AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

Discussion in 'Architecture and Products' started by snc, Jul 4, 2019.

Tags:
  1. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,025
    Likes Received:
    305
    Location:
    Finland
    I do question the benefit of 8k video. Users would need truly ginormous tv's to see the difference between 4k and 8k unless you like to sit with your nose to the screen. Like the 3d tv fad this is just manufacturers trying to sell useless garbage to consumers.
     
  2. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,617
    Likes Received:
    5,179
    There are 8K TVs wihtout HDMI 2.1?
    That's preposterous! What do they use? HDMI 2.0b for 4K60 and then they just upscale the content?

    As the owner of a 65" 4K TV, I'd say if I had a 80"+ model or a projector on a 100" panel then I'd probably want 8K too.
     
    no-X likes this.
  3. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,333
    Likes Received:
    290
    Yes, they are, e. g. Sony Bravia KD-98ZG9. The HDMI input allows for "7680 x 4320p (24, 25, 30 Hz)" (official specification). It is capable to play current online 8k streams (h.265/VP9), but doesn't support "native" 8k codecs (av1/h.266) nor native 8k@60 video input. So once online streaming services like YouTube or Netflix proceed to AV1, there will be no way to play it on these TV's. Internal decoder can't handle AV1 and external decoders can't be connected because of the HDMI 2.0 limitation (maybe it will work if the external device supports framerate reduction from 60 to 30 FPS, but who knows). The first generation of 8k TVs is just a gimmick, the second generation (current one) is slightly better because of HDMI 2.1 support, but lack of AV1 support is somewhat elusive. So called 8K Association doesn't help either, it allows the manufacturers to put 8k logo on products, which doesn't support any codec developed for 8k video (AV1/h.266).
     
    BRiT likes this.
  4. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,025
    Likes Received:
    305
    Location:
    Finland
    Even at 100" you would need to sit closer than 2 meters to the screen to see a difference so that is the bare minimum size. I have a 65" 4k tv as well and I don't see any benefit to 4k video in my largeish living room unless i sit on the floor with the children.
     
  5. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,384
    Likes Received:
    765
    Location:
    France
    Plus résolution isn't everything... I see some tv shows and movie in 4k, but looking like a bad upscale...
     
  6. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,617
    Likes Received:
    5,179
    I sit a bit less than 2m away from the TV and the difference between 4K and 1080p is very noticeable.
    OTOH, it might be because 1080p content has a much lower bitrate so the TV has much less data to handle.

    Yes. If I had the choice, I'd rather have Netflix serving 4K60 Dolby Vision content with significantly higher bitrates.
    If ~16Mbps can do all that, I can only imagine how much better that TV could do with 40-60Mbps at its disposal.
     
  7. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,384
    Likes Received:
    765
    Location:
    France
    And content creators / studios have to have the will to deliver good PQ. Some studios love grainy looks and weird filters, doesn't deliver much on 4k... So I'll wait until everything is done properly in 4k before thinking about 8k :eek:
     
  8. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,617
    Likes Received:
    5,179
    I wish I could throw figurative punches at the directors who do this..
    The consumers and the companies are spending all this money on technologies that allow us to record and and watch video in a more clear and immersive way, just to have the guy in charge of the film/series artificially lowering said immersiveness with post-production filters because nostalgia.
     
    Frenetic Pony likes this.
  9. Rootax

    Veteran Newcomer

    Joined:
    Jan 2, 2006
    Messages:
    1,384
    Likes Received:
    765
    Location:
    France
    Or just don't release 4k versions in that case, it looks worse than 1080p sometime.
     
  10. Kyyla

    Veteran

    Joined:
    Jul 2, 2003
    Messages:
    1,025
    Likes Received:
    305
    Location:
    Finland
    Yeah bitrate can be pretty bad. There is horrible banding in streaming content when looking at dark scenes for example.
     
  11. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    7,440
    Likes Received:
    3,461
    Location:
    Pennsylvania
    Having only just recently started using Netflix, I've noticed that at 1080p (I don't have a 4k TV), the image quality is far superior to cable TV and much better than a lot of my h265 Bluray rips, especially for banding in dark scenes. I'm quite impressed with it.
     
    BRiT likes this.
  12. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,859
    Likes Received:
    5,998
    Even more I wish I could throw figurative punches at game developers that do this because they consider it part of the "filmic" look because film directors do it.

    Thankfully on PC you can usually disable those effects unlike on console.

    Regards,
    SB
     
    CarstenS, ToTTenTranz and PSman1700 like this.
  13. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    3,320
    Likes Received:
    1,951
    AMD reconfigures Radeon RX 5600 XT
    https://www.guru3d.com/news-story/n...icing-amd-reconfigures-radeon-rx-5600-xt.html
     
  14. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,617
    Likes Received:
    5,179
    Honestly, at 1560MHz boost they were totally sandbagging the 5600 XT's performance.
    Though this change might indeed come at a cost to the 5700's sales.
     
  15. Esrever

    Regular Newcomer

    Joined:
    Feb 6, 2013
    Messages:
    747
    Likes Received:
    517
    They might think the decreased 5700 sales will be offset by the increased 5600xt sales. Nvidia just lowered the price of the 2060 and AMD will be looking at competing with it with this card with both at less than $300. The old specs probably would be below the 2060 but now it will likely compete pretty well while maybe being slightly cheaper.
     
  16. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    840
    Likes Received:
    935
    Seems 5700 did sell badly, only XT sold well. (Just personal impression from following sales numbers at some bigger shop site.)

    Slightly cheaper than 2060 is not enough to compete, because of features. But if people are willing to ignore this just for 'slightly less' or brand loyalty, then so be it.
    I still think AMD should undercut NV significantly - it would be no shame as long as there is no feature parity.
    But they did not. And they got away with it as it seems.
    (I'm worried PC gaming becomes some luxury niche. If only AMD would do a 'Ryzen' for GPUs as well...)
     
    pharma, CeeGee and Silent_Buddha like this.
  17. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,875
    Likes Received:
    2,181
    Location:
    Germany
    That's a bit more unlikely to happen than in x86-land, though. Apart from Ryzen a) being a giant leap for AMD and b) increasingly utilizing the microarchitecture's potential, there was factors c and d which also contributed to it's sucess. c being Intel having horrific manufacturing problems for 3+ years without taking appropriate action at the first sign of troubles, and d being Intel so full of themselves, that they continued to sell the smallest increments in improvement possible. Maybe though d is more closely related to c and I do not give c's dimension enough credit. :)

    But then, as can be seen in x86-land: As soon as it gets the upper hand, AMD behaves just like a normal company would, charging according to the value of their products, not selling them for (dirt) cheap.
     
    Kej, Silent_Buddha, Lightman and 2 others like this.
  18. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,208
    Likes Received:
    526
    Location:
    en.gb.uk
    As any company that has shareholders and doesn't want to be sued into oblivion by those shareholders should do.
     
    Alexko likes this.
  19. JoeJ

    Regular Newcomer

    Joined:
    Apr 1, 2018
    Messages:
    840
    Likes Received:
    935
    Sure, and selling under priced is not healthy for anyone. But getting a 10TF GPU + 8 GB HBM for 230 bucks (Vega 56) and assuming they still make money from that, then looking at 5600 which will cost likely more, something seems a bit stretched.
    Though, i see the specs are better than i remembered, and now after the increase it looks even better for this OC model: https://www.techpowerup.com/gpu-specs/sapphire-pulse-rx-5600-xt.b7552
    I'd agree to 250, but for 300 i'd get RT instead.
     
    Lightman likes this.
  20. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,725
    Likes Received:
    2,545
    Location:
    Finland
    7nm process is nearly twice as expensive per mm^2 and transistor density hasn't doubled (at least on AMDs chips, Vega 10 has about ~25 million transistors per mm^2, Navi 10 about 41 million)
    upload_2020-1-18_11-58-41.png
     
    Lightman likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...