AMD Navi Product Reviews and Previews: (5500, 5600 XT, 5700, 5700 XT)

I do question the benefit of 8k video. Users would need truly ginormous tv's to see the difference between 4k and 8k unless you like to sit with your nose to the screen. Like the 3d tv fad this is just manufacturers trying to sell useless garbage to consumers.
 
There are 8K TVs wihtout HDMI 2.1?
That's preposterous! What do they use? HDMI 2.0b for 4K60 and then they just upscale the content?

I do question the benefit of 8k video. Users would need truly ginormous tv's to see the difference between 4k and 8k unless you like to sit with your nose to the screen. Like the 3d tv fad this is just manufacturers trying to sell useless garbage to consumers.
As the owner of a 65" 4K TV, I'd say if I had a 80"+ model or a projector on a 100" panel then I'd probably want 8K too.
 
There are 8K TVs wihtout HDMI 2.1?
That's preposterous! What do they use? HDMI 2.0b for 4K60 and then they just upscale the content?
Yes, they are, e. g. Sony Bravia KD-98ZG9. The HDMI input allows for "7680 x 4320p (24, 25, 30 Hz)" (official specification). It is capable to play current online 8k streams (h.265/VP9), but doesn't support "native" 8k codecs (av1/h.266) nor native 8k@60 video input. So once online streaming services like YouTube or Netflix proceed to AV1, there will be no way to play it on these TV's. Internal decoder can't handle AV1 and external decoders can't be connected because of the HDMI 2.0 limitation (maybe it will work if the external device supports framerate reduction from 60 to 30 FPS, but who knows). The first generation of 8k TVs is just a gimmick, the second generation (current one) is slightly better because of HDMI 2.1 support, but lack of AV1 support is somewhat elusive. So called 8K Association doesn't help either, it allows the manufacturers to put 8k logo on products, which doesn't support any codec developed for 8k video (AV1/h.266).
 
As the owner of a 65" 4K TV, I'd say if I had a 80"+ model or a projector on a 100" panel then I'd probably want 8K too.
Even at 100" you would need to sit closer than 2 meters to the screen to see a difference so that is the bare minimum size. I have a 65" 4k tv as well and I don't see any benefit to 4k video in my largeish living room unless i sit on the floor with the children.
 
Plus résolution isn't everything... I see some tv shows and movie in 4k, but looking like a bad upscale...
 
Even at 100" you would need to sit closer than 2 meters to the screen to see a difference so that is the bare minimum size. I have a 65" 4k tv as well and I don't see any benefit to 4k video in my largeish living room unless i sit on the floor with the children.
I sit a bit less than 2m away from the TV and the difference between 4K and 1080p is very noticeable.
OTOH, it might be because 1080p content has a much lower bitrate so the TV has much less data to handle.

Plus résolution isn't everything... I see some tv shows and movie in 4k, but looking like a bad upscale...
Yes. If I had the choice, I'd rather have Netflix serving 4K60 Dolby Vision content with significantly higher bitrates.
If ~16Mbps can do all that, I can only imagine how much better that TV could do with 40-60Mbps at its disposal.
 
And content creators / studios have to have the will to deliver good PQ. Some studios love grainy looks and weird filters, doesn't deliver much on 4k... So I'll wait until everything is done properly in 4k before thinking about 8k :eek:
 
Some studios love grainy looks and weird filters, doesn't deliver much on 4k...
I wish I could throw figurative punches at the directors who do this..
The consumers and the companies are spending all this money on technologies that allow us to record and and watch video in a more clear and immersive way, just to have the guy in charge of the film/series artificially lowering said immersiveness with post-production filters because nostalgia.
 
I sit a bit less than 2m away from the TV and the difference between 4K and 1080p is very noticeable.
OTOH, it might be because 1080p content has a much lower bitrate so the TV has much less data to handle.


Yes. If I had the choice, I'd rather have Netflix serving 4K60 Dolby Vision content with significantly higher bitrates.
If ~16Mbps can do all that, I can only imagine how much better that TV could do with 40-60Mbps at its disposal.
Yeah bitrate can be pretty bad. There is horrible banding in streaming content when looking at dark scenes for example.
 
Having only just recently started using Netflix, I've noticed that at 1080p (I don't have a 4k TV), the image quality is far superior to cable TV and much better than a lot of my h265 Bluray rips, especially for banding in dark scenes. I'm quite impressed with it.
 
I wish I could throw figurative punches at the directors who do this..
The consumers and the companies are spending all this money on technologies that allow us to record and and watch video in a more clear and immersive way, just to have the guy in charge of the film/series artificially lowering said immersiveness with post-production filters because nostalgia.

Even more I wish I could throw figurative punches at game developers that do this because they consider it part of the "filmic" look because film directors do it.

Thankfully on PC you can usually disable those effects unlike on console.

Regards,
SB
 
AMD reconfigures Radeon RX 5600 XT
Media worldwide is hard at with their 5600 XT reviews, and yes yesterday the media samples required a new firmware, I can confirm this personally. While we cannot confirm or say what triggered AMD to do that, we can confirm that there are significant changes in firmware configuration.
...
The new OC models will get a board power allowance of 180 Watts in total, the TGP (GPU power) is increased from 150 towards 160 Watts. Accumulated all together, that's a valid tweak as the boost clock just jumped from 1560 MHz advertised, towards wait for it ... 1700 MHz.

In the end, the consumer wins here, for AMD I am not sure how this will pan out, they are now actively cannibalizing their own Radeon RX 5700 (non-XT). Next week you can expect some reviews. But currently, we have to re-do all reviews from scrap as this reconfiguration changes everything from power measurements, thermals, and performance.
https://www.guru3d.com/news-story/n...icing-amd-reconfigures-radeon-rx-5600-xt.html
 
Honestly, at 1560MHz boost they were totally sandbagging the 5600 XT's performance.
Though this change might indeed come at a cost to the 5700's sales.
 
They might think the decreased 5700 sales will be offset by the increased 5600xt sales. Nvidia just lowered the price of the 2060 and AMD will be looking at competing with it with this card with both at less than $300. The old specs probably would be below the 2060 but now it will likely compete pretty well while maybe being slightly cheaper.
 
Seems 5700 did sell badly, only XT sold well. (Just personal impression from following sales numbers at some bigger shop site.)

Slightly cheaper than 2060 is not enough to compete, because of features. But if people are willing to ignore this just for 'slightly less' or brand loyalty, then so be it.
I still think AMD should undercut NV significantly - it would be no shame as long as there is no feature parity.
But they did not. And they got away with it as it seems.
(I'm worried PC gaming becomes some luxury niche. If only AMD would do a 'Ryzen' for GPUs as well...)
 
Seems 5700 did sell badly, only XT sold well. (Just personal impression from following sales numbers at some bigger shop site.)

Slightly cheaper than 2060 is not enough to compete, because of features. But if people are willing to ignore this just for 'slightly less' or brand loyalty, then so be it.
I still think AMD should undercut NV significantly - it would be no shame as long as there is no feature parity.
But they did not. And they got away with it as it seems.
(I'm worried PC gaming becomes some luxury niche. If only AMD would do a 'Ryzen' for GPUs as well...)
That's a bit more unlikely to happen than in x86-land, though. Apart from Ryzen a) being a giant leap for AMD and b) increasingly utilizing the microarchitecture's potential, there was factors c and d which also contributed to it's sucess. c being Intel having horrific manufacturing problems for 3+ years without taking appropriate action at the first sign of troubles, and d being Intel so full of themselves, that they continued to sell the smallest increments in improvement possible. Maybe though d is more closely related to c and I do not give c's dimension enough credit. :)

But then, as can be seen in x86-land: As soon as it gets the upper hand, AMD behaves just like a normal company would, charging according to the value of their products, not selling them for (dirt) cheap.
 
But then, as can be seen in x86-land: As soon as it gets the upper hand, AMD behaves just like a normal company would, charging according to the value of their products, not selling them for (dirt) cheap.

As any company that has shareholders and doesn't want to be sued into oblivion by those shareholders should do.
 
Sure, and selling under priced is not healthy for anyone. But getting a 10TF GPU + 8 GB HBM for 230 bucks (Vega 56) and assuming they still make money from that, then looking at 5600 which will cost likely more, something seems a bit stretched.
Though, i see the specs are better than i remembered, and now after the increase it looks even better for this OC model: https://www.techpowerup.com/gpu-specs/sapphire-pulse-rx-5600-xt.b7552
I'd agree to 250, but for 300 i'd get RT instead.
 
Sure, and selling under priced is not healthy for anyone. But getting a 10TF GPU + 8 GB HBM for 230 bucks (Vega 56) and assuming they still make money from that, then looking at 5600 which will cost likely more, something seems a bit stretched.
Though, i see the specs are better than i remembered, and now after the increase it looks even better for this OC model: https://www.techpowerup.com/gpu-specs/sapphire-pulse-rx-5600-xt.b7552
I'd agree to 250, but for 300 i'd get RT instead.
7nm process is nearly twice as expensive per mm^2 and transistor density hasn't doubled (at least on AMDs chips, Vega 10 has about ~25 million transistors per mm^2, Navi 10 about 41 million)
upload_2020-1-18_11-58-41.png
 
Back
Top