AMD: Navi Speculation, Rumours and Discussion [2019-2020]

Status
Not open for further replies.
Right, although mainstream graphics cards tend to be more efficient anyway. I would say eficiency is not only perf/watt but also fps/price for someones needs.

Considering @Picao84 mentioned "efficiency" after stating he didn't want to upgrade his 650W power supply, I doubt he was talking about cost efficiency.
 
I don't think "efficiency" is the word you want to use here. Perhaps you mean "lower-powered"?

Efficiency doesn't dictate a power limit. You could have a 5000W graphics card that is more efficient than an Adreno iGPU.

Don't be so pedantic. Efficient in the context of the competition and other products from the same line of course. I'm not going to put an Adreno GPU on a desktop PC.
 
Yeah, because I like my things efficient. It has nothing to do with money.

Curious what CPU you're running in that setup. A (presumably efficient) 650W PSU, with a 65W or 95W Ryzen CPU, should allow you to run almost any GPU on the market. No need for an upgrade.
It seems to me AMD is prolly making a mistake not having Navi 22 go on sale the same day as Navi 21: high-end (not enthusiast) gamers who want to build a 5600X gaming system are going to put RTX 3070 in it because AMD will have nothing at $500. 2+ months of the best sales quarter lost.

But that's pretty normal isnt it? High end always releases first, followed by the mid-range and then low end (The only notable exception to this being RV770 of course). Its rare for two chips to release at exactly the same time. It's simply a matter of available engineering resources. Even Nvidia is selling the GA104 more than a month after the GA102 came out.

All aboard the hype train! Choo choo!
 
Curious what CPU you're running in that setup. A (presumably efficient) 650W PSU, with a 65W or 95W Ryzen CPU, should allow you to run almost any GPU on the market. No need for an upgrade.

I'm running a Ryzen 3900X. If you look at the PSU recommendation for RTX3080 it is a 750W PSU. The RTX3070 already recommends a 650W one for 220W. GPUs consumption has been increasing for a while from it's usual 150W to 180W for Gx104 class models. My PSU is nothing special, a Corsair VS650 I think.
 
But that's pretty normal isnt it? High end always releases first, followed by the mid-range and then low end (The only notable exception to this being RV770 of course). Its rare for two chips to release at exactly the same time. It's simply a matter of available engineering resources. Even Nvidia is selling the GA104 more than a month after the GA102 came out.

Rumors were Evergreen and Northern Islands was suppose to launch a whole family within 1Q but the best we got was highend and midrange within a month or two.
Cypress Sept 2009, Juniper Oct 2009, Hemlock Nov 2009 then Redwood in Jan/Feb 2010.
Barts Oct 2010, Cayman Dec 2010, Turks and Caicos released to OEM in Jan/Feb 2011 but retail was April 2011.

Closest family launch was OG GCN.
Tahiti Jan 2012, Cape Verde Feb 2012, Pitcairn March 2012. $500 down to $100 in 1 quarter. <$100 was all 40nm rebrands.
 
This seems promising. But if the 6800XT alone is so fast, why did AMD show benchmarks at the Zen 3 launch were Navi is a decent margin slower than the 3080? I'd love it to be as fast as these rumours are suggesting, not least for the sake of pricing, but I think it might be a bit too soon to jump on this hype train.
Agreed, synthetic 3DMarks are still rather meaningless. Look at Intel - it's not hard to optimize for benchmarks. So we are still waiting for some "real" benches.
 
This seems promising. But if the 6800XT alone is so fast, why did AMD show benchmarks at the Zen 3 launch were Navi is a decent margin slower than the 3080? I'd love it to be as fast as these rumours are suggesting, not least for the sake of pricing, but I think it might be a bit too soon to jump on this hype train.

A little salt...

Navi1x punches above its weight in Firestrike. Maybe that's also the case for Nav2x.
I believe these are total scores including CPU so not directly comparable.
May be an AIB card at higher clocks.
 
For those wondering about the differences in Time Spy and Fire Strike performance, this is straight from the 3dmark technical guide:

upload_2020-10-23_13-59-19.png

My guess would be on modern gpus that fire strike is going to be more of a fillrate test, or something.

Edit: Looking at clock speed, potential rop count (128) and these firestrike results, AMD has potential to be the ultimate "competitive settings" gpu for csgo, valorant, fortnite, apex etc, where people tend to play on very low settings and lower resolutions. "Ultra settings" comparisons is where things are looking close, but AMD still might win there too, but ray tracing may muddy those waters. I'm curious if igors labs port royal scores included dlss for the nvidia gpu. I'd like to see native vs native numbers for ray tracing.
 
Last edited:
Why fixating on FSU scores? Igor's lab last leak was showing TSE score slightly above the 3080 - let's say on par. And RT bench below that, as was clear since months. Ofc these needs to be confirmed and we need to know about actual game benchmarks, but if the leaks were real the scenario points to a good competition for Ampere.
 
The Timespy Extreme score for the 3080 looks wrong. They would have used a "low end" CPU to bring a 3080 under 8000 points in the overall score. At the same time the overall score is only 14% higher than a 2080TI (FE? AIB OC version?). Sweclockers for example used a Intel Core i9-10900K @ 5,0 GHz and the 3080 has 8697 points and the 2080TI FE 6702 points.
 
The Timespy Extreme score for the 3080 looks wrong. They would have used a "low end" CPU to bring a 3080 under 8000 points in the overall score. At the same time the overall score is only 14% higher than a 2080TI (FE? AIB OC version?). Sweclockers for example used a Intel Core i9-10900K @ 5,0 GHz and the 3080 has 8697 points and the 2080TI FE 6702 points.

The igor's lab thing is weird. I'm not sure if it's just lost in translation, but his 3dmark scores for the 6800xt are "estimated" and he did some conversion from percentages to actual scores. I don't quite get it.
 
The igor's lab thing is weird. I'm not sure if it's just lost in translation, but his 3dmark scores for the 6800xt are "estimated" and he did some conversion from percentages to actual scores. I don't quite get it.

Igor wrote that an AIB who is producing cards for nVidia and AMD has sent him the numbers. wccftech got the same numbers: https://videocardz.com/newz/amd-radeon-rx-6800xt-alleged-3dmark-scores-hit-the-web

I think the Timespy Extreme score is switched between the 3080 and the AMD card. 8200 points overall makes more sense when the 2080TI gets ~7000.
 
Igor wrote that an AIB who is producing cards for nVidia and AMD has sent him the numbers. wccftech got the same numbers: https://videocardz.com/newz/amd-radeon-rx-6800xt-alleged-3dmark-scores-hit-the-web

I think the Timespy Extreme score is switched between the 3080 and the AMD card. 8200 points overall makes more sense when the 2080TI gets ~7000.

I'm trying to make sense of this ... I think he's saying that he didn't want to post their numbers, but he confirmed their 3080 numbers were similar to his, so instead of posting the 6800XT number they sent him he "estimated" the number from his 3080 based on the ratio of the 3080 to 6800XT numbers they sent him ... it's really dumb, and I have no idea why he'd do something so convoluted, but it's the only interpretation I can come up with.

upload_2020-10-23_14-42-59.png
 
Yes, he wrote that he should not post the real numbers for the AMD card but he got the nVidia numbers from the source, too. So he benched himself the 3080 and 2080TI FE (said he got nearly the same scores...) and "estimated" the AMD number from the nVidia scores.

But still the 3080 Timespy Extreme number is wrong.
 
Status
Not open for further replies.
Back
Top