Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Those titles are having to process all lighting and shadowing on compute. If those workloads were moved onto other hardware, that 7TFs could achieve more.

It all depends on what this other 'RT hardware' is! ;)

Those title need to run on a next generation consoles, I doubt we will have fixed function lighting RT.

Is this coming from that neogaf mod?
Mentioning a specific CPU<->GPU bus speed could mean they're going with a chiplet design, with both the GPU and CPU having their own memory controllers (256bit GDDR6 GPU, 64/128bit DDR4 CPU).
Also, the rumored 300mm^2 for the PS5's GPU would fit with those 10-11 TFLOPs if it's a chip that is wider than 250mm^2 Navi 10 while running at lower, more power efficient clocks.
In practice, it'd be a chip with 20% more compute resources (24 DCUs) while clocked at ~1750MHz (which the GPU could sustain if they use TSMC's 2nd-gen 7nm).



Thankfully, the PS5 isn't releasing this day. It's releasing over a year from now.

Also, we have no idea on the economics of the RX 5700 series. For all intents and purposes, they're mid-range cards using mid-range sized chips on a 256bit memory bus, using a PCB with power characteristics similar to the RX580/RX590.
For all we know, the RX 5700XT could be selling for $250 and still pull a nice profit to AMD.

No this is my guess with the information(AMD Gonzalo, AMD Flute) and Kelly Rickards information. For the bus bandwith and everything some guess and same for SSD bandwidth.
 
Last edited:
Also important, not so baseless perhaps, is the build quality, and perhaps it's other uses then games. My first base PS4 fat got the blue light of dead issue, got it replaced but the replacement is very loud, in special in DOOM and strangely enough the fortnite menus. My siblings PS4 got some problem with the controllers not finding the PS4 or the other way around, tried everything to no avail. Controllers work on W10. I heard the Slim is quiter but the Pro seems a hit or miss. A good cooling solution would be very welcome, so it doesn't run hot & loud. Is it a 60fps game problem? Doom seems to torture my PS4 more then HZD, those games both are lookers but one is 60fps.

While i didn't experience it myself, i see rather many used with defect HDMI ports, power buttons, dual shock 4 contollers with bad batteries etc.
Finally the OS, right now the PS4's web browser is kinda useless, google maps street view is awfully slow for some reason, needed a quick view of a street and the phones screen isn't big enough. It's web browser is slow in general and flash isn't supported. So is the option to install your own browser.
Yes the hardware is interesting but so is the next OS and it's functionalities.
 
Is it a 60fps game problem? Doom seems to torture my PS4 more then HZD, those games both are lookers but one is 60fps.
Higher framerate means it's working more at every stage (off-die accesses are doubled too). :)

edit:
In the course of a frame, only parts of the GPU/CPU are functioning at a given time (as the heavenly-descended workloads go from bit to bit), although async compute is also meant to also increase GPU utilization too, so things can heat up on there as well (similarly - out-of-order execution, working to fill in the bubbles with independent loads).
 
Last edited:
But how much space does the RT hardware take up? Since we have zero clues about anything GPU related we don't know.
When comparing TU106 to TU116, we know it takes very little space/transistors on Turing cards. We don't have a lot of clues, but we don't have zero clues.


All i'm seeing is wishes for the highest TF number possible. Probably to win an arms race against the competition, or something.
What competition?
Rumors are pointing to both consoles having ~10 TFLOPs GPUs. We won't see anything like 1.8 vs 1.3 TFLOPs and 32 vs. 16 ROPs like we saw with the PS4 vs. XBone.


Exactly, and it's general specs are set in stone.
The general specs could be set in stone 5 years ago.
The console is still only going to be mass produced sometime in 2020, using 2020's price-per-waffer, 2020's price-per-GB on RAM and 2020's price-per-GB on solid state storage.
Which is what matters, not the time someone decided the spec would be closed.

RX5700XT is AMD's highest end dGPU at the moment, expecting the raw power of that + ray tracing seems far stretched.
The RX5700XT is AMD's highest end dGPU because they're not competing on the high-end.
I'm not expecting the raw power of that, I'm actually expecting a bit more than that.


Like DF mentioned in one of their video's, there will be dissapointments.
There would always be disappointments. Even more with rumors of >14TFLOPs circulating the web, prior to RDNA bringing a step back in raw floating point throughput compared to GCN5.

That doesn't mean I think Microsoft will launch a console in 2020 that has less than 17% more raw throughput than their mid-gen refresh launched 3 years before.
People who think that must be really afraid of being disappointed, IMO.
 
When comparing TU106 to TU116, we know it takes very little space/transistors on Turing cards. We don't have a lot of clues, but we don't have zero clues.



What competition?
Rumors are pointing to both consoles having ~10 TFLOPs GPUs. We won't see anything like 1.8 vs 1.3 TFLOPs and 32 vs. 16 ROPs like we saw with the PS4 vs. XBone.



The general specs could be set in stone 5 years ago.
The console is still only going to be mass produced sometime in 2020, using 2020's price-per-waffer, 2020's price-per-GB on RAM and 2020's price-per-GB on solid state storage.
Which is what matters, not the time someone decided the spec would be closed.


The RX5700XT is AMD's highest end dGPU because they're not competing on the high-end.
I'm not expecting the raw power of that, I'm actually expecting a bit more than that.



There would always be disappointments. Even more with rumors of >14TFLOPs circulating the web, prior to RDNA bringing a step back in raw floating point throughput compared to GCN5.

That doesn't mean I think Microsoft will launch a console in 2020 that has less than 17% more raw throughput than their mid-gen refresh launched 3 years before.
People who think that must be really afraid of being disappointed, IMO.

Exactly AMD 5700XT is not a high end GPU. I don't expext AMD high end GPU but something middle range in 2020. And I don't expect something much more powerful than this GPU it can be 10 to 10,5 Tflops for example.
 
When comparing TU106 to TU116, we know it takes very little space/transistors on Turing cards.

If it's anything like Turing.

What competition?
Rumors are pointing to both consoles having ~10 TFLOPs GPUs.

Oviously the xbox camp, not saying it's happening here, but check other forums where there are TF wars already now raging.
Theres also ~8TF rumors, take your pick that suits best, both have no real meaning.

Which is what matters, not the time someone decided the spec would be closed.

I ment if they decided navi and zen 2, and last minute change to navi 2/zen 3. It's possible se might see a custom solution, hybrid rdna/rdna2. This was 'leaked' somewhere.

The RX5700XT is AMD's highest end dGPU because they're not competing on the high-end.
I'm not expecting the raw power of that, I'm actually expecting a bit more than that.

They haven't been competing in that segment for years, and that doesn't seem their strategy either.
Hope your right though, that we will see some highest end monster of a rdna2 gpu with the best possible RT for the time. And zen 3 ofc if we are on it anyways. 11/12TF and 24gb hbm2+.

Exactly AMD 5700XT is not a high end GPU. I don't expext AMD high end GPU but something middle range in 2020. And I don't expect something much more powerful than this GPU it can be 10 to 10,5 Tflops for example.

Maybe we will see, expecting a mid-range amd gpu isn't baseless though :)
 
If it's anything like Turing.



Oviously the xbox camp, not saying it's happening here, but check other forums where there are TF wars already now raging.
Theres also ~8TF rumors, take your pick that suits best, both have no real meaning.



I ment if they decided navi and zen 2, and last minute change to navi 2/zen 3. It's possible se might see a custom solution, hybrid rdna/rdna2. This was 'leaked' somewhere.



They haven't been competing in that segment for years, and that doesn't seem their strategy either.
Hope your right though, that we will see some highest end monster of a rdna2 gpu with the best possible RT for the time. And zen 3 ofc if we are on it anyways. 11/12TF and 24gb hbm2+.



Maybe we will see, expecting a mid-range amd gpu isn't baseless though :)

This is why it seems plausible. He did not even said more than 11 Tflops but more than 10 Tflops. It can be 10.0 Tflops and 10.1 Tflops, maybe they push it a little for marketing purpose. Nothing far from a 9.2 Tflops of a 5700XT.
 
Last edited:
Nothing far from a 9.2 Tflops of a 5700XT.

A RX5700XT is AMD highest end so far, it doesn't even contain any ray tracing hardware and is on a older, less efficient architecture.

Why even bother?

Totally different architectures, the inclusion of ray tracing and a 7 years time span between them.

Oh, you're right. For some reason when I saw 1 TB/s, I was thinking 1 GB/s. I guess my brian auto-corrected something that was obviously untrue.

Something like a Intel Optane 905P is already incredibly fast. Would games and software be optimized for that with PCI-E 4.0, and fast CPU's you have some fast storage solution.
 
Dev kits will give u a ballpark figure not actual specs.

X dev kits have 6.6 of Tflops with 24 GBs of RAM which aren’t retail specs.

This might be why Sony is using a beefy looking form factor for its dev kits. Sony’s current dev kits may be running 5700 XTs at boost clocks in a sustained fashion to provide more Tflops than the retail kits will have.
 
Last edited:
Dev kits will give u a ballpark figures not actual specs.

X dev kits have 6.6 of Tflops with 24 GBs of RAM which aren’t retail specs.

This might be why Sony is using a beefy looking form factor for its dev kits. Sony’s current dev kits may be running 5700 XTs at boost clocks in a sustained fashion to provide more Tflops than the retail kits will have.
Devkits are always more powerful. Which is also why the comparisons between PS5 devkit being > Scarlett like how X1S > X1; is largely inconsequential as well. It's a data point that is worthless with respect to release hardware for obvious reasons; the difference between X1S and X1 is a minor bump in clock speed; and no one knows what clock speeds are being settled on this early. At least not well enough to know to be able to spot a performance differential of S > One

As for the topic on 7 or 10, 11; crutching the debate on which source is better seems like a waste of effort.
Let's just do the BOM on them and get an idea of where we are landing.
The closer to 399 the better, the further the away the less likely I see it happening.
 
Dev kits will give u a ballpark figures not actual specs.

X dev kits have 6.6 of Tflops with 24 GBs of RAM which aren’t retail specs.

This might be why Sony is using a beefy looking form factor for its dev kits. Sony’s current dev kits may be running 5700 XTs at boost clocks in a sustained fashion to provide more Tflops than the retail kits will have.

AFAIK the last two PlayStation dev kits have had the exact same specs (clocks, TF, RAM amount) as the retail unit.

Unless someone here knows differently? Maybe they change with the PS5 dev kit but we'll see I guess.
 
AFAIK the last two PlayStation dev kits have had the exact same specs (clocks, TF, RAM amount) as the retail unit.

Unless someone here knows differently? Maybe they change with the PS5 dev kit but we'll see I guess.

Where did u get that information from? The info always seem pretty light on the dev kit specs. Same chip is accurate but I’ve never seen detailed info in terms of cpu/gpu frequency. Even psdevwiki lacks that info.

Plus these aren’t final hardware kits. Earlier PS4 dev kits used 8 bulldozer cores, which would not have given an accurate look at final hardware other than core count.
 
Last edited:
Devkits are always more powerful. Which is also why the comparisons between PS5 devkit being > Scarlett like how X1S > X1; is largely inconsequential as well. It's a data point that is worthless with respect to release hardware for obvious reasons; the difference between X1S and X1 is a minor bump in clock speed; and no one knows what clock speeds are being settled on this early. At least not well enough to know to be able to spot a performance differential of S > One

As for the topic on 7 or 10, 11; crutching the debate on which source is better seems like a waste of effort.
Let's just do the BOM on them and get an idea of where we are landing.
The closer to 399 the better, the further the away the less likely I see it happening.

I’ll wait it out to early next year. I am betting on timing similar to last gen. Dev kit kits with actual hardware released around dec/Jan with full comprehensive docs detailing specs. Leaks with a ton of accurate info followed by official reveals. If I am right Q1 2020 is going to be a great time.
 
Status
Not open for further replies.
Back
Top