Digital Foundry Article Technical Discussion [2019]

Status
Not open for further replies.
DF written article on PS5 and XSX spec leaks: https://www.eurogamer.net/articles/...laystation-5-xbox-series-x-spec-leak-analysed

The scale and scope of this latest leak is remarkable and the origin of the new information seems even more far-fetched than the Gonzalo story, leading many to believe that the entire thing may be a work of fiction. However, having looked into the situation and independently verified the source, the overwhelming evidence is that the data does indeed originate from AMD - and hasn't been doctored. We're lacking crucial context for sure but the reasons to doubt the veracity of the leak are somewhat thin on the ground.

From what I can gather, someone at AMD's ASIC validation department used GitHub to store fragments of internal testing data from a range of work-in-progress Team Red projects. The leaks include testing of next-gen desktop and mobile Ryzen APUs along with some deep-dive testing on the PS5 chip, now codenamed Oberon. While the data is not public, it's clear that the GitHub test data has travelled far and wide: further details from the leak mentioned in this article are being discussed at length on ResetEra, for example. The genie is out of the bottle.​

Then there we have it, ps5 9TF. It was what i expected.

Nothing is sure until SONY gives official statement of PS5 specs. These leaks can always be fake or something else than ps5 chip. Also I saw something about that this leak was fake and/or it is not about PS5 chip but something else

AMD tests lot of chips so it could be anything, even some different versions of PS5 chip and the real one didnt leak. Or partial tests for some compatibility mode, who knows until it is official.

Just because some popular site/channel/dude says so, it doesnt make it 100% trustworthy.

There are always people whom like to troll or gain fame, on every gen.


I would guess that there is maybe 30% chance that ps5 will be 9tflop machine and 70 % chance that it is something else.

This 36CU @ 2Ghz just doesnt sound believable to me, clocks are too high for a console and CU count is too low for the available tech.

Sony isnt stupid company, they have epic engineers so I would be surprised if they would lose so badly to xbox, unless it is really only about the price point strategy.
 
This 36CU @ 2Ghz just doesnt sound believable to me, clocks are too high for a console and CU count is too low for the available tech.

I'm also convinced 9Tf isn't it. Everything we know about Navi architecture and 7nm thermals tells us that 36 CUs at 2Ghz is ludicrous. If it's 36 CU, it'll be clocked lower so under 9Tf.

Sony isnt stupid company, they have epic engineers so I would be surprised if they would lose so badly to xbox, unless it is really only about the price point strategy.
*cough* PS3 *cough*.
 
It is highly likely MS showed their premium console and kept the other one for the complete next gen announce.
MS expects Sony to announce a product that is designed with perfect balance between price and performance. This gives a chance to seize the opportunity from Sony and have the most powerful console, with the other XBOX being the more affordable.
 
They may look worse, but only when compared to scenes with no performance implications. When performance is held identical, using these techniques produces better looking scenes than when they're not used.
Again, I get what you are saying. In fact, I said as much...
Better quality via techniques that objectively offer worse quality. I get what you are saying, that those techniques will allow developers to push overall image quality because they have built in techniques that stabilize performance, but it's also sort of counter to pushing no compromise image quality.
 
36CUs is stll about 300-315 mm^2 die space. so it does sound believable. and the 2.0 ghz clock is probably a reaction to Xbox.

But what doesn't sound believable is Arden. That thing will be 400mm^2< which sounds too good to be true.
 
36CUs is stll about 300-315 mm^2 die space. so it does sound believable. and the 2.0 ghz clock is probably a reaction to Xbox.
You can't react that way though. If the silicon can't realistically support 2 GHz, you can't just turn the clock up to hit 2 GHz. It needs to be part of the design, and would have to be a choice made when settling on the 36 CUs in the first place. Or, you release a very hot, unreliable system, and/or some expensive cooling solution. It just wouldn't be cost effective to get a 10% improvement in GPU performance that has negligible difference on screen and won't close the numbers gap with the rival.
 
Knowing how they designed last several consoles I doubt they are that reactionary. They have set their TDP/price price targets and went with that.

Obviously if your BC method requires specific design decisions it can introduce certain limitations, but all in all 36CU 8 core Zen2 console that is relatively highly clocked WILL push the envelop of standard console design. There is no reason to think such design is Sony "slacking" if MS delivers 12TF one. By all accounts, 12TF one would only be possible if TDP, console factor and price were pushed considerably higher then what has been the case historically.

If Sony assumed MS will push hard as well, but at expected price (400-500$) and console form factor they must have surely thought current PS5 design will do more then good enough.
 
I don't know how true it is but there were reports that PS3 had a GPU switch and 256MB or extra RAM added really late in development because of the Xbox 360s specs. I'd qualify those changes as reactionary if the reports are true.
 
I don't know how true it is but there were reports that PS3 had a GPU switch and 256MB or extra RAM added really late in development because of the Xbox 360s specs. I'd qualify those changes as reactionary if the reports are true.

There's some quotes in an IGN article. The PS3 was delayed a year from 2005 to 2006.

For a while, [PS3 had] no GPU, it was going to run everything with SPUs. The ICE team proved to Japan that it was just impossible. It would be ridiculous. Performance-wise, it would be a disaster. That’s why they finally added the GPU, closer to the end.

https://uk.ign.com/articles/2013/10/08/playstation-3-was-delayed-originally-planned-for-2005
 
There's some quotes in an IGN article. The PS3 was delayed a year from 2005 to 2006.



https://uk.ign.com/articles/2013/10/08/playstation-3-was-delayed-originally-planned-for-2005
Sony announced the partnership with NVIDIA in 2005, the year both MS and Sony revealed their consoles, and showcased some tech demos and the specs. Which means the change happened much much earlier and the GPU was finalized by then. The delay was the result of other hardware parts not being ready, the Blu Ray drive in particular.
 
Sony announced the partnership with NVIDIA in 2005, the year both MS and Sony revealed their consoles, and showcased some tech demos and the specs. Which means the change happened much much earlier and the GPU was finalized by then. The delay was the result of other hardware parts not being ready, the Blu Ray drive in particular.

They couldn't make/purchase enough blue lasor diodes. That was once they were in production wasn't it? Europe was pushed back to an '07 launch.
 
I don't know how true it is but there were reports that PS3 had a GPU switch and 256MB or extra RAM added really late in development because of the Xbox 360s specs. I'd qualify those changes as reactionary if the reports are true.
GPU was not swapped because of 360. RAM probably wasn't either, but you can potentially react adding more RAM, or using a faster clock than the one you intended if the clock you intended was conservative. The suggestion here is taking a GPU clocked as fast as it can go, and then pushing it beyond that to something totally inefficient.

Look at Saturn for a disaster created in reaction. Reaction hardware is not going to be a good business. You'll be burdened in all sort of ways, and realistically gain nothing.
 
Didn't sony intend for the PS3 not to have a GPU and it would all be done by the CPU or something like that. Somebody made the statement that not even kamikaze japanese programmers would be able to work with that, and that the RSX was a late inclusion.
 
Didn't sony intend for the PS3 not to have a GPU and it would all be done by the CPU or something like that. Somebody made the statement that not even kamikaze japanese programmers would be able to work with that, and that the RSX was a late inclusion.
There was the possibility of just using Cell, of a Toshiba developed GPU, and nVidia. Sony had patents for a variation of Cell with GPU-style rasterising units for SPUs. Indeed, the idea of Cell as a heterogeneous architecture included different types of ancillary processors and not just SPUs. RSX was definitely a last-minute addition, but not in reaction to 360 is some way.
 
I'm also convinced 9Tf isn't it. Everything we know about Navi architecture and 7nm thermals tells us that 36 CUs at 2Ghz is ludicrous. If it's 36 CU, it'll be clocked lower so under 9Tf.


*cough* PS3 *cough*.

I got a feeling that these 36 CUs @ 2.0 GHz figures come from 5700 based dev kits with cherry picked chips that are overclocked. If you got 10+ TFlop consoles the only choice AMD can provide with Navi is Navi 10. These are dev kits so Sony or MS can pay a premium for higher binned chips with extra cooling to provide something that runs near 2.0 GHz at sustained rates.
 
I got a feeling that these 36 CUs @ 2.0 GHz figures come from 5700 based dev kits with cherry picked chips that are overclocked. If you got 10+ TFlop consoles the only choice AMD can provide with Navi is Navi 10. These are dev kits so Sony or MS can pay a premium for higher binned chips with extra cooling to provide something that runs near 2.0 GHz at sustained rates.
Why not pick full 40CU chip and clock it to 2GHz, why go with 5700 part? You can overclock every 5700XT to 2.0GHz rather easy and put them in dev kits.

I think its more likely to believe that Oberon is custom chip based on Navi 10, like PS4 and Pro were (HD7870 and RX480), and first early dev kits which were reported as 13TF ones were packing Vega cards.

This is from AMD repo (Ariel) :

/proj/gfxip_vega20_pv_02/junyingz/Ariel_BC_ex/CL3134324_4/out/linux_2.6.32_64.VCS/ariel/config/gc/run/block/tgltp_tc4_tcp_hit_tfRGBA8888_tfilterBilin_noitrace @CYB

This is not only because Oberon chip can be found in AMD repo (and is revision of Ariel), but also because Taiwanese insiders reported Oberon chip going into verification process now.

Also, since we know "V" shape development kits were out by June this year, they must have had early chips ready by then.
 
Why not pick full 40CU chip and clock it to 2GHz, why go with 5700 part? You can overclock every 5700XT to 2.0GHz rather easy and put them in dev kits.

I think its more likely to believe that Oberon is custom chip based on Navi 10, like PS4 and Pro were (HD7870 and RX480), and first early dev kits which were reported as 13TF ones were packing Vega cards.

This is from AMD repo (Ariel) :



This is not only because Oberon chip can be found in AMD repo (and is revision of Ariel), but also because Taiwanese insiders reported Oberon chip going into verification process now.

Also, since we know "V" shape development kits were out by June this year, they must have had early chips ready by then.

Acutally, I hadn't thought about that. Maybe there was more ample supply of cutdown 5700s. The early production runs of 5700s may have had more issues with defective blocks than the ability to overclocked. What do you do with 5700s that can maintain stability at higher clocks but have defective blocks?

These are pre-launch dev kits. Given how Sony and MS tends to treat dev kits (there is a reason you can't readily buy dev kits online in volume), these may be short lived dev kits that get swapped out as soon as newer versions are available. Its could of been the case of where AMD didn't want to limit the availability of its best chips to the gaming market to accommodate throw away dev kits.

Some early Durango dev kits had high end 7 series nvidia cards in them. Who actually knows why? The only AMD cards that will give you something close to 500 GBs of bandwidth are high end Vegas and Navi 10. There are other metrics other than Tflops that manufacturers have to contend with when producing dev kits.

Trying to parse out relevant information for early hardware kits is like trying to read tea leaves.

Who had early chips? Devs? Maybe internal or trusted devs who are part of the verification/validation process but regular third party devs probably not. If you are in the middle of verification and validation of hardware, you are not providing the same hardware to devs for development purposes.
 
Last edited:
Status
Not open for further replies.
Back
Top