Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
If I had a source I wouldn't have used "IIRC", I would have just posted the source. At that point there would be no question about recalling something correctly. :)

We do know that early devkits were basically just PC's, however.


[edit]
https://www.gamespot.com/forums/sys...hat-is-inside-the-lastest-ps4-dev-k-29343809/

That also shows the PS4 devkit with a more powerful CPU than what the SOC ended up with.

Regards,
SB
From memory, wasn't a totally different CPU with 4 cores instead of 8 ?
 
I don't. Why would AMD wait 1 year before releasing their PC GPU with RT if they had it in June 2019?

Fair enough but I don't see why it would be shocking that Sony get features before dGPUs given the features in PS4 (extra ACEs) and Pro (half-floats).
 
I guess he can't but he did say Control on PC with RTX didn't come close. That at least suggests hardware RT rather than software?
That’s unfortunately a weaker data point despite it sounding correct. Quake II is fully ray traced and doesn’t hold a candle to say Death Stranding for instance. So it really comes down to understanding what is being ray traced and what is not.

Most people have not a clue how and where to identify RT. Thankfully @Dictator continues to post videos explaining what the different RT levels are do. But I can’t imagine Klee understanding these differences on a quick glance
 
Last edited:
and the recent pics of the PS5 kit could be close to near final there. That’s not an alpha box.

Yes, I don't see how the current devkits aren't very close to final at the moment.

I don't. Why would AMD wait 1 year before releasing their PC GPU with RT if they had it in June 2019?

So you think AMD go straight to retail with there new tech?

I would be very surprised if AMD didn't have RT capable hardware right now and possibly even in 2018.
 
Is HW RT in consoles any different to unified shaders? Wasn't that end of 2005 for XB360, with R600 releasing mid 2007?

Yes. Their main PC GPU competition did not have Unified Shaders out then. Now, AMDs main competitor has had RT out in PC world for over a year, maybe longer. If they had it in June 2019 they'd have released it in PC-land already.
 
Hmmm, I'm not convinced. I don't think RT in the RTX cards is a huge selling point for gamers, going by chatter from RTX owners who prefer to disable RT for performance. It hasn't really taken the world by storm with only a handful of games having RT features. I don't know why AMD would sit on the tech, but I don't know why they sat on Unified Shaders either but they did, with nVidia getting the first US architecture to market six months earlier, and it's not like they're missing out with every game having RT and every gamer wanting an RTX card to enable it.
 
Hmmm, I'm not convinced. I don't think RT in the RTX cards is a huge selling point for gamers, going by chatter from RTX owners who prefer to disable RT for performance. It hasn't really taken the world by storm with only a handful of games having RT features. I don't know why AMD would sit on the tech, but I don't know why they sat on Unified Shaders either but they did, with nVidia getting the first US architecture to market six months earlier, and it's not like they're missing out with every game having RT and every gamer wanting an RTX card to enable it.

IIRC, it was because universal shaders on PC was part of DX10 and Xenos wasn't fully compliant with all of the features of that spec, so the PC part required additional development. Basically by pulling the US tech forward in the state that it was and dedicating so much of their resources into developing that into what became Xenos, ATI ended up behind in developing their PC solution.

Which, in context of this discussion, may in fact be exactly what happened here with RT support.

Edit: Turns out B3D is an excellent resource for digging up this type of information. :D

https://forum.beyond3d.com/threads/bringing-directx-10-to-xbox-360.30472/
 
Last edited:
You think this drive will be in the PS5?

While possible, I'm doubtful. The discs that are used in Sony's Optical Disc Archives aren't compatible with Blu-ray. Of course, Sony has no obligation to support Blu-ray playback on the PS5, but I'm not sure if that's a selling point they'd want to get rid of yet.

Since the discs aren't compatible with Blu-ray, the economies of scale are worse and each disc used in each cartridge (11 per cartridge) are likely more expensive to manufacture than your standard Blu-ray disc.

1st gen used discs that were slightly larger than 25 GB per disc (unless they used 12 discs per cartridge back then?), 2nd gen used discs that were 300 GB each, and now 3rd gen uses discs that are 500 GB per disc.

It's a similar case to the discussion on ReRAM, albeit this is something that would be far more likely to be used than ReRAM as there are existing facilities making the optical discs for the enterprise sector.

Also read speed may or may not be an issue as the claimed 375 MB/s per cartridge is likely achieved when reading from all 11 discs at the same time.

Granted this is information from Wikipedia (https://en.wikipedia.org/wiki/Optical_Disc_Archive ) which doesn't cite many sources, so it's quite possible it isn't entirely correct.

But if it is correct, then cost may be an issue. And depending on how different it is from Blu-ray, it may need a dual laser diode drive in order to also support Blu-ray playback, assuming Sony doesn't just decide that the PS5 won't be used for optical media playback.

Regards,
SB
 
Of course not, that's why I posted it in the baseless thread.:LOL:

There is a 10+ years long running discussion about this somewhere here....

I'm surprised it wasn't posted in there...
 
Status
Not open for further replies.
Back
Top