Baseless Next Generation Rumors with no Technical Merits [pre E3 2019] *spawn*

Status
Not open for further replies.
So ... based on that flow chart, do you think it would be possible to mix and match different types of intersection test? E.g. use voxel (terrain) and tri (characters)? Err ... put both in the same acceleration structure and enumerate either type?
The main intent for DXR as written by MS is to try to keep DXR as flexible as possible in alignment with the way that GPUS are moving into compute and further away from fixed function. For those reasons the DXR doesn't actually create a 'Ray tracing' object, they're actually generic objects and the can be used however desired, if you want to use it for AI vision, sound, graphics, etc. I think it's entirely possible in the future that the vendors identify a way for the developers to create their own intersection tests that would be as fast as fixed function. How that's accomplished is beyond me, but I suspect this is the evolutionary path.
tldr; yes. Yes with you own intersection tests you can support non triangle objects. But it will likely be very slow. Developers are also allowed to
I believe the vendors will figure out a way to make programmable intersection tests just as efficient as fixed function, but that may take some time to happen.

Hmm RTX 2060 is 6.5 TF. Lockhart is rumoured to be ~ 4TF, which on the surface of it doesn't look great. Then again, would Nvidia even try selling RT to people who didn't mind gaming at 1080p/30 fps? There's probably a point where they want the minimum die area and the maximum incentive to go up to a higher margin product like the 2060.
The feature set needs to be on both, it's unlikely they'll restrict it just 1. And knowing PS5 will have it, I can't seem them impeding progress for their Windows strategy. DXR is still very much tied to 12, and 12 to windows 10. It also makes streaming more appealling to players if the games they are streaming are ray traced - if you don't have the latest hardware streaming suddenly becomes very attractive. They should align with Sony on this and have RT as a baseline. I think this is a good business decision and Sony signalling their Ray Tracing was an effective way to say, hey 'we are doing this, so let's align'. Or you know, they've been talking about alignment of features in backroom talks that are confidential.

That's my thought too.

As an aside, I wonder if Nvidia tried to use hardware RT acceleration to flog Turing to either Sony or MS? That might have put some pressure on AMD as might favourable moves to MS for an evolution of DX. *shrug*

Nvidia had to have been working on this since 2016, at least.
I don't know nvidia well, but being first to feature sets has served them well many times over. They are making big bank in the AI industry, and have found ways to bring tensor technology to the consumer space. That is interesting. to say the least, I'm not sure if that put pressure on AMD. I think MS is happy nvidia has moved quickly because they need developers to start coding for RT today to be ready for the console space in 2 years time.

've been wondering if the strain of building console processors has pushed Navi back. Even without RT, having to change the Navi design to allow for the possibly for compatibility with PS4 and X1 might have been more than AMD were comfortable with. With the the Pro, X1X, and now PS5 and Xbox next(s) they've had a lot of custom work on their hands that requires ongoing compatibility.
Hmm. With how we see MS doing their BC 2 generations back, I don't think any design would have caused them any serious issue; though it may grievance Sony a touch more. Both teams are extremely talented, and often I would look at 'feature' issues as more as business problem than they have been technical.
 
Redgamingtech latest rumours "confirms" custom solution embeded SSD paired with 2TB HDD.
8c16t 3.2ghz
24GB RAM
Navi with 56CU at 1.8ghz = 12.9 TF. (I wonder if it's indeed on 7nm or even if it's the final version, since the 6nm TSMC says that is 100% compatible, without the need to change anythingon the client side.)
RT is based on Hardware.

I have some ideias of the possible combinations of hardware inside both PS5 and XBOX.

Could be:
Single CPU chiplet of 8 cores 16T, lower intercore latency, lower power consumption, smaller area, smaller cooler, smaller overall console, more design possibilities.
Two CPU chiplets of 4 cores 8T, Depending on the percentage of good/bad yelds, it could be a cheaper solution, could allow better heat management, maybe a cheaper cooler design, allow a "PRO" version with the exact same design using two 8c chiplets.

I wonder if 3.2 will be the max clock while all 8C, and what clocks could be achieved with a lower number of cores used.
 
The main intent for DXR as written by MS is to try to keep DXR as flexible as possible in alignment with the way that GPUS are moving into compute and further away from fixed function. For those reasons the DXR doesn't actually create a 'Ray tracing' object, they're actually generic objects and the can be used however desired, if you want to use it for AI vision, sound, graphics, etc. I think it's entirely possible in the future that the vendors identify a way for the developers to create their own intersection tests that would be as fast as fixed function. How that's accomplished is beyond me, but I suspect this is the evolutionary path.
tldr; yes. Yes with you own intersection tests you can support non triangle objects. But it will likely be very slow. Developers are also allowed to
I believe the vendors will figure out a way to make programmable intersection tests just as efficient as fixed function, but that may take some time to happen.

Thanks! This all gets the imagination flowing. A firefighting game that calculated radiated heat, fire propagation and pushed you to find ways to survive and rescue people. Or a super realistic suntanning simulator for DoAX. :D


The feature set needs to be on both, it's unlikely they'll restrict it just 1. And knowing PS5 will have it, I can't seem them impeding progress for their Windows strategy. DXR is still very much tied to 12, and 12 to windows 10. It also makes streaming more appealling to players if the games they are streaming are ray traced - if you don't have the latest hardware streaming suddenly becomes very attractive. They should align with Sony on this and have RT as a baseline. I think this is a good business decision and Sony signalling their Ray Tracing was an effective way to say, hey 'we are doing this, so let's align'. Or you know, they've been talking about alignment of features in backroom talks that are confidential.

Yeah completely agree featureset needs to be the same for both. I've thought that since the first rumours of dual SKU. I was pondering what may be different for where Nvidia draw the line with HW RT (practically too slow on 1660 in SW for anything), and where MS include it (be it HW or SW) and actively promote it in a theoretical 4TF Lockhart.

It needs to be there for the audio and any gameplay ramifications as much as for simply how nice things might be able to look.

I don't know nvidia well, but being first to feature sets has served them well many times over. They are making big bank in the AI industry, and have found ways to bring tensor technology to the consumer space. That is interesting. to say the least, I'm not sure if that put pressure on AMD. I think MS is happy nvidia has moved quickly because they need developers to start coding for RT today to be ready for the console space in 2 years time.

Yeah that's a good point. If MS want meaningful console support then the sooner developers are finding their feet - even if only on nvidia hardware atm - the better.

Hmm. With how we see MS doing their BC 2 generations back, I don't think any design would have caused them any serious issue; though it may grievance Sony a touch more. Both teams are extremely talented, and often I would look at 'feature' issues as more as business problem than they have been technical.

I just remember reading about changes to later GCN that means they're not binary compatible with Sea Islands. Didn't Cerny mention something about some of the PS4 tech still being in there along with new stuff to ensure BC, or did I imagine that? :-?

I'll check later. I'm on the worstest netbook ever at the moment, and it's dying.
 
Is their source any different to the existing sources? Nine times out of ten, new rumour articles are regurgitating old rumours.

The guess is that it is based on a deleted twitter posting [0][1] which some believe (myself included) was itself just based on the previous discussed reddit posting, and that the tweet wasn't intended as factual stuff.

[0] [1] the url of the deleted tweet was: https://twitter.com/user/status/1123316987067797510


EDIT: looking at the actual video it's directly based on the reddit post:
 
Last edited by a moderator:
Isn't Benji a fairly trust worthy verified Era fella? Why would he regurgitate other's rumor when he has his own sources? Dude's got his own reputation to consider after all.
Edit: Ah yes he later on tweeted to ignore that tweet and consider it nothing worthy to focus on. Could be just covering his own ass or he was simply regurgitating others.
 
Last edited:
Isn't Benji a fairly trust worthy verified Era fella? Why would he regurgitate other's rumor when he has his own sources? Dude's got his own reputation to consider after all.
I don't know him, but I don't think he would lie or purposely mislead. I don't know if that means he's correct though. It really depends on the state of those DevKits. If they are early as in not close to the final release SDK, then a lot of things can still be subject to change.

For instance the early SDKs are PCs in a box, for both Sony and MS. The specs in there compared to what ends up being what's in our hands can be vastly different. Even the final release development console to consumer release is still more powerful and many regards.

So I think he's gotten the right numbers, but I wouldn't correlate that to directly being in the final box either.

You can follow along with how Sony did it with PS4;
https://ca.ign.com/articles/2012/11/01/report-ps4-dev-kits-surface-details-inside
Devkit 1: just the GPU
Devkit 2: Modified PC
Devkit 3: Another modification on the PC?
Devkit 4: Console final devkit

I believe with PS5 we are at modified PC; so either 2 or 3. If this is the first time we are getting specs I think we are at 2 just because this is the first we are reading about specs.
You can sort of read between the lines when you following along the specs for the earlier PS4 devkits and for Xbox. Mention of A10 APUs (which are piledriver) but they launched as jaguar. 16GB of ram became 8 (was apparently actually 4 at one point in time). Xbox had an intel processor and nvidia GPU. That's like so far away from the final product it's laughable.
 
Last edited:
I wouldn’t bother reading too much into early dev kits. The most important thing for those are approximating architectural features to get engines up and running with some modicum of console optimization if the API’s warrant it.

We may be converging upon feature sets these days, but we’ve had at least 2 generations where alpha kits barely resembled the final product in performance. They’re just super computer parts thrown together for compiling. Right now it wouldn’t be a stretch to throw in some crazy Zen part with some crazy SSG Radeon just to get started - Huge compute perf overhead is needed for RTRT (as demonstrated with pascal, and devs can get started with DXR), while Radeon Pro SSG features a unique mem hierarchy closely tieing an SSD to the GPU.

Beta kits will have early runs of the designed silicon.


No “Beef hor-fun” allowed. :devilish: /fires a shot across @iroboto ’s starboard bow.
 
Last edited:
I believe with PS5 we are at modified PC; so either 2 or 3.

The first PS5 SDK was rumored to have shipped around March 2017, the second iteration between Jan-Mar of 2018, and the third iteration around late February of this year. So, PS5 games have been in development for at least 2 years now. The finale silicone or kit, will probably hit around late September of this year (two months after the PC market receives Radeon RX 3060, 3070 and 3080 in July, hopefully), but that's just me guessing.
 
The first PS5 SDK was rumored to have shipped around March 2017, the second iteration between Jan-Mar of 2018, and the third iteration around late February of this year. So, PS5 games have been in development for at least 2 years now. The finale silicone or kit, will probably hit around late September of this year (two months after the PC market receives Radeon RX 3060, 3070 and 3080 in July, hopefully), but that's just me guessing.

That's what I thought and the way I see it the latest dev kits are starting to use final hardware and that's the reason for the wired article. There was the rumor about the latest dev kit using the GPU and we know it's got the storage solution also.
 
So, PS5 games have been in development for at least 2 years now. The finale silicone or kit, will probably hit around late September of this year (two months after the PC market receives Radeon RX 3060, 3070 and 3080 in July, hopefully), but that's just me guessing.
Just in time for an April 2020 launch. :runaway:
 
The first PS5 SDK was rumored to have shipped around March 2017, the second iteration between Jan-Mar of 2018, and the third iteration around late February of this year. So, PS5 games have been in development for at least 2 years now. The finale silicone or kit, will probably hit around late September of this year (two months after the PC market receives Radeon RX 3060, 3070 and 3080 in July, hopefully), but that's just me guessing.
Those are fairly reasonable timelines. I don't think they'll launch right away just because the hardware is ready, seems ideal to time it with your software launches, service launches, OS and online infrastructure are ready etc. But they've done their due diligence in keeping a large flexible launch window waiting for everything else to come together, and in doing so they have some strategic leverage to work with on their launch timing once they know what MS is doing.
 
If 13 TF and 24 GDDR6 Ram turn out to be true, I honestly have no idea what kind of graphics 1st party would pump out if targeting 4k checkerboard. I hope we get at least Deep Down level of fidelity with all fluid dynamics and Voxel GI enabled.
 
If 13 TF and 24 GDDR6 Ram turn out to be true, I honestly have no idea what kind of graphics 1st party would pump out if targeting 4k checkerboard. I hope we get at least Deep Down level of fidelity with all fluid dynamics and Voxel GI enabled.
Yeah I'd rather they focus on pixel quality rather than marketing checkboxes.
 
Wouldn't 24GB gddr6 require 384bit memory bus? That sounds unlikely for console but if that happes, awesome.
 
If 13 TF and 24 GDDR6 Ram turn out to be true, I honestly have no idea what kind of graphics 1st party would pump out if targeting 4k checkerboard. I hope we get at least Deep Down level of fidelity with all fluid dynamics and Voxel GI enabled.
To put things into perspective - this is from a finalized Scorpio devkit:

It contains four more compute units, an additional 12GB of GDDR5 memory (for a total of 24GB of RAM), and an extra 1TB SSD. Compare that to the specs of the retail version of Scorpio, which has only 12GB of GDDR5 RAM and a 1TB drive.
 
Wouldn't 24GB gddr6 require 384bit memory bus? That sounds unlikely for console but if that happes, awesome.

Yes, unless they use a setup with split memory pools (potentially tied together by HBCC) like the HBM2+DDR4 post from reddit, or they use clamshell mode with a 192 bit bus (which wouldn't make sense from a bandwidth perspective).
 
Status
Not open for further replies.
Back
Top