Baseless Next Generation Rumors with no Technical Merits [pre E3 2019] *spawn*

Status
Not open for further replies.
AMD are inevitably working on hardware RT for their own cards (even if nothing point to that)
1) PowerVR had RTRT units. 2) nVidia has RTRT units. 3) MS have been working on DXR for a whiles and obviosuly been in talks with both AMD and nVidia about that. 4) AMD has been working on professional accelerated RT for years and will be aware of the need to make it faster. 5) The professional imaging market for raytracing is potential huge and if your solution is half the speed of your rivals, you'll miss all of that.

Unless you think AMD have been completely blindsided by the way the market is shaping, it's obvious they have people considering the problem. Heck, if Sony were aware of the need for HWRT accelerating when they went to AMD to set up this Navi partnership, how on earth could AMD not be aware of it???

Sony (2016): Also, we want some hardware acceleration for raytracing to make it a useable feature at realtime speeds.

AMD: Ha ha ha! Ray tracing? You're out of your tree, mate. No-one cares about fast raytracing. Whatever, sure, tree traversal hardware with intersect tests. Happy to take your money.​

Or is this RT hardware something started late last year after RTX appeared?

In the same way because the ID buffer is a modified version of....?
Whatever GCN version forms the basis of PS4Pro. AMD had an architecture, Sony said, "can we have that with an ID buffer please?" and AMD implemented a hardware ID buffer in the silicon determining the best way to route that, as is the basis of all customisations which, to date, we've called 'customised' and not 'co-engineered' projects.

I admit, that is speculation on my part. It could be that Sony took AMD's blueprints for their GCN GPU, worked out all the transistors and routing needed to implement the hardware ID buffer, drew up their own blueprints with the changes and got the chip made. I guess it's also possible that Sony engineers sat down and designed a hardware accelerating raytrace unit of some sort, then presented it to AMD saying, "can you bolt this onto the Navi chip?" I'm not sure which engineers mind. AFAIK the only chip designers Sony have work on their sensors.
 
Last edited:
The counter-arguments of Shifty being that it's not possible that AMD could have agreed to make RT hardware only for Sony because:

- Exclusive GPU hardware customizations would break the rules of the GPU laws
- AMD are inevitably working on hardware RT for their own cards (even if nothing point to that, there are no rumours about that, and Microsoft are only talking about a pure software RT solution for their next machine for now)
- Sony would be unable to co-design anything so fancy because Cell sucked (and Cell was a long time ago anyways) and they never ever co-engineered anything valuable on PS4 hardware : ID buffer was a modification of existing AMD hardware...what ?
SONY & AMD co-design PS5 RTRT hardware and AMD develops its own RTRT solution simultaneously. I don't figure out what laws will be broken.

I remember that AMD designed many exclusive GPUs for Gamecube of Nintendo and xbox 360 of Ms. Why can't AMD do exclusive GPU features again?


And why is it so surprising that SONY may co-design RT hardware with AMD? If SONY design the whole GPU then it is really surprising.
 
There's been no reason to crossover until very recently, though. Real-time raytracing is only months old in the consumer's hands.

True, but even with raytracing hardware the demands of games vs movies /animations will be different and, I think, tend to attract different types of people as with my old chums. Hopefully there will be sharing of techniques and approaches though. Better performance and increasing the freedom of animators to work free of large companies in shorter time-scales will be good.

That's not a fair statement at all.
Sony worked a hell lot on the PS3's hardware, and Cell ended up being extensively used as a pixel shader co-processor for the console, effectively contributing for its image output pipeline. Not to mention the fact that the nvidia GPU in PS3 was a reportedly late inclusion to the system.

It's completely fair, and true. I was talking about the GPUs. Sony haven't substantially engineered a console GPU in nearly 20 years. And they haven't needed to.

How Cell ended up being used was down to developers and not due to Sony's GPU engineering prowess, which is the focus of this particular conversation (or at least I understood it to be that).

Sony understanding what's going to be available, what developers will need, and requesting changes to a chip that someoneone else will engineer is all Sony need.

They did a better job?
AFAICS, Microsoft launched a higher performance console that came in a time when 16FF+ was significantly more mature, got 50% more RAM and costs substantially more to make because every single system needs to be fine tuned to achieve those clocks.
It was a different job, not a better one

I said "MS did a better job of getting more performance out of GCN with the X1 (though time and price were definitely on their side)."

You object to this, then basically go on to admit that MS got more out of GCN, but they had time and price on their side. :confused:

I said this in the context of a discussion in which Sony graphics expertise is supposed to allow them turbo charge AMDs designs to a degree that AMD couldn't manage. Turns out that - so far - Sony's ability to turbo charge the GPU designs they've bought-in amounts to them requesting AMD engineer some nice but not game changing modifications. Turns out they haven't been the only people to do this, and at least one other party has got more out of the base architecture (even they though had time and money on their side, as I stated).

I'm sure if there are some changes that Sony think will enhance ray tracing ability they'll ask for them. But something radically different from core Navi would be a risk for Sony and AMD, IMO. And AMD are barely able to make a single core architecture - Navi is late, Vega had limited success and never had one of it's touted features enabled, and Polaris is ancient.

It would really suck if AMD had developed a HW ray tracing Navi derivative for Sony (or MS, or anyone else) at the expensive of being able to use it in a product themselves.:cry:
 
Last edited:
SONY & AMD co-design PS5 RTRT hardware and AMD develops its own RTRT solution simultaneously. I don't figure out what laws will be broken

Well I'm not sure AMD have those kinds of resources. It would also be a lot of redundant work. AMD would have told Sony what they're working on, and I don't think Sony would pay for them to do something they're already doing. I can see them asking for AMD to take their architecture in a modified direction given how big a customer they are, and I can see AMD agreeing if the gain is big enough, but such different, divergent paths doesn't seem sensible.

I remember that AMD designed many exclusive GPUs for Gamecube of Nintendo and xbox 360 of Ms. Why can't AMD do exclusive GPU features again?

ArtX made the GC GPU. They designed it and then were looking for a customer, which turned out to be Nintendo. Then ATI bought them. Then AMD bought them. It's not really like AMD span off an entire design team to do a new GPU. The 360 was also based on an abandoned PC architecture (R500?), that MS were forward thinking enough to see the merit in. It wasn't a fully bespoke architecture though. Things have only got more complex and expensive since then.

I think AMD can do exclusive features, but the more engineering work they need the bigger the cost and the bigger the impact on whatever else AMD are working on. You can't rule out a radically different version of Navi with custom Sony RT hardware, but I think it's a big ask. Sony asking for some level of customisation seems likely though, and it's also possible that three years ago (or whatever) Sony talked with AMD about the direction of GCN and what they wanted to see.

I just think that the more complex and core the feature or functionality, the less likely it'll be exclusive. I mean, MS work with AMD too and have done for a long time now (14 years and counting), and they've been one of the prime movers in pushing ray tracing into the consumer space. But I don't seem them having some kind of unique hardware ray tracing variant of Navi.

I think it's most likely that if anyone has it, everyone has it. But, you know, I live to be proven wrong. :LOL:
 
On the subject of Sony working more hands on with its APU then yes, I won't be surprised if they do this time. Hell, there is this post by 3dilettante in the Navi thread (unashamedly taking the abridged version from VX1 from Era)
AMD shifting its encodings to more closely align with Sea Islands would tend to help both console vendors as far as backwards compatibility with any low-level GPU code without retranslation or bespoke silicon goes.
Moving the opcodes around is something GPU makers have been rather free to do, although I am curious why GF8 and GFX9 moved so many operations as much as they did when it was possible that something in the future might want to be compatible with a Sea Islands console.

What that might mean for the PS4 Pro given that it added rapid packed math, is unclear. I wouldn't know where those instructions fit in the opcode space relative to the GFX 8/9 instructions, or the still-incoming list of GFX10 opcodes.

I am unclear about the "unashamedly taking the abridged version from VX1 from Era" part, is this a reference to a different post? Is there a link?


In the same way because the ID buffer is a modified version of....?
I think the ID buffer is a modified copy of the depth portion in the ROP export path. There may be some logic similar at a high level to how the binning rasterizer records primitive IDs for what is held in a tile, although such a solution could have been done independently. The ID buffer takes primitive ID information and passes the values in parallel with the associated Z buffer writes.
 
They did a better job?
AFAICS, Microsoft launched a higher performance console that came in a time when 16FF+ was significantly more mature, got 50% more RAM and costs substantially more to make because every single system needs to be fine tuned to achieve those clocks.
It was a different job, not a better one.
I think the way to compare the consoles is relatively how well did they hit their desired objective. MS sold the Xbox One X as a 4K Xbox One machine, and it did very well in a majority of X1X enhanced titles in meeting that requirement or very close to with FauxK.

You'll have to ask yourself if 4Pro hit the mark of being that 4K machine Sony promised. Very few developers opted to leverage checkerboard *in the grand scheme of titles released* and fewer managed to hit the FauxK mark. More than a handful were steadfast held at 1080p for some reason even though the expectation was that these 4Pro titles should be hitting 1440p.

I don't own a 4Pro so I won't judge it, but for what was sold to me on the X1X and how it performs, I believe they hit the mark very well.

That is, SONY has better RTRT solution than AMD's, therefore SONY has chosen own RTRT hardware for PS5, not using AMD's existing technique. Besides SONY's solution may be even developed faster than AMD's. When NVidia already released RTRT GPU last year, AMD's GPUs are still lack of RTRT hardware, which could be a sign of AMD's slower development. Therefore SONY has to choose its own solution to ensure 2020 release of PS5 console with powerful ray-tracing hardware.
So one of the challenges with this argument is that the idea that the way RT would be designed say without AMD, but using AMD hardware seems very improbable. At the end of the day the GPU is AMD, so Sony must still work within the confines of the AMD architecture much like AMD must work within the confines of the AMD architecture. If AMD can't figure out how to do ray tracing in their own hardware, I'm doubtful to believe Sony would do a better job designing AMDs own hardware for them.

So the likely case in point is that AMD has their own RT solution and Sony can customize it.

- AMD are inevitably working on hardware RT for their own cards (even if nothing point to that, there are no rumours about that, and Microsoft are only talking about a pure software RT solution for their next machine for now)
AMD is working on RT. That's confirmed. Lisa Su also mentions this as well. If you're unhappy with members confirming this.
The rumour that MS is software only needs to exit the discussion. That's a complete misunderstanding of what Directx Ray Tracing is. DXR is an API, and it can be enabled on any DX12 GPU, the drivers determine how the APIs function which means whether or not there is hardware acceleration is dependent on the hardware and drivers, not on Direct X Ray Tracing.

And why is it so surprising that SONY may co-design RT hardware with AMD? If SONY design the whole GPU then it is really surprising.
So more plausible than your above point. This will come down to a variety of things like $$$ and time. Co-designing is a word I would use that would be much further than 'customizing'.
For instance, Apple has the license to customize ARM chips. And they super customize their Arm chips for their purposes. That's already a massive.
I would interpret Co-designing is like a level above, as in AMD requires the man-power from Sony and they work together to design stuff from scratch.
If Sony does co-design this RT solution, it's only for Sony, I guess it's possible it can back port at a later time to AMD. But it's going to be $$$ and according to the rumours since it was AMD that was providing the additional engineers for this stuff not Sony...
so yea, I mean Sony can do whatever they want, just costs a lot of $$$ and time, and that won't be good if they are looking to keep the console at a specific price point.
 
Some part of this discussion is probably highly influenced by how people are reading the rumour. To me, 'co-engineered' is a very specific and strong term, whereas others might be reading that as nothing more than 'customised' like any other console part. For clarity, people probably need to say what exactly they think Sony are providing/able to help with, ranging from feature requests to designing hardware solutions to actually laying out the silicon design.
 
I feel sad that PowerVR lost out almost everywhere. Their realtime raytracing stuff intended for low power devices looked cool.

I wish PVR had found its way into a console. Or some high end arcade machine. Or ... something high end.
 
Yes, I always wondered why Intel didn't buy Imagination in 2017. I mean they hired Raja Koduri only slightly later so I would assume they had the idea to enter the discrete GPU market before that. ~800 million should be pocket money for Intel and compared to the failed stuff they bought for more money like McAfee (~8 billion) be a good investment, considering that other than tile based rendering Imagination probably has other nice GPU related patents.

Intel Kyro would have been a nice name as well.
 
Last edited by a moderator:
Some part of this discussion is probably highly influenced by how people are reading the rumour. To me, 'co-engineered' is a very specific and strong term, whereas others might be reading that as nothing more than 'customised' like any other console part. For clarity, people probably need to say what exactly they think Sony are providing/able to help with, ranging from feature requests to designing hardware solutions to actually laying out the silicon design.

Honestly we don't even know anything, it could be that the PS5 doesn't even have hardware RT as in RTX at all, but a software only solution. Mark Cerny said 8k support in the same vein.
 
Oh this again..
I said this in the context of a discussion in which Sony graphics expertise is supposed to allow them turbo charge AMDs designs to a degree that AMD couldn't manage.
Where is this discussion?
Who wrote this on B3D? Please, please point me to one post that describes this theory. Where is it? One post is enough.
I see this coming up over and over, yet I can't find the origin of it all. Does it even exist?


I guess not?
I'm not really sure to be honest. I suppose it depends on how it's implemented. Sony is an unknown here for me on the software side. While I assume GNM just adds in RT extensions into their API like DXR into 12; without knowing how it works it may be dramatically different. GNM is going to be supporting their hardware and whatever exotic features it would have. DXR is more abstracted and not vendor specific, so I don't really know how this is going to play out. Like you said earlier, it's entirely possible that Sony comes up with their own RT acceleration separated from AMD; i'm not going ot say they can't or haven't without seeing the final release. So that leaves GNM also sort of a big question mark, which makes development optimization a big ? as well.
My opinion is Microsoft's approach will be as close to the PC as possible, since they've been pushing the Xbox Anywhere feature.
There's a much higher chance that whatever gets adopted by XBtwo will then be used by AMD's PC solutions because that's part of Microsoft's current strategy.


It's completely fair, and true. I was talking about the GPUs. Sony haven't substantially engineered a console GPU in nearly 20 years. And they haven't needed to.
The Cell does graphics work. In fact, the Cell was originally intended to do all the graphics work.
And if Sony "haven't substantially engineered a console GPU in nearly 20 years", then who made the GPU on the 15 year-old Playstation Portable?
I'd also say the dedicated Wide I/O implementation on the SGX543MP4 for the 8 year-old Vita using TSV should count as "substantial engineering" (since it was the first ever TSV implementation in a mass produced device AFAIK).


I said "MS did a better job of getting more performance out of GCN with the X1 (though time and price were definitely on their side)."

You object to this, then basically go on to admit that MS got more out of GCN, but they had time and price on their side. :confused:

I admit MS got more out of GCN, and I object the statement about them doing a better job when is was clearly a different job.
I can also say "AMD with the Vega 64 did a better job of getting more performance out of a FinFet GPU than nVidia did with the GTX1050".
It makes just as much sense as your "better job" sentence.


I wish PVR had found its way into a console.
I'm a genie and I just made your wish come true:
https://en.wikipedia.org/wiki/PlayStation_Vita


You'll have to ask yourself if 4Pro hit the mark of being that 4K machine Sony promised.
The list of non-VR Pro-enhanced games without 4K or >=1440p+upscale is really small, so the answer is yes.

Very few developers opted to leverage checkerboard *in the grand scheme of titles released* and fewer managed to hit the FauxK mark. More than a handful were steadfast held at 1080p for some reason even though the expectation was that these 4Pro titles should be hitting 1440p.
Do you have a source for that first sentence? I really can only find "a handful" of titles that don't go over 1080p. It's basically just Ubisoft's open world games and a few more.


From the latest rumors it sounds more like the following:
  • Sony provided RayTracing.
  • AMD provided nothing related to RayTracing.
Did you read any of the rumors? Or any of the posts describing the rumors?
How does is this post or your previous one (or the one before that) contribute to the conversation?
In fact, I'm having a hard time finding a post from you in this these last 5 pages that isn't mockery, cynicism or joke.
Where exactly should the mods (i.e. you?) draw the line between this and trolling?
 
Have a read through the ResetEra thread to get what and why of what was posted here discusses what it does, lately it being the "Only Sony has RayTracing" theme.

Besides... This is the baseless thread, I dont need to base my posts on anything. :p
 
Have a read through the ResetEra thread to get what and why of what was posted here discusses what it does, lately it being the "Only Sony has RayTracing" theme.
Are we actually supposed to care about what some internet people post on neogaf's failed attempt of a clone?
 
The posts here are following down the same illogical paths as there, so why not short-cut the conversation to end up at the same destination?

EDIT: And ERA is far from failed, it's the better forum between the two (GAF and ERA).
 
The posts here are following down the same illogical paths as there, so why not short-cut the conversation to end up at the same destination?
What illogical path is being followed?

EDIT: And ERA is far from failed, it's the better forum between the two (GAF and ERA).
IMO users should respond to resetera posts on resetera, not on B3D.
We shouldn't have to answer for some anon's delusions on some other forum.

Resetera is garbage. That's my opinion.
 
Do you have a source for that first sentence? I really can only find "a handful" of titles that don't go over 1080p. It's basically just Ubisoft's open world games and a few more.
https://www.resetera.com/threads/all-games-with-ps4-pro-enhancements.3101/
^ All games listed with Ps4 Pro enhancements.
Feel free to count how many are at 1080p. I generally stop after 5 to call it a handful. I'm not being an ass about it, I didn't want to go through the whole list, but I counted up to 25 titles at 1080p that are enhanced before I quit. This isn't a sleight against the hardware but I'm not sure why there are so many at 1080p.

There are 390 titles in that list that are 4Pro enhanced. I believe this list is near exhaustive as it counts boost and PSVR in separated lists.

So you need to count only 20 titles to get to 5% of that list. The number of titles is numerous at 1080p, as in much greater than 20 with a simple eyeball.
The number of titles that are greater than 1080p but less than 1800p are also numerous.
If I were to eyeball that list, I'd say up to 45% of the catalog doesn't make it to ~4K (which I think should start around 1800p reconstructed), with a baseline minimum of 30% doesn't make it for sure.

This was the criteria for my statement of the first sentence.

In any event, I was specific about checker boarding and leveraging of their ID buffer that was meant to assist all their titles to get to 4K.
It would appear that most developers preferred alternate ways of reconstruction. And we see this even within their own 1P teams like Spider Man and HZD.

But that's not an insult to Sony's hardware team either. I've seen a ton of talked about DX features that never get used. Almost all of Xbox's customizations around the command processor were never touched either. I don't know a single title that leveraged execute indirect (the xbox version) yet, perhaps some 1P titles, but definitely no 3P. Fairly positive that ID Buffer got leveraged way more than the Xbox stuff as well, at least we know some titles that leveraged ID Buffer and RPM.

This is sort of why I'm almost more conservative about exotic customizations of features. Sounds great on paper, but it has to be used.
 
I think the rumor is bullshit but it brings an interesting question: What would sony want to add/change on their SoC, specifically something AMD doesn't have the same incentive to put in a PC gpu?

Sony allows more low level access, and they can have big changes and additions at the gen transition. AMD however cannot come out with a PC gpu which has a performance gain, no matter how great, which only applies to some new games, using more silicon for it which would look bad in benchmarks and existing games. Sony have no such limitations, since devs get devkits long before launch with a fixed and modern hardware to optimize against. It launch with 100% identical capabilities and performance. The better games happen at the gen transition and improves even more afterwards.

It's not about sony knowing better than amd how to make a generic gpu. It's that they are not bound by the PC gaming market, and have opportunities unavailable to AMD.

What they said about a completely optimized SSD data path is very telling. If all consoles have exactly the same storage path, it can be significantly better than the most expensive PCs because it's a guaranteed performance for streaming assets. The fixed console hardware and the generation transition enable an immediate paradigm shift. Sony didn't invent a magical new ssd which dwarf samsung technology. They probably optimized the entire data path and used it in a way PCs modular and universal compatibility prevents it.
 
What illogical path is being followed?


IMO users should respond to resetera posts on resetera, not on B3D.
We shouldn't have to answer for some anon's delusions on some other forum.

Resetera is garbage. That's my opinion.
I think it's fair to be critical of resetera and GAF rumors to a certain degree; b3d has always catered to a slightly different audience. We're generally a bit more careful with what is posted, not saying better but the fact that many come from developer and engineering backgrounds lends itself to discussions which are rooted in reality. The danger with cross pollination of discussion here with 4chan, GAF and resetera is foundationally we're in a very unstable place.

It probably makes more sense to discuss the rumors from the perspective of what the design choices would imply such as allocating budget to bandwidth versus more yet slower storage and what sort of trade-off in terms of cost and features we might see as well as the potential pros and cons for third party development as a result.

Personally I'm skeptical about either company committing many resources explicitly to RTRT as those resources would arguably be better delivered in general computation Budget with the developer choosing how how to allocate the resources.

I also think Cerny provided enough details to at least infer that Sony is going for quick movement of large amounts of data with PS5. I question if GDDR6 would be a good choice with that sort of approach based on what we already saw with GDDR5 contention issues with PS4. It's reasonable to conclude HBM with GDDR4 could be a better choice and a quick SSD is being utilized so less memory which is faster can stream assets quickly. We could also be looking at potential modest loss on hardware at launch with increasing margins as HBM becomes cheaper. That said I doubt either platform is dramatically subsidizing hardware as streaming options are viable at lower end to address that issue.
 
If considering another rumor, things become very interesting, that is: xb2 has best performance but using software raytraycing solution. It seems reasonable if SONY has its own RT hardware solution which can be developed faster than AMD's own solution.

That is, SONY PS5 has real-time hardware solution and better RT effect. Since MS chooses software RT, xb2 uses more silicon and system power to have better floating-point performance. If the retail price are very close then the two consoles seem to have quite different design given limited system power and BOM budget.
 
Status
Not open for further replies.
Back
Top