Baseless Next Generation Rumors with no Technical Merits [pre E3 2019] *spawn*

Status
Not open for further replies.
The PS5's implementation could be 100% for gaming, using much smaller data variables and not capable of producing pro imaging quality. Solo-AMD's implementation for the PC market could be significantly different and more versatile, to be used by Radeon Pro cards for offline RT.
There's two muddled arguments going on now - DSoup saying Sony brings a load of RT experience from digital cinema, and you saying it's gaming related. And I didn't even present a timeline other than 'not two years with RTRT hardware on AMD cards.'

My point is to argue ideas like this:
"Speaking of Raytracing, if it´s the next big thing. (which it is) and Nvidia it´s pushing it in the market right now, What would be the point of devoting resources towards a solution just for Sony, and not integrating it in your hardware portfolio from now on.

If you have something better in the pipeline why not offer it in the first place"
If AMD has an RT HW solution, it'll be making an appearance in their GPUs and not be PS5 exclusive. That presents no timeline for when it'll appear in PC GPUs, but it won't be a long, long way off because AMD will be significantly disadvantaged in that area, so they'll want an (effective) RTRT solutions ASAP. If they've got one developed for console, they'll want to use it.

My mind is boggled with the ease you dismiss what Sony brings to the partnership. Sony has been heavily invested in 3D computer animations since the creation of Sony Animation in 2002 and their first feature film Open Season (2006) for which they developed a bunch of new 3D modelling technologies. AMD sure have a lot more silicon design and production experience on modern processes but Sony have fifteen years of industry experience in producing successful 3D movies.
I haven't dismissed it. I said are the Sony Cinema guys involved? You said they don't need to be. I ask what then does Sony bring if the cinema people aren't involved?
 
My mind is boggled with the ease you dismiss what Sony brings to the partnership. Sony has been heavily invested in 3D computer animations since the creation of Sony Animation in 2002 and their first feature film Open Season (2006) for which they developed a bunch of new 3D modelling technologies. AMD sure have a lot more silicon design and production experience on modern processes but Sony have fifteen years of industry experience in producing successful 3D movies.

I'm not sure how much of that animated movie knowledge is directly transferable to hardware design for a console. It's possible that Cerny has gone to the studios to ask what features they'd feel are important, but I think it's more likely he's been talking to game developers and reading white papers [edit: you said their knowledge could be exploited without directly talking, I see that now]. Anecdote: my friends split into games and sfx/animation work after uni and there's no crossover in careers.

I don't think much of Sony's movie expertise will be useful in working with AMD to customise the silicon. Naughty Dog et al will have far more input, I'm sure.

Sony did nothing in particular with the PS3 GPU, and relatively little to GCN for PS4 and Pro. And whatever they did do, MS did a better job of getting more performance out of GCN with the X1 (though time and price were definitely on their side). If the next GPU is like their last three, it'll be very much like PC Navi but with a few tweaks.
 
I haven't dismissed it. I said are the Sony Cinema guys involved? You said they don't need to be. I ask what then does Sony bring if the cinema people aren't involved?

You asked "Are people from the Sony Imageworks included in the talks, or just SIE people?" and I said "Sony can exploit their animation studio's knowledge without those guys talking directly to AMD".

So uh, no. Just, no. :nope: For the first time in the history of the internet, the use of QFT is warranted. :yep2:

I'm not sure how much of that animated movie knowledge is directly transferable to hardware design for a console. It's possible that Cerny has gone to the studios to ask what features they'd feel are important, but I think it's more likely he's been talking to game developers and reading white papers. Anecdote: my friends split into games and sfx/animation work after uni and there's no crossover in careers.

Nvidia, nor AMD, are producing black box RT solutions, they are designing silicon to accelerate the type of calculation environments that raytracing requires much like AES instructions accelerate common types of mathematical functions utilised in many type fo cryptographic algorithms, but aren't encryption capabilities in themselves.
 
The people to get on board are those working on game engines.

Mark Cerny is working on Death Stranding while he is developping the PS5 feature set. If the next gen Decima engine requires RT, they have the best team to balance the hardware requirements.
 
The people to get on board are those working on game engines.

Mark Cerny is working on Death Stranding while he is developping the PS5 feature set. If the next gen Decima engine requires RT, they have the best team to balance the hardware requirements.

Yeah, I think it'll all be about game creators. They're the ones who'll be creating ray tracing solutions and content pipelines that they'll need to run well on the PS5.

Cerny being an accomplished developer himself means the conversations he has will be well understood.
 
Yeah, I think it'll all be about game creators. They're the ones who'll be creating ray tracing solutions and content pipelines that they'll need to run well on the PS5.
I completely agree, however the maths are identical and you have less computational capacity so you have two scale back for which there are many options, including reducing the number of rays and reducing the length or bounces that each ray is calculated.

What game devs don't want to have to do, because nobody does, is reinvent the wheel. Few game devs will have experience of raytracing so you learn from those who do.
 
Wouldn't be patching the binary if I'm understanding what your saying.
Would patch from the pc branch of the code to the Xbox branch.
It would be nice for gamers, benefit to studio is minor bump in sales, bit like when get X enhanced (not many RT games) , devs get to see what works well and what doesn't for Scarlett based on what they've currently done.

So I do expect some Scarlett RT patched games, if platformance is good enough that is, as its using common API so amount of work is reduced.
DXR is just an extension library of DX12. Like all other things they are going to have flag checks which will determine how things are done. I’m not entirely sure if they will need to recompile the whole thing just because it’s xbox. I guess it really comes down to how they setup their engine to support multiple platforms. But the structure of DX is setup such that if you have the feature it knows which way to go. The only caveat is if DXR requires a completely different setup to rendering, in which I can see a need for a large patch. But they doesn’t seem to be what is happening. PC can toggle between DXR and non DXR with a hit of a switch. I expect to see that being buried in the code of Xbox one titles no reason to take it out, now that we know the future is ray tracing.

The concerns some senior posters have posited is that with AMD being behind on their RT implementation a lot of the code or RT shaders are being optimized for nvidia and that could have performance impacts on other IHVs.

Not even sure if they have a custom version of HLSL for Xbox. It’s true that a port may likely be required, but the workload should not be enormous.
 
There's two muddled arguments going on now - DSoup saying Sony brings a load of RT experience from digital cinema, and you saying it's gaming related.
Two arguments that don't seem mutually exclusive AFAICS.
The folks from digital cinema that produce full-length 3D animations probably know better than game developers how much detail / variable size needs to go into different scenes that vary in pacing, scope, etc.
Only exception might be the very few studios that still put off-engine FMVs in their games, like Blizzard.
(Actually I can't think of any other besides Blizzard at the moment.)


If AMD has an RT HW solution, it'll be making an appearance in their GPUs and not be PS5 exclusive.
Ok let's all agree here and now that AMD will eventually launch PC graphics cards with real-time raytracing hardware.
What I can't agree with is the idea that AMD is in a rush to launch RTRT-capable hardware so the 2019 Navi cards absolutely need to come with whatever RT HW is coming in the 2020 PS5.

. That presents no timeline for when it'll appear in PC GPUs, but it won't be a long, long way off because AMD will be significantly disadvantaged in that area, so they'll want an (effective) RTRT solutions ASAP.
They will? Are you sure?
I can't find a single RTX card review telling people to rush out buying it because of ray tracing, and I can't find a single review of a RTX-enabled game saying it's a game-changer of any kind.
OTOH I can find lots of reviews saying the exact opposite, that RTRT isn't worth the money and performance deficit right now. I see reviews saying people should buy the GTX1080 Ti instead of a RTX2080 if they can find it cheaper. And I see nvidia's fiscal reports saying the RTX line didn't sell as well as they had predicted.
It doesn't sound like 2019 is a critical year to bring RTRT to the PC market, at all.

How long is "a long, long way off"?



I haven't dismissed it. I said are the Sony Cinema guys involved? You said they don't need to be. I ask what then does Sony bring if the cinema people aren't involved?
You literally just dismissed my point of Sony having hardware teams that have been consistently working on imaging processors for the last decade. It's right there in my post that you quoted.
¯\_(ツ)_/¯

And you really think Sony has nothing to bring to the table when developing gaming hardware?
They have a bunch of (very) successful 1st party development studios under their wing, who are supported by a team dedicated to low-level optimization for their hardware and includes Sony's own lead system architect for the PS4 and PS5.
I feel like you're just not being rational here.


Anecdote: my friends split into games and sfx/animation work after uni and there's no crossover in careers.
There's been no reason to crossover until very recently, though. Real-time raytracing is only months old in the consumer's hands.


Sony did nothing in particular with the PS3 GPU, and relatively little to GCN for PS4 and Pro.
That's not a fair statement at all.
Sony worked a hell lot on the PS3's hardware, and Cell ended up being extensively used as a pixel shader co-processor for the console, effectively contributing for its image output pipeline. Not to mention the fact that the nvidia GPU in PS3 was a reportedly late inclusion to the system.

Somehow the B3D mythology says that all people from Sony who worked on Cell were fired or retired. Which to me is a bit odd because Sony never stopped working on processors and in companies with several thousands of employees, people are often put into projects that span many years. Maybe others here have more factual knowledge than me, IDK.

And whatever they did do, MS did a better job of getting more performance out of GCN with the X1 (though time and price were definitely on their side).
They did a better job?
AFAICS, Microsoft launched a higher performance console that came in a time when 16FF+ was significantly more mature, got 50% more RAM and costs substantially more to make because every single system needs to be fine tuned to achieve those clocks.
It was a different job, not a better one.


The concerns some senior posters have posited is that with AMD being behind on their RT implementation a lot of the code or RT shaders are being optimized for nvidia and that could have performance impacts on other IHVs.
Does that really matter, now that we know both PS5 and XBTwo will have an AMD GPU?
 
You asked "Are people from the Sony Imageworks included in the talks, or just SIE people?" and I said "Sony can exploit their animation studio's knowledge without those guys talking directly to AMD".
I asked who was involved with the engineering...

me said:
and how much of Sony? Are people from the Sony Imageworks included in the talks, or just SIE people? Naughty Dog asking what they'd like to see and not engineering silicon?
Three question marks. Not a single statement. I'm asking what expertise is being involved and how.

This argument seems to be getting silly. I'll clarify my position here.
  • PS5 is rumoured to be getting RT hardware co-engineered by Sony and AMD. Note that's not expressed as designed with Sony, and engineered (executed in real silicon) by AMD, but co-engineered with Sony working on the execution.
  • With such a design involving Sony, there's the possibility that the IP from that venture won't find its way into other GPU products from AMD.
  • I'm pointing out that AMD will produce their own RT accelerating designs for PC and maybe XBN. Involvement of Sony in developing this part of PS5 does not prevent RT acceleration in other AMD products.
The reason this doesn't prevent other AMD products have 'RTRT' is because AMD are capable of designing their own solutions without requiring Sony's involvement, so AMD will be working on these solutions themselves anyway. Sony's ability to design a hardware raytracing unit is likely comparable to their ability to design a GPU themselves for PS2 or PS3 versus AMD and nVidia, or pull together a team to design a heterogeneous CPU architecture that can find its way into every conceivable device. When it comes to designing processors for accelerating raytracing, AMD have no need for Sony.

---

In addition to that, and independent from that (could RT in PS5 mean no RT in other products?), there's the question you've raised about what Sony brings to the table for gaming RT acceleration. When it comes to understanding the requirements of RT, AMD already develop extensive solutions for raytracing users, like movie companies. AMD (want to) sell GPUs to renderfarms and they develop rendering solutions that tie in with the existing packages. The requirements of movie production are also very different from realtime (gaming) applications. eg. Sony's fork of Arnold chooses BVH in part because they're working at less than one triangle per pixel, which is not a priority gaming will make. What Sony (SIE) can certainly do is talk to its top-drawer developers about how they would use RT and what they think would benefit them, and then talk with AMD about focussing a solution in areas that Sony's devs (and one would hope, third parties beyond) feel they'll get the most from. Much like MS engaging in talks with devs and IHVs about directions to take GPUs.

Therefore, IMO, the HWRT solution in PS5 is likely to be a modified version of what AMD are already working on, in the same way Sony had an ID buffer added to the GPU in PS4Pro. Those modifications will not appear in an MS console. Those modifications may appear in PC GPUs depending on the deal. Regards Sony's wider experience, I don't see it as particularly relevant, I doubt Sony Picture's involvement at any level beyond everyone reading their white-papers, because they don't profit from it at all (unless the designs are going to be included in AMD GPUs that will accelerate Arnold), and it'll be engineered by AMD anyway because Sony Pictures and SIE have no experiencing of engineering silicon.
 
I asked who was involved with the engineering...

Which is, as I am sure you know, a nonsense question since nobody from AMD or Sony are going to talk about that. Putting aside what is known, which is next to nothing, do you honestly think if PS5 has hardware RT functionality, that before embarking on this that Sony wouldn't have explored the knowledge of their 3D animation studio on this?

Quite simply, they couldn't afford not too. Even if it produced nothing or worth, they'd be idiots not too. I don't think Sony are idiots. :nope:
 
Which is, as I am sure you know, a nonsense question since nobody from AMD or Sony are going to talk about that. Putting aside what is known, which is next to nothing, do you honestly think if PS5 has hardware RT functionality, that before embarking on this that Sony wouldn't have explored the knowledge of their 3D animation studio on this?

Quite simply, they couldn't afford not too. Even if it produced nothing or worth, they'd be idiots not too. I don't think Sony are idiots. :nope:
Sony Image Works is its own entity far removed from SIE & I frankly don't see the relation between their work producing CG content (using a 100% CPU based renderer BTW) and SIE working with AMD on a custom SOC.
 
Sony Image Works is its own entity far removed from SIE & I frankly don't see the relation between their work producing CG content (using a 100% CPU based renderer BTW) and SIE working with AMD on a custom SOC.

As somebody who manages a server farm, which has a wide variety of hardware, I can tell you that the mathematics utilised in raytracing (which is not unique to the production of 2D or 3D images) are a constant, it is only the hardware method of calculation that differs.
 
Does that really matter, now that we know both PS5 and XBTwo will have an AMD GPU?
I guess not?
I'm not really sure to be honest. I suppose it depends on how it's implemented. Sony is an unknown here for me on the software side. While I assume GNM just adds in RT extensions into their API like DXR into 12; without knowing how it works it may be dramatically different. GNM is going to be supporting their hardware and whatever exotic features it would have. DXR is more abstracted and not vendor specific, so I don't really know how this is going to play out. Like you said earlier, it's entirely possible that Sony comes up with their own RT acceleration separated from AMD; i'm not going ot say they can't or haven't without seeing the final release. So that leaves GNM also sort of a big question mark, which makes development optimization a big ? as well.
 
Which is, as I am sure you know, a nonsense question since nobody from AMD or Sony are going to talk about that. Putting aside what is known, which is next to nothing, do you honestly think if PS5 has hardware RT functionality, that before embarking on this that Sony wouldn't have explored the knowledge of their 3D animation studio on this?

Quite simply, they couldn't afford not too. Even if it produced nothing or worth, they'd be idiots not too. I don't think Sony are idiots. :nope:
You appear to be arguing for arguments' sake here because you've completely ignored the rest of my post where I clarify exactly my position. Whatever.
 
I asked who was involved with the engineering...
  • I'm pointing out that AMD will produce their own RT accelerating designs for PC and maybe XBN. Involvement of Sony in developing this part of PS5 does not prevent RT acceleration in other AMD products.
.

There is another possible scenario, based on the rumor that PS5 has its own RT solution.

That is, SONY has better RTRT solution than AMD's, therefore SONY has chosen own RTRT hardware for PS5, not using AMD's existing technique. Besides SONY's solution may be even developed faster than AMD's. When NVidia already released RTRT GPU last year, AMD's GPUs are still lack of RTRT hardware, which could be a sign of AMD's slower development. Therefore SONY has to choose its own solution to ensure 2020 release of PS5 console with powerful ray-tracing hardware.
 
To be honest it would be weird for SIE staff to work on ray tracing tech, when the needs between cinematic quality ray tracing and video game is drastically different.

The most I think is just by consulting them. I won't be surprised if there's SIE people who shifted into Sony Interactive though, but that's a lot more different than working with the department itself.

On the subject of Sony working more hands on with its APU then yes, I won't be surprised if they do this time. Hell, there is this post by 3dilettante in the Navi thread (unashamedly taking the abridged version from VX1 from Era)

“A potentially larger omission is the lack of FeatureGCN3Encoding for GFX10. I have seen discussion in various fora that Navi is a repudiation of Vega, and that it's a return to Polaris or something like that. However, the lack of GCN3 encoding flag (and I reviewed some of the opcodes listed in later updates) makes it seem like a significant number of opcodes have been changed to match the console-generation instructions, if they were present at the time. This means before Polaris, Tonga, and Fiji. Architectural advances since Sea Islands appear to still be present, such as the various parallel and packed extensions and scalar memory operations. There are also references to Primitive Order Pixel Shading (POPS) in other scalar ISA commits that were in the Vega ISA doc, message types from Vega not supported by other GPUs, and some things like DLI instructions from Vega 20.*
*One possible caveat: I am not sure whether there's more to interpret from the decision to move the scalar operation flags and others into a separate sub-version 10.1 versus the overall GFX10 set. That may mean some variation of Navi could be missing one or more of these operations, and the lack of the scalar ops would be more like the consoles--though it might be more of a regression than some of the more niche flags.

So GFX10 appears to have a mix of returning some operations in a way that might align it with the consoles, while still having more recent or new features from GFX8 and GFX9. Some changes like the HasNoSdstCMPX change, might be a place where Navi deviates from both the console and PC space.”
 
I completely agree, however the maths are identical and you have less computational capacity so you have two scale back for which there are many options, including reducing the number of rays and reducing the length or bounces that each ray is calculated.

What game devs don't want to have to do, because nobody does, is reinvent the wheel. Few game devs will have experience of raytracing so you learn from those who do.

Yeah, absolutely agree there's no point in reinventing the wheel. The maths for raytracing has been know about for donkeys and there have been countless siggraph papers on it and research into it. The thing is, I'm not sure what extra Sony Pictures Animation will know about the principles of raytracing that aren't already out there, or that AMD / Nvidia won't be able to find out from the movie studios and effects companies and academic researchers they already try and learn from.

I see those staff as being potentially useful on content creation, but I can't think of any secret sauce they can mix into architectural modifications beyond what chip makers have access too and are already working on.
 
There is another possible scenario, based on the rumor that PS5 has its own RT solution.

That is, SONY has better RTRT solution than AMD's, therefore SONY has chosen own RTRT hardware for PS5, not using AMD's existing technique. Besides SONY's solution may be even developed faster than AMD's. When NVidia already released RTRT GPU last year, AMD's GPUs are still lack of RTRT hardware, which could be a sign of AMD's slower development. Therefore SONY has to choose its own solution to ensure 2020 release of PS5 console with powerful ray-tracing hardware.
That's basically DSoup argument, roughly : Sony have co-engineered (allegedly during 4 years or so) with AMD their own RT acceleration hardware like they previously co-engineered their own checkerboard rendering solution (hardware + software) using their own years of experience.

The counter-arguments of Shifty being that it's not possible that AMD could have agreed to make RT hardware only for Sony because:

- Exclusive GPU hardware customizations would break the rules of the GPU laws
- AMD are inevitably working on hardware RT for their own cards (even if nothing point to that, there are no rumours about that, and Microsoft are only talking about a pure software RT solution for their next machine for now)
- Sony would be unable to co-design anything so fancy because Cell sucked (and Cell was a long time ago anyways) and they never ever co-engineered anything valuable on PS4 hardware : ID buffer was a modification of existing AMD hardware...what ?

Therefore, IMO, the HWRT solution in PS5 is likely to be a modified version of what AMD are already working on, in the same way Sony had an ID buffer added to the GPU in PS4Pro. Those modifications will not appear in an MS console. Those modifications may appear in PC GPUs depending on the deal.
In the same way because the ID buffer is a modified version of....?
 
Status
Not open for further replies.
Back
Top