Next Generation Hardware Speculation with a Technical Spin [pre E3 2019]

Status
Not open for further replies.
6nm will start risk production in 1Q20, so too late for this round

I didn’t mean to suggest it as an option for launch. Obviously they wouldn’t be in a position to leverage it if they didn’t already have 7nm designs taping out :)

Seems a smart move on TSMC’s part given 7nm is going to be a long node for them. Perhaps a tacit acknowledgement that making 7nm+ incompatible was a bad move.

I don't believe we will have an APU this time around. I believe in XBOX360 / Intel Kaby Lake G type solution. Can't see a >10 TFlops GPU sharing the same die with an 8 core Zen class CPU.

I think a GPU + mem controller die along with a Zen 2 chiplet over IF is possible, but that will increase packaging costs. Could be worth it if the die cost goes way down though. And give them the ability to shrink them independently.
 
Last edited:
The reality is that whichever way you look at it, VR is and probably will always be a niche, and it would be insane for Sony to focus on VR as much as you say. Sorry, it's just not going to happen. The average Joe will never, ever buy and use a VR set as much as other non-VR games. I'm saying this as a PSVR owner.
Agreed, as I mentioned earlier in the discussion if we saw 20% VR adoption with PS5 I'd be shocked
 
Great state of the memory industry by TechInsights.


https://www.techinsights.com/blog/techinsights-memory-technology-update-iedm18


TechInsights-gives-memory-update-at-IEDM18-b-1.jpg


TechInsights-gives-memory-update-at-IEDM18-b-2.jpg


TechInsights-gives-memory-update-at-IEDM18-b-18.jpg


TechInsights-gives-memory-update-at-IEDM18-b-19.jpg
 
Naughty Dog must know Sony's plans and hardware. It could as well be that the guy was told to post a retraction because Sony aren't ready to talk about the hardware beyond what's been said. It's not really proof nor disproof, but I'm inclined to believe that the initial response had more insight than just the Wired article one-liner.

That's the way I see it too.
 
Naughty Dog must know Sony's plans and hardware. It could as well be that the guy was told to post a retraction because Sony aren't ready to talk about the hardware beyond what's been said. It's not really proof nor disproof, but I'm inclined to believe that the initial response had more insight than just the Wired article one-liner.

Naughty Dog are on many of the infamous Cerny patents regarding BC and GPU arch. They absolutely know what’s up. (Jason Scanlin and David Simpson)
 
This employee may well be already working with the dev kit...

Or may not. Or the dev kit might not include Navi yet, or if it does, Navi might not have whatever customisations Sony may or may not have asked for regarding RT!

It's possible they're telling the truth when they say "I assumed it's hw cos we can already do sw ray tracing (albeit glacially)". Or maybe they're covering their ass because they've just broken an NDA (e.g. "I assumed [Cerny was confirming it's] hw cos plz don't fire me").

There are lots of developers in AAA studios who don't know specifics on the level of hardware RT support, right at this very moment. It may be true for large parts of Sony's internal studios too (I expect it is to some extent).

At a basic level, artists need to know the performance parameters of what they're working on. In a sense it doesn't really matter to them whether it's software or dedicated hardware (or a particular combination thereof) that uses the work they do.

So they many know, or they may be telling the truth and not. I'm just saying we can't take this as solid confirmation ... yet!
 
Audio is a bit like graphics, a lot of work arounds, hacks, and leg work to get different sections of games to sound half decent. Not even talking surround sound output.

RT audio would help with a lot of that.
If you shoot a gun in one room with a lot of open space and it sounds different than outside, thats leg work to set up sound cones etc.
Wall type, wooden floors, contents of room, all have an impact.
You'd be surprised how much even currently goes into some games. Even if it's the unsung hero.

I think a GPU + mem controller die along with a Zen 2 chiplet over IF is possible, but that will increase packaging costs. Could be worth it if the die cost goes way down though. And give them the ability to shrink them independently.
Yields could also be better going this route.
 
Last edited:
So they many know, or they may be telling the truth and not. I'm just saying we can't take this as solid confirmation ... yet!
Indeed. However, anyone working on a PS5 game, or even coming across work on it in the office, is going to know about the pipeline difference. How can you not pick up on such things when some artist in the office is creating art without having to worry about all the crap you're having to worry about?!

No confirmation, but my gut tells me it was accidental. If there aren't hardware RT acceleration structures in PS5, I'll actually be surprised because of this tweet. ;)
 
Last edited:
Indeed. However, anyone working on a PS5 game, or even coming across work on it in the office, is going to know about the pipeline difference. How can you not pick up on such things when some artist in the office is creating art without having to worry about all the crap you're having to worry about?!

No confirmation, but my gut tells me it was accidentally. If there aren't hardware RT acceleration structures in PS5, I'll actually be surprised because of this tweet. ;)

Could very well be, but it's also possible they physically separate people for that exact reason. Badge access areas work quite well. My workplace has lots of "need to know" policies. There are always things in development that I'll never see the details of until around the same time as customers or "the market" hear about it.
 
No confirmation, but my gut tells me it was accidentally. If there aren't hardware RT acceleration structures in PS5, I'll actually be surprised because of this tweet. ;)
Where as I'm inclined to believe his follow up tweet that it wasn't a confirmation.
I'm not saying it's not hw RT, but if it was and he knew then he wouldn't be contradicting what cerny said and what people assumed he meant.

Hopefully now we'll start to get proper leaks.
 
Audio is a bit like graphics, a lot of work arounds, hacks, and leg work to get different sections of games to sound half decent. Not even talking surround sound output.

RT audio would help with a lot of that.
If you shoot a gun in one room with a lot of open space and it sounds different than outside, thats leg work to set up sound cones etc.
Wall type, wooden floors, contents of room, all have an impact.
You'd be surprised how much even currently goes into some games. Even if it's the unsung hero.


Yields could also be better going this route.
Yields will unquestionably be better. The question is whether it offsets packaging costs and what, if any, performance trade offs exist.
 
What defines "hardware RT" though? The presence of fixed function units because Nvidia has gone that approach? One could say that RT acceleration in the GPU via Compute is still hardware instead of purely CPU? Performing RT on the GPU has been around for decades and is still considered a hardware acceleration of RT vs CPU.

Mentioning "hardware RT" could simply be a marketing checkbox as the definition is technically correct.
 
What defines "hardware RT" though? The presence of fixed function units because Nvidia has gone that approach? One could say that RT acceleration in the GPU via Compute is still hardware instead of purely CPU? Performing RT on the GPU has been around for decades and is still considered a hardware acceleration of RT vs CPU.

Mentioning "hardware RT" could simply be a marketing checkbox as the definition is technically correct.
To me, hardware RT bar is pretty low. If there’s an API for it, it’s hardware RT. Software RT is doing the calculations necessary to compute RT completely agnostic of the hardware and letting the compiler sort it out. If you’re coding with specific APIs provided to address the hardware, it’s hardware RT.
 
To me, hardware RT bar is pretty low. If there’s an API for it, it’s hardware RT. Software RT is doing the calculations necessary to compute RT completely agnostic of the hardware and letting the compiler sort it out. If you’re coding with specific APIs provided to address the hardware, it’s hardware RT.

That strikes me as such a low bar that it renders (no pun intended) the idea of hardware acceleration pretty much meaningless. You can write an API for just about any kind of purpose that runs on just about any type of system that can pass and return variables and references.

There's always hardware somewhere down there. An API doesn't somehow make it specialised.

(And specialised hardware doesn't always give you the best results long term, in a rapidly changing environment!)
 
That strikes me as such a low bar that it renders (no pun intended) the idea of hardware acceleration pretty much meaningless. You can write an API for just about any kind of purpose that runs on just about any type of system that can pass and return variables and references.

There's always hardware somewhere down there. An API doesn't somehow make it specialised.

(And specialised hardware doesn't always give you the best results long term, in a rapidly changing environment!)
The problem is that RT is a massively parallel problem by its nature, so a massively parallel compute structure (read: GPU), already well suited to be a hardware solution to that. Is not hardware RT because those compute units do other things well? Can they only accelerate RT to be considered dedicated? What about ImgTec’s solution that isn’t full RT, does it count? What if GPU architecture evolves to where a generic compute unit does RT better? What’s the magic threshold to call it a hardware solution?
 
Status
Not open for further replies.
Back
Top