Predict: The Next Generation Console Tech

Status
Not open for further replies.
Doesn't the GPU-dedicated RAM need to increase in bandwidth as you increase the number of cores? Not theoretically of course, but practically.

You couldn't get a "high-end" version of your GPUs to "infinitely" scale linearly with increasing the number of cores without increasing memory bandwidth, right?
Right, but the bandwidth requirement for extra cores is low.
 
I heard I believe from Dave that moving data for a modern ATI graphics chip was more expensive in terms of power consumption than the actual calculations themselves. I assumed that the same would apply to all graphics hardware given they seem to follow many similar basic principles.

So even if the data quantity increased linearly, maybe its the size of the chip and the distance each bit would need to be sent which would cause the power use to scale up faster?
Our architecture is quite a bit different from ATI's at a low, fundamental level. It's right to say that moving data around the chip costs power, but when your architecture is fundamentally based on minimising data transport....

I always wanted to see if your graphics cores would be viable in a console... I was hoping that there'd be some kind of Microsoft ARM console due to their hoping for some synergy with Windows 8...
It's always been viable, it's just whether it's more viable in the eyes of the vendor than someone else bidding for the design win.

P.S. What does Molly! mean?
MOLLY is the codename for something really awesome :D
 
Just fine :LOL: Did you have anything in mind there where you think a TBDR wouldn't do so well when manipulating or amplifying/deamplifying geometry?
Well, if the developer isn't nice (ie. doesn't use a displacement shader which allows you to determine a tight bounding box) you have to either cycle the amplified geometry through memory during tiling or amplify it twice.
 
I think it's a great idea, and there's nothing inherent to either Cell or an architecture like SGX that would stop you wanting to pair them on the same die other than memories and how they'd talk to each other efficiently.
Glad to hear it. :mrgreen:

Negatives? I'm not sure there are many, at least if you're not the hardware designer trying to integrate the two (along with everything else the chip would need).
A current concern is SGX featureset. I imagine you can't talk about this openly, but with the probability of getting a DX12 part in the XBox3, SGX doesn't appear to have anything like on a roadmap. That said, enough compute resources would, by my reckoning, make up for specific hardware features in the GPU. So SPUs+SGX would constitute a division of graphics workload more into computer shaders and rasterisers. In that respect one could even reduce the SGX complexity a little - it'd be a balancing act for the transistor budget to get the required functionality in the right places for best performance. I'm curious what instructions you'd have the SGX issue to the SPUs. Could SPU's also be made to appear as a form of shader to the SGX core? Perhaps set to a 'slave' mode when it comes to graphics rendering and taken out of the 'SPU pool' for tasks on a SPU scheduler?
 
A current concern is SGX featureset. I imagine you can't talk about this openly, but with the probability of getting a DX12 part in the XBox3, SGX doesn't appear to have anything like on a roadmap.
Where's the DX12 for Xbox come from? I haven't been keeping up on rumours (best go read the rest of this thread I guess!). Anyway, we're looking to DX12 for future designs, just like everyone else

That said, enough compute resources would, by my reckoning, make up for specific hardware features in the GPU. So SPUs+SGX would constitute a division of graphics workload more into computer shaders and rasterisers. In that respect one could even reduce the SGX complexity a little - it'd be a balancing act for the transistor budget to get the required functionality in the right places for best performance.
There are definitely elements of Rogue you wouldn't want to spend area on for a fixed console design. There are also elements you'd want to improve on or add from scratch too, so the area/power/cost tradeoff is there in both directions.

I'm curious what instructions you'd have the SGX issue to the SPUs. Could SPU's also be made to appear as a form of shader to the SGX core? Perhaps set to a 'slave' mode when it comes to graphics rendering and taken out of the 'SPU pool' for tasks on a SPU scheduler?
The SPU ALUs do certain things very well that we wouldn't readily devote area to, so you'd want Rogue be able to do compute there in a general sense I think, in such a design, if they could communicate well. Even something as simple as using the SPUs to drive the Rogue CB, you'd want Rogue to be able to drive the SPUs directly in return (at least IMHO).
 
Rys, if you are going to respond to our posts, would you be able to share some examples how mixed SPU and SGX cores can be used together in a meaningful and effective way ? (e.g., Doing what real world tasks, solving what real world problems ?)
 
Has anyone heard anything about DX12? I was under the assumption that it wouldn't come out anytime in the next 2 to 3 years. I expected everyone to have gotten sidetracked by the whole Larrabee thing and therefore would be behind.

Despite what I said about penny pinching before I also think it would be stupid for Microsoft to once again design a console that isn't compliant with their own new graphical standard that they are trying to push in the PC market. I think that the Xenos not being DirectX 10 compliant( lack of geometry shaders ) hurt them a bit in both the PC and console market. IMO the next console should be compliant with whatever hardware advancements that becomes standard with the next version of DirectX.
 
Where's the DX12 for Xbox come from? I haven't been keeping up on rumours (best go read the rest of this thread I guess!). Anyway, we're looking to DX12 for future designs, just like everyone else
Just going by numbers. ;) DX 11 is out now, and so I expect the next iteration for a late XB3 release. But maybe it won't be out. That said, I think the impression is mobile parts lack features of desktop parts. SGX 5 series in NGP is DX9 when DX11 is out. There's been no mention of Rogue being DX11 compatible so people are expecting it not to be, I guess.

There are definitely elements of Rogue you wouldn't want to spend area on for a fixed console design. There are also elements you'd want to improve on or add from scratch too, so the area/power/cost tradeoff is there in both directions.
It's a shame there are no rumours of Sony looking into this. It'd need a lot of work and they wouldn't be able to pull something like this off at the last minute. I guess it's just a pipedream. :(

This design would also fit my Grand Vision perfectly. You could have an SGXCell in a base unit and another in a tablet and, with an architecture that's all about jobs and distributed scheduling, seamlessly combine the two's resources when docked. Although we'd need a cross-device bus that'd allow fast interconnects.

I don't suppose this line of discussion is really valid in this thread though, which is supposed to be about predicting next-gen, and not designing one's own console!
 
Thunderbolt can be a tech solution, but don't known if it be licensed on non-intel product?

I would think so. Intel sells Thunderbolt controller chip and should be happy if the standard takes off compared to USB 3. Limiting it in anyway will harm the standard unneccessarily.


It's a shame there are no rumours of Sony looking into this. It'd need a lot of work and they wouldn't be able to pull something like this off at the last minute. I guess it's just a pipedream. :(

Touche !

This design would also fit my Grand Vision perfectly. You could have an SGXCell in a base unit and another in a tablet and, with an architecture that's all about jobs and distributed scheduling, seamlessly combine the two's resources when docked. Although we'd need a cross-device bus that'd allow fast interconnects.

I don't suppose this line of discussion is really valid in this thread though, which is supposed to be about predicting next-gen, and not designing one's own console!

Argh ! Stop it. Stop teasing the idea ! I'm trying to forget it.
 
Marketing for one. If Sony were to stick with Power architecture for the CPU and use PowerVR for graphics they would really have a complete Power console. I wonder how they would market it and if they could do so successfully.

It's not like IMG doesn't have experience in the console space. The PVR2DC in the Dreamcast was a beast of a chip when the machine launched. The fact Sony picked them for NGP is at least make them a likely contender for PS4 as much as Nvidia or AMD. An 8 or 16 core variant of Rogue I think would make for a nice GPU for PS4.
 
Has anyone heard anything about DX12? I was under the assumption that it wouldn't come out anytime in the next 2 to 3 years.
Shader Model 6 and/or DX12 comes most likely with the next generation chip from nvidia around 2013, not sure from AMD roadmap.
 
Ha ha... I don't dare to dream in that direction to prevent any disappointment. I think Sony will miss their target date again if the system is too complex. I like the idea of mixing SPU and specialized GPU cores though.

I'm prepared for disappointment for over 34 years ...(gamer since 1977 and coding since 1983,today.. not anymore)

But I think(dreaming) it would be the best option for cost-effective way to maintain backwards compatibility with ps3 ("mode2" = Cell + SGX600 as Pixel Engine) and also help developers who want work with a more user-friendly processor ("mode 1" = ARM A15+same SGX600 customized with "steroids") and also compatible NGP, mobile phones universe.
 
Last edited by a moderator:
Marketing for one. If Sony were to stick with Power architecture for the CPU and use PowerVR for graphics they would really have a complete Power console. I wonder how they would market it and if they could do so successfully.

It's not like IMG doesn't have experience in the console space. The PVR2DC in the Dreamcast was a beast of a chip when the machine launched. The fact Sony picked them for NGP is at least make them a likely contender for PS4 as much as Nvidia or AMD. An 8 or 16 core variant of Rogue I think would make for a nice GPU for PS4.

So why doesn't one hear of PowerVR in the Direct X space?
 
Status
Not open for further replies.
Back
Top