Predict: The Next Generation Console Tech

Status
Not open for further replies.
Exotic doesnt have to mean hard to program for. To me Xenos was a pretty exotic GPU at time of release if you ask me as was the 360 unified memory architecture, which still resulted in a relatively easy to program for device. The problem isnt with exotic hardware but with requiring significant change in programming methods required to make the most of that hardware. Its entirely possible for a new exotic hardware not require a big shift in programing practices behind the scenes.
 
Exotic designs have seem to be what pushes the industry forward, allows for developers and the like to explore different ways of doing things. Would we have MLAA if the exotic architecture didn't both cause and allow it to happen? If the machine had what the developers wanted they wouldn't have needed to develop MLAA and therefore we probably wouldn't have it right?
 
The concept of detecting edges and post-processing is hardly attributable to Cell. On PC there hasn't been a great need for such a hack since MSAA performance was good enough, but now we're entering a stage where MSAA is too costly for most folks and now it's an alternative. Even before then, there have been EDAA attempts.
 
If NGP is to go by, i doubt we will see another exotic architecture again.

There isn't any need nor is it cost friendly. It will also most likely be performance-detrimental.

We moved from all exotic so to speak in PS2, to normal GPU and half normal (PPU) half Exotic (SPU) CPU in PS3...so you're basically looking at 1/4 exotic heh. How much longer before that fraction is stamped out?
 
Depends on what you mean by exotic. Looking at all the current gen consoles Rsx seems like the only off the shelf part to me, and it gets the most stick for it ;) Some parts from some machines are just more exotic than others!
 
The concept of detecting edges and post-processing is hardly attributable to Cell. On PC there hasn't been a great need for such a hack since MSAA performance was good enough, but now we're entering a stage where MSAA is too costly for most folks and now it's an alternative. Even before then, there have been EDAA attempts.

Agree and that is why I think the exoctic tech is important, with Cell and the lack of a equally capable gpu the devs had to explore other methods and that led them to MLAA. It also lead them to producing code that had to run on multiple cores (so to speak) and made a valid case for multi-core CPU's.

I just look at the things that both the 360 and PS3 taught developers because the archetecture wasn't like a typical PC or even past generation consoles. PC tech is so stagnant because of legacy that I don't want that for consoles. Would the PC version of Frostbite 2 engine be as good if DirectX didn't follow more in line with what the consoles are doing?

I think the more exotic the tech the better chance we see more improvements with DirectX. I would love for PC hardware manufactures to pay closer attention to the console world and start developing fundamentally different products based on the strengths of these machines instead of relying on pure brute force and insane amounts of vram.

I might be wrong, but today is friday and I really dont care :cool:
 
I don't think DirectX was influenced much by the consoles. And you do know PC hardware vendors designed the console GPUs right? They pay very close attention to consoles and current PC GPUs are way more capable than console GPUs. IMO the only thing exotic about consoles is Cell. Xenos may have seemed exotic at the time, but no longer.
 
I don't think DirectX was influenced much by the consoles. And you do know PC hardware vendors designed the console GPUs right? They pay very close attention to consoles and current PC GPUs are way more capable than console GPUs. IMO the only thing exotic about consoles is Cell. Xenos may have seemed exotic at the time, but no longer.

PC GPU's are only more capable because they get a chance to progress. It's not like I can pop a newer GPU in my 360 or PS3. In fact I would almost wager that you would be hard pressed to find a game on the PC today that could run on 2005 PC GPU hardware that could compete visually with some of the current crop of console games. Even if I'm wrong the distance between the PC GPU's of 2005 vs the games being produced for the consoles in 2010-11 means that with substantially less hardware (back then I had 2GB of Ram and a card with 512mb on my pc) the difference isn't much.

Lets face it, the long running discussions about Killzone 3, Halo, Crysis, Uncharted graphics wouldn't even be around if games looked like that 5 years ago on the PC. So yes, I do contribute that to the exotic hardware of Both machines. A closed system yes, but different way of getting performance from the machines without soley relying on the brute force of the GPU alone.

Just my opinion
 
Agree and that is why I think the exoctic tech is important, with Cell and the lack of a equally capable gpu the devs had to explore other methods and that led them to MLAA. It also lead them to producing code that had to run on multiple cores (so to speak) and made a valid case for multi-core CPU's.
MLAA was developed initially by Intel, presumably as research in an effective AA solution on Larrabee although there may not have been a target hardware architecute in mind and it was just a pure algorithm. MLAA was adopted on PS3 because it is a good fit for the available resources. Other systems use MSAA which is a better fit, because that technique is incorporated in the hardware design. MSAA itself was developed as a fast approximation of SSAA. SSAA was developed to overcome the limitations in discrete pixel-unit display devices. Every technique is an approximation or a short-cut or a compromise to work within hardware limits. Hence:

I just look at the things that both the 360 and PS3 taught developers because the archetecture wasn't like a typical PC or even past generation consoles. PC tech is so stagnant because of legacy that I don't want that for consoles. Would the PC version of Frostbite 2 engine be as good if DirectX didn't follow more in line with what the consoles are doing?
...is untrue. Given finite resources and a need for smart approximations of real-world photon-based imagery, hardware and software developers would always be looking into new techniques. Everything that exists in the DirectX GPU space has evolved from PC architectures independent of consoles and will continue to do so even if there are no more consoles.

I don't think DirectX was influenced much by the consoles. And you do know PC hardware vendors designed the console GPUs right? They pay very close attention to consoles and current PC GPUs are way more capable than console GPUs. IMO the only thing exotic about consoles is Cell. Xenos may have seemed exotic at the time, but no longer.
Smart eDRAM is exotic. ;)
 
As was unified shaders and MEMEXPORT when it was launched.
But they weren't exotic because of consoles. Had R400 not been canned these features would have been on the PC first. Though memexport wouldn't have been exposed. Hence the true distinguishing feature of consoles is the static platform that allows for all useful features to be exposed.

Shifty, I agree that edram has remained an exotic feature I was just trying to keep my post short. ;)
 
For TSMC, they last mentioned 20nm plans last year in April 2010 so it's hard to say if anything has changed. Perhaps not, which would mean "safe" production may begin during 2013.

http://www.eetimes.com/electronics-news/4088580/TSMC-skips-22-nm-rolls-20-nm-process

Global Foundries just rolled out 28nm, so 20/22nm may be viable for console systems come mid-late 2013.


Thanks, that fits my prediction that next Xbox will be launched in 2013. Next generation AMD GPU will be in production in 2012 I assume. Thus, if MS continues to use AMD GPU, their next generation system may use 20/22 nm version of those GPUs. I think it is safe to expect at least HD6970 level of GPU performance in next Xbox system.
 
Lets face it, the long running discussions about Killzone 3, Halo, Crysis, Uncharted graphics wouldn't even be around if games looked like that 5 years ago on the PC. So yes, I do contribute that to the exotic hardware of Both machines. A closed system yes, but different way of getting performance from the machines without soley relying on the brute force of the GPU alone.

Just my opinion

By that logic consoles should have been producing graphics like KZ3 etc... from day one. It's not the exotoc architecture that produces those results, it's the fact that developers have time to focus on a single architecture and optimise for that architecture to a huge degree.

Its not more realistic to expect G70 class hardware to have been producing games like Crysis 2 back in 2005 that is is to expect the PS3 to have done so. It takes time regardless of platform to get that kind of performance from an architecture. On consoles developers have that time. On PC's they don't so older GPU's lose performance in newer games but thats made up for (and more) by newer more powerful GPU's.

I'd be willing to bet a lot that if the PS3 had used a regular dual core x86 CPU and an off the shelf R580 when it launched you'd be getting better results than what you're seeing today.
 
Aside from multiplatform issues, the biggest issue was that it was pretty limited and cost too much perf to do anything fancy. I'd say it was more to get developer's feet wet in time for when a more robust spec could be ratified (DX11) than a particularly exotic part. It shared quite a lot with the dx9 workaround that ATI was demoing with R6xx.
 
Regarding this exotic stuff, was the 360's tessellator seen as an exotic thing back then?

I'd say aside from the edram, xenos was less exotic and more just a half step towards ATI's at that point unreleased next generation pc architecture. It was still on the pc devopment path (more or less) but was released earlier than it's pc counterpart.
 
Nevermind, edited post makes it clearer

The only games I know of using it are Halo Reach, Halo Wars and Rare's games


It's probably a lot more than that considering how limited it is in functionality - it's just not worth mentioning. For example, Alone in the Dark 5 used it. :p
 
Status
Not open for further replies.
Back
Top