Predict: The Next Generation Console Tech

Status
Not open for further replies.
Sony has a patent filed about a backward compatibility module (external module containing only cpu, gfx and mem, but using the console's USB, Hdd, BR-Disc, Power supply, output connectors, controllers, etc..). This should be an indication that they are considering a different architecture and that they won't do any design compromise for the sake of backward compatibility.
http://www.siliconera.com/2010/09/1...nsole-to-previous-generation-console-adapter/
 
shouldn't Kepler be considered the current nvidia GPU?
we've yet to see the results, presumably good as the 28nm process feels like it's going great.
then you choose the sizing that meets your power requirements, not try to shoehorn the flagship 250W GPU into a console.
 
shouldn't Kepler be considered the current nvidia GPU?
we've yet to see the results, presumably good as the 28nm process feels like it's going great.
then you choose the sizing that meets your power requirements, not try to shoehorn the flagship 250W GPU into a console.

Actually I was thinking just how feasible say, Tahiti, WOULD be in a console.

It's like 350 mm^2, similar to the ~340mm^2 Xenos+EDRAM budget.

250 watt TDP I believe (including RAM, cooling etc) just up your console budget to 300 watts (not unrealistic imo) and you're practically there. Downclock it to 800 or something to save some more.

All that's fine but the stickler is still the 384 bit memory bus there.

I'd put in a 192 bit bus, the fastest GDDR5 I could get, and call it a day and a console :p Being limited to 1080P might help the bandwidth constraints. Not sure if it's feasible but, a decent HD7970 GDDR5 overclock (not a bleeding edge one) gets you about 290 GB/s, so halve that, you'd have ~145 GB/s.
 
Actually I was thinking just how feasible say, Tahiti, WOULD be in a console.

It's like 350 mm^2, similar to the ~340mm^2 Xenos+EDRAM budget.

250 watt TDP I believe (including RAM, cooling etc) just up your console budget to 300 watts (not unrealistic imo) and you're practically there. Downclock it to 800 or something to save some more.

All that's fine but the stickler is still the 384 bit memory bus there.

I'd put in a 192 bit bus, the fastest 1GDDR5 I could get, and call it a day and a console :p Being limited to 1080P might help the bandwidth constraints. Not sure if it's feasible but, a decent HD7970 GDDR5 overclock (not a bleeding edge one) gets you about 290 GB/s, so halve that, you'd have ~145 GB/s.

Actually,
Assuming the voltage could be dropped to 1 instead of 1.1, and the clock to 600MHz instead of 900MHz, and you're looking at a 134W GPU, instead of 250W.

There are some other unnecessary bits such as the 384bit bus as you mentioned and A ROP count well above the needs of a console, but it goes to show just how flexible the power draw can be for a top of the line GPU without needing to raise the TDP to 300watts. :cool:

Bottom line, Tahiti-class GPU is not unrealistic to expect.
 
Actually,
Assuming the voltage could be dropped to 1 instead of 1.1, and the clock to 600MHz instead of 900MHz, and you're looking at a 134W GPU, instead of 250W.

There are some other unnecessary bits such as the 384bit bus as you mentioned and A ROP count well above the needs of a console, but it goes to show just how flexible the power draw can be for a top of the line GPU without needing to raise the TDP to 300watts. :cool:

Bottom line, Tahiti-class GPU is not unrealistic to expect.

Dropping it to 600 is lopping off a lot of performance though at that point. Maybe, 750? :p

I dont see what's so bad about a 300 watt console anyway, if it came to that. Not like it's not going to come down with later revisions. Certainly PC GPU's have scaled up a lot with regards to wattage, up to about the 300 watt ceiling they've apparently decided is the endgame, no reason consoles cant/shouldn't somewhat follow along after all they're closely related.

That Alienware X51 (console sized) has a 330 watt PSU for example.

Still like you Chef I believe even 200 watts can allow a pretty big GPU, or perhaps they could compromise (that word again) at 250 watts.
 
Dropping it to 600 is lopping off a lot of performance though at that point. Maybe, 750? :p

I dont see what's so bad about a 300 watt console anyway, if it came to that. Not like it's not going to come down with later revisions. Certainly PC GPU's have scaled up a lot with regards to wattage, up to about the 300 watt ceiling they've apparently decided is the endgame, no reason consoles cant/shouldn't somewhat follow along after all they're closely related.

That Alienware X51 (console sized) has a 330 watt PSU for example.

Still like you Chef I believe even 200 watts can allow a pretty big GPU, or perhaps they could compromise (that word again) at 250 watts.

If I had my way, we'd be looking at 500watt consoles! :devilish:

But it seems the masses would reject such a design with the noise, heat and size...

I still don't see why NextGen couldn't use a 17" wide, DVR-sized case which most people are happy with. And if the device is taking over those entertainment aspects, I'm not sure why it would need to be significantly smaller. Noise can be held in check with larger fan(s), heat I don't think most people care all that much, just as they don't shop for TV's based on heat.

Anyway, pipe-dream over. I think the most we can expect would be a 250watt console, and a similar size to ps3/xb360 launch units. But with smarter cooling design (xb360s side/top exhaust).


BTW, bumping that Tahiti clock up to 700MHz @ 1.0v nets 156W
750MHz @ 1.0v = 167W
800MHz @ 1.0v = 179W
 
Xbitlabs:

AMD Working on Graphics Processor for PlayStation 4 - Report.
Sony PlayStation 4 May Be Powered by AMD

[02/23/2012 04:28 PM]
by Anton Shilov


It is well known that Sony Corp. is developing the next iteration of the popular PlayStation console, the PlayStation 4. What remains unknown is the hardware that powers it. According to a media report, the PlayStation 4 will utilize graphics processing technology designed by Advanced Micro Devices.

A former employees of AMD told Forbes web-site that the company is working on a graphics processing technology for the next-generation PlayStation 4 video game console. The ex-employees did not provide any actual details or evidence about the actual proceedings and also naturally remained anonymous.

At present the information should be considered as a rumour as a custom AMD Radeon graphics chip inside the PS4 means that Sony will either have to drop compatibility with PS3 titles on its new consoles, or pay additional royalties to Nvidia Corp., whose chips power the current PlayStation 3 system.

In case the rumours about AMD's custom Radeon graphics processors inside Xbox Next (Loop, Durango) as well as PlayStation 4 are correct, then the company has a reason to celebrate: it is a massive success to power all three next-generation consoles from all three major platform holders, Microsoft, Nintendo, Sony). Such position on the market may be very favourable for AMD as it will allow it to scale its graphics processing architecture beyond consoles, which are the primary game platforms nowadays, to new types of hardware that will be the gaming platforms of tomorrow.



Among other advantages, Radeon HD architecture inside every next-generation consoles will provide AMD a major advantage on the market of personal computers as all major game designers will have to optimize their titles for AMD architecture and therefore the Radeon graphics chips for PCs will have an advantage over competing solutions.

However, developing of three separate graphics cores for the next-gen video game consoles means that AMD will have to offload resources from other projects, such as next-generation GPUs for PC or ultra-portables, and dedicate them to development of new solutions for console platform holders.

AMD and Sony did not comment on the news-story.

http://www.xbitlabs.com/news/multim...phics_Processor_for_PlayStation_4_Report.html
 
At present the information should be considered as a rumour as a custom AMD Radeon graphics chip inside the PS4 means that Sony will either have to drop compatibility with PS3 titles on its new consoles, or pay additional royalties to Nvidia Corp., whose chips power the current PlayStation 3 system.
Is it just me or is that kind of reasoning complete BS? RSX was relatively simplistic design with nothing extraordinary about it that might make it complicated to emulate. Pretty much the biggest problem will be getting permission from NV to actually do it (if it's even needed) and somehow find a way to run the SPE code fast enough on whatever architecture their CPU will use.
 
Is it just me or is that kind of reasoning complete BS? RSX was relatively simplistic design with nothing extraordinary about it that might make it complicated to emulate. Pretty much the biggest problem will be getting permission from NV to actually do it (if it's even needed) and somehow find a way to run the SPE code fast enough on whatever architecture their CPU will use.


Yes pretty much. They'd need to get libgcm running on the new gpu which shouldn't be too difficult and would be a design spec for the new gpu. That takes care of most/almost all the BC issues gpu-wise. Not sure what they expect that would need NV's permission and why it wasn't already part of the agreement with Sony already.
 
MS had to pay nVidia for aspects of the Xbox BC in the 360. It's entirely possible that Sony may be in a similar position depending on the licence they currently have.
 
Is it just me or is that kind of reasoning complete BS? RSX was relatively simplistic design with nothing extraordinary about it that might make it complicated to emulate. Pretty much the biggest problem will be getting permission from NV to actually do it (if it's even needed) and somehow find a way to run the SPE code fast enough on whatever architecture their CPU will use.

It's about the low level stuff being used. Proprietary nV architecture junk. It's no different than what happened between Xbox and 360. There's stuff down to the metal that's really peculiar that some devs have taken advantage of, but that's all I can say really.
 
MS had to pay nVidia for aspects of the Xbox BC in the 360. It's entirely possible that Sony may be in a similar position depending on the licence they currently have.

Certainly it's possible, but they were very different agreements. MS bought chips from NV where Sony bought an IP. Sony also had the hindsight (of MS's agreement) of how not to get into an agreement with NV.
 
It's about the low level stuff being used. Proprietary nV architecture junk. It's no different than what happened between Xbox and 360. There's stuff down to the metal that's really peculiar that some devs have taken advantage of, but that's all I can say really.

If it's that unique though, then there's no reason to believe that it would work on a current NV gpu as at that level they are as different from an RSX as anything ATI or IMTech would offer.

So, you get libgcm up and running and if they went lower than that and you can't emulate that functionality then it's not BC. That's still going to get you BC for the majority of the games and the ones that it doesn't can either be patched or not. All the multiplats have an ATI version of the game anyhow so they wouldn't be reinventing the wheel with a patch.
 
If it's that unique though, then there's no reason to believe that it would work on a current NV gpu as at that level they are as different from an RSX as anything ATI or IMTech would offer.

Indeed.
where Sony bought an IP.

You probably already knew/meant this, but they rather bought a license to manufacture a design. nV would never sell them the architecture IP to G7x. :p
 
The close coupling between SPEs and the RSX, which some devs are using to do some very unorthodox things, would that be difficult to emulate?
 
Status
Not open for further replies.
Back
Top