Predict: The Next Generation Console Tech

Status
Not open for further replies.
The fact that the article mentions "very affordable" hardware might indicate the PS4 will only have an APU instead of a combination of APU + discrete GPU.

This is Sony supposedly talking, $499 retail could be "affordable" in their terminology. Who knows! I know it's only an early dev kit, but no way is the A-10 capable or running next gen games at the stated 1080p60 3d. Still plenty to be revealed yet.
 
PS4 discrete GPU on modular daughterboard.

Is this a statement founded in insider knowledge or just a suggestion? :) I'm very intrigued.

Also, sweetvar on NeoGaf, who we know to be reliable as he leaked solid info about Orbis and Durango before, pretty much confirmed that Sony's PS4 will have an APU + discrete GPU.

I don't see it being anything less :)
 
Is this a statement founded in insider knowledge or just a suggestion? :) I'm very intrigued.

Also, sweetvar on NeoGaf, who we know to be reliable as he leaked solid info about Orbis and Durango before, pretty much confirmed that Sony's PS4 will have an APU + discrete GPU.

I don't see it being anything less :)

An APU + 88XX GPU and 4+ GB of fast RAM would be perfect, but who knows...
 
I'm still wondering why people are all over this APU+GPU thing. If you want an expensive machine, that's the way to go.

Pretty sure its only for alpha/beta devkits to simulate the final APU. Or they are still undecided if APU will be possible for launch spec. They might be forced to separte CPU/GPU
 
What is better for a game console, a CPU like Bulldozer with AVX, or an APU with a good GPU inside?

Add a powerful discrete GPU in both cases.
 
Last edited by a moderator:
I doubt any dedicated hardware will make it for PS3 BC in PS4, considering the cost of Xenos vs RSX this could also cost less, HDD costs have fallen unless they go for SSD and the latter could pretty much be negligible costs

Sony did file that patent for an outboard backwards compatibility module as well, and they spec'ed a gigabit ethernet port for the PS3 from the beginning. Maybe they'll be able to use that 7th SPU to do some video compression coming off of a PS3 and squirt the data out the gigE port to the PS4.

Or, you know, not. :???:
 
Pretty sure its only for alpha/beta devkits to simulate the final APU. Or they are still undecided if APU will be possible for launch spec. They might be forced to separte CPU/GPU

If they use stacking and have extremely high bandwidth RAM on top of an APU, that might be pretty decent, no? Or would that bandwidth be extraneous due to Amdahl's law?
 
http://www.eurogamer.net/articles/df-hardware-amd-virgo-review

A10 - this what the Wii U should have been for the current price
Well the power consumption is not even in the same range as what Nintendo wanted.
--------------------------------------------------------------------------------------------------

Other than that, it's confirmed by AMD last road map that Stream rollers are delayed. Too bad it seems that they were to bring meaty improvements to multi-threaded performances that might have justified the power budget.
I'm not sold on piledriver based design, steal a lot of silicon and power from the GPU / rest of the system.
All AMD will have for 2013 is reworked Piledriver. luckily it seems that Jaguar cores are in time.

I would definitely prefer a Jaguar based system especially it Sony wants to deliver something like a "wii for core gamers". I'm not that pleased with too big, too noisy, etc, and ultimately expansive... systems. I've been advocating for this for a while and even though it's just a rumor I would be pleased if Sony makes sense instead of further digging its own grave... :(

Speaking of which as the last rumor speaks of an APU + GPU design, I don't find it a that compelling option. At this point if they want performance for cheap they might be better off putting two APU head to head. That way they could have lowered R&D costs, they may produce only one chip, even use salvage parts (* more on this later), It could have looked like this: DDR3<=>APU1<=>APU2<=>DDR3
You need a fast and coherent link between the two APUs but then there are benefits as you end up with 2 128 bit buses to DDR3 and so twice the bandwidth provide to something like trinity.
Another win is you have more CPU resources to play with.

A perfect chip to me could be like 4 Jaguar cores(2 MB of L2, 1.8GHz), 8GCN CUs, 8 ROPs, VPU, some ARM CPU for security, 128 bit bus DDR3. The chip would be somewhere in between 160 mm^2 and 180mm^2 (home made estimate), pretty cool and chip to produce. That's for the fully functional chip.

Back to my *, Sony could even use salvage parts, either a fully functional chip and one with CPU cores, CUs, etc deactivated. That or a blend:
Like "APU1 4 cores + 6 Cus+ VPU" and APU2 "2 cores + 8 CUs, VPU killed/fused".
/ Pretty much anything that is a match for what they are getting out of the foundry and seems to not be too bothering from a software POV.
I don't know they may want the GPU to identical, I could see that being practical but be less bent on the CPU. APU2 CPU cores for examples could run the OS, sounds, etc. / any task that is not critical / performance sensitive.
May this could be more optimal.
APU1 4 cores + 7 CUs + 8 ROPs
APU2 2 cores + 7 CUs + 8 ROPs

The point would be to use pretty much all the production of what should be a part with high yields.
 
Last edited by a moderator:
so if it was going to be an apu, its probably going to 2 or 4 jaguar cores, because piledriver just isnt suited to a typical gaming load, its also big and hot and power hungry, you simply arent going to get 4 modules + a gpu of any merit in a package that can be cooled in a console case. never gonna happen.
If they are going the Jaguar route, 2 cores aren't cutting it. And a Jaguar core is tiny (2.9mm²), 4 cores including 2 MB of L2 are measuring just 20mm² or something. A single BD module (also including 2 MB L2) stands at already 31mm² in 32nm (meaning it is probably at least as big as the 4 Jaguar cores even on 28nm). What do you want in a power budget of ~35W (to leave 100+W to the GPU part) and a die size of ~45-50mm² in 28nm?
(i) 8 Jaguar cores running close to 2GHz
or
(ii) 4 BD cores at maybe 2.5 to 3 GHz, both having roughly the same IPC?
 
I could believe in a dual APU solution or APU + A/GPU (meaning two AMD cores on the GPU itself to act as additional processing needed by GPU but needing faster access to CPU cores).
 
I could believe in a dual APU solution or APU + A/GPU (meaning two AMD cores on the GPU itself to act as additional processing needed by GPU but needing faster access to CPU cores).
You mean Sony has tortured the devs with the PS3 so long that it's now time for the next hard to program architecture? :LOL:
 
If overall performance is roughly the same, 4 bulldogs should be superior to 8 jags, as juggling lots of cores is often stressful to programmers and may lead to underutilization in a real-world situation.
 
Well the power consumption is not even in the same range as what Nintendo wanted.
--------------------------------------------------------------------------------------------------

Other than that, it's confirmed by AMD last road map that Stream rollers are delayed. Too bad it seems that they were to bring meaty improvements to multi-threaded performances that might have justified the power budget.
I'm not sold on piledriver based design, steal a lot of silicon and power from the GPU / rest of the system.
All AMD will have for 2013 is reworked Piledriver. luckily it seems that Jaguar cores are in time.

sorry i might have missed it and it seems the internet has missed it where is this confirmed delay? Also did you not like read the last frigging page or anything. keep the clock around 3.0ghz and you're not buring power.
 
Also, sweetvar on NeoGaf, who we know to be reliable as he leaked solid info about Orbis and Durango before, pretty much confirmed that Sony's PS4 will have an APU + discrete GPU.

How could you possibly verify that anything about Orbis or Durango is "solid info" at this point?
 
That's an assumption. We might be seeing a change in what the devkits are required to do, warranting more RAM so that the final console is 4 GBs with a 16 GB devkit. We also don't know what the devkit RAM is. "8 or 16 GBs"? Does that mean the poster doesn't know which of the two it is, or are their two flavours of devkit with differing RAMs. I'd be very hesitant to call platform RAM at this point, other than a safe 4 GBs. If the rumour is true. ;)

well, VG247 mentioned there are to be four versions of the dev kit, and with January being the final. so that means there were already two versions in existence and now with a third. so if you take the dev kits and average the numbers out you'll probably get the medium.

One has 8 gigs the other 16. my money's on the ps4 probably having 8 gigabytes because one of the kits hits 16 gigs (most likely the final version.) so that basically upped the average as far as memory is concerned.


This is Sony supposedly talking, $499 retail could be "affordable" in their terminology. Who knows! I know it's only an early dev kit, but no way is the A-10 capable or running next gen games at the stated 1080p60 3d. Still plenty to be revealed yet.

Depends on the developer, but that's usually always the case with ALL consoles. Don't forget pc games aren't the best source to gauge hardware. metro 2033 and the witcher 2 have poor support for tons of gaming rigs, and waist lots of resources to run correctly.

The A-10 5800 can pull off gaming with some added DX-11 features like tessellation at 1080p. PC games aren't the correct source for hardware utilization.


What is better for a game console, a CPU like Bulldozer with AVX, or an APU with a good GPU inside?

Add a powerful discrete GPU in both cases.

For a console, an APU would be an "affordable" solution, which is what Sony says they want to do with PS4. (according to VG247.) Comparing the two, however, an FX is still a better CPU on its own. But they can't be compared for graphics capabilities since FX doesn't include a GPU.
 
Last edited by a moderator:
One has 8 gigs the other 16.
It's unclear if that's old and new devkits, or two flavours of the current devkit.
my money's on the ps4 probably having 8 gigabytes because one of the kits hits 16 gigs (most likely the final version.) so that basically upped the average as far as memory is concerned.
I consider that far from certain. As I say, it depends on what the devkit has to do, but in this age of VMs and potentially with larger services at play, the devkits may well have 4x the console RAM to accommodate other development tasks. 8 GBs slow RAM would leave a large question mark over system bandwidth. If PS4 hasn't got eDRAM, where is that 8GBs going to get hundreds of GB/s BW from? Or have we got another split RAM pool?
 
I consider that far from certain. As I say, it depends on what the devkit has to do, but in this age of VMs and potentially with larger services at play, the devkits may well have 4x the console RAM to accommodate other development tasks. 8 GBs slow RAM would leave a large question mark over system bandwidth. If PS4 hasn't got eDRAM, where is that 8GBs going to get hundreds of GB/s BW from? Or have we got another split RAM pool?

Xbox Durango kits have 12GB of ram, if it is 4x the console ram, Xbox 720 will have 3GB of ram :oops: I doubt that development task requires 9GB of ram.

VS2012 requirements:

Hardware requirements


  • 1.6 GHz or faster processor
  • 1 GB of RAM (1.5 GB if running on a virtual machine)
  • 10 GB (NTFS) of available hard disk space
  • 5400 RPM hard drive
  • DirectX 9-capable video card running at 1024 x 768 or higher display resolution




I know it is not the same for XDK, but I really doubt that VM and services and dev tools requires more than 6GB of ram.
 
I think the point was more devkits usually have more memory than the retail units.
I wouldn't go drawing conclusions beyond retail units having less than what's in the dev boxes.
Especially when the devkits are glorified PC's, rather than based on final hardware.

The reason it's usually 2x is that it's usually the easiest thing to do if the devkit is based on the retail unit, not because devs need 2x the amount of memory, though a lot of teams screw themselves by using it in development. Best part is you can always tell who fucked up this way at E3 because they run on devkits and not testkits and the only real reason for that is the game won't run in the final memory footprint.
 
Status
Not open for further replies.
Back
Top