Predict: The Next Generation Console Tech

Status
Not open for further replies.
Was that DDR4 from the leaked 2010 doc?

Does DDR4 even have a roadmap?

Also, even if DDR4 is available would it's price not make going say 4GB GDDR5 a better option?

I imagine it'd be cheaper than GDDR5 as it's still slower. DDR4 was likely just guessing on their part. I think everyone hopes they're using GDDR5. Perhaps the memory arch is split (like PS3) and we're just hearing the combined numbers.
 
I think it's already been said from multiple people that the memory is 4GB GDDR5 plus 4GB of DDR3.

Whether they're just parroting the same old/fabricated rumours however is unclear.

I'd expect at least some GDDR5 in xbox3, even with their eDRAM.
 
I think it's already been said from multiple people that the memory is 4GB GDDR5 plus 4GB of DDR3.

Whether they're just parroting the same old/fabricated rumours however is unclear.

I'd expect at least some GDDR5 in xbox3, even with their eDRAM.

I think 4 GB of GDDR5 with eDRAM is a damn good combination (they could essentially do high res texture packs for every game if they wanted), and I can't see them dropping all of that BOM cost on RAM and only put a Cape Verde equivalent in there. Doesn't make sense.
 
3GB is a lot, but it depends on how much always available functionality they have planned. Bear in mind they can never revise the amount up after launch games ship.
 
Not really, both 360 and ps3 came out with surprisingly low amounts of memory for the time.

Nothing packed more GDDR3 than the 360 back in 2005. The only device I'm aware of that had as much was the Nvidia 7800 GTX 512 - a limited availability showpiece version of their top end card. iSupply pegged the launch 360 as having $65 worth of ram - that's more expensive than 2GB of GDDR5 on today's top end cards.

I recall some surprise at getting 512MB instead of just 256MB, but none that it didn't have a gigabyte*.

*[Edit] Actually, come to think of it, I do recall some surprise in the PC gaming crowd, where people were comparing directly with DDR1/2 from newegg and main ram quantities in 1337 gaming rigs. [/Edit]
 
Last edited by a moderator:
3GB is a lot, but it depends on how much always available functionality they have planned. Bear in mind they can never revise the amount up after launch games ship.

And unlike a PC there may not be a HDD allowing for a sizeable swapfile...
 
I think 4 GB of GDDR5 with eDRAM is a damn good combination (they could essentially do high res texture packs for every game if they wanted), and I can't see them dropping all of that BOM cost on RAM and only put a Cape Verde equivalent in there. Doesn't make sense.

4GB GDDR5? I did rather they lower it to 2GB GDDR5 and spend the budget on a 2-3x better GPU.
 
Yeah, presently it's samples this year, out in numbers next year, replaces DDR3 in the mass market in 2014/2015.

I think going DDR4 for a console to be released into holiday sales next year would be viable, if risky.



DDR4 should be less expensive than any of the alternatives (ex. stacking) for the majority of the lifecycle of nextgen. It might be a tad pricy in the beginning...
Thats what I was thinking, using a rather expensive memory short term with the advantage of using respectable size bus, and with ddr 4 going to be the standard for 5-7 years or so prices will shoot right down very quickly.

I actually brought this up a while ago when peeps thought 720 was arriving this year..so wasn't very popular..I moved onto rambus which went dead..so now I'm going back to ddr 4 @ 4gb :)
 
I think 4 GB of GDDR5 with eDRAM is a damn good combination (they could essentially do high res texture packs for every game if they wanted), and I can't see them dropping all of that BOM cost on RAM and only put a Cape Verde equivalent in there. Doesn't make sense.

First again it's supposed DRR3/4 vs GDDR5, not comparable, cost metrics (and performance) are different.

Second, what if they want all that RAM for all the set top box stuff the console does (not necessarily Kinect, as my original theory), but de-emphasize the GPU because it's only crucial for games? That's my take.

I dont like the design philosophy, but it looks to me like you'll end up with a console in a good overall technical position vs PS4 anyway, so I dont guess it matters how they got there or if it wasnt even their intention.
 
I think it's already been said from multiple people that the memory is 4GB GDDR5 plus 4GB of DDR3.

Whether they're just parroting the same old/fabricated rumours however is unclear.

I'd expect at least some GDDR5 in xbox3, even with their eDRAM.

Really!? If that's even remotely true then we are in for a treat because you don't spend that kind of money on ram and have low end cpu and gpu to connect it with, Microsoft knows how to balance a console.
 
If there is edram in the design I'm not sure of the benefit of using expansive gddr.
Same things about the noise about 4GB of DDR3 and 4GB of dDR3 doesn't make sense if there is edram in the design.
It seems that Sony is likely to lead in perf at this point. With the constrained MS have with their 720 (low price, low power consumption, etc.) if they want to compete for performance lead they may indeed push back the launch till 2014 and 22nm availability.
Either way they are to sit on a lot of their design goals.
 
Is the use of DDR4 even relevant to the CPU side? How insensitive to mem bandwidth is Jaguar? I'm just wondering what the cost/benefit diagram looks for unified/split mem archs and mem types.
 
No argument here. 2GB is sufficient for res capped at 1080p.
I wouldn't be so quick to make such sweeping generalizations regarding a device that's liable to have another near-ten-year lifespan. You wouldn't want to rashly make decisions that could have far-reaching negative consequences that last for many years and years.

For example, Unreal Engine 4, with its voxel tree-based lighting system. Alledgedly, this system gives great results at a high memory cost. Do we really want an entire new generation of consoles permanently unable to run any UE4 games effectively and with full fidelity, hm?

Or more in general, if the hardware resources to experiment with new rendering technology is lacking, then no innovation is possible, and gaming and gamers suffer as a result. Why would we want that? Just a bigger GPU doesn't neccessarily open any innovation doors in of itself. That'd simply be more of the same, not 'more of the new'.
 
I wouldn't be so quick to make such sweeping generalizations regarding a device that's liable to have another near-ten-year lifespan. You wouldn't want to rashly make decisions that could have far-reaching negative consequences that last for many years and years.

For example, Unreal Engine 4, with its voxel tree-based lighting system. Alledgedly, this system gives great results at a high memory cost. Do we really want an entire new generation of consoles permanently unable to run any UE4 games effectively and with full fidelity, hm?

Or more in general, if the hardware resources to experiment with new rendering technology is lacking, then no innovation is possible, and gaming and gamers suffer as a result. Why would we want that? Just a bigger GPU doesn't neccessarily open any innovation doors in of itself. That'd simply be more of the same, not 'more of the new'.

I would argue that if the cape verde rumor is true, 1080P UE4 games are already in jeopardy. In my opinion, MS and Sony should be targeting the Samaritan demo as their benchmark. I also don't see a case where they go 4GB GDDR5 only to give a mediocre GPU, FWIW.

BTW, Wii U is already doomed to not run UE4 to full fidelity ;)
 
Last edited by a moderator:
Single threaded benchmarks (Cinebench R10 ST) show that a modern OoO core has over 3x higher single threaded IPC (instructions per clock) than an older OoO (speed demon) core. Of course a single Xenos core is even slower than similarly clocked P4 (it's an in-order core after all).

Yep. It's a very long time since I last did low-level stuff for P4, but if I had to pull a number out of a hat I'd say that a Prescott thread is capable of doing ~2x work per cycle that a Xenon thread can.

I hadn't realised the difference was so large. So this puts Ivy at around 6x the single threaded performance of Xenon. Or in other words, a quad Ivy at a mere 65w and 160mm2 would deliver the nice target of 8x CPU power over the current consoles at 3.2Ghz and include a fairly decent GPU to support a discrete GPU to boot. Or does that not account for the greater advantage Xenos would take from SMT compared with Ivy?

Shame the consoles wonn't go the Intel route!
 
Status
Not open for further replies.
Back
Top