Predict: The Next Generation Console Tech

Status
Not open for further replies.
Yes, hilariously, MS studios don't contain any game studios. (Phone is in some of them hardware in some, and XBox in a couple). All the game studio teams are in Redmond town center or one of the downtown bellevue buildings.

Why?


You dont want to know :p
 
Yes, hilariously, MS studios don't contain any game studios. (Phone is in some of them hardware in some, and XBox in a couple). All the game studio teams are in Redmond town center or one of the downtown bellevue buildings.

Why?

Would I be able to get into Redmond town center building with a blue card? Would I be able to catch Durango titles in action? :D
 
Last edited by a moderator:
You're just laying down a blanket statement that everyone says BD is "bad" and then concluding that furthermore it can't be fixed.
Everyone except AMD fanboys concluded that Bulldog really IS bad, yes. It's slow, and power hungry. Those two are a bad combination, and have been ever since the pentium 4. Unfixable - well if you fix the multitude of problems with bulldog you no longer have bulldog, so you could argue it's "unfixable".

The biggest problems with BD IMO are related to scheduling and cache.
Yes, only two of the most fundamental centerpieces of modern desktop processors... Face it. BD is borked.

The FPU is not one of the problems and is actually superior to Intels current offerings, the only catch is that there is only one unit for each two integer cores.
Yes, and this is another fundamental issue with BD that can't be fixed without re-engineering much of the chip and essentially changing it into something new. BD was built with this setup in mind, just kludging on another set of FPUs isn't going to be a workable solution of course.

Personally I think with HSA and being able to offload much of the traditional FPU work to the GPU that this approach will make sense in the future and certainly in a console APU.
GPUs are unsuited, or even extremely unsuited towards running traditional FPU work. A GPU just isn't general enough to handle such problems efficiently, also at least I would prefer if the GPU actually processes GRAPHICS instead of sitting there wasting capacity on running game code. Gods knows the GPUs in next-gen consoles aren't exactly going to be top of the line offerings; you don't want to bog them down with ANYthing unneccessary unless you want the look of your game to suffer...

As has been mentioned above, shedding much of the unneeded extra baggage of orochi coupled with these improvements alone would make a pretty decent console CPU.
You don't just go tearing bits out of CPUs and changing things around on a whim to make a console CPU... Changing stuff in a CPU is an extremely big deal, you're going to require hundreds of engineers and hundreds of millions of $, and enormous amounts of verification of the new design.

but stating over and over that BD is "bad" and "everyone says so" isn't exactly an argument as far as I'm concerned.
Well... What would YOU call BD then? AFAIR, BD doesn't even beat the old core2 penryn core in performance while drawing as much or more power, and that one's a half-decade old processor. That's a pretty monumental failure in my book.
 
Everyone except AMD fanboys concluded that Bulldog really IS bad, yes. It's slow, and power hungry. Those two are a bad combination, and have been ever since the pentium 4. Unfixable - well if you fix the multitude of problems with bulldog you no longer have bulldog, so you could argue it's "unfixable".

Thats a typical rubbish meaningless statement.
Explain then how in 9 months with no architectural changes you can go from bulldozer to piledriver if everything is so fundamentally broken ?

Also i guess you never owned a I7 920 C0 stepping like i do, on the stock cooler at stock 2.66ghz clock i could make it hit 100C and auto shutdown. I guess Nehalem was hot, slow and unfix-able :rolleyes:.

the problem with bigots is that when you dig any deeper then the generic label they get shiity real fast. No one is saying Orochi is a good product or you should put Orochi in a console so whats your point anyway?

Well... What would YOU call BD then? AFAIR, BD doesn't even beat the old core2 penryn core in performance while drawing as much or more power, and that one's a half-decade old processor. That's a pretty monumental failure in my book.

and as bad as bulldozer is thats a complete lie.
heres some facts for your BS.
http://www.anandtech.com/bench/Product/49?vs=434
that is unless your favorite program in the world is sysmark 2007.

Ironic that you can have a spaz in another section of the forum for someone posting partisan ill considered rubbish then do the same in another section.
 
Last edited by a moderator:
Also i guess you never owned a I7 920 C0 stepping like i do, on the stock cooler at stock 2.66ghz clock i could make it hit 100C and auto shutdown. I guess Nehalem was hot, slow and unfix-able :rolleyes:.

I do wonder how Intel avoided class action lawsuits for that one. Mine will throttle to less than 1.66GHz running the crypto subtest in SiSoft Sandra with the stock cooler. I'm (still) running it with a giant heat-piped monster.

Cheers
 
I figure I've sat on this tidbit long enough. It seems Xbox 3's final GPU will have eSRAM instead of eDRAM. I made sure to confirm eSRAM wasn't a misspelling. I don't know the amount though.

I should also add he clarified at that time that the "1+ TFLOP" was an estimate from someone else.
 
What's eSRAM? Wikipedia has no entry other than the HP PA-8800 with ESRAM which it goes on to describe as 1T-SRAM. As on-chip cache is SRAM anyway, I don't see how you could even have eSRAM.
 
It's embedded SRAM Shifty. The same way eDRAM is embedded DRAM (would guess).

If you look for SRAM, there's a section on embedded SRAM in the wiki article. You can also google embedded SRAM to find lots of stuff from its manufacturers.

Also interesting is that on the Wiki eDRAM article it says,
"larger amounts of memory can be installed on smaller chips if eDRAM is used instead of eSRAM. eDRAM requires additional fab process steps compared with embedded SRAM, which raises cost, but the 3X area savings of eDRAM memory offsets the process cost when a significant amount of memory is used in the design."

Seems like MS is going for cheaper manufactering cost over memory density. I do wonder how the system bandwidths would compare, and if this might affect backwards compatibility between Durango and X360?

Edit:
Another quote on the 1T-SRAM wiki page:
1T-SRAM has speed comparable to 6T-SRAM (at multi-megabit densities). It is significantly faster speed than eDRAM, and the "quad-density" variant is only slightly larger (10–15% is claimed).
1T-SRAM is explained to be the "1 transister" variant of standard SRAM (6T-SRAM). Who knows which MS may be using. The Gamecube used 1T-SRAM though...

Anyone with gamecube experience care to comment on SRAM performance? I assume this would be simply a high BW framebuffer, like how eDRAM was used in the 360.
 
Last edited by a moderator:
It's embedded SRAM Shifty. The same way eDRAM is embedded DRAM (would guess).

If you look for SRAM, there's a section on embedded SRAM in the wiki article. You can also google embedded SRAM to find lots of stuff from its manufacturers.
As far as I can see, SRAM as we talk about it (caches) is eSRAM - it's embedded on the die rather than accessed in an external chip (just as DRAM is supplied on separate chips while eDRAM is DRAM on the die). So this eSRAM in the Durango GPU would just be a SRAM cache. I guess EMS are using eSRAM to differentiate 1T-SRAM from normal SRAM? That is, perhaps Durango is using 1T-SRAM a la Gamecube?
 
What's eSRAM? Wikipedia has no entry other than the HP PA-8800 with ESRAM which it goes on to describe as 1T-SRAM. As on-chip cache is SRAM anyway, I don't see how you could even have eSRAM.

That's why I was trying to make sure it wasn't misspelled. But according to the person who said it, this is what the sheet with the alpha kit said.

The Alpha kit uses a discrete graphics card similar in capability and speed to the GPU that will be included in the final design. The card does not have the ESRAM that the final design GPU will

There was no period after will, so I don't know if it goes further than that in regards to the memory.
 
I figure I've sat on this tidbit long enough. It seems Xbox 3's final GPU will have eSRAM instead of eDRAM. I made sure to confirm eSRAM wasn't a misspelling. I don't know the amount though.

I should also add he clarified at that time that the "1+ TFLOP" was an estimate from someone else.

What's eSRAM? Wikipedia has no entry other than the HP PA-8800 with ESRAM which it goes on to describe as 1T-SRAM. As on-chip cache is SRAM anyway, I don't see how you could even have eSRAM.

It's what's used in the Gamecube if my memory is correct.


Edit: Yep

3 MB embedded 1T-SRAM within "Flipper"

http://en.wikipedia.org/wiki/Nintendo_GameCube
 
1T-SRAM is explained to be the "1 transister" variant of standard SRAM (6T-SRAM). Who knows which MS may be using. The Gamecube used 1T-SRAM though...

1T SRAM is a DRAM array with a SRAM interface. DRAM is one capacitor and a transistor per bit cell, SRAM is typically 6 transistors per bit cell. 1T SRAM or eSRAM is fundamentally DRAM.

Cheers
 
1T-SRAM is a registered trademark of MoSys, so maybe eSRAM is used as the generic term for similar implementation from another company.
 
1T-SRAM is a registered trademark of MoSys, so maybe eSRAM is used as the generic term for similar implementation from another company.
That's how it's looking at the moment with what little info we have. Enhanced Memory Systems have their 'eSRAM' technology that is a 1T SRAM implementation. I can find reference to a 72Mb ESRAM component which matches the Wiki reference to the HP processor. That'd be 9 MBs. 9 MBs of local cache for a GPU? Why? But then Wii has 24 MBs 1T, so I expect we could be looking at a slightly less dense but much faster replacement for eDRAM, in theory.
 
As far as I can see, SRAM as we talk about it (caches) is eSRAM - it's embedded on the die rather than accessed in an external chip (just as DRAM is supplied on separate chips while eDRAM is DRAM on the die). So this eSRAM in the Durango GPU would just be a SRAM cache. I guess EMS are using eSRAM to differentiate 1T-SRAM from normal SRAM? That is, perhaps Durango is using 1T-SRAM a la Gamecube?

Yes, but imagine that embedded SRAM would be software managed and not hardware managed like a true cache. If that's the case, the actual implementation would be a lot simpler then a cache. Perhaps, we'd be looking at something like 32-64MB of SRAM with a 1TB/sec bandwidth.
 
Would I be able to get into Redmond town center building with a blue card? Would I be able to catch Durango titles in action? :D
Depends. I don't know how locked down those areas are. Having a blue badge does not guarantee access to any particular building (Although it will let you into most buildings). Being caught wandering around a building with no business need to be in that building could be a career limiting move though.
 
Depends. I don't know how locked down those areas are. Having a blue badge does not guarantee access to any particular building (Although it will let you into most buildings). Being caught wandering around a building with no business need to be in that building could be a career limiting move though.

Just claim your new and your lost :D.

So whats the benefit to Esram s over Edram?

manufacturing?
performance?
cost to move process nodes?
total cost?

If it's the 1st and the 3rd that makes perfect sense, take a hit now while your console is expensive and numbers are low, benefit later with shrinks as your Esram shrinks can keep up with the rest of your system.

Now the big question is if the 8Gb of DDR3 rumors are true is 64MB enough to hide the poor bandwidth behind it, I would have thought not. Unless this is just going to be another GPU buffer like in the 360 but thats a backwards step.
 
embedded sram is a peculiar term, we've had that forever.

wow, the Atari 2600 had 128 bytes of sram!, in a chip that also does I/O and timer.

ah, clicking shifty's link reveals it's "enhanced sram". I was writing a comment without reading the article, as you would do if posting on slashdot.

if we get fast memory pool and large slower ram (64MB is a good looking number, had that in a trusty PC as ram), it may be a disappointment. no "do whatever without worrying about memory sizes".

but the PS2 and X360 followed that model and were highly successful.
Wii U does it. the PS3 and Xbox 1 are the only ouliers.
 
Status
Not open for further replies.
Back
Top