For the fun of it.Why are ppl even debating this.
For the fun of it.Why are ppl even debating this.
Remember 3DO Mark II (M2) and the less well-known Atari Jaguar II / Project Midsummer? Those never happened as game consoles, or in Jaguar II's case, at all.
There are other examples, ones that actually did get released as game consoles.
The NEC SuperGrafx (originally named PC-Engine 2) is one of them. It had a 2nd graphics chip, more RAM and VRAM. It bombed, badly.
yes, what funFor the fun of it.
We have just shipped 15 million Xbox Ones. In less than a year, this makes it the best product ever, even better than Kinect. Now you can buy the division, for only 2 Billion US $
Shiiiiiiit, it was shipped, not sold, damn you MS!!!
ywell what will the xbtwo look like in 2018, oh what fun
why 2018, seeing as the last gap was 9 years wouldnt that be 2023 ?
yes I'm guessing the main reason the last cycle was so long (except for nintendo) is both sony&MS had some big loses at the start and thus wanted to mitigate those as much as possibleUbisoft and EA are hoping and expecting,
Yeah, at 1,088 Gb/sec!
If ESRAM2 in Xbox Two was crazy high bandwidth like than then you can probably always find novel ways to make the hardware dance that's virtually impossible on a conventional hardware.
Completely unfounded remark. I'm sure if MS wanted a small pool of 1 TB/s eDRAM, they could have made it happen. the choice for ESRAM over EDRAM was about sourcing a supply. eDRM would have had production consequences which is why they didn't go that route.MS could have certainly went with it; but they did not have the architects, nor the skills in house to make such a thing happen.
2 times, eh? How did you calculate that?So it could have been produced for the same price as the current Xbox... 'marvel', but yeah with about 2 times the performance...
Completely unfounded remark. I'm sure if MS wanted a small pool of 1 TB/s eDRAM, they could have made it happen. the choice for ESRAM over EDRAM was about sourcing a supply. eDRM would have had production consequences which is why they didn't go that route.
2 times, eh? How did you calculate that?
In this hypothetical case, we don't know how "small" it would need to be to allow 1TB/s, and we don't know how limited it would have been (directly addressable? only usable for some special instructions?). Maybe it would be completely different from the current ESRAM in XB1.
But I still wonder what is the real technical explanation why the ESRAM is so slow. Many GPUs have external memory faster than this.
It seems to fit the purpose and isn't a bottleneck, but it's certainly not what we thought it could be when we speculated about this. It could have been a special thing like the 360's EDRAM, or the PS3 Cell. But it's only compensating for the DDR3.It isn't really slow though, is it? Memory above 200GB/s is pretty rare, no?
Whoever codes the games, how did you calculate that n MBs (where n is an unknown) of 1 TB/s eDRAM instead of 32 MBs ~200 GB/s ESRAM would result in a machine with twice the performance?you have to read the second part as well:
It seems to fit the purpose and isn't a bottleneck, but it's certainly not what we thought it could be when we speculated about this. It could have been a special thing like the 360's EDRAM, or the PS3 Cell. But it's only compensating for the DDR3.
I mean we would expect on-die memory to be several times faster than external, in that sense I think it's "slow".
A $300 midrange GPU card is about 200GB/s, right?