Predict: The Next Generation Console Tech

Status
Not open for further replies.
Yes it's from MisterXmedia, you can link the blog if you want
http://misterxmedia.livejournal.com/
or the tweeter profile:
https://twitter.com/misterxmedia

so you don't bu this rumor but you buy other rumors, this is ridicolous and the other are not, fine.
I prefer to wait the official info before call something ridicolous :rolleyes:

since this moment I prefer to elaborate the rumors deeply as I can. Nor me, nor you know the truth, you can believe what you want and for me the same.

If the rumored CPU is a w2w 3d stacking cpu, then it uses 75% less power than a standard cpu, and produce 75% less heat, so you can easly uses the double amount of transistor and staying under the 300W mark.

What's the reason that make you believe that this can't be possibile?

I don't buy that rumor because it makes no sense at all. I'm sure two HD8000s would LOVE to share data with each other over DDR3. I also love how each CPU has 128MB of eDRAM (what would be the size of that again?) along with a HD8000 GPU and 1.5GB of GDDR5 all fitting nice and snug in a small package, and and there are two of them!

But wait, there's more act now and you get a kitchen sink in the form of a VTE...that will do real time raytracing! All this coupled with an additional 5GB of DDR3 can be yours for the low low price of? Magical raytracing chips, 2x HD8000 GPUs, and 256MB of eDRAM. Yea...totally realistic. :|

Not to mention thermal and power limits. The magical raytracing chip alone slays me.
 
Well, the leaked specs don´t point that way. I wonder where all this hype comes from.

IMO I think what happened at that time was some with direct access to the paper specs, saw 8GB of memory, and what seems to be an internal comment from MS about the GPU's performance being comparable to a 680. This is what I think started the hype. Some then passed along that "Xbox 3 is going to blow people away". Once they got their hands on the actual hardware and started working on it, the tone changed. And it hasn't been hyped up since that time period.

Doesn't bother me though if something like that is the case. I'll buy any console that has the games I want and save the raw power for the PC. Heck I'd get the next Xbox just for the potential TV control features. :LOL:

Just for some forum background, but MrCteam/MrXMedia was enough of a joke that he too was banned from these forums just like MrRigby. So yes, his rumors are complete bullshit.

Haha. I had been thinking he seemed like the MS version of Jeff. I commend them on their effort though.
 
IMO I think what happened at that time was some with direct access to the paper specs, saw 8GB of memory, and what seems to be an internal comment from MS about the GPU's performance being comparable to a 680. This is what I think started the hype. Some then passed along that "Xbox 3 is going to blow people away". Once they got their hands on the actual hardware and started working on it, the tone changed. And it hasn't been hyped up since that time period.

Doesn't bother me though if something like that is the case. I'll buy any console that has the games I want and save the raw power for the PC. Heck I'd get the next Xbox just for the potential TV control features. :LOL:



Haha. I had been thinking he seemed like the MS version of Jeff. I commend them on their effort though.

Then, we will have news about performance issues from developers.
 
I don't buy that rumor because it makes no sense at all. I'm sure two HD8000s would LOVE to share data with each other over DDR3. I also love how each CPU has 128MB of eDRAM (what would be the size of that again?) along with a HD8000 GPU and 1.5GB of GDDR5 all fitting nice and snug in a small package, and and there are two of them!

But wait, there's more act now and you get a kitchen sink in the form of a VTE...that will do real time raytracing! All this coupled with an additional 5GB of DDR3 can be yours for the low low price of? Magical raytracing chips, 2x HD8000 GPUs, and 256MB of eDRAM. Yea...totally realistic. :|

Not to mention thermal and power limits. The magical raytracing chip alone slays me.

mmh I read different things, 2x 32 MB of eSRAM, 2x HD8900 (can be a mobility version) and 2x VTE (they are talking about raytracing audio, not gfx, or any vector compute task) 70-110 W each

each gpu has 1,5 GB GDDR5 (in the system 3 GB of GDDR5 plus 5 GB DDR3)

the third soc, Mars, is for BC

all the soc communicates with a Blitter Chip at 550 GB/s

of course we have to take all with a lot of salt, but just try to be precise, we're talking of a system:
dual gpu (maybe mobility version)
8 GB ram
64 MB eSRAM
+ 2 vector unit

is this so incredible? I don't understand why, if it consumes less than 300W and microsoft accepts to sell @ 400$ loosing some money, the forst years, as always

if the source is untrustable this will change the things, but a system as this is all but not impossibile in my opinion. if ps4 will go for a big gpu, 4 GB GDDR5 + some compute units, and the cost is not so different, or I'm wrong?
 
IMO I think what happened at that time was some with direct access to the paper specs, saw 8GB of memory, and what seems to be an internal comment from MS about the GPU's performance being comparable to a 680.

Well, if that was true, Microsoft must have had pretty good reasons to say the GPU performance was going to be comparable to a 680. The leaked specs do not suggest that. So either we are missing some important information, or the GPU was never compared to a 680 to begin with.
 
Wasn't Xbox 360 gpu something "new" comparing to 2005 GPUs? How many 2005 GPUs can run BF3, Crysis 3 or The Whitcher 2 as Xbox 360 do?
Xenos was "new", but ATI was working on the unified shader architecture before Microsoft contracted them. So while Microsoft requested some features they didn't dictate the base architecture.
 
It's a matter of specific implementations though - in theory a DSP can be as incredible as its designers make it for its given taskloads... it just won't be reprogrammable after the fact.

Sony's ex-CTO (Makimoto) seems to be a big fan of FPGA. I saw some reference to Makimoto's Wave. Is there any advantages to FPGA vs ASICs in costs and other production factors ?

Sure, it's always the mystery bits that are interesting after all!

No sh*t !

I also wonder about TrustZone. Sony usually wants total control over their security platform. Will be interesting to see if they roll a completely different one.
 
Well, if that was true, Microsoft must have had pretty good reasons to say the GPU performance was going to be comparable to a 680. The leaked specs do not suggest that. So either we are missing some important information, or the GPU was never compared to a 680 to begin with.

Probably no different than Sony saying their GPU features are like DX11.5. The 680 comment was leaked by aegies.
 
Probably no different than Sony saying their GPU features are like DX11.5. The 680 comment was leaked by aegies.

Features is one thing, performance is another. I don't see Microsoft saying the GPU performance is comparable to a GTX 680 and then put anything less than a Radeon 79xx in the devkits. Downgrading from a Radeon 79xx in the alpha kits to something in the ballpark of a Radeon 77xx, as it appears from the current rumors, would be quite disappointing for the developers. So maybe we are still missing some pretty significant information on Durango, but I don't want to be too optimistic. :)
 
My guess based on the current rumors: 32 MB on a daughter die (in the same package) with the ROPs; internal bandwidth 819.2 GB/s*; bandwidth to the main die 102.4 GB/s. Maybe a smaller pool of ESRAM (8-16 MB) could be used in the main die to facilitate data sharing between the CPU and the GPU.

* The Xbox 360 has 8 ROPs at 500 MHz and 256 GB/s of internal bandwidth in the daughter die. If Durango has 16 ROPs at 800 MHz, scaling the bandwidth gives: (16/8) * (800 / 500) * 256 GB/s = 819.2 GB/s, which is incidentally 8x the rumored 102.4 GB/s, which I believe is only the bandwidth to the compute die.

Interesting fact: the Xbox 360 has a 32 GB/s interconnect bandwidth between the GPU die and the daughter die. Scaling that bandwidth by the ratio of the size of the EDRAM pools gives us exactly the rumored bandwidth 32 GB/s * (32/10) = 102.4 GB/s. It fits too perfectly to be just a coincidence. So, either the rumor was pulled out of thin air by doing the same calculation that I'm doing or we are indeed seeing the comeback of the daughter die. :) If it was just speculation, though, they would state clearly that it is the external bandwidth and not the internal one. A scaled up daughter die, with 32 MB, 819.2 GB/s internal bandwidth and 102.4 GB/s interconnect bandwidth, seems a very reasonable setup to me.
 
IMO I think what happened at that time was some with direct access to the paper specs, saw 8GB of memory, and what seems to be an internal comment from MS about the GPU's performance being comparable to a 680. This is what I think started the hype. Some then passed along that "Xbox 3 is going to blow people away". Once they got their hands on the actual hardware and started working on it, the tone changed. And it hasn't been hyped up since that time period.

Doesn't bother me though if something like that is the case. I'll buy any console that has the games I want and save the raw power for the PC. Heck I'd get the next Xbox just for the potential TV control features. :LOL:



Haha. I had been thinking he seemed like the MS version of Jeff. I commend them on their effort though.

Hmmm, when did this happen exactly? did you get this information from your source or are you coming to that conclusion on your own? if you are then why are you stating it as a fact? Because as far as I know no dev have openly discussed the Durango gpu.
 
mmh I read different things, 2x 32 MB of eSRAM, 2x HD8900 (can be a mobility version) and 2x VTE (they are talking about raytracing audio, not gfx, or any vector compute task) 70-110 W each

each gpu has 1,5 GB GDDR5 (in the system 3 GB of GDDR5 plus 5 GB DDR3)

the third soc, Mars, is for BC

all the soc communicates with a Blitter Chip at 550 GB/s

of course we have to take all with a lot of salt, but just try to be precise, we're talking of a system:
dual gpu (maybe mobility version)
8 GB ram
64 MB eSRAM
+ 2 vector unit

is this so incredible? I don't understand why, if it consumes less than 300W and microsoft accepts to sell @ 400$ loosing some money, the forst years, as always

if the source is untrustable this will change the things, but a system as this is all but not impossibile in my opinion. if ps4 will go for a big gpu, 4 GB GDDR5 + some compute units, and the cost is not so different, or I'm wrong?

He was quoting this fictitious hilariousness

-There isn't going to be a blitter. Something that has started off as a somewhat joke has become a hopeful rumor.

-eSRAM is a bit more expensive than eDRAM so I doubt MS is going to going to front the bill for 2x in each system.

-Dual GPUs? What's the point? In your setup they'd most likely just end up mirroring each other ala SLI/Crossfire. Waste of die among other things

-Somehow I doubt highly that a CPU, 8900M, 1.5GB of GDDR5, 32MB of eSRAM, and a VTE on each unit is ONLY using 110W.

-What's the point of a 'VTE' on each SoC if it's being used for audio?

-And then a third SoC containing 360's chipset, oh well pile on the PCB complexity! #YOLO!

So we have 3 SoC, 3 separate memory controllers, multiple VTEs on each SoC just to do audio, 3 separate memory pools, two 8900Ms. All of this for the low low price of?
 
Hmmm, when did this happen exactly? did you get this information from your source or are you coming to that conclusion on your own? if you are then why are you stating it as a fact? Because as far as I know no dev have openly discussed the Durango gpu.

If you're asking me this, then you didn't see the post I was responding to and the post that person was referring to. Lherre also initially said Xbox 3 "was a beast".
 
Last edited by a moderator:
That's not the first time they've pulled quotes from B3D. I suppose they assume it's ok since it's an anonymous forum, however sooner or later someone connects the dots and that leads to trouble. That's pretty much the main reason I don't post here anything of substance anymore.
As an open forum, everything posted can be considered public domain. I guess if quoted as 'some known ex dev on a forum said ... when asked ...' then it's legitimate reporting and unavoidable, but if presented as 'we asked a dev and they said' excluding context, or worse is an interpretation misrepresented, that's definitely bad form. Then again, lots of reporting is misrepresented. I guess maintain context produces document tl;dr for the masses, so corners are cut.
 
He was quoting this fictitious hilariousness

-There isn't going to be a blitter. Something that has started off as a somewhat joke has become a hopeful rumor.

Nice, how do you know this?

-eSRAM is a bit more expensive than eDRAM so I doubt MS is going to going to front the bill for 2x in each system.

64 MB of eSRAM would cost about how much 10 MB eDRAM in the 2005, so what's the point?

-Dual GPUs? What's the point? In your setup they'd most likely just end up mirroring each other ala SLI/Crossfire. Waste of die among other things

it's not a cross fire and it's transparent to the developer

-Somehow I doubt highly that a CPU, 8900M, 1.5GB of GDDR5, 32MB of eSRAM, and a VTE on each unit is ONLY using 110W.

-What's the point of a 'VTE' on each SoC if it's being used for audio?

it's easy and elegant to print a soc with VTE inside and put two of it on board
soc are means to have cpu, gpu and the rest of the microsystem, included audio, so what's so strange for you?


-And then a third SoC containing 360's chipset, oh well pile on the PCB complexity! #YOLO!

ps3 with ps2chip inside for BC say nothing to you?

a lot of normal things seems incredible to you, why?

All of this for the low low price of?

I think 400-450 $
 
Everyone, stop saying "special sauce," for the love of God!

Also, I'd still treat these rumours as rumours and not bullet-proof fact.

Also, the tone of this thread has re-entered "fanboy war" territory, if it ever left it.
 
Status
Not open for further replies.
Back
Top