Nextbox: "XBOX Loop"

Any custom part can do a specific job more efficiently than a more generalised part, but you are trading performance for flexibility, and the past 20 years have shown less and less specialisation in favour of more flexibility. PS3 features the last GPU from nVidia to have specific vertex and pixel units, with nVidia talking at the time about how discrete shaders are more efficient yadayada, and look where that got PS3. ;) I can't think of any discrete part that'd make sense in the consoles' silicon budget. Video decoding, data encryption/decryption, physics, raytracing, are all jobs served by discrete chips but which don't make sense bundling them all together like a 21st century Amiga!
 
This doesn't mean however that heterogeneous architecture doesn't make sense. It's been done with some success in PS3. ;) It's calling different execution units "this accelerator" and "that accelerator" that doesn't make much sense.
 
There is nothing illogical about refusing to do business with a company where a prior relationship was severed due to pricing conflict.
That's a pricing conflict that is almost 10 years in the past. You're seriously saying time is not a factor in these matters, that NV and MS could never ever do business again, even after 10,000 years? :LOL: I don't think so!

You can trust MS to write better contracts this time around, when they've got a bit more experience under their belt. All MS had made in the way of consumer hardware before the original Xbox was a bunch of input devices, mices, joysticks and stuff like that. Maybe some webcams too. Nothing particularly complicated.

And while Tegra might help Nvidia grow in the mobile market, you don't see them trying to replace their desktop based GPUs with their ARM based GPUs anywhere along their offerings where GPU is of upmost importance.
I don't even know what point you're trying to make here. You're totally comparing apples and oranges. Tegra isn't aimed at the desktop so that's why you're not seeing it there...

If what you are saying is so true
I'm not saying anything, I just posted about an intriguing rumor. It's probably all bullshit like with most such rumors.

then why doesn't Vita include Nvidia based tech especially since Nvidia and Sony already have a relationship due to the PS3. Yet Sony licensed it GPU tech from Imagination instead using Tegra 3.
IMG has the best portable graphics right now, which is most likely why Sony picked them for the vita. NVs forte is desktop graphics (which is what would go into a theoretical XB "loop".) You're again producing convoluted reasoning to make flawed points.
 
Any custom part can do a specific job more efficiently than a more generalised part, but you are trading performance for flexibility, and the past 20 years have shown less and less specialisation in favour of more flexibility. PS3 features the last GPU from nVidia to have specific vertex and pixel units, with nVidia talking at the time about how discrete shaders are more efficient yadayada, and look where that got PS3. ;) I can't think of any discrete part that'd make sense in the consoles' silicon budget. Video decoding, data encryption/decryption, physics, raytracing, are all jobs served by discrete chips but which don't make sense bundling them all together like a 21st century Amiga!
The Amiga had it right, and did it almost 30 years ago :). PCs at the time had no dedicated hardware. By the 90's we saw specialization coming in, with soundblasters and VGA cards. Nowadays we're so used to these specializations we don't even see them anymore. For instance, you talk about general, but ignore that the GPU is about as specialized and discrete as they come. Every PC nowadays comes with audio hardware (cheapo crap, sure, but it's there and still has SB compatibility deeply buried under it) and the ridiculously specialized GPU. How is that so different from the Copper, Blitter and Paula?
 
That's a pricing conflict that is almost 10 years in the past. You're seriously saying time is not a factor in these matters, that NV and MS could never ever do business again, even after 10,000 years? :LOL: I don't think so!

You can trust MS to write better contracts this time around, when they've got a bit more experience under their belt. All MS had made in the way of consumer hardware before the original Xbox was a bunch of input devices, mices, joysticks and stuff like that. Maybe some webcams too. Nothing particularly complicated.

I don't even know what point you're trying to make here. You're totally comparing apples and oranges. Tegra isn't aimed at the desktop so that's why you're not seeing it there...


I'm not saying anything, I just posted about an intriguing rumor. It's probably all bullshit like with most such rumors.

IMG has the best portable graphics right now, which is most likely why Sony picked them for the vita. NVs forte is desktop graphics (which is what would go into a theoretical XB "loop".) You're again producing convoluted reasoning to make flawed points.

Ohh I see where our wires are crossed. I thought your reasoning for thinking the Nvidia was the clear choice is because Nvidia could produce a ARM based CPU/GPU combo and stick in a MS console. But you seem to be pushing the ideal of a Nvidia ARM based CPU tied to a Nvidia desktop based GPU. Yeah that makes a lot more sense. Sorry about that.

I still think Nvidia is not going to get MS's console business. 10 years isn't a long time when you talking about an issue that occurred from the last product cycle. Furthermore, there are plenty of competent ARM designers out there. Plus one of the bright spot for the 360 was the ATI gpu which allowed the 360 to hang with PS3 in a way that Sony didn't expect. It also pretty obvious that one of the weakest links for the PS3 was its Nvidia gpu.
 
I honestly don't think MS will shift away from Power architecture, unless they go x86 or ARM in order to build congruities between the nextbox and Windows 8, which I think is a valid possibility. Software and brand platforms are in a sense more important I think. This next go around will be about MS pushing Live and Kinect on both PC and Nextbox. Having as much commonality as possible between Nextbox, PC, and ARM will be in their favor. I envision Nextbox games actually being compatible with PCs that run Live software, and that's why an x86 Nextbox intrigues me. Like I said, I'm stressing the software platform side of things. With a PC Live client, there are no losses incurred with selling the hardware, you can still market accessories and software to the PC user. A console centric gamer can still have their experience, the PCer there's, and one disc you buy at retail can work in both machines. We could also see a "buy the console disc, get the PC version on DD for free with cross platform play and cloud storage" or something similar to that extent.

I think there are so many ways to go about this. All have risks and incentives, though in the end I see the convergence of MS's various hardware platforms by having an all encompassing software platform (Live). Even if mainstream, high end games will be unavailable on mobile platforms like phones and tablets, simply having Live Arcade games as well as user profile access, communciation, etc will be a big plus and keep the service within short distance of the digitally connected consumer.
 
FYI here are the original sources to the 2 different rumors making the rounds. I had seen these before but didn't think they were too credible. Evidently everybody else did though.

It first started from MSNerd with this post on October 21, 2011:

http://msnerd.tumblr.com/post/11725284513/rhythm

E3 2013
Xbox”loop” announce

...

Build 2013
Xbox”loop” launch

I believe that is the original source of the "Xbox Loop" code-name.

He followed with another post on November 2, 2011:

http://msnerd.tumblr.com/post/12233928364/clarity

The Xbox is another story altogether. With a heady mix of rumors, tips and speculation, I am now stating that Xbox codename “loop” (the erstwhile XboxTV) will indeed debut a modified Win9 core. It will use a Zune HD-like hardware platform—a “main” processor with multiple dedicated assistive cores for graphics, AI, physics, sound, networking, encryption and sensors. It will be custom designed by Microsoft and two partners based on the ARM architecture. It will be cheaper than the 360, further enabling Kinect adoption. And it will be far smaller than the 360. It will also demonstrate how Windows Phone could possible implement Win9’s dev platform on the lower end.

The next rumor was started by Paul Thurrott of WinSuperSite.com on the Windows Weekly #233 podcast with Mary Jo Foley which aired November 3, 2011 on TWiT.tv.

http://twit.tv/show/windows-weekly/233

LiveSide was the first to post a summary on November 4, 2011.

http://www.liveside.net/2011/11/04/...tablets-xbox-and-windows-phone-all-next-year/

Basically Paul is saying that Microsoft will announce the next Xbox at CES 2012 and is...

Codenamed TEN, is all about Metro, embedded Silverlight and an Apple-like integration with Windows and Windows Phone

Hopefully that helps put the latest rumors in perspective.

Tommy McClain
 
Hmmm How is an ARM design going to fair against a new CELL design in area and power ... me thinks the ARM design would run away with its tail between its legs. Am I wrong?
 
The Amiga had it right, and did it almost 30 years ago :). PCs at the time had no dedicated hardware. By the 90's we saw specialization coming in, with soundblasters and VGA cards. Nowadays we're so used to these specializations we don't even see them anymore. For instance, you talk about general, but ignore that the GPU is about as specialized and discrete as they come. Every PC nowadays comes with audio hardware (cheapo crap, sure, but it's there and still has SB compatibility deeply buried under it) and the ridiculously specialized GPU. How is that so different from the Copper, Blitter and Paula?

With the advent of new processors has always begun the eventual integration or functional absorption of specialized parts into the more central processing elements of computers.

FPUs, memory controllers, video decode, etc, and GPUs are part of this trend. However, high end GPUs remain too large and specialized for cost effective integration with CPUs, and it will probably be a long time before their complete absorption, by which time, there may be new computing paradigms in place. However, dedicated GPU dies themselves have absorbed the functions of other parts too like video decode.
 
With the advent of new processors has always begun the eventual integration or functional absorption of specialized parts into the more central processing elements of computers.

FPUs, memory controllers, video decode, etc, and GPUs are part of this trend. However, high end GPUs remain too large and specialized for cost effective integration with CPUs, and it will probably be a long time before their complete absorption, by which time, there may be new computing paradigms in place. However, dedicated GPU dies themselves have absorbed the functions of other parts too like video decode.
I think we're talking about specialized silicon, not necessarily a seperate chip. Even in the current gen Intel Core chips that have a GPU on the die, there is still silicon dedicated to the specialised task of pushing pixels. This is different from the eaqrly 90s, where the video hardware was purely for setting screen modes, with (if you were lucky) some memory for a buffer. All the work of setting pixels, and shading pixels, and processing geometry, was handled by the general purpose CPU. Even if we absorb the functions of doing graphics back into the CPU die, I think it'll always have a "graphics block" that is specialised for the kinds of loads that graphics and physics processing requires.
 
I shudder at every mention of the nextx being Zune like. Might as well give up straight away if that's going to be the design philosophy...:D
 
Heterogeneous architecture with different CPUs for different tasks (AI, physics, sound etc.) is laughable.

More likely it will be a powerful SoC with 4GB of DDR3 and maybe some EDRAM or whatever. Perhaps the EDRAM with be a separate die as it is on the 360.

My suggestion (which is worthless) would be a tiny but very capable dual or tri core CPU and a big, flexible DX11+ GPU (think GTX460) on one die (or at least the same package). This would enable some very interesting possibilities and be perfectly feasible even on 40nm. Would maybe draw around 175-200W.

When I see games like Uncharted 3 or GT5 running on PS3, I have to think such a console would be capable of absolutely mind-blowing visuals.
 
Last edited by a moderator:
After the massive success of X360's ATI GPU vs PS3's Nvidia GPU why would MS switch to Nvidia?
Just because Nvidia can make ARM cores?
Please, anyone can make ARM cores. That's the whole idea.
If Apple can build a custom SOC and slap a ARM+PowerVR, I see nothing difficult about slapping N ARM cores + ATI/AMD GPU.
And then if you want to fit the rumors call the Compute Shaders specialized Physics, Sound, AI "cores".
 
I just mean performance on the level of a GTX460, not necessarily made by NVIDIA. AMD's GCN looks like it would be great for a next gen console. I think the next gen is when we'll see GPGPU really taking off in games, and GCN is pretty badass in that regard. Then again, so is Fermi and Kepler will be even better.

For the CPU, I think something like the i3-2100 would be more than enough. It could handle the things the GPU isn't good at while being small and low power. Unfortunately the chance we'll see an Intel design in a next gen console is exactly 0%, but damn it would be nice.
 
Are there examples of a similar cost single-die CPU/GPU combo besting a discreet CPU and discreet GPU of similar costs (power; also remember that a single die chip won't have the same footprint due to defects, power/cooling, etc)? While significant bandwidth is a good thing, I am not sure a single die solution due to communication has really shown to be a big win over having a significantly larger chip (e.g. 50% larger GPU).
 
Are there examples of a similar cost single-die CPU/GPU combo besting a discreet CPU and discreet GPU of similar costs (power; also remember that a single die chip won't have the same footprint due to defects, power/cooling, etc)? While significant bandwidth is a good thing, I am not sure a single die solution due to communication has really shown to be a big win over having a significantly larger chip (e.g. 50% larger GPU).

It may not be about winning the performance game. As we saw with the last generation, cost matters a lot to consumers, and no one wants to lose as much money this time around. I would expect a US$300 price point at launch, and counting inflation that is a lot cheaper than last time around. Further, all console vendors will want to start out the cycle a lot closer to breaking even.

A lower power SoC would also make for a much cheaper cooling solution, and saving $20 a box on cooling, case size, and decreased warranty claims due to a simpler design is absolutely huge in a $300 retail product.

EDIT: to be clear, Sony is going to be subject to these same pressures. I expect all of the big three to be more conservative than last time.
 
Last edited by a moderator:
Are there examples of a similar cost single-die CPU/GPU combo besting a discreet CPU and discreet GPU of similar costs (power; also remember that a single die chip won't have the same footprint due to defects, power/cooling, etc)? While significant bandwidth is a good thing, I am not sure a single die solution due to communication has really shown to be a big win over having a significantly larger chip (e.g. 50% larger GPU).

Next gen won't be driven around visuals.
 
Hmmm How is an ARM design going to fair against a new CELL design in area and power ... me thinks the ARM design would run away with its tail between its legs. Am I wrong?

Any design on the roadmap from ARM itself (A15) wouldn't be competitive. But, there isn't any reason one couldn't build an aggressive, high power processor that implements the ARM instruction set: you would have to implement it yourself rather than licensing the design from ARM. Microsoft has the resources to do this, if they choose. I just have no idea if they will. But, given Microsoft's obvious desire to own all the IP wherever possible, I could see this route being an appealing option from that angle.

The claims I've seen indicates that nvidia intends to take this route with Project Denver: it is claimed to be much more of a PC-level part (in both power consumption and performance) despite being an ARM device.
 
A nice little gimmick would be an SLI/Crossfire expansion module to enable stereoscopic mode and/or enhanced visuals. That way you could still cater for the hardcore crowd while the base model has a modest (for the time) GPU.
 
Back
Top