tongue_of_colicab
Veteran
Were not living in the '80 anymore though. There is a reason it died out. '80 monitors were large and expensive but these days you can get ''22 flatscreen monitors for under 100 euro's.
The reasons for the age of integrated home computers bought for games and productivity dying out are numerous. The point is back then one device used for everything was popular. It's also been popular with PCs. It's also popular in mobile forms on tablets, and even on smartphones. It's even being extended to CE devices like Android TVs, with productivity apps being runable on TVs via web services, though clearly that's early days yet. The notion that people take an active dislike to added value to devices confuses me. Where adding features adds cost, then yes, there's a reason against it. But where the internals of a device contain a processor, graphics, local storage and access to high capacity non-volatile storage, and the device already runs a variety of softwares, extending that comes at little cost.Were not living in the '80 anymore though. There is a reason it died out. '80 monitors were large and expensive but these days you can get ''22 flatscreen monitors for under 100 euro's.
So, um.. you do realize that the 360 is running Windows, right? Sure, it's a heavily modified, limited windows kernel, but it's still mostly compatible. Most of my tools will compile for either xbox or windows with a single makefile change.
Intel's highest end processor at the end of 2005 was the Pentium D and i doubt anyone wanted that abomination in a console.
AMD had the Athlon X2 @ 2,4 Ghz (with a TDP of 110 W). The Athlon might have a higher performance, than the Xenon, in some fields but has nowhere the same SIMD capabilities.
I think i don't need to compare these cpu's with the Cell cause everybody reading here, by now, should know that the Cell is helping the RSX in ways an Athlon X2 or a Pentium D could never have done.
I though Xenos was the X360 gpu or was it Xenon? Microsoft seriously needs to dump the confusingly close codenames.
The reasons for the age of integrated home computers bought for games and productivity dying out are numerous. The point is back then one device used for everything was popular. It's also been popular with PCs. It's also popular in mobile forms on tablets, and even on smartphones. It's even being extended to CE devices like Android TVs, with productivity apps being runable on TVs via web services, though clearly that's early days yet. The notion that people take an active dislike to added value to devices confuses me. Where adding features adds cost, then yes, there's a reason against it. But where the internals of a device contain a processor, graphics, local storage and access to high capacity non-volatile storage, and the device already runs a variety of softwares, extending that comes at little cost.
Not to derail with this a discussion that's been had many times before. Suffice to say multifunction devices are on the increase, proving people are not against versatile CE devices.
Ironic comment coming from a poster named Xenus.I though Xenos was the X360 gpu or was it Xenon? Microsoft seriously needs to dump the confusingly close codenames.
For power and heat reasons absolutely. I still think both console CPU's are pretty impressive for their size/heat/power. For raw performance though I'm betting most devs would rather have seen a 3.2 Ghz PD in the 360 than Xenon.
And what exactly does that SIMD performance translate into in the real world? Let me put it another way, Xenon has more theoretical SIMD performance than an i7 920. Which app/game/anything demonstrates that?
Possibly, it's never been demonstrated or even attempted so who knows? But Cell is halfway to a GPU anyway, hence why it performs so well under those types of workloads. No doubt it helps RSX out a lot but what would have performed better, an Athlon X2 with Xenos or Cell with RSX? If your GPU is good enough in the first place then better to have an good all round CPU than very weak CPU that can also act as a decent prop for a slow GPU.
Out of curiosity I wonder what developers would have preferred out of Xenon with an 8800GTX or Cell with an 8800GTX (the 8800 is to remove the GPU limitations).
It really isn't that hard...
Xenos GPU
Raw performance of what??
Performance certainly isn't defined by the ISA in any capacity so I'm not sure what you're trying to say here...?
Read my previous posts...
CELL is far from very weak & in typical PS3 game workloads today would piss all over an Athlon X2 with great ease.
Square Enix: Next gen to bring 'a big leap' in graphics
Stylised games may seem to plateau, but realism will raise the bar, says technology director Julien Merceron.
The next generation of games machines will be driven to new levels of graphics realism, says Julien Merceron, worldwide technology director at Square Enix.
"I think that we're still going to see a big leap in graphics," Merceron told VideoGamer.com at Develop this week. "In terms of technology I think we'll see developers taking advantage of physically-based rendering, physically-based lighting. I think people will take advantage of global illumination, or at least some form approximation of global illumination, so that could have a significant impact on graphics quality.
Physically-based rendering and global illumination are techniques that allow coders to create photo-realistic effects. Both processes are already used in CGI film-making - and this in turn could benefit game developers.
"It's going to enable new forms of art direction, but it's also going to enable deeper convergence between multiple media - being able to share more assets horizontally between movies, TV series and games," said Merceron.
"This means that when you're doing a cartoon, or when you're doing an animated movie, you could think about an art direction for the game that could be far closer [than current tie-ins]. Obviously it won't be the same, because the processing power won't be there, but you can think about art directions being way closer. And you can think about assets being re-used."
But merceron believes that graphical advances will be most evident in games that strive for a realistic appearance.
"There's a lot of room for improvement, and consumers will be able to see that in future graphics innovation techniques," he said. "Now, if you take most of the Pixar movies from the last five to six years… do you see a big difference between one that was released five years ago, and one that was released last year? I'm actually not sure we see a huge difference.
"But if you take a film like Avatar, there's a huge leap in the graphics techniques that are being used and the level of realism. The conclusion I would draw from that is we might end up seeing the difference way more in realistic-looking games, rather than those trying to achieve a cartoony look. At some point, with all these games [that are] going for a cartoony look, consumers might get the feeling that it's plateauing. But for games striving for a very realistic look, it's going to be easy to see all the improvements, all the time."
During the same conversation, Merceron expressed his belief that social networks and similar experiences will play a major role in the future of gaming.
You listed a few theoretical examples where you think SIMD performance is useful.
I'm asking you to put that into the context of everything else that goes on in a game to provide some real world examples of where it's resulted in something like Xenon with its apparent SIMD advatage giving superior results to something like an i7 920.
Obviously theirs a lot of power in Cell but it's all down to the SPU's and most of it's spent on tasks that would be better performed on a decent GPU.
That's just it, what exactly would you want your general purpose CPU to do?By very weak I was really referring to the PPE. As I said, if you have a decent GPU in the first place thus negating the need for much of the work Cell does in those PS3 worloads then are you better off with Cell or a good all round CPU?
As a processor to help out with GPU work then Cell certainly is much better than an AX2 and probably in some ways better than modern quads too. But if you don't need that help on the GPU front then is it better than an AX2 for everything else? Certainly I don't think theirs anything on PS3 that couldn't be done on an AthlonX2 powered console with a better GPU.
I'd rather not we end up in situations where x86 CPUs have been adopted under the assumption that GPGPU means we can de-emphasize SIMD performance on the main processor. What if it means we're stuck at 720p because the GPU is spending so much time doing physics, sound, or whatever? Especially when at the same transistor budget you could probably have a Cell-like design where you're producing 1080p and actually offloading graphics tasks to the CPU without sacrificing anything. What task is so critically important that a Sandybridge is best at that it would be worth potentially giving up 4-8 times the SIMD power?