About nextgen consoles' CPUs..

darkblu said:
Yes, the next generations of consoles will be 'powerful enough' to run modern apps. question is, will they be more powerful at it when compered to the next generation of x86s/g5s? i doubt it very much. (no, i don't need to see the zeta benchmarks, beos 5 has been my home platform for the past 4 years, i'm aware what difference the os design/architecture makes). so would i rather see zeta running on a dual athlon64 mp rather than on a xcpu/cell - bloody right yes.
What apps do we need to run faster than they currently do? there's Wordprocessing, maybe other office apps, video encoding/decoding, audio encoding/decoding, photo editing, games, web browsing. These are the major acitivities are they not?

Wordprocessing runs in the main at the speed the user does. Same with most office apps unless you're doing big database stuff. Video, audio and imaging work should be ideal for Cell. That is, a hundredfold increase in office performance is going to have negligable results for your average home user, whereas a hundredfold increase in media performance will. The Web's only as fast as your connection so processors only good for flash/video which comes under the other categories.

If PS3 Linux say can run most 'office' stuff as fast as my Athlon 2500 that's fast enough. Heck, I used a PIII 800 before this and it ran Office fine. I only upgraded because of media work. If Cell is that poor it only runs as fast as an 800 MHz PIII in office apps, but as we expect it to massively boosts media apps, that serves a better use of performance advancement than continuing GP advances.

I ask the question of you personally, what do you use your PC for and what would you expect a next gen x86/g5 to be noticeably and importantly better at than Cell?
 
Could someone explain to me what actually constitutes general purpose processing?

The most important hardware related demands for the user experience, is that the computer updates the screen fast enough and keeps waiting to a minimum.
All the demanding tasks I can think of, for anything but strictly scientific use, involve multimedia kind of processing (copying lots of data, transforming it, matrix transformations etc.).

There is OS and certain kinds of applications (spreadsheets?), that does much of the work in the cache only and have no use for SIMD or FLOPS, but aren't the requirements of those kinds of apps minimal today, compared to the power of the rest of a modern computer?
 
"General purpose computing"

Generally (at least in the context of these discussions) refers to data shuffling.
Most applications touch a lot of data and do very little actual work on the data.

Graphics programs and certain other numerically intensive tasks, tend to have much more predictable data access paterns, have data that is independant from other neighboring data and do much more work per piece of data.

Data shuffling is prevalent even in graphics to some extent though, many PC games use complex spatial subdivision schemes to reduce the geometry they draw. Walking these datastructures is a potentially very expensive operation, but on a PC you have alot of CPU power and it's "good" at walking these types of datastructure. On consoles (PS2 in particular) you tend to avoid the overhead of these datastructures in favor of just drawing extra stuff.

The bulk of a modern game is basically datashuffling, but the small percentage of "stream processing" can be dominant in execution time.
 
Shifty Geezer said:
What apps do we need to run faster than they currently do? there's Wordprocessing, maybe other office apps, video encoding/decoding, audio encoding/decoding, photo editing, games, web browsing. These are the major acitivities are they not?

Wordprocessing runs in the main at the speed the user does. Same with most office apps unless you're doing big database stuff. Video, audio and imaging work should be ideal for Cell. That is, a hundredfold increase in office performance is going to have negligable results for your average home user, whereas a hundredfold increase in media performance will. The Web's only as fast as your connection so processors only good for flash/video which comes under the other categories.

If PS3 Linux say can run most 'office' stuff as fast as my Athlon 2500 that's fast enough. Heck, I used a PIII 800 before this and it ran Office fine. I only upgraded because of media work. If Cell is that poor it only runs as fast as an 800 MHz PIII in office apps, but as we expect it to massively boosts media apps, that serves a better use of performance advancement than continuing GP advances.

I ask the question of you personally, what do you use your PC for and what would you expect a next gen x86/g5 to be noticeably and importantly better at than Cell?

<rant> darn, i absolutely refuse to retype all the text i had just before my browser went down, so this time it'll have to do with the terse version</rant>

shifty, forget the office apps. there's your os there (kernel, serverices, window managers and what not) with generally tons of code that care about completely different things than the data access habits of some 'exotic' cpus. now just imagine having your desktop os re-written, not just recompiled with quick patches, in order to provide the same level of performance as the generic desktop near you. 'heck, why is my new freetype5 library under ps4linux drawing so much slower than on my old desktop?'. the solution to that is for every significant peace of general-purpose system code (see ERP's definition above of gp code) somebody to spend an adequate time and effort to get things to a proper level of your-exotic-cpu-friendliness, or even above that, so that you'd not notice that it was not originally written for it. now, i'm personally all for that, i just don't see it happening. at least not during this 'next-gen' of consoles.

as about how i use my home machine time - it's approximately 80% textual data reading/writing, 15% compiler runs, 25% compiled apps runs/debugging, 100% mp3 playback (percentages amount to more than 100% because activities overlap).
 
Last edited by a moderator:
darkblu said:
...somebody to spend an adequate time and effort to get things to a proper level of your-exotic-cpu-friendliness, or even above that, so that you'd not notice that it was not originally written for it. now, i'm personally all for that, i just don't see it happening. at least not during this 'next-gen' of consoles.QUOTE]I guess the question is how much is non-reworked OS and System stuff going to crawl on a PPE? Even a quarter the speed of my current PC it'd be comfortably useable. But then if performance is more like 1/10th my machine then there will be significant slowdown for non-optimized code. I only hope Sony are serious about their Linux ideas and aren't just bundling a recompiled, non-optimized kernal. Otherwise they're kinda shooting themselves in the foot, offering what they claim a super-powerful processor and yet showing a dog-slow OS because it's not written for the platform.
 
I must ask this about the SPEs 256k mem.
Do you think it would been better with regular cache?
What pros/cons do you see as a whole or is it to early to say?
 
Shifty Geezer said:
I only hope Sony are serious about their Linux ideas and aren't just bundling a recompiled, non-optimized kernal.
The only money to be made with Linux is in services and support. What you should be asking yourself is "how is Sony going to make money from this product?" rather than hoping they'll pour money into making it optimum. I think it's more likely that it'll be up to the open source community to make it good, and that's kind of a crap shoot. I can see IBM putting money into it, they might leverage that later into servers or something, but the sorts of things Sony would bring (DRM enabled media player, maybe) aren't what most people running Linux would want.

overclocked said:
I must ask this about the SPEs 256k mem.
Do you think it would been better with regular cache?
This was beaten into submission (or bickering) in another thread, do a search. It basically comes down to: "if whatever you're doing fits into the LS then it'll run very well indeed, if it doesn't then not so much". The point of moving to LS was to help alleviate the memory bandwidth problems so having it be cache would have been pointless from that standpoint. If you're using the SPEs in a way that bandwidth is a limiting factor then you're using them sub-optimally. For example it'd be better to do expensive calculations (or come up with a good approximation, which is probably always the better choice actually) rather than use a lookup table which would be the old-school optimization approach. The caveat is that I don't think that achieving maximum efficiency and the real world will necessarily collide when making games for Cell so it's always better to use SPEs for something, even if they're not going to be blowing through calculations, rather than letting them sit idle.
 
chachi said:
The only money to be made with Linux is in services and support. What you should be asking yourself is "how is Sony going to make money from this product?" rather than hoping they'll pour money into making it optimum.
Indeed, we need to understand their ulterior motive for providing 'computing' functions. One is software licenses for more programs than just games. If every art package and media player saw a few bucks head in Sony's direction they'd be very pleased. Also it'll work as the framework to the distributed media. Linux based content programs would be sending 20 cents per music single, $1 per album, $1.50 per film, downloaded over Sony's network services. You wouldn't NEED a full OS for this but if you've gota Linux kernal it wouldn't take much more to open it up for the full OS - in essence Sony adds the OS as an afterthought for extra value in the HDD package. And if I were Sony I'd allow community development but provide all content through an access portal. By all means let Eddy Geek write his own little games and apps for PS3 owners to buy few a few measly dollars, but ensure EVERY transaction slices a little profit in Sony's direction. If they had a suitable DRM system that content wouldn't run without passing through their portal, they could get the rest of the world developing content for them to profit from.
 
chachi said:
This was beaten into submission (or bickering) in another thread, do a search. It basically comes down to: "if whatever you're doing fits into the LS then it'll run very well indeed, if it doesn't then not so much".
Of course, the cell isn't limited to using just one SPU on a problem. Do one half of your algo on one SPU, then punt the partly finished result across the EIB to another to finish it off. The bus transfers 768 bits/cycle after all, which is quite a lot, and this way, memory size is essentially almost doubled (256k SRAM to 512, though a minority would get eaten up by transfer buffers). Even more needed? Use a third SPU, or a fourth; this is the way the machine's designed to function. No need to access main memory, just pass the data along internally, there's one and three quarters megabyte of on-chip SRAM and 896 (!) 128-bit registers useable for processing in PS3, on top of what the PPE also offers. Will STORAGE really be a problem? :)
 
chachi said:
The only money to be made with Linux is in services and support. What you should be asking yourself is "how is Sony going to make money from this product?" rather than hoping they'll pour money into making it optimum. I think it's more likely that it'll be up to the open source community to make it good, and that's kind of a crap shoot. I can see IBM putting money into it, they might leverage that later into servers or something, but the sorts of things Sony would bring (DRM enabled media player, maybe) aren't what most people running Linux would want.
Actually there's already an excuse for how Linux benefits SCE - In EU you can claim PS3 as a computer and can be exempt from the tax for a video game console.

Also, if Sony tries to bring a DRM media player, it'll be built in the "main" real-time OS in the PS3 firmware, not in a Linux in an HDD. As Cell can host multiple OSs simultaneously, I expect the main OS is running in the isolated mode (protected DRM mode) in an SPE even while you boot Linux from HDD. It's like one of SPEs is always crippled when you have fun with Linux, but won't be a big letdown as you have enough power in Cell already. It's a necessary step also from the point of preserving the interest of licensed PS3 developers. Cell PC with 2 gig RAM, it's ideal, but PS3 is enough for hobby use and OSS projects seeing PSP hackers having fun (though firmware hacking is not intended :p ). Just my 2 cents.
 
I think you're overestimating the performance demands of these simple desktop tasks. Try watching the speed of your notebook CPU when browsing the web -- it runs at only a fraction of its top frequency. I've seen GNOME run fine on a superscalar in-order processor with no on-chip L2, so I don't see why a vastly higher clocked dual-issue in-order chip with 512 KB on-chip L2 and multithreading would somehow be so much slower, as you seem to think.

Hell, I've even run desktop apps on my A64 PC alongside several minimized games without noticing any significant performance degredation. Just because today's desktop PC processors have a certain level of performance does not mean that level is required or important for all applications.

darkblu said:
<rant> darn, i absolutely refuse to retype all the text i had just before my browser went down, so this time it'll have to do with the terse version</rant>

shifty, forget the office apps. there's your os there (kernel, serverices, window managers and what not) with generally tons of code that care about completely different things than the data access habits of some 'exotic' cpus. now just imagine having your desktop os re-written, not just recompiled with quick patches, in order to provide the same level of performance as the generic desktop near you. 'heck, why is my new freetype5 library under ps4linux drawing so much slower than on my old desktop?'. the solution to that is for every significant peace of general-purpose system code (see ERP's definition above of gp code) somebody to spend an adequate time and effort to get things to a proper level of your-exotic-cpu-friendliness, or even above that, so that you'd not notice that it was not originally written for it. now, i'm personally all for that, i just don't see it happening. at least not during this 'next-gen' of consoles.

as about how i use my home machine time - it's approximately 80% textual data reading/writing, 15% compiler runs, 25% compiled apps runs/debugging, 100% mp3 playback (percentages amount to more than 100% because activities overlap).
 
ban25 said:
I think you're overestimating the performance demands of these simple desktop tasks. Try watching the speed of your notebook CPU when browsing the web -- it runs at only a fraction of its top frequency. I've seen GNOME run fine on a superscalar in-order processor with no on-chip L2, so I don't see why a vastly higher clocked dual-issue in-order chip with 512 KB on-chip L2 and multithreading would somehow be so much slower, as you seem to think.

it may not be noticeably slower for generic code written one or two generations ago, it very likely will be slower for generic code written for the contemporary generation of desktop cpus (next gen x86s/g5s), to a degree that the previous generation of desktop cpus may beat next-gen console cpus when running that code. you know, developers are a kind who will always find ways to consume any extra performance hw vendors throw at them. and desktop cpu vendors have been throwing quite some power of the 'generic' type at desktop devs for the past several generations, and will do so this time around again. exactly that's the cause for the whole 'desktop game developers complain about next gen cpus' soap opera that emerges now and then.

Hell, I've even run desktop apps on my A64 PC alongside several minimized games without noticing any significant performance degredation.

<sidenote> minimized games don't eat cpu time when written properly. </sidenote>

which games where those? what desktop apps did you run in parallel?

Just because today's desktop PC processors have a certain level of performance does not mean that level is required or important for all applications.

i beg to disagree. it is often required for very fundamental parts of the desktop like the os, and various other essentials. when was the last time you opened a post-scrip-abund document (pdf, ps) under a viewer that has a precise type3 rasterizer (e.g. libxpdf)? notice the page redraw time there.
 
Last edited by a moderator:
nAo and others:
Does doing a ray trace on a particle imply that the particle will pretty much follow a straight line path to the next object it encounters?
If so how do you take into account physics impacting particles like wind blowing snow or sand?
 
You'd cast a ray in the direction vector the particle is travelling and measure for a distance up to particle's velocity. Each frame the physics engine might change that direction so you sample a different direction vector and distance. Diesn't factor in true smooth cureves but at 60 fps it'll be good enough that people are unlikely to notice errors.
 
blakjedi said:
nAo and others:
Does doing a ray trace on a particle imply that the particle will pretty much follow a straight line path to the next object it encounters?
No, it only means a particle moves on a straight line instantaneously.
blakjedi said:
If so how do you take into account physics impacting particles like wind blowing snow or sand?
External forces can be modeled as impulses that modify particle's velocty vector.
 
Back
Top