So there's no way PS3 CPU is getting more than one PE~Cell?

Status
Not open for further replies.
Hdtv is coming to Europe too, it just takes a bit longer than in the US, I personally refuse to play next generation games in 480i, even if I have to hook it up on my monitor. Here in Finland lcd tv's are starting to sell very well, price has come down a lot during last few years and I expect that to continue. I think it is clear that even pal-versions of the next gen machines will support hdtv-resolutions, and that is all we need. It doesn't matter then, if grannies don't want to upgrade.
 
london-boy said:
I think there should be a push of content first, to persuade people that there is a reason for going HDTV. Flat panels on their own won't convince people to upgrade, cause if you can't play any HD content on them, they're useless.
People don't buy flatscreens because they support higher resolutions. People buy flatscreens because they are flat. It just so happens that when people buy a flatscreen they often also buy a HD set whether they know it or not.

Obviously when content becomes available it will become a factor as well but there will be lots of people buying HD sets before then.
 
cybamerc said:
london-boy said:
I think there should be a push of content first, to persuade people that there is a reason for going HDTV. Flat panels on their own won't convince people to upgrade, cause if you can't play any HD content on them, they're useless.
People don't buy flatscreens because they support higher resolutions. People buy flatscreens because they are flat. It just so happens that when people buy a flatscreen they often also buy a HD set whether they know it or not.

Obviously when content becomes available it will become a factor as well but there will be lots of people buying HD sets before then.

That's why i did my previous post, when there is content, people will know what HDTV is and go buy an HDTV with whatever HD set is available.
If there is no content, they will just buy "a plasma TV", the cheaper the better, which will likely only support 480p.
If there is content and public acknowledgement, then the whole HDTV transition will be much smoother.
 
cybamerc said:
london-boy said:
If there is no content, they will just buy "a plasma TV", the cheaper the better, which will likely only support 480p.
Or a dirt-cheap LCD that supports 720p.

Yep. Personally i don't care, i'll plug my next gen machine to my LCD screen myself.
 
Megadrive1988 said:
ee1.jpg


thanks, One :oops:


I either have not seen this slide, or haven't seen it in a long, long time.

You know that picture only shows it at like 200-300 Million btw and not 500 Million (its log scale remember).

Also it looks like the figures Sony is showing there is much higher than Moore's law to me (I know that Intel processors have traditionally before the Pentium followed Moore's Law and you can see the slope is much higher for Sony's EE's).
 
DemoCoder said:
If you're still using a 480p SDTV/EDTV in 2006-2011, you're a loser. :) 720p sets will exist at the sub-$1000 price point by then, and 1080p sets will be sub-$2000

IMHO, with HD-DVD and Blu-Ray supporting 1080p, and all major manufacturers moving to 1080p, designing a next-gen system to support anything less than 1080p framebuffers as standard is a big mistake.

Maybe not 720p but 1080i under $1000. There are a few models there already, but they're 30-inch CRTs.

Only products supporting close to 720p are fixed pixels like LCD or plasma and that's because they're not quite ready to do 1080p yet. And I'm sure DLP as well. But they are all aiming at 1080p, which is going to be only on the high-priced units for the next year or two.
 
slashdot CELL article:
-------------------------

The Cell is just a PowerPC with some extra vector processing.
Not quite. The Cell is 9 complete yet simple CPU's in one. Each handles its own tasks with its own memory. Imagine 9 computers each with a really fast network connection to the other 8. You could problably treat them as extra vector processors, but you'd then miss out on a lot of potential applications. For instance, the small processors can talk to each other rather than work with the PowerPC at all.

Sony will have to sell the PS3 at an incredible loss to make it competitive.
Hardly. Sony is following the same game plan as they did with their Emotion Engine in the PS2. Everyone thought that they were losing 1-200 bucks per machine at launch, but financial records have shown that besides the initial R&D (the cost of which is hard to figure out), they were only selling the PS2 at a small loss initially, and were breaking even by the end of the first year. By fabbing their own units, they took a huge risk, but they reaped huge benefits. Their risk and reward is roughly the same now as it was then.

Apple is going to use this processor in their new machine.
Doubtful. The problem is that though the main CPU is PowerPC-based like current Apple chips, it is stripped down, and the Altivec support will be much lower than in current G5s. Unoptomized, Apple code would run like a G4 on this hardware. They would have to commit to a lot of R&D for their OS to use the additional 8 processors on the chip, and redesign all their tweaked Altivec code. It would not be a simple port. A couple of years to complete, at least.

The parallel nature will make it impossible to program.
This is half-true. While it will be hard, most game logic will be performed on the traditional PowerPC part of the Cell, and thus normal to program. The difficult part will be concentrated in specific algorithms, like a physics engine, or certain AI. The modular nature of this code will mean that you could buy a physics engine already designed to fit into the 128k limitation of the subprocessor, and add the hooks into your code. Easy as pie.

The Cell will do the graphics processing, leaving only rasterezation to the video card. Most likely false. The high-end video cards coming out now can process the rendering chain as fast as the Cell can, looking at the raw specs of 256Gflops from the Cell, as opposed to about 200GFlops from video cards. In two years, video cards will be capable of much more, and they are already optomized for this, where the Cell is not, so video cards will perform closer to the theoretical limits.
The OS will handle the 8 additional vector processors so the programmer doesn't need to.
Bwahahaha! No way. This is a delicate bit of coding that is going to need to be tweaked by highly-paid coders for every single game. Letting on OS predictively determine what code needs to get sent to what processor to run is insane in this case. The cost of switching out instructions is going to be very high, so any switch will need to be carefully considered by the designer, or the frame-rate will hit rock-bottom.

The Cell chip is too large to fab efficiently.
This is one myth that could be correct. The Cell is huge (relatively), and given IBM's problems in the recent past with making large, fast PowerPC chips, it's a huge gamble on the part of all parties involved that they can fab enough of these things.
 
Play any PC game and you'll realize that the difference between 480 and >1000 lines of resolution is not all that mind blowing. You get less aliasing and less shimmering, but the difference in quality isn't really worth $300 for the latest video card, let alone >$1,000 for the latest TV set. Of course some people will upgrade and some won't, it's pointless to argue what the exact proportion will be. Personally I've got a high-res DLP, a high res LCD, and an old Magnovox TV that my PS2 is hooked up to; I'm most likely to keep using the Magnovox when I get my PS3, unless I get an HMD, which would most likey still be 480 lines of resolution. This is all highly subjective as to what's important to you/what makes a difference in your eyes.
 
BOOMEXPLODE said:
Play any PC game and you'll realize that the difference between 480 and >1000 lines of resolution is not all that mind blowing. You get less aliasing and less shimmering, but the difference in quality isn't really worth $300 for the latest video card, let alone >$1,000 for the latest TV set. Of course some people will upgrade and some won't, it's pointless to argue what the exact proportion will be. Personally I've got a high-res DLP, a high res LCD, and an old Magnovox TV that my PS2 is hooked up to; I'm most likely to keep using the Magnovox when I get my PS3, unless I get an HMD, which would most likey still be 480 lines of resolution. This is all highly subjective as to what's important to you/what makes a difference in your eyes.


:oops: Errr ok........
 
Can you really say that you've played a game at 640x480, and then at 1600x1200, and have been blown away by the difference? Maybe you can, like I said it's a subjective thing.
 
I concur with my esteemed colleague's :oops:

In any modern game, 640x480 is simply unacceptable on the PC. And I've never remotely picked up a $300 GPU or dealt with 640x480 resolutions in years and years.

While HDTV's haven't made that transition quite as mass-market yet, once it GETS closer, and more and more people have it, it will become just as hideous to think of going back.

Do you think people don't notice the quality difference between VHS and DVD? :p
 
I think your right about the resolution thing, especially with latest games that have higher polly counts. playing doom 3 and the difference between 800x600 and 1024x768 wasn't huge. I know Doom3 didn't have very high res textures but i imagine that's what console games will be like with their limited RAM. I think the more pollys and effects the less you notice the resoltion in most situations.
 
BOOMEXPLODE said:
Can you really say that you've played a game at 640x480, and then at 1600x1200, and have been blown away by the difference? Maybe you can, like I said it's a subjective thing.


The problem is not that maybe i can... The issue here is that you certainly should see the difference...

You either have severe eye problems, or... I have no idea.
 
sir doris said:
I think your right about the resolution thing, especially with latest games that have higher polly counts. playing doom 3 and the difference between 800x600 and 1024x768 wasn't huge. I know Doom3 didn't have very high res textures but i imagine that's what console games will be like with their limited RAM. I think the more pollys and effects the less you notice the resoltion in most situations.

He's talking about 640x480 to 1600x1200. If someone can't see the difference, i advise a urgent visit to their GP for an eye test.

Also, more geometry and more details will have to go along with increase in resolution too, or there won't be enough pixels to display all the calculations performed by the platform.
 
london-boy said:
BOOMEXPLODE said:
Can you really say that you've played a game at 640x480, and then at 1600x1200, and have been blown away by the difference? Maybe you can, like I said it's a subjective thing.


The problem is not that maybe i can... The issue here is that you certainly should see the difference...

You either have severe eye problems, or... I have no idea.

Yes of course I can see the difference, but I'm not blown away, in other words my hat doesn't fly off my head and steam doesn't shoot out of my ears. A new leather couch and a few mood lights would make a more appreciable difference to my gaming experience. But of course, these are all subjective things.
 
BOOMEXPLODE said:
london-boy said:
BOOMEXPLODE said:
Can you really say that you've played a game at 640x480, and then at 1600x1200, and have been blown away by the difference? Maybe you can, like I said it's a subjective thing.


The problem is not that maybe i can... The issue here is that you certainly should see the difference...

You either have severe eye problems, or... I have no idea.

Yes of course I can see the difference, but I'm not blown away, in other words my hat doesn't fly off my head and steam doesn't shoot out of my ears. A new leather couch and a few mood lights would make a more appreciable difference to my gaming experience. But of course, these are all subjective things.

Well obviously running Quake2 at 640x480 then running it at 1600x1200, it will still look like crap. Just sharper.

The difference will be much more visible on a game with lots and lots of details (geometry, textures and shaders) like Farcry and HL2 for example.
 
V3 wrote:
For gaming, I actually prefer higher frame rate, than higher resolution.

indeed. I totally agree.

to me, resolution means almost nothing. framerate means almost everything.

as long as the resolution is not below 480i/p I'm fine. but anti-aliasing does help.

high resolution won't make games look like CG. you could watch Toy Story on a 1980 color television through a VCR which is less than 480i and it will still look like CG dispite the low resolution.
 
Megadrive1988 said:
V3 wrote:
high resolution won't make games look like CG. you could watch Toy Story on a 1980 color television through a VCR which is less than 480i and it will still look like CG dispite the low resolution.

It will look like Toy Story in low resolution. Not like Toy story should look like.
 
Status
Not open for further replies.
Back
Top