ATI - PS3 is Unrefined

Edge said:
Great thread!

I wonder how many Japanese games does ATI produce? :D

I'm gonna guess the same as Nvidia....

These arguments are pretty meaningless. Especially when RAM will be the limiting factor for both consoles. The xbox1's superior technology didn't stop Gods Of War or Resident Evil 4 from making alot of xbox1 releases look bad. Those were both on inferior hardware to boot. This upcoming generation will be even more developer dependant on quality. I just dont see hardware being limiting for either camp.

As for the less-than-stellar xbox360 launch lineup as proof of hardware limits. I'd refer you to the ps2 launch lineup. Which wasn't that good either. Especailly compared to the DC games being released at the time.
 
weaksauce said:
" through what is effectively a PCI express bus", haha! Isn't that like 4gb/s?
And didn't epic say the did the demo on ps3? All they had to do was to "swith to opengl and check that the shadows and lighting were alright".

Which may or may not be true and has nothing to do with what hardware the demo was actually running on.

Aaron Spink
speaking for myself inc.
 
weaksauce said:
Dr Evil, motorstorm was already said to be realtime. I think the game is not that much about polygons but rather effects, as you've seen they are awesome. Wouldn't they fit for cell?

Then give the pointer to where they say "Motostorm was demo'd in real time on real hardware", cause I haven't seen it anywhere.

Aaron Spink
speaking for myself inc.
 
Laa-Yosh said:
I wouldn't make any guesses from such a video; it's pretty obvious that it shows a highly simplified process. Have you seen any actual scanning in the clip, for example? All they show is a few short cuts of the photo shoot... They could've used a lot of pre-built assets modified a bit to look like the host, or they could've done most of the work prior to the actual visit. In my experience, all the work for such a character takes at least a few weeks...
They probably could of had the body done before hand, as there is a "create a character" mode in most EA games that was probably a given. Also, animations and coding to put him in the game were probably pre done. But kudo said "heres your head scan" so its pretty safe to assume they they scanned his head and not some other guy that looks like him. If you notice closely, the clothes he comes in with before he meets kudo, are the same ones he leaves with after he plays as himself in the game. Of course there is no way to know for sure but I think its obvious it didnt take them as long as someone doing the modeling of his face manually. (which can take weeks) The fact that they might of done this for free (who knows for sure is up in the air also), is even more interesting to me.
 
Last edited by a moderator:
Repeat: scanning does not replace modeling, it's used for 1. reference 2. normal map details.
 
one said:
Have you watched the pre-E3 Sony conference? If you don't think Tim Sweeney is a liar...
http://www.gamespot.com/news/2005/05/16/news_6124681.html

http://consolewar.1up.com/do/newsStory?cId=3140617

The same demo was also played at PS Meeting in September, with more details. AFAIK it was demoed in other venues too.
http://media.ps3.ign.com/articles/635/635625/vid_1180296.html
This recent article is a good read about the PS3 version of UT2007.
http://www.beyond3d.com/forum/showpost.php?p=653851&postcount=11

Basically, it seems Mr. Richard Huddy managed to ditch his credibility completely.

I will just point out, that at no time did Sweeney actually say he was demoing it on an actual PS3 dev kit. Considering that the vast majority of their development is on the PC side and the Demo's are most likely more stable on the PC side, it wouldn't be supprising to see it demo'd with a PC but the comments be able the PS3. It also wouldn't be the first time somehting like this happened with any of the console makers.

Aaron Spink
speaking for myself inc.
 
The_Standard said:
Who says it isn't an all-new GPU without console-specific features? Certainly not NVIDIA...

David Roman: "We’ve been working with them to produce a customized version that is customized specifically to connect that to the cell processor, so that they could work together.​

Um, of course it is customized, they had to add in the flexIO interface. What do you think is taking them so long?

Aaron Spink
speaking for myself inc.
 
Laa-Yosh said:
Repeat: scanning does not replace modeling, it's used for 1. reference 2. normal map details.
I never said it did. But the time from doing a facial/head scan and modeling a low poly head by hand... Is probably WAY more shorter than modeling a high poly head by hand, and a low poly head by hand. I'm sure you can agree with this?
 
mckmas8808 said:
So people man. Nvidia said in Dec 2004 that they have been working with Sony for about 2 years right? Wouldn't mean that they starting working on RSX back in early 2003 or very late 2002?

Intel and AMD have been working with Microsoft for the past 10+ years. That doesn't mean that if MS had announced either an Intel or AMD processor for X360 that it would have been custom designed for 10 years.

The likely theory is that Sony had been working with Nvidia on developer tools like cg, etc for some time. All indications point to Nvidia's sourcing of the hardware being fairly last minute.

Aaron Spink
speaking for myself inc.
 
aaronspink said:
Um, of course it is customized, they had to add in the flexIO interface. What do you think is taking them so long?

Aaron Spink
speaking for myself inc.

I don't think it takes a company like Nvidia some 8 months to glue the flexIO on a stock G70 (which taped out in..march?)
 
Edge said:
360 GPU claims to have "greater efficiency", and "unified shaders", as if those two phrases would be enough to settle the debate, but I wonder how many people would choose a machine with 48 * 10 horsepower motors versus (24 * 22 horsepower motors (pixel pipelines) + 8 * 10 horsepower motors (vertex pipelines)). One could argue all day the 10 horsepower motors are more efficient, and offer better load balancing, but in the end the G70 can still be more powerful. Sure it has less execution units, but each pixel execution unit looks quite a bit more powerful than each unified shader on the 360 GPU.

Following your analogy would it not be fair to say that in the RSX those 24 * 22 horsepower motors will be sitting idle if the 8 * 10 ones are maxed out? Both chips when running at max efficiency we will not be starve us for eye candy but the philosophy of Xenos is to remove the weakest link in the chain.
 
aaronspink said:
Um, of course it is customized, they had to add in the flexIO interface. What do you think is taking them so long?

Aaron Spink
speaking for myself inc.

Customized for more than the flexIO if you read through all posts and opinions.
 
aaronspink said:
I will just point out, that at no time did Sweeney actually say he was demoing it on an actual PS3 dev kit. Considering that the vast majority of their development is on the PC side and the Demo's are most likely more stable on the PC side, it wouldn't be supprising to see it demo'd with a PC but the comments be able the PS3. It also wouldn't be the first time somehting like this happened with any of the console makers.

http://www.ga-forum.com/showpost.php?p=2477080&postcount=119

The statement "The Epic demo was running on a PC, and it was done using an early 7800 in SLI mode, so that was a high-end PC demo" is completely incorrect.

Our demo was NOT running on a PC. Our E3 PS3 demo was a live demo running on a PS3 development kit.


--Mark Rein, Epic Games Inc.

http://www.ga-forum.com/showpost.php?p=2476272&postcount=48

For the actual content for the demo, which runs on Unreal Engine 3, we did most of the work on the PC and then copied it over to the PLAYSTATION 3 and just ran it.

So, they developed a demo on the PC and ported it to a PS3 devkit (Cell plus a PC GPU), which was the equipment used for the demo.

Bearing in mind that the graphics programming for the demo is the same on both PC and PS3, you can see why the graphics would have been relatively easy.

The tricky bit, presumably, was getting the UE3 engine code to run on Cell - presumably on PPE, perhaps dual-threaded on PPE.

The PLAYSTATION 3 Stuff is easy to get up and running because it uses Open GL and Linux, as well as nVidia graphics and Power PC architecture.

Jawed
 
3roxor said:
I don't think it takes a company like Nvidia some 8 months to glue the flexIO on a stock G70 (which taped out in..march?)
NVidia still hasn't delivered any desktop PC 90nm GPUs. They're running behind schedule (3 months+). They were supposed to be on the market this past autumn.

So it wouldn't surprise me in the least if NVidia is struggling to get 90nm RSX working at Sony's fab.

Jawed
 
3roxor said:
I don't think it takes a company like Nvidia some 8 months to glue the flexIO on a stock G70 (which taped out in..march?)
They could've easily and most likely started work on integrating the flexio, before it even taped out, into the design in parallel. Such an extended period of time hints at something else as others have said.

Jawed said:
NVidia still hasn't delivered any desktop PC 90nm GPUs. They're running behind schedule (3 months+). They were supposed to be on the market this past autumn.

So it wouldn't surprise me in the least if NVidia is struggling to get 90nm RSX working at Sony's fab.

Jawed
They've no need to rush things, right now they're easily dominating and getting some $$$.

edited
 
Last edited by a moderator:
Jawed said:
NVidia still hasn't delivered any desktop PC 90nm GPUs. They're running behind schedule (3 months+). They were supposed to be on the market this past autumn.

So it wouldn't surprise me in the least if NVidia is struggling to get 90nm RSX working at Sony's fab.

Jawed

Ow.. where did you read that?
 
Edge said:
360 GPU claims to have "greater efficiency", and "unified shaders", as if those two phrases would be enough to settle the debate, but I wonder how many people would choose a machine with 48 * 10 horsepower motors versus (24 * 22 horsepower motors (pixel pipelines) + 8 * 10 horsepower motors (vertex pipelines)). One could argue all day the 10 horsepower motors are more efficient, and offer better load balancing, but in the end the G70 can still be more powerful. Sure it has less execution units, but each pixel execution unit looks quite a bit more powerful than each unified shader on the 360 GPU.

When the machine you speak of isn't 24x22, but really 24x2x11, then you don't really have fewer execution units at all. Besides, what do you do when your 24x2x11 ends up doing only a little bit more work than a competing machine that is (1.45)x16x12? Time to stop looking purely at the numbers and more at what they're being used for, I'd imagine.

G70 can still be more powerful. It can still be weaker. Whatever the difference, name one person who honestly thinks it's going to be greater than say... 10 or 15%, or far less, of a difference between the two.
 
Last edited by a moderator:
3roxor said:
I don't think it takes a company like Nvidia some 8 months to glue the flexIO on a stock G70 (which taped out in..march?)

It does when they have to move over to a whole new process. Just process porting can take up to 1 year in a heavy asic flow.

aaron spink
speaking for myself inc.
 
Here's one indication from March:

http://www.beyond3d.com/forum/showthread.php?t=17878

On the discussion of silicon process nodes Mike later mentioned this: “If you look at when we go to 90, my guess will be is we’ll have one or two products this year going from 90 in the second half”. On the face of it this would tend to confirm our suspicions that the Spring refresh will be 110nm based, as much of the GeForce 6 product line is now, whilst the high end fall product will likely be 90nm.

Latest statement was that January would see a flurry of 90nm desktop PC GPUs, not including a high-end GPU.

Indications are that we prolly won't see a high-end 90nm PC GPU from NVidia until Feb/March.

Jawed
 
Back
Top