ATI - PS3 is Unrefined

mckmas8808 said:
ALL PS3 dev-kits had Cell processors in them. None of them had a P4 or a G4 in them. You are thinking about the 360 dev kits.

And this is what the ATI guy said.



He is wrong. Mark Reign even said so. And that was his first post at the GAF. What is it that you don't understand? Why do you feel like you HAVE to believe this ATI guy? Sony never had PC's for devkits.

You seem to be failing to comprehend that the demo was not utilizing Cell. Remember that the Demo was a scripted cutscened with no A.I. or physics being used at all. If it was only using two PC GPUs in SLi, what would you call it? I know I'd call it a highend PC demo, as most people would.
 
Nerve-Damage said:
Dude what are you talking about?

I know the Evaluation System uses the 4x PCI Express Bus System.

I have the PS3 Evaluation System listed as is (i.e. PCI Express)

The 16x PCI Express figure are for the current PS3 Dev-Kits!!

The 7800 GTX Based PS3 Alpha Kits (the ones running PS3 UT2007), uses the 16x PCI Express System.
Then check out what I quoted from your post again.

http://www.beyond3d.com/forum/showpost.php?p=658865&postcount=50
PS3 Alpha Dev-Kit:
* CPU: Cell @ 2.4GHz
* GPU: NVIDIA 7800 GTX
* Memory: 512MB of XDR
* Bus System: 16x PCI Express
Note: SLI Configuration models are also rumored (PS3 UT2007).

As you see in the above diagram, the FlexIO bandwidth to the South Bridge limits it to 5GB/s, so it's limited to 4x PCI-E.

EDIT: Oh, you seem to distingush between the evaluation system and "PS3 Alpha Dev-Kit", which I've never heard of. Where did you see it was bumped up to 16x PCI Express? If it's still connected via the south bridge, it's meaningless.
 
Last edited by a moderator:
Nerve-Damage said:
PS3 Final Dev-Kit (Fine Tuned To Spec):
Note: Rumored to be released in Mid-February or early March 2006.

Er, that's pretty tight for a spring launch, even if it's in Japan only. What's going on with the PS3... do we have to wait until CES to find it out?
 
Laa-Yosh said:
Er, that's pretty tight for a spring launch, even if it's in Japan only. What's going on with the PS3... do we have to wait until CES to find it out?

Well it's less than two weeks left. Must.. hold... out..
 
Hardknock said:
You seem to be failing to comprehend that the demo was not utilizing Cell. Remember that the Demo was a scripted cutscened with no A.I. or physics being used at all. If it was only using two PC GPUs in SLi, what would you call it? I know I'd call it a highend PC demo, as most people would.

HA! You're so funny. And it's kinda cute that you are still fighting this. So that means every Xbox 360 game was made on a PC up until this August right? No 360 dev kits existed.
 
one said:
Then check out what I quoted from your post again.

http://www.beyond3d.com/forum/showpost.php?p=658865&postcount=50


As you see in the above diagram, the FlexIO bandwidth to the South Bridge limits it to 5GB/s, so it's limited to 4x PCI-E.

Are you blind?

You keep quoting my PS3 Alpha Kits specs, not the Evaluation specs!!

The Evaluation Systems was to give an overall view of the PS3 system (in its infancy stage) with a SLI NVIDIA 6800 setup.

Since then……….

THE CURRENT PS3 ALPHA KITS ARE USING a 16x PCI EXPRESS BUS SYSTEM WITH THE 7800 GTX CARDS!!
 
mckmas8808 said:
HA! You're so funny. And it's kinda cute that you are still fighting this. So that means every Xbox 360 game was made on a PC up until this August right? No 360 dev kits existed.

Yes, and I've stated this before. The 360 demos at E3 were not running on actual 360s, but dev-kits made of PC parts.
 
Let's focus the thread

The discussion is becoming more and more hostile.

Let's try to argue politely and in a non agressive manner, if possible. And, we all know that it's possible.
 
Nerve-Damage said:
Are you blind?

You keep quoting my PS3 Alpha Kits specs, not the Evaluation specs!!

The Evaluation Systems was to give an overall view of the PS3 system (in its infancy stage) with a SLI NVIDIA 6800 setup.

Since then……….

THE CURRENT PS3 ALPHA KITS ARE USING a 16x PCI EXPRESS BUS SYSTEM WITH THE 7800 GTX CARDS!!
Well please look at the above edit ;)
EDIT: Oh, you seem to distingush between the evaluation system and "PS3 Alpha Dev-Kit", which I've never heard of. Where did you see it was bumped up to 16x PCI Express? If it's still connected via the south bridge, it's meaningless.

Laa-Yosh said:
Er, that's pretty tight for a spring launch, even if it's in Japan only. What's going on with the PS3... do we have to wait until CES to find it out?
Apparently "Nerve-Damage" speculates there's another line of machines that are explicitly labelled as "devkit" unlike Evaluation System and Reference Tool, which I've never heard of and probably no one knows.
 
Last edited by a moderator:
Hardknock said:
The dev kit they used was using a 7800 PC GPU. Not only that, but the Unreal Tournament 2007 demo at E3 only showed the power of the GPU not Cell. Here's a comment they made.


http://ps3.ign.com/articles/615/615178p1.html
He did not say that CELL was not used, he said it was barely utilised (aka what you saw was barely optimised). That only makes the demonstration much more impressive.

What is your goal here Hardknock. You are trying to get people to agree to something that is already proven factually incorrect because it gels with your HW bias?


I'll repeat my question:
Did you transcribe the interview Hardknock? If not, can you link to the original source.
 
Nicked said:
He did not say that CELL was not used, he said it was barely utilised (aka what you saw was barely optimised). That only makes the demonstration much more impressive.

What is your goal here Hardknock. You are trying to get people to agree to something that is already proven factually incorrect because it gels with your HW bias?


I'll repeat my question:
Did you transcribe the interview Hardknock? If not, can you link to the original source.

The demo in question was a scripted with pre-canned animation cutscene. It was not being controlled in real-time. Cell was not used because it wasn't needed. And he never elaborates on whether Cell was used at all for the demo unfortunately, but I doubt it. He just states he'll really dive into Cell when he gets home.

Me not believing this has nothing to do with a hardware bias. I'm believing it because it makes sense. This could actually mean the PS3 is going to be much more powerful than the dev-kit at E3. Why people are getting so defensive is beyond me.

No I didn't transcribe the interview, IGN did, which I linked to in my original post.


Edit: My bad you are talking about the ATi interview? I got it from this topic: http://www.ga-forum.com/showthread.php?t=77662&page=1&pp=50
 
Last edited by a moderator:
one said:
Well please look at the above edit ;)



Apparently "Nerve-Damage" speculates there's another line of machines that are explicitly labelled as "devkit" unlike Evaluation System and Reference Tool, which I've never heard of and probably no one knows.

I'll pop up the link in a minute............

I'm tracking some news I just posted in the technology section :cool:

Edit: Sorry about that I posted old news link in technology thread!! :oops: :cry:
 
Last edited by a moderator:
Hardknock said:
The demo in question was a scripted with pre-canned animation cutscene. It was not being controlled in real-time. Cell was not used because it wasn't needed. And he never elaborates on whether Cell was used at all for the demo unfortunately, but I doubt it. He just states he'll really dive into Cell when he gets home.
No, the words "only really" and "seriously" imply otherwise.

edit; thanks.
 
Hardknock said:
The demo in question was a scripted with pre-canned animation cutscene. It was not being controlled in real-time. Cell was not used because it wasn't needed. And he never elaborates on whether Cell was used at all for the demo unfortunately, but I doubt it. He just states he'll really dive into Cell when he gets home.

Me not believing this has nothing to do with a hardware bias. I'm believing it because it makes sense. This could actually mean the PS3 is going to be much more powerful than the dev-kit at E3. Why people are getting so defensive is beyond me.

No I didn't transcribe the interview, IGN did, which I linked to in my original post.
:rolleyes:
So it took 2 months to port it to a f'ing PC, NOT known as its exotic architecture, and Tim Sweeney bragged about this speedy achievement and went on the record on the stage at E3. Very... strange if you ask me.
 
Nicked said:
No, the words "only really" and "seriously" imply otherwise.

edit; thanks.

Here's the thing. Sony is the absolute king of PR. They know exactly what to say and what not to say to keep themselves in a positive light. All this implying and hinting at means nothing to me. If it's not explicitely stated that it was running on Cell then I don't believe it. Sony has had developers state the pre-rendered footage at E3 was real-time. PSM was quoted as saying Killzone was real-time on PS3 but only at 5fps. Sony continues with their "launching in Spring" stance... There is just so much bull going around... I even find it hard to believe Nvidia's statements that they've been working on RSX for 3 years now... Anything dealing with Sony I take with a grain of salt, they have lied too much in the past.
 
I am not a tech person(even though only a select few are in this forum anyway) so i can't have a definite opinion about this but from my experience as a gamer with consoles over the years i will say that raw power is better than "efficiency".

This was an arguement that i was hearing when the xbox and the GC came out,nintendo fans would go on and and on about how GC is more efficient.Well you know what.....playing all 3 current gen. consoles on a HDTV makes this xbox look like half a generation above both ps2 and GC and no mentioning of things like RE4(a very overrated game when it comes to graphics) can change that.

With the original xbox you could get amasing graphics from day1(see DOA3 or even Halo which was a great looking FPS for 2001) and this was not the case with the ps2 for example where only a select few could make good looking games.

Right now the 360 reminds me more of the ps2,with underwhelming looking launch games(not looking like current gen games as some people claim but still not huge leaps either) and promises of "efficiency" and "way better looking games in the future".Now i am not denying that much better looking games will come out for the 360(hell ,the UE3 games confirm that) but still i think that Nvidia is the company to go for consoles judging by both the xbox1 and the solutions they have for the early development kits of ps3(with the sli 6800gt that are way more powerfull and closer to the final PS3 GPU than whjat the x850-powered dev. kits were for the 360).

Bottom line:I believe that all this talk about "efficiency" is bullshit excuses and PR "smoke and mirrors" and will look moronic if the first ps3 games look much better than the xbox360 launch games(which will probably be the case).
 
Last edited by a moderator:
Hardknock:

Please just stop. You're making a complete fool out of yourself.

Don't drag Sony and your own hate into this. This is all about an ATI rep's PR BS which is being shot down by mr Rein himself on GAF (atleast the part about the UT demo was a high end PC-demo) and has got nothing to do with Sony at all.

As all the realtime demos (demos shown by Phil Harrison including the lots-of-ducks demo, the fight night demo and the UT demo) shown at Sony's E3 conference was using the PS3 Evaluation System (Cell CPU@2,4Ghz, GF 7800GTX@500Mhz) you can be certain that all demos used the Cell microprocessor even if they all didn't use it fully (which makes the UT demo even more impressive).
 
Hardknock said:
Here's the thing. Sony is the absolute king of PR. They know exactly what to say and what not to say to keep themselves in a positive light. All this implying and hinting at means nothing to me. If it's not explicitely stated that it was running on Cell then I don't believe it. Sony has had developers state the pre-rendered footage at E3 was real-time. PSM was quoted as saying Killzone was real-time on PS3 but only at 5fps. Sony continues with their "launching in Spring" stance... There is just so much bull going around... I even find it hard to believe Nvidia's statements that they've been working on RSX for 3 years now... Anything dealing with Sony I take with a grain of salt, they have lied too much in the past.
All companies stretch the truth. Sony just do it well, thats a good thing. You lap up this PR hungrily because it falls on the right side (hint; its bullshit too).
 
Ati Pr Idiot Strikes!

RSX is not G70 bolted to Cell! Sony wanted a PC part? Nvidia have G71 taped out at 90nm. Sony could've send developers full kits for Christmas!

The story about last minute switch is bunk. RS and RSX were competitors to the PS3 GPU as early as 2003. Nvidia drew up contractual terms for Sony after M$ grew closer to ATI. Sony sent a small team on attachment to Nvidia during the winter of FY'03. They are to study Nvidia software first and GPU secondary. The RS team was formed in FY'02, they started from scratch. Sony people made up 1/3 of RS team and being internally researched, outsiders saw it as the final GPU. It is not, it is only one GPU under evaluation. A huge project like PS3 is no a clean and cut deal.
Sony took RSX development more seriously by FY'04. The plan to use Nvidia software with RS hit a few bugs. The RS team were late on their updates. Publishers felt the pinch of RS delays . Nvidia made a final push with several IP deals. Sony awarding the contract to Nvidia is a late decision; but there is no switching in a PC part. RSX is not G70, 70:30 of it may be Nvidia's but a full custom job on G7 modular architecture is a silly waste of time.

I heard that Xenos is closer to R480 in perf. The line is blurred between exotic_unproven, unified_competing. The XCPU, X360 non-launch games ... are the ones needing to be refined.
 
fulcizombie said:
Bottom line:I believe that all this talk about "efficiency" is bullshit excuses and PR "smoke and mirrors" and will look moronic if the first ps3 games look much better than the xbox360 launch games(which will probably be the case).

First PS3 games are entirely... guaranteed to look better. PC and Xbox ports with development... no, tweaking, for a few months on actual architecturally-accurate chips. Versus the right architecture for well over a year. The comparison becomes completely and utterly useless as a hardware benchmark, and this should be absolutely and utterly obvious. So it'll look moronic. Nobody's fault except the individuals if they fail to account for the other factors, so from the hardware (analysis) perspective it shouldn't matter if it looks pathetic in a comparison of launch title visual quality.

And in the real world, efficiency is absolutely and utterly important. Raw power means squat when usable energy is the product of the raw, theoretical value and your efficiency. What's the point if you can do two multiply-adds per clock in a pipeline, tops, if you get one cut down due to a texfetch or because you can't coissue the instructions you want for that clock cycle? Zero. You have, realworld, one in that case.

Alot of the things these companies state aren't invalid in of themselves. It's that people fail to put them into proper context. A Gflops number isn't useless. You just need to know how they're trying to BS you and avoid it. It's great you have X number of Gflops. Whether I know that you lose Y because of normal, standard rendering tasks that are taken care of by hardware that is shared amongst numerous tasks, or lose K because it's really only there for special cases is my problem.

Nicked said:
Sony just do it well, thats a good thing.

Please, tell me that I am misinterpreting your statement, as the way I read it now, it's a good thing for a company to attempt to mislead me. Don't think I actually WANT to wade through all the extra BS. But perhaps I'm alone in that.

tema said:
I heard that Xenos is closer to R480 in perf. The line is blurred between exotic_unproven, unified_competing.

The only time I've ever heard R480 referenced when speaking of Xenos was in a flawed comparison between the two, in which the conclusion was that it would act like a 20-pipe R480 (or something like that). A lovely comparison that figured a pipeline that occasionally coissues an add before the full-ALU op is worthy of being considered {edit: just as] capable as using a completely capable ALU unit in two subsequent clocks. As stated by ATI, however, Xenos will surpass R520 at the targetted resolution (whilst losing out at higher resolution, which implies hitting fillrate or having to reprocess geometry for probably >half a dozen tiles, as would be the case for AA'd 1080p framebuffer), which should be a satisfactory cut into that comparison for you.

So long as shaders keep to their present math: tex ratio, Xenos can remain quite limited, though. I don't think X1600 quite gets the 3:1 math advantage it holds over R520, but it certainly doesn't perform like a 4 pipe card either. It gets a little extra whenever it can issue an op for the add/imput modify-only ALU before another ALU op. Having seen those two code snippets posted at Rage3d recently, that mini-ALU doesn't seem to have many situations where it actually comes into play, and math to tex ops are already pretty high. Which really begs the question of why G70 doesn't perform better, since it should have considerably greater opportunity to use that second ALU. Or perhaps I'm too tired to think straight.

But, of course you've taken to those "exotic vs. unproven" and "unified vs. competing" lines. Where did they come from again? They're fairly brilliant statements, worthy of some admiration. IIRC, what's stated is "it takes careful work to balance between the work you need done and what's being done" and the result is people take it as "it's pretty likely there's going to be serious issuuuueees!" That's brilliant PR BS at work, right there. The best gives you a simple fact and lets you put the spin they want on it. They haven't lied, but they've still gotten the same message aross that they wanted to.
 
Last edited by a moderator:
Back
Top