Tim Sweeney says G70 > R520

Status
Not open for further replies.
DemoCoder said:
But I think I'd take Carmack or Sweeney's opinion over the opinion of anyone in this forum regardless of their involvement with IHVs.

If a developer demonstrated an overt bias I wouldn't. And stating NV30 is on par with R300 is overt.
 
wireframe said:
Sweeney's reply of "G70 for sure" hardly qualifies as a quip and I think it would be fair to say that the way it was asked and answered was more jokingly and well beyond any meaningful analysis.

This is probably the most likely scenario given that the interviewer was from NVNews. Sweeney was probably grinning cause he knew the quote would be the catalyst for many a forum flamefest. :)
 
For Carmack's needs, it was. Doom3 was not primarily PS2.0 game, it was a game that started development in the DX7 days and that relied heavily on stencil fillrate. NV3x was no slouch on DX7/DX8 workloads, despite the way people want to portray it, and ran Doom3 at competitive speeds to R3xx. Where the R300 differentiated itself and pulled way ahead was on PS2.0 workloads, but the NV3x was already dead by the time we got a PS2.0 game shipped worth playing (HL2)

You can ignore the opinion of Sweeney et al if you want, but he has what may the most popular game engine in the world it seems now. If Sweeney says he works better on one set of HW, then that means games on that engine may run better. Likewise, when Valve said HL2 would run better on ATI HW, that mean, if you like the Source engine, and games derived from it, you'd be better off with an ATI card.

I don't think you should ignore what these people have to say, after all, the software they author has a disproportiate representation among 3D workloads. Ultimately, its the games that matter, not the card, and if you find, like the Quake days, that a majority of the games you want to play are running on the UE3 engine, and that Sweeney has optimized that engine to run better on one IHV, than it doesn't matter what you think of his bias, you should listen to what he said and act appropriately.

I think Carmack et al have been pretty honest about their opinion, and have hedged their comments and been truthful in criticizing limitations of NV3x and complementing ATI. Especially if you read John's comments on Slashdot. I think this effort to paint him as a transparent Nvidia shill is weak.

Besides which, maybe Sweeney knows something about the R520 that you don't?
 
DemoCoder said:
For Carmack's needs, it was. Doom3 was not primarily PS2.0 game, it was a game that started development in the DX7 days and that relied heavily on stencil fillrate. NV3x was no slouch on DX7/DX8 workloads, despite the way people want to portray it, and ran Doom3 at competitive speeds to R3xx. Where the R300 differentiated itself and pulled way ahead was on PS2.0 workloads, but the NV3x was already dead by the time we got a PS2.0 game shipped worth playing (HL2)

You can ignore the opinion of Sweeney et al if you want, but he has what may the most popular game engine in the world it seems now. If Sweeney says he works better on one set of HW, then that means games on that engine may run better. Likewise, when Valve said HL2 would run better on ATI HW, that mean, if you like the Source engine, and games derived from it, you'd be better off with an ATI card.

I don't think you should ignore what these people have to say, after all, the software they author has a disproportiate representation among 3D workloads. Ultimately, its the games that matter, not the card, and if you find, like the Quake days, that a majority of the games you want to play are running on the UE3 engine, and that Sweeney has optimized that engine to run better on one IHV, than it doesn't matter what you think of his bias, you should listen to what he said and act appropriately.

I think Carmack et al have been pretty honest about their opinion, and have hedged their comments and been truthful in criticizing limitations of NV3x and complementing ATI. Especially if you read John's comments on Slashdot. I think this effort to paint him as a transparent Nvidia shill is weak.

Besides which, maybe Sweeney knows something about the R520 that you don't?

And maybe money has something to do with their opinions. Personally I don't throw out their opinions because of their relationship with certain IHV. However let's not conveniently ignore some facts. Carmark was touting FP when the NV30 path was using FX.
 
DemoCoder said:
For Carmack's needs, it was. Doom3 was not primarily PS2.0 game, it was a game that started development in the DX7 days and that relied heavily on stencil fillrate. NV3x was no slouch on DX7/DX8 workloads, despite the way people want to portray it, and ran Doom3 at competitive speeds to R3xx. Where the R300 differentiated itself and pulled way ahead was on PS2.0 workloads, but the NV3x was already dead by the time we got a PS2.0 game shipped worth playing (HL2)

You can ignore the opinion of Sweeney et al if you want, but he has what may the most popular game engine in the world it seems now. If Sweeney says he works better on one set of HW, then that means games on that engine may run better. Likewise, when Valve said HL2 would run better on ATI HW, that mean, if you like the Source engine, and games derived from it, you'd be better off with an ATI card.

I don't think you should ignore what these people have to say, after all, the software they author has a disproportiate representation among 3D workloads. Ultimately, its the games that matter, not the card, and if you find, like the Quake days, that a majority of the games you want to play are running on the UE3 engine, and that Sweeney has optimized that engine to run better on one IHV, than it doesn't matter what you think of his bias, you should listen to what he said and act appropriately.

I think Carmack et al have been pretty honest about their opinion, and have hedged their comments and been truthful in criticizing limitations of NV3x and complementing ATI. Especially if you read John's comments on Slashdot. I think this effort to paint him as a transparent Nvidia shill is weak.

Besides which, maybe Sweeney knows something about the R520 that you don't?

So what if he has the most poular game engine in the world? Does that some = he has no OBVIOUS UNMERITTED BIAS? No. Because thats exactly what he has.

Carmack is a Different story. Nvidia clearly as a far superior OpenGL rendering system and was much better Equipped for his games. Of course he practically helped Nviidia develope or made the basis for their Tech so read into that what you will.

Sweeney on the other hand has an unadulterated BIAS for Nvidia regardless of the facts. Its a Deeper issue than ATi Vs Nvidia. Look at the way he treats PowwerVR, or the last days of 3dfx. Or any other company. He is totally 100% in the sack with Nvidia. He basis engines around their hardware, promotes them above all esle and even when a Competitor has OBVIOUS technologies that suit his immidiate needs better or actually performs better he DOESNT CARE. He still praises Nvidia and Builds his future engines around their so called *exclusive* technology ideas.

Further, they outright LIE about features, Maps and such that are supposedly "only" able to run on Nvidia hardware. Then Consider techs like 3DC which *EXACTLY* targets a major feature if not the primary development tool of their Engine and *Still* they ignore it completely and talk up Nvidia. Even at the last E3 Ati's hardware ran circles around the current Nvidia tech in teh next gen Unreal Engine. There were many witnesses to this fact. What was promoted or pointed out?? The fact that Nvidia was "superior" with theur IEEE FP32.

You guys dont even know whats under the hood of the R520. When you find out its going to be that much mkore obvious how rediculous his Bias is. But we will still have the democoders of the world making excuses for it and bashing the people that can see the obvious bias and point it out.
 
Hellbinder said:
You guys dont even know whats under the hood of the R520. When you find out its going to be that much mkore obvious how rediculous his Bias is. But we will still have the democoders of the world making excuses for it and bashing the people that can see the obvious bias and point it out.

What brings you to this conclusion?
Do you have a R520?
Do you have a R520 and a G70 and were able to compare them?
Is this just your opinion based on your past experience with ATI and nVidia?
[Edit]corrected spelling[/Edit]
 
Fred da Roza said:
And maybe money has something to do with their opinions. Personally I don't throw out their opinions because of their relationship with certain IHV. However let's not conveniently ignore some facts. Carmark was touting FP when the NV30 path was using FX.

Yes, let's not forget some facts. Like the fact that Carmack said nVidia was "probably cheating" with their drivers WRT D3's performance or that D3 _the game_ was first demonstrated at E3 on an ATi card.
 
Mordenkainen said:
Fred da Roza said:
And maybe money has something to do with their opinions. Personally I don't throw out their opinions because of their relationship with certain IHV. However let's not conveniently ignore some facts. Carmark was touting FP when the NV30 path was using FX.

Yes, let's not forget some facts. Like the fact that Carmack said nVidia was "probably cheating" with their drivers WRT D3's performance or that D3 _the game_ was first demonstrated at E3 on an ATi card.

If he claimed that he shouldn't have said that they are on par bud.
 
Fred da Roza said:
If he claimed that he shouldn't have said that they are on par bud.

Not if, feel free to use the search feature present on these forums. And what he said (as Joe DeFuria pointed out on page 2) was that for developers and for him specifically, the NV30 was better because of higher instruction count and for better developer driver support. He even said that in terms of general fragment program performance the R300 was faster than the NV30.

I think it's obvious that a developer can (and should) only speak for their games. Or that readers should only take those views as it pertains to that developer's games.
 
Just watched it (skipped to the end really). He was clearly caught off guard with the question. I'm certain that if he had his time back he would have put more PR spin into his answer, but it was interesting to hear his "reflex" response.
 
Mordenkainen said:
Fred da Roza said:
If he claimed that he shouldn't have said that they are on par bud.

Not if, feel free to use the search feature present on these forums. And what he said (as Joe DeFuria pointed out on page 2) was that for developers and for him specifically, the NV30 was better because of higher instruction count and for better developer driver support. He even said that in terms of general fragment program performance the R300 was faster than the NV30.

I think it's obvious that a developer can (and should) only speak for their games. Or that readers should only take those views as it pertains to that developer's games.

A little history. A series of posts beginning with

Fred da Roza said:
When someone says the decision between a 9700 and a 5800 isn't clear cut I have to question their objectivity (regardless of when the 5800 was cancelled). I don't think there is any question about which one of these two cards is better, and JC can't claim ignorance for his comment.

http://www.beyond3d.com/forum/viewtopic.php?p=164704&highlight=#164704

and ending with


Fred da Roza said:
Don't assume I have not read JC .plan(s) and interviews because I don't agree with you. I haven't read them all but I have read some, and one of them was actually with that quote. Although he has said that he has had to do more work on the NV30 path, he has also said things like FP16 is enough. The FP16 is enough comment is strange when you actually are using FX. Didn't he only recently acknowledge the NV30 path actually uses FX?

When the evidence consistently shows how impressive a chip R300 is compared to NV30 you only look foolish denying the obvious. I believe someone that works at this site said something similar about Kyle. The fact is JC's statement in question is clearly wrong and he knew better.

This speaks volumes on Carmark's objectivity.
 
The G70 is better than the R520. For sure.

Wait, I meant to say NVIDIA is better than ATI. For sure.

I have a feeling Tim is cracking up reading a whole thread dedicated to what he had/has to say about a NVIDIA product vis-a-vis a competitor's. The comments in this thread is just plain silly. Please read what Tim said and it should be clear how he conducts himself in interviews compared to how he supervises the rendering engine in UnrealEngineX.

In future, please ignore anything Tim says in interviews when it comes down to which particular hardware (or, ahem, IHV) he prefers. Whenever he gets asked such questions, he probably says (or types) while grinning like a maniac.

BTW, he had said NOTHING technical about the G70 or R520 that deserves to be mentioned in this particular forum, not that he'd be allowed. Mods... time to move this thread...?
 
In case anyone has forgotten, it doesn't matter which company brings the best technology to us first.

...unless of course you're a shareholder of one of the two companies. And golly, none of the posters with extreme opinions would be in that position, would they?
 
Temporary Name said:
...unless of course you're a shareholder of one of the two companies. And golly, none of the posters with extreme opinions would be in that position, would they?

I'm a shareholder in both! ;)
 
Hellbinder said:
You guys dont even know whats under the hood of the R520. When you find out its going to be that much mkore obvious how rediculous his Bias is. But we will still have the democoders of the world making excuses for it and bashing the people that can see the obvious bias and point it out.
Okay then, you sound like you know as much about the G70 as you do the R520 in order for you to make such a statement. Let's hear it. If not, then you're just like Tim Sweeney.
 
Hellbinder said:
Even at the last E3 Ati's hardware ran circles around the current Nvidia tech in teh next gen Unreal Engine. There were many witnesses to this fact.

This is so inaccurate it's funny. Let's recap what really happened at GDC with regard to UE3:

Some guy: "OMG, I just say UE3 on NV40 and R420 (neither launched at the time), and it was great on R420 at 1024x768 and very slow at NV40 at 640x480"

Local contingent of ATI fanpeople:
"Haha, NV40 is obviously horrible! Our day is complete!"

jvd: "God, I hope this is true! ATI must own Nvidia, for the common good!" (It's that kind of even-handed approach that earns you a moderator position on B3D).

Epic people (In the immortal words of Mark Rein): "Complete bullshit!"

ATI fanpeople: "OMG, they are saying that NV40 does not suck compared to R420? Blasphemy! Clearly TWIMTBP money at work."

/curtains
 
I do enjoy the entertainment value of these things, because in another couple months we will know, and everyone throwing down with such certainly will get to point at, or be pointed at, "scoreboard!"
 
Geeforcer said:
Hellbinder said:
Even at the last E3 Ati's hardware ran circles around the current Nvidia tech in teh next gen Unreal Engine. There were many witnesses to this fact.

This is so inaccurate it's funny. Let's recap what really happened at GDC with regard to UE3:

Some guy: "OMG, I just say UE3 on NV40 and R420 (neither launched at the time), and it was great on R420 at 1024x768 and very slow at NV40 at 640x480"

Local contingent of ATI fanpeople:
"Haha, NV40 is obviously horrible! Our day is complete!"

jvd: "God, I hope this is true! ATI must own Nvidia, for the common good!" (It's that kind of even-handed approach that earns you a moderator position on B3D).

Epic people (In the immortal words of Mark Rein): "Complete bullshit!"

ATI fanpeople: "OMG, they are saying that NV40 does not suck compared to R420? Blasphemy! Clearly TWIMTBP money at work."

/curtains

Heh, don't forget to add WaltC telling everyone that Nvidia is obviously cheating on their benchmarks and will bring armageddon to mankind.
 
Status
Not open for further replies.
Back
Top