Tim Sweeney...

Does anyone remember a little while back that Tim Sweeney stated that the GTX would be much better for UE3 than the next gen ATi board.

Knowing what we know now about the Hyperthreaded design, Branching capabilities, Pure FP32 etc etc.. And assuming that he had to know that...

What the heck was he smoking? With all the Speaches about IEEE 32 and similar stuff from the 5800 days .... any idea how he can justify this continued Pro Nvidia spin??

Do you guys remember or have a link to those somewhat recent comments??
 
There was something on theinq like 6 months ago... I am not sure that would be in any way relevant at this point in time and I expect it will be less relevant when the game is released.
 
Hellbinder said:
Does anyone remember a little while back that Tim Sweeney stated that the GTX would be much better for UE3 than the next gen ATi board.

Did he really say that though? Do you mean what he said during the "interview" at E3 with one of the nvnews guys?
 
As I recall it was a one-word answer to a generic question along the lines of:

Q: G70 or R520?
A: G70

No "much better" or anything else, and so clearly a "we're having fun" situation that its only interest at the time were those who have gone beyond tea leaves all the way to tarot cards and chicken bones.
 
IbaneZ said:
Did he really say that though? Do you mean what he said during the "interview" at E3 with one of the nvnews guys?

*slings mud back*

ATi stole my cabbage *fart*.

I seem to remember this on TheInq, probably some of their more dodgy journalism.
 
AlphaWolf said:
There was something on theinq like 6 months ago... I am not sure that would be in any way relevant at this point in time and I expect it will be less relevant when the game is released.
Not to mention that the R520 that Sweeney saw/used/knew of 6 months ago was very probably not the same R520 that was just released. ;)
 
Hellbinder said:
Does anyone remember a little while back that Tim Sweeney stated that the GTX would be much better for UE3 than the next gen ATi board.

No. He was asked "G70 or R520" and he said "G70, of course". That's all. He could have meant anything, but nowhere did he use any words like "better", "bigger" or such.
 
BTW, TS replied to me (I asked about all the new HW being FP32) back in 7/05 :

Tim Sweeney said:
We're thrilled that everyone is moving to full 32-bit IEEE-754 floating-point support. For Unreal Engine 3, we can cope with existing FP24 (Radeon X800, etc) though it complicates things and causes some projection artefacts when not used very carefully. 32-bit will definitely improve everything long-term.

FP16 is only useful as a data storage format, for data types where less precision is needed. It's not reasonable to use such a low-precision format in computations.

Final Fantasy was clearly CG and though it's artistically interesting, it falls way short of reality, which is the real goal of computer graphics.

Duh! stuff but if you know TS like I do, you'll recognize a prepared canned reply when you see one.

TS said G70 is/was better? Oh Wow, Gee Whiz. Anyone knows if he grinned (cheekily, perhaps) when he said so, in that everyone-must-trust-an-Internet-"report" quoted above?

Tim Sweeney is an NVIDIOT. Please quote me.

That reply of his came after we joked about whether he or JC is a better programmer (since we had such a joke in these forums in the past).
 
Hellbinder said:
Does anyone remember a little while back that Tim Sweeney stated that the GTX would be much better for UE3 than the next gen ATi board.

Knowing what we know now about the Hyperthreaded design, Branching capabilities, Pure FP32 etc etc.. And assuming that he had to know that...

What the heck was he smoking? With all the Speaches about IEEE 32 and similar stuff from the 5800 days .... any idea how he can justify this continued Pro Nvidia spin??

Do you guys remember or have a link to those somewhat recent comments??


Nvidia ships him and his entire studio hardware for free, i dont see the problem. When someone scratches your back and scratch theirs right back, or you lose perks. And to be completely honost there was what, a total of 6 actual running R520 cards at the time of this comment? Problably more, thats more of a sarcastic comment, but i doubt any of them were clocked much higher then a current X800. With low clocks we know the R520 doesnt hold a candle to the GTX. I'd be surprised if he knew or cared much how the R520 performed at the time anyway. Its a huge technological leap in some areas, mainly the memory controller, but it falls short in over all performance without the raw speed. And at the time of the comment, there simply werent any 500MHz +

When you develop on a card you learn its tricks and how to use it, as im sure they have with the G70, im sure they really like the core. I believe they stated as much at E3 about how well it was running the engine. Many other developers, even small ones, have said much of the same. You're weighing too heavily on his responce. The R520 may have theoretical better advanced shader performance but that doesnt make a developer whos been using NV30s 40s and 47s, someone whos use to the core architecture and how it works and how to develop for it, dishonost when they answer based on what they know.

"excuse me sir, you've been developing this engine for years now on Nvidia hardware, which do you prefer, ATI or Nvidia?" der
 
Last edited by a moderator:
SugarCoat said:
Nvidia ships him and his entire studio hardware for free, i dont see the problem.
'Problem' is, when you are epic studios, you can fuckin afford to buy a bunch of video cards and still have enough loose change rattling around to afford a ferrari or two so you don't HAVE to be indebted to any one particular video card vendor. I think any reasonable sensible person would not want to whore out themselves and their games to - in this case nvidia - for what in the end amounts to petty cash amounts! They'd have more brains, balls, ethics and understanding than that.

And to be completely honost there was what, a total of 6 actual running R520 cards at the time of this comment?
If that's the case, Sweeney should have declined to comment. That he chose otherwise speaks for his motives as a nvidia mouthpiece.

The guy has a history of making less than informed statements, his early comments on tiled deferred renderers being fundamentally incompatible with transform & lighting acceleration comes easily to mind.
 
Or his interview from a few years back where he prognosticated that the entire 3D rendering pipeline would be moved back to the CPU "within a decade." Had he merely counted the number of operations required to do simple, Voodoo 1-type things such as basic FB blends or a bog-standard bilinear filter he would have quickly seen how that was completely improbable, even with all the ludicrously overoptimistic predictions on future CPU/HW scaling that most of us believed in back in the 90's.
 
Last edited by a moderator:
Meh. All I have to say is, "Cheers to Sweeney and his continued support of nvidia and linux" since that's all I use.
 
Guden Oden said:
I think any reasonable sensible person would not want to whore out themselves and their games to - in this case nvidia - for what in the end amounts to petty cash amounts! They'd have more brains, balls, ethics and understanding than that.


If that's the case, Sweeney should have declined to comment. That he chose otherwise speaks for his motives as a nvidia mouthpiece.

Your indignation is pretty funny. You're certainly no stranger to voicing strongly opinionated comments in public. So why shouldn't Sweeney be allowed to say whatever the hell he feels like? The only real difference is that people are interested in hearing what he has to say.
 
Back
Top