Ironic post, don't you think, XMAN?It is totally useless to have discussions with some people because they truely want to believe that Nvidia is the devil of GPUs and Intel the devil of CPUs and AMD/ATI the angels over them all.
-FUDie
Ironic post, don't you think, XMAN?It is totally useless to have discussions with some people because they truely want to believe that Nvidia is the devil of GPUs and Intel the devil of CPUs and AMD/ATI the angels over them all.
So instead of fixing the alleged problems, they didn't release the patch. Yeah, that's a good solution.No, there were rendering problems with the path.
Ironic post, don't you think, XMAN?
-FUDie
So instead of fixing the alleged problems, they didn't release the patch. Yeah, that's a good solution.
In any event, you still have responded to TRAOD patch being retracted and the benchmark being removed.
-FUDie
So instead of fixing the alleged problems, they didn't release the patch. Yeah, that's a good solution.
I responded to the AC claim. But we can speak about Half-Life 2 and the non existent of the "NV3x path" in the retail version. Where was it?In any event, you still have responded to TRAOD patch being retracted and the benchmark being removed.
-FUDie
Actually, you do put down AMD/ATI with every post and put nvidia on a pedestal whereas I don't make out nvidia to be any more evil than they actually are But please, feel free to show where I've accused nvidia of anything that wasn't publicly verified.No, because unlike you and others, I'm not making AMD/ATI out to be devils, where as to you, anything Nvidia does = the evil, even when its proven to NOT be of there doing.
One more: http://forum.beyond3d.com/showthread.php?p=1339955#post1339955XMAN26 said:WOW! Going all the way back to the 5800FX days on that one with 3DMark03 huh, and when was that a game? One where the pic used in the "Shader scape" was shown to be rendered more true on the Nvidia card than ATIs. And yes real world, ATI has long since beaten up Nvidia in the 3D Pro Synthitic Apps, but in the ones used by Pros, it was just the oppisite. 4870 and GTX285was no different, on paper, ATI had the better specs for most things, but yet got beat in actual apps. I doubt that is going to change this time around either.
XMAN26 said:I still think this launch was a fail on AMDs part.
Half-life 2 did have an NV3x path. You don't recall all the PS1.x shaders and liberal use of _pp for the few 2.0 shaders?I responded to the AC claim. But we can speak about Half-Life 2 and the non existent of the "NV3x path" in the retail version. Where was it?
So there is a special NV3x path, it's Valve's fault for not making the 5900 faster than it is? Maybe Valve should have tried flat-shaded polygons?Anandtech said:even with the special NV3x codepath, ATI is the clear performance leader under Half-Life 2 with the Radeon 9800 Pro hitting around 60 fps at 10x7. The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath
There was plenty to show NV30 in a negative light, but TRAOD was the first game to make use of lots of new DX9 features: SM2.0, 10_10_10_2 render targets, etc. Why wouldn't a gaming enthusiast be interested in benchmarking that?Apparently the only people concerned about the removal of a benchmark are ATI fans. Gotta cling to every last shread of anything that shows Nvidia in a negative light in gaming results.
Half-life 2 did have an NV3x path. You don't recall all the PS1.x shaders and liberal use of _pp for the few 2.0 shaders?
Maybe you should provide some source to back up your comment. Here, I'll help: http://forums.hexus.net/graphics-cards-monitors/3367-halflife-2-ati-only-route.html That post links Anandtech: http://www.anandtech.com/showdoc.aspx?i=1862.
So there is a special NV3x path, it's Valve's fault for not making the 5900 faster than it is? Maybe Valve should have tried flat-shaded polygons?
-FUDie
Actually, you do put down AMD/ATI with every post and put nvidia on a pedestal whereas I don't make out nvidia to be any more evil than they actually are But please, feel free to show where I've accused nvidia of anything that wasn't publicly verified.
Here, I'll help:
http://forum.beyond3d.com/search.php?searchid=1310667&pp=25
Oh, and here's a classic too: http://forum.beyond3d.com/showthread.php?p=1360269#post1360269
One more: http://forum.beyond3d.com/showthread.php?p=1339955#post1339955
Guess what? AMD has released 4 chips from the 5xxx series, where's the fail?
You aren't the objective observer you think you are.
-FUDie
Retail version with patches or without? In any event, what's the point of an NV3x path when DX8 works just as well for it? http://www.anandtech.com/showdoc.aspx?i=1863&p=8 Not exactly stellar performance for the 5900 in the mixed-mode.You should check the retail version. There is no NV3x path - only DX7, DX8 and DX9.
Retail version with patches or without? In any event, what's the point of an NV3x path when DX8 works just as well for it? http://www.anandtech.com/showdoc.aspx?i=1863&p=8 Not exactly stellar performance for the 5900 in the mixed-mode.
I know the 5900 is faster than the 4600, so use the DX8 path, get faster performance and be happy.
-FUDie
Where did I say that? Your statement that "You won't believe AMD until Ubisoft admits to being wrong" is ludicrous because there's no way Ubisoft will do that.1. Where has anything been proven other than pure conjecture from Huddy? But then again, in your eyes, if Huddy says it so, it must be all the proof you need.
Because you are probably the only person to think this way?2. And where in that statement was I wrong?
R300/350 were far faster than anything around. Did you not even look at the Half-life 2 numbers posted a couple posts back? And that's just one game. The only place NV3x did well was Doom3, but I don't think that holds anymore.3. Again where am I wrong? For years ATI has ruled synthitic benches and the performance results there have not translated to gaming performance wins Gen vs Gen. R600 vs G80, R670 vs G92, R770 vs G200. R580 being the lone exception since 04. Before that, R300/350.
Your lack of logic is astounding. RV770 was slower than GTX 285, right? Where did you think doubling RV770 would put performance? I'll give you a big hint: 5870 is not double 4870 in one big area. But, 5870 is easily double 4870 in many ways, such as shader/compute performance, as long as you avoid the one big thing that's not doubled.4. I still find it a somewhat fail. 5770 slower than previous highend is nothing to be happy about. Hell the 8600GTS was an even worse launch as its performance fell back 2 gens and not even the highend part. 7600GTs were faster. If the mid range for nvidia doesn't atleast equal GTX285, it too is fail. As to why I concider R870 fail, you double nearly everything from R770 and at best its 30% faster than your competitors DX10 hardware overall.
The difference here is that the NV3x path didn't make 5900 performance competitive. It also didn't add new features (DX9 path was already there). So what is the real point? How does this compare to, say, the AC patch which added DX10.1 features?Now this is cute. Its shown that Valve in conjecture with ATI released a version that hurt a competitor and you are TOTALLY ok with it. But yet, conjecture and speculation of Nvidia doing, even when proven false is, "OMG look what they are doing" whether or not they are doing anything.
Maybe it was buggy, like the alleged bugs in the AC patch? I don't know, why don't you ask nvidia/Valve.XMAN26 said:And FYI, teh NV3x code path wasn't released as a patch until after NV40 launched. Whats the point releasing it then?
Where did I say that? Your statement that "You won't believe AMD until Ubisoft admits to being wrong" is ludicrous because there's no way Ubisoft will do that.
Because you are probably the only person to think this way?
R300/350 were far faster than anything around. Did you not even look at the Half-life 2 numbers posted a couple posts back? And that's just one game. The only place NV3x did well was Doom3, but I don't think that holds anymore.
Your lack of logic is astounding. RV770 was slower than GTX 285, right? Where did you think doubling RV770 would put performance? I'll give you a big hint: 5870 is not double 4870 in one big area. But, 5870 is easily double 4870 in many ways, such as shader/compute performance, as long as you avoid the one big thing that's not doubled.
Since when have the next gen's midrange been faster than the previous gen's high-end? Never? I'll draw up a list of ATi's products, you can do the same with nvidia:
High end: R300->350->360->420->480->520->580->600->RV670->770->Cypress
Midrange: RV350->370->410->530->635->730->740->Juniper
RV740 was certainly faster than RV670, but I wouldn't call RV740 a midrange chip. Plus, RV670 was sort of broken.
-FUDie
What's the problem? This comes directly from Huddy: http://www.thinq.co.uk/news/2010/3/8/amd-game-devs-only-use-physx-for-the-cash/
Did you think that through? So AMD is going to make chipsets that only work with its graphics cards and graphics cards that only works with its chipsets and in doing so hands the entire desktop PC market over to Nvidia? Nice plan.
PhysX adds value for Nvidia's customers. What value does a proprietary peripheral interface add when it does exactly the same thing that an existing widely adopted standard does?
If the slot has more bandwidth it would add value there , IT can provide more power through the slot so you don't need as many connectors on the card and so on.
"Eidos’ legal department" does not equal Nvidia.
You should check the retail version. There is no NV3x path - only DX7, DX8 and DX9.