Why all the artefacts? (for examply in Farcry)

Ylandro

Newcomer
Reading the post 'Brent latest review' I checked out the article http://www.hardocp.com/article.html?art=NTk5 and I started to wonder about the lighting artefacts the Geforce cards show in Farcry.

A simple question: Why do they exist?

I mean... this generation of videocards allready exist for more than a year. Why do we still see these kind of problems?

Is it something in the game itself? Then why isn't the ATI card affected? Why does it show up in Need for Speed also? Then you would think it's the drivers... But shouldn't the drivers be mature by now? And why are only these two games affected? Are games so completely different from each other, that they always stumble upon new driver bug?

I don't see these kind of game behaviour concerning CPU's. It's not like you suddenly get weird behaviour when running and AMD instead of Intel. Why in videocards? Do videocards have so much room for their own 'interpretation' of what they should do?

Or is it that the videocard manufacturers are too busy 'optimizing' performance by second guessing the game manufacturers, that they introduce artefacts themselves?

Anyone care to shine his light about the matter?


NB: Before people think I'm attacking NVidia specifically... It's not just Farcry. You see these kind of problems very often, with both NVidia and ATI cards. Just using this case as an example.
 
well... it could be an "optimization"

To get an idea of how much amount of lights can affect performance, just look at 3dmark 2001 high polygon count test and look at performance with 8 lights vs 1 light.

I guess there are ways to hack a scene look almost correct with counting just the nearest light or two and using some approximation for further away lights...
 
Ylandro said:
A simple question: Why do they exist?

I mean... this generation of videocards allready exist for more than a year. Why do we still see these kind of problems?

Well, PS 1.4 support exists for more than two and a half year, yet ATI broke it in their latest drivers (for both the R200 and the R300).

I mean ... in the driver world anything can happen.
 
OpenGL_guy spending too much time on B3D again :D

leaving.gif
 
I'm not sure whether laymen can say whether the game is at fault or drivers are at fault. I think we've all seen alot where a developer points at the IHV, and the IHV points at the developer. For example, IIRC with KOTOR soft shadows ATI blamed Bioware and Bioware blamed ATI.

The situation can become even more obfuscated when you consider that drivers may be buggy, and software developers code around those bugs. Then if the bugs in the driver are fixed, the game breaks.

And of course, let's not discount the possibility of over-aggressive compilers. Always a possibility ;)
 
I can't wait for the day that graphics chips are as robust and rigorously specified as CPUs.

It's not that a PowerPC or x86 chip is without bugs, but man, if a processor had even one of the problems that seemingly every developer finds a number of for every GPU, the outcry would be horrendous.

Yet a GPU or driver set can at any moment corrupt texture outputs or fudge filtering, and the IHVs can still sell the cards with barely a peep of protest.
 
3dilettante said:
I can't wait for the day that graphics chips are as robust and rigorously specified as CPUs.

It's not that a PowerPC or x86 chip is without bugs, but man, if a processor had even one of the problems that seemingly every developer finds a number of for every GPU, the outcry would be horrendous.

Yet a GPU or driver set can at any moment corrupt texture outputs or fudge filtering, and the IHVs can still sell the cards with barely a peep of protest.

And you don't think there are bugs in compilers for CPUs that cause similar issues?
 
There probably are a few, but nowhere near the level that current GPUs are at.
Looking at the features Nv3x chips can't do compared to the what the R3xx cores can even though both series supposedly fully support the same standard is something that would never fly for a CPU.

Heck, even something as relatively minor as the alpha-blending bug in the R3xx chips would be something worthy of a recall if it were an Intel chip suffering from a similarly-sized error. I don't see any campaigns for a recall like there was for the FDIV bug.
 
3dilettante said:
There probably are a few, but nowhere near the level that current GPUs are at.

Also remember that CPU compilers have had a steady platform to work on since the 80386 and even earlier [over 15 years]. But despite that, there's been numerous bugs fixed in GCC over the years that have been very bad; GCC's 2.7.x optimizer producing completely broken code -- runs but incorrect results, others producing significantly slower running code, taking nearly twice the time to compile -- newest revision of GCC versus earlier revision... Given another 15 years and GPU Drivers will have the same perception of being flawless.
 
BRiT said:
3dilettante said:
There probably are a few, but nowhere near the level that current GPUs are at.

Also remember that CPU compilers have had a steady platform to work on since the 80386 and even earlier [over 15 years]. But despite that, there's been numerous bugs fixed in GCC over the years that have been very bad; GCC's 2.7.x optimizer producing completely broken code -- runs but incorrect results, others producing significantly slower running code, taking nearly twice the time to compile -- newest revision of GCC versus earlier revision... Given another 15 years and GPU Drivers will have the same perception of being flawless.

Which is why I can't wait. ;)
I'm not holding the rough status of the chips against them, but I would hope they would work a little harder and a little faster on getting to a more rigorous standard.

A lot of the work that the CPU industry went through is available for their use now, so there are a bunch of roadblocks the video manufacturers don't need to run into.
 
THey don't need to if the graphic cards screws up some pixels are the wrong color. Until the GPU is used for something that is actually "important" like making movies then no one will care.
 
BRiT said:
Also remember that CPU compilers have had a steady platform to work on since the 80386 and even earlier [over 15 years]. But despite that, there's been numerous bugs fixed in GCC over the years that have been very bad; GCC's 2.7.x optimizer producing completely broken code -- runs but incorrect results, others producing significantly slower running code, taking nearly twice the time to compile -- newest revision of GCC versus earlier revision... Given another 15 years and GPU Drivers will have the same perception of being flawless.

How long ago was this and how long did it take to get fixed?
 
The GCC 2.7.x was around 2000, and it's optimizations would even completely break the following code:

Code:
int main(void) {
  unsigned char     sign;
  unsigned char     integer[2] = {0xff };
 
  if ( integer[0] != 0 )
    sign = 0xff;
  else
    sign = 0x00;
 
  printf ( "sign: %x\n", sign );
  printf ( "sign == 0xff: %d\n", (sign == 0xff) );
 
  return 0;
}

Broken optimizations output of "gcc -O1 example.c":
sign: ff
sign == 0xff: 0

Correct output of "gcc -O0 example.c":
sign: ff
sign == 0xff: 1

For more, google on "gcc 2.7 bugs". A majority of the bugs were fixed in GCC 2.8, some weren't fixed until 2.9.

As for the speed to compile, take a look at any forums/mailing list dealing with GCC and compiling the Linux kernal using the newer GCC versus the older ones. Here's one such report on the C++ side as recent as Thu, 12 Feb 2004 21:50:24 -- "We've got some generated C++ which compiles in about 10s (on a P4 2800) at -O0 with gcc-3.2 or 3.3.2, but takes about 1000s with 3.4.0 or 3.5.0." [ http://gcc.gnu.org/ml/gcc/2004-02/msg00796.html ]

There's also major compile-time differences between GCC 2.9.x and GCC 3.2 as documented here: "The compile times is probably the biggest difference between 2.95.3 and 3.2. I compiled arts, kdelibs and kdebase with both. 2.95.3 took a total of about 55 minutes, while 3.2 took one hour and 32 minutes for the same source, with the same configure options (--disble-debug and --enable-final)." [ http://www.linuxandmain.com/modules.php?name=News&file=article&sid=185 ]

And here's compilation of the linux kernel with both GCCs [ http://www.ussg.iu.edu/hypermail/linux/kernel/0302.0/0356.html ]:
Code:
Kernbench-2: (make -j N vmlinux, where N = 2 x num_cpus) 
        Elapsed User System CPU 
gcc2.95 46.08 563.88 118.38 1480.00 
gcc3.21 69.93 923.17 114.36 1483.17 


Kernbench-16: (make -j N vmlinux, where N = 16 x num_cpus) 
        Elapsed User System CPU 
gcc2.95 47.45 568.02 143.17 1498.17 
gcc3.21 71.44 926.45 134.89 1485.33

But it's somewhat understandable in that GCC 3.x has majorly revamped their implementation of code analysis and optimizations.
 
These are some screens I took a few weeks ago, when the second demo came out, on a 5900 and 56.56 drivers.

The rocks just in front of me are dark..
1.jpg


Walk one foot further, and BAM its like someone hit a light switch..
2.jpg


Look at the rock in the background to the right.. suddenly dark..
3.jpg


This time, just the opposite, lights on..
4.jpg


One foot further.. lights off..
5.jpg


Look to the right, towards the dock. That huge rock has the lights on..
6.jpg


Move a tad farther, and the lights are suddenly "off".. If someone is standing over there, its going to be hard to see them...
7.jpg


As Brent mentioned, it happens in NFS:U also. It also happens in BF with water, which is pretty annoying.
 
What I really don't understand, is why the lights go on and off when you walk. A driver bugs that goes and off? :?

I'm not a 3D programmer... Just have done a few lines in Darkbasic.
But from what I understand, everything about those rocks is static. Not only the geometry and textures, but also the light. It's not like the player has a flashlight introducing a dynamic light source. So why would anything change? Especially for those far away rocks.
 
In the demo anyway my 9800 had issues with the rocks as well. I didn;t complain b/c I didn't care since it was a demo, but it was pretty noticeable.
 
It didnt have lighting problems.

I have a right to complain, it does the same thing in the retail version, and is very annoying.
 
Ylandro said:
But from what I understand, everything about those rocks is static. Not only the geometry and textures, but also the light.

Why would the rocks be static?
Ever heard of level-of-detail?

It's possible that the bug is affecting only one of the detail levels.
And LOD might change more than the polycount you know...
 
Back
Top