Perlin Noise?

lopri

Regular
Could someone explain what exactly it is? I have a GTX 280 and a HD 4890 side-by-side and doing some comparisons, but it isn't exactly limited to GT200 and RV770/RV790. I believe I had a similar experience with a 8800 GT and a HD 3850 as well.

Basically, what happens is that NVDIA's cards take a much huge hit once AA is applied. And it just doesn't feel as smooth as ATI cards at the same AA. I am using a 30" monitor so probably frame buffer is a factor, but I am comparing cards with the same amount of VRAM. And GT200 has superior specs to RV790 in just about every aspect (at least on paper).

So I decided to find out what is going on, and I did find something while running 3DMark06.

Compare_Details.jpg


This isn't an apple-to-apple test. 280 is paired with an E8400 @3.6GHz and 4890 with a X2 4050e @3.0GHz (waiting for a PII). But I figured it wouldn't matter much at 2560x1600/8AA/16AF.

The detailed feature tests show 4890 is better at simple vertex shader and Perlin Noise. So it got me curious what Perlin Noise is.. and if it has anything to do with AA?

Any help is appreciated.
 
Perlin Noise is used to generate solid procedural textures. There's plenty on the web (e.g academy award) including the seminal paper by Ken Perlin.
 
Thank you for the reference. But I thought this was a beginner's forum.. (maybe it was meant for folks who actually started to learn 3D at school or a similar institution?)

I am not even at such level. I am more like a 'consumer' compared to the rest of you. Maybe a mod could move this thread to 'Hardware' forum or somewhere more appropriate? Thank you much.
 
Perlin Noise can be seen as a smooth randomized function. It generates values between -1 and 1 given an input parameter. Think of it roughly like a sin(x) function, but instead of smoothly going from -1 to 1 at regular intervals, it smoothly goes from random values between -1 and 1 to other such random values at regular intervals. Then there's a 2D version of the perlin noise function as well, which takes and x and y parameter and generates a smooth "surface" in a similar fashion. Then there's 3D and higher dimension versions as well. Up to 4D is useful in graphical applications, where the fourth dimension is time, so you get an animated 3D noise function.

Perlin noise is a basic primitive in many forms of graphical techniques. It's useful anywhere where you want some kind of controlled randomness. As a practical example, the wood shader I used in this demo is generated from Perlin noise. By using noise instead of a plain wood texture I can get each tile to look unique.
 
Could someone explain what exactly it is? I have a GTX 280 and a HD 4890 side-by-side and doing some comparisons, but it isn't exactly limited to GT200 and RV770/RV790. I believe I had a similar experience with a 8800 GT and a HD 3850 as well.

Basically, what happens is that NVDIA's cards take a much huge hit once AA is applied. And it just doesn't feel as smooth as ATI cards at the same AA. I am using a 30" monitor so probably frame buffer is a factor, but I am comparing cards with the same amount of VRAM. And GT200 has superior specs to RV790 in just about every aspect (at least on paper).

So I decided to find out what is going on, and I did find something while running 3DMark06.

[ulr]http://kryiyq.blu.livefilestore.com/y1pcDhriaqBSsgse6j1BvrPJSmG5ZMcW4bLDnNegZEQn0IuWHk_L1pwQnP5QlscYOFC9YxqEye18EIb9xd2LdKs6w/Compare_Details.jpg[/url]

This isn't an apple-to-apple test. 280 is paired with an E8400 @3.6GHz and 4890 with a X2 4050e @3.0GHz (waiting for a PII). But I figured it wouldn't matter much at 2560x1600/8AA/16AF.

The detailed feature tests show 4890 is better at simple vertex shader and Perlin Noise. So it got me curious what Perlin Noise is.. and if it has anything to do with AA?

Any help is appreciated.

I am not sure, AA is even applied in the feature tests if you switch it on via 3DMark's own settings. But nevertheless: HD 4890 packs massive shading power, it's theoretical peak is somewhere between 1.5 and 2x that of GTX 280 (depending on who you're asking). Perlin Noise tests as in 3DMark 06 and Vantage do scale almost linearly with core/shader clock on these chips, so it's much like a directed test running their shading engines at full throttle.

WRT Vertex-stuff - I think there's more going on than meets the eye. I suspect non user adjustable settings in the driver to be responsible for much of the vertex behaviour we're seeing today (as a ref: Take AMDs DX10 shader tests, which humus should know very well that they sent out at the HD 2900 launch. It showed a vastly superior vertex shading performance for HD 2900 compared to GF8800 at the time. Later, the vertex performance was reduced by a factor of ten or so via drivers)
 
Humus - Oh, that's a real cool site! Is that your site? I've downloaded bunch of demos. And thank you so much for the explnation on Perlin Noise. Along with the demo, I can understand what it does. From what I understand, it has little to do with Anti-Aliasing. Correct? (I'd imagine AA would applied with more 'fixed' patterns) It'd be rather suitable for replacement for simple textures. (including transparent textures?)

CarsternS - Thinking about it, you might be correct in that AA/AF might not apply to the Feature Tests in 3DMarks. I know AMD's shader units and those of NV's are different, but I assume by 'shading power' you mean FLOPS, correct?

But according to the feature tests GT200's shaders seem to give better performance? (Again, except Perlin Noise)

The whole thing began from this.



The first row is the default run (1280x1024), and the second row is 2560x1600/8AA/16AF. It started with simple stress-testing on HD 4890, then I noticed the Game Test 4 looked a lot smoother on HD 4890. So I ran the whole thing a couple times and verified my eyes aren't totally bad. :) GTX 280's SM3.0 test score drops so dramatically @2560/8AA, and this somewhat mirrors my experience in at least two games that are based on a same engine - Oblivion and Fallout 3.

But according to you guys, this is not something related to Perlin Noise. Am I correct?
 
But according to you guys, this is not something related to Perlin Noise. Am I correct?

Correct! Perlin Noise is something that can easily be run at peek rates on ATI hardware. Well it can also run just as effectively on NV hardware, but they have a lower peek.

Reason why you're seeing such a performance drop at 8x AA in shader model 3.0 tests is probably that these tests (or part of them) are significantly bound by GPU fillrate which will be lower on GT200 then on HD4890 at these settings. Are you seeing same pattern at 4x AA?
Or it could also be that ATI early Z rejection is more effective in these tests. And of course combination of this with lower fillrate at 8x AA.
 
humus,

Sure, it was called AMD DX10 Shadertests and consisted of a series of shaders with dozens of iterations of the same instruction.

There were 5 tests: float MAD, float4 MAD, int MAD, int4 MAD (those once serial, once parallel) and SQRTs plus 5-instructions mix. All of these were run once as vertex shaders and once as pixel shaders.

I was under the impression that you were the one writing those tests - if that's not true, please accept my apologies.



lopri,

Yes, i meant the theoretical throughput of the ALUs, but that alone never makes the whole story. Most of the other feature tests except perlin noise is not mainly limited by GFLOPS throughput - shader particles for example is mainly limited by bandwidth (internal or external i am not 100 percent sure).
 
Last edited by a moderator:
I was under the impression that you were the one writing those tests - if that's not true, please accept my apologies.

Me and another guy wrote some simple tests to measure the R600's strengths and weaknesses vs. G80, but that was intended for internal use and I don't recall this was ever released externally. Can't say that it wasn't though. We had a test that sounds similar to that, though my memory is weak, but that would be just one test out of like 20+ different tests, and I didn't write that one in any case.

Googling on this I found only one reference to it:
http://techreport.com/articles.x/12458/3
It sounds too generic for me to know if it's really something ripped out of our test suite or if it's a completely different test.
 
The first row is the default run (1280x1024), and the second row is 2560x1600/8AA/16AF. It started with simple stress-testing on HD 4890, then I noticed the Game Test 4 looked a lot smoother on HD 4890. So I ran the whole thing a couple times and verified my eyes aren't totally bad. :)

You're testing with two wildly different CPUs. The E8400 is maybe twice as fast as the 4050E, meaning that the 4890 is heavily limited by the slow cpu in the lower resolutions (and less in the high, that's why it looks like the GT200 is taking a larger hit). If you want to compare you should really run them in the same machine.
Generally the 4890 and the GTX280 is about the same speed..
 
Me and another guy wrote some simple tests to measure the R600's strengths and weaknesses vs. G80, but that was intended for internal use and I don't recall this was ever released externally. Can't say that it wasn't though. We had a test that sounds similar to that, though my memory is weak, but that would be just one test out of like 20+ different tests, and I didn't write that one in any case.

Googling on this I found only one reference to it:
http://techreport.com/articles.x/12458/3
It sounds too generic for me to know if it's really something ripped out of our test suite or if it's a completely different test.

It's from that testsuite from what I recall was said during that time-period....unless the "other guy" you're referring to isn't Guennadi:-? I also think Wavey provided the exe in the R600 review thread.
 
Back
Top