Anyone else wondering if 6800 Ultra is really the 6800?

Chalnoth said:
DemoCoder said:
Integer codecs don't neccessarily mean a loss of quality.
Well, since I don't really know the math that goes into a MPEG-4 encode, I can't really give an intelligent comment on that. All I can say is that since when I'm encoding a movie, any small errors that creep up will be there for good, so I'd rather let my computer encode for a few hours while I'm gone or sleeping than deal with slightly lower quality.

Most video codecs are based on the concept of a transform (DCT, FFT, DWT) All three of these can be done absolutely losslessly via all integer code. It's been proven that any linear transform that is invertible and finite dimensional has a lossless integer implementation. JPEG-2000 implementations take advantage of this discovery.

But even if you don't want to cope with lossless transforms, you can support arbitrary precision fixed point, which is how many high end DVD players with DSPs do it today.

Same goes for arithmetic encoding (use Q-coding), huffman is already integer. Subpixel Motion compensation can be done with integer, etc
 
First of all were those Far cry shots really comparing PS 2.0 to PS 3.0 for Nvidia? or were they comparing PS 1.1 to PS 3.0.. Some of them look like the HDR things Ati is already capable of wiht their PS 2.0 shader tech.

and...
I have been thinking that myself because not only the name string, but the clockspeeds...

While I will say that I think fanATIcs claiming the R420 will be 16 pipelines and be close to 600 MHz are living in a fairy tale, I do believe that the R420 will be clocked higher than 400 MHz... In fact, the 400 MHz thing is what puzzles me so much. That is EXTREMELY low. Bandwidth is also very low as well...

550 Mhz? That's only 75 MHz (150 DDR) increase over the NV38.

I think something is going on beneath the surface here. The core and memory speeds both seeming *puzzlingly* low. Perhaps we're looking at a 6900 or 6850 at E3?

If these are final shipping specs, I can see a 550 Mhz, 12 Pipeline, 600 MHz Ram X800 Pro defeating the 6800 Ultra - for 100 bucks less to boot.
Fanatics are not the ones saying its 600/600 they are only REPEATING info that is currently out there. ITs not some fanboy made up pipedream like you seem to think. In fact its on more than one website including [H] and posted by Kyle. They are the ones saying that the X800XT is 600/600.

Which gets into Uttars assumption a page or Two back.

If X800XT is 600/600 then Nvidia releasing a card at 475/600 a month later is not going to accomplish Squat.

Some of you in fact many of you also seem to be making the mistake that the R420 is simply a 16 pipe version of the R300 technology. Totally False. It has Shader processing enhancements as well.

I guess all will be revealed in time.
 
DaveBaumann said:
Is that "PS3.0 making a difference" or "What the application is doing makingf a difference"?

Mostly the latter. Offset mapping can of course be done on PS2.0. So can stencil shadow volumes. The only PS3.0 feature I can think of that FarCry might be using is FP texturing/blending.
 
DemoCoder said:
DaveBaumann said:
Is that "PS3.0 making a difference" or "What the application is doing makingf a difference"?

Mostly the latter. Offset mapping can of course be done on PS2.0. So can stencil shadow volumes. The only PS3.0 feature I can think of that FarCry might be using is FP texturing/blending.

That was what i expected too. The question is, if Nvidia helped implementing it, would they want the additions to be used on PS2.0 hardware ? I somehow doubt it.
 
Ardrid said:
So does anyone know what the first image was? It seems far to bland to be PS 2.0 based.

Looks like DX7 water. And afa the floor goes, sure, it's nice with less flat looking surfaces. But as usual, it also looks more like plastic.
 
I don't see a difference between the SM3.0 pics with what my 9700Pro renders in Far Cry. But who am I too see any difference.

The SM2.0 mode in FC does not render water like another GF4MX though.
 
Ardrid said:
So much for PS3.0 not making a difference.

So seeing shitty water to seeing no water is a difference???

And we know that the 6800U has far cry bugs why not compare the visual between a 9800xt on PS2.0 and a 6800U on PS3.0 and show that difference?

Don't be led by the nose fella's
 
Sorry I didn't elaborate, but I thought it was obvious the first shot of the pairs wasn't SM2.0. Heck, it looks too horrible to be programmable anything. Maybe it's a pool cover. :LOL:

Stryyder, there's water in the SM3.0 pic, it's just mostly translucent.

I, too, am somewhat confused by the Crytek dev's speech in EvilMofo.com's video. He mentions "SM2.0 and 3.0" a couple of times, so I'm not entirely sure the "3.0" screens are impossible to render with 2.0. They may take more passes or something, though.

I think those three weeks Crytek spent whipping up a SM3.0 mod would have been better spent fixing NV3x's IQ problems, though. :p
 
Pete said:
Sorry I didn't elaborate, but I thought it was obvious the first shot of the pairs wasn't SM2.0. Heck, it looks too horrible to be programmable anything. Maybe it's a pool cover. :LOL:

Stryyder, there's water in the SM3.0 pic, it's just mostly translucent.

I, too, am somewhat confused by the Crytek dev's speech in EvilMofo.com's video. He mentions "SM2.0 and 3.0" a couple of times, so I'm not entirely sure the "3.0" screens are impossible to render with 2.0. They may take more passes or something, though.

I think those three weeks Crytek spent whipping up a SM3.0 mod would have been better spent fixing NV3x's IQ problems, though. :p

Totally missed it because the second picture has such a low water level compared to the first!!

As to the IQ issues who knows, maybe thats the way its meant to be played!! ;)
 
I think Longer shaders and subroutines/branchs is a quality issue in the shaders and not a performance issue. It can be consider a performance issue, if you need to the desired effect with 2.0 shaders and effect is very long, it might mean switching the shaders in hardware which I would believe would take longer.

Since the shaders can be larger and more modular, this would mean less switching shader in and out of system. With larger amount of memory coming video card now days ( 256M and maybe with NV40, a 512M ), this means less overhead in the long run.

I am a programmer but not for games and such, so I maybe off on this.. But logitically thinking it sounds like PS3.0 would be alot better than 2.0. Only reason ATI states its not, is they don't have it and its an NVidia thing and they don't want developers moving to it.
 
Stryyder said:
Ardrid said:
So much for PS3.0 not making a difference.

So seeing shitty water to seeing no water is a difference???

And we know that the 6800U has far cry bugs why not compare the visual between a 9800xt on PS2.0 and a 6800U on PS3.0 and show that difference?

Don't be led by the nose fella's

You must have missed my other post. I stated that I didn't believe the first pic was PS2.0.
 
Snarfy said:
farcry on my 9800xt:

http://www.azuretwilight.org/gallery/2004_04/FarCry0015

pictures say a thousand words,
ps3.o is a gimmick, pure n simple

PS 3.0 is a set of features and, for each hardware, the performance tradeoffs to implement them and the possibilities that result from this. The issues with these screenshots does not establish the relationship between these as making PS 3.0 a "gimmick" for the NV40.

What is a "gimmick" is the apparent "FarCry PS 3.0" presentation, and the attempts to foster the perception that the change in pixel shader version being a whole number determines the significance between PS 3.0 or PS 2.0. That's what these screenshots prove.
 
Snarfy said:
farcry on my 9800xt:

http://www.azuretwilight.org/gallery/2004_04/FarCry0015

pictures say a thousand words,
ps3.o is a gimmick, pure n simple
It's too early to write off ps 3. However, I doubt it'll have a huge impact in the near future. Based on the developer's comments in evilmofo's launch party farcry clips, it seems everything they've done (so far) can be done in both sm 2 and sm 3.

So why bother with ps 3?
  • One, nVidia might pay developers to do it or even have their own dev relations people write it.
    Two, perhaps a given shader will be faster with 3.0 then with 2.
    Three, perhaps some shaders can be more easily implemented with 3. For long term projects, perhaps some effects will only be possible with sm 3.
    Four, why not? HLSL should help make it easier to support both 2 and 3.
 
Ardrid said:
So does anyone know what the first image was? It seems far to bland to be PS2.0 based.

http://www.presence-pc.com/news/n3625.html

My translation of the author's comments on those screenshots (and remember not to shoot the messenger, since I am in no position to corroborate or refute his claims):

"But NVDA goes further, and demonstrates 4 screenshots from the game FarCry, the first set allegedly using pixel shader 2.0 versus 3.0 for the second. Obviously, if taken at face value, the difference is considerable and we would quickly conclude that this technology provides a significant improvement in image quality in a game that has already been released. Except that several remarks can be made when looking at these screenshots.

Indeed, the image of the statue rendered with shader 2.0 seems to have been rendered without using any shader technology at all, and even without Normal Mapping. The output is therefore visibly not up to par with the capabilities of even the original GeForce.

The screenshot of the staircase using shader 2.0 seems to be of higher quality. From this shot, the rendering of the stones on the wall seem to use Paralax Mapping. This technique can be rendered via shaders 1.X (DirectX 8.X), but is ideally obtained with shader 2.0. The Displacement Mapping used on the left screenshot is only possible using shader 3.0, which would make this consistent. It is therefore more than probable that these shots are authentic, although we cannot state this with certainty. Note that in this case, the difference in quality compared to shader 3.0 is also less. Difficult to investigate this issue further without a GeForce 6800 card to run FarCry.
 
Back
Top