Post your breaking NV30 news links here!

Essentially all modes of antialiasing are available at all resolutions without any performance hit. (In Quake 2 and Starcraft)

If essentially all antialiasing modes are essentially always available for essentially no performance hit, then how come several of the screenshots of the oh-so-impressive CineFX tech demos show rather blatant jaggies?
 
Joe DeFuria said:
I'm surprised that Carmack "approved" (if he did) of nVidia giving out Doom3 benchmarks...it's not like Carmack to do such a thing. Until the game is close to release, all he usually ever does is give vague references to relative performance.

Not that odd at all really. id used to release pretty comprehensive performance metrics for their games that were currently under development. Detailed 3d chip comparisons were released for both Quake2 and 3 well before the actual games were released.
 
An extremely nice piece of technology, but considering it'll hit the market almost 8 months after the R300, is the GeforceFX really all that impressive? I really expected something a little more.....spectacular.

agreed. totally.

that's why I'll take the plunge into CineFX architechure with the refresh of GeForceFX - hopefully that will have, 256-bit bus (8x 32-bit or 4x 64-bit controllers) 2 texture units per pipeline, more vertex engines, etc.

I was expecting more, and I was expecting it sooner.


I see how desperate Nvidia was to catch ATI...Nvidia had to clock GeForceFX to near 500 Mhz and use 1 Ghz (500 Mhz) DDR-II.

everyone expected the 128-bit bus (4x 32-bit controllers)

but the single TMU per pipe took me by surprise :cry:
 
I don't know...If Anand is to be believed, it seems that the specs have not changed since he was made aware of them way back in March. I really don't know if it's accurate to say that nVidia "had to" clock it to 500 MHz. to be competitive...Sounds like 500 MHz was the target all along, the 9700 aside.
 
BoddoZerg said:
Essentially all modes of antialiasing are available at all resolutions without any performance hit. (In Quake 2 and Starcraft)

If essentially all antialiasing modes are essentially always available for essentially no performance hit, then how come several of the screenshots of the oh-so-impressive CineFX tech demos show rather blatant jaggies?

More importantly, if there was no performance hit for AA, then why would you offer any of the lower quality modes? Maybe for those people who want lower quality AA at the same performance as high quality AA?

:eek:
 
I really don't know if it's accurate to say that nVidia "had to" clock it to 500 MHz. to be competitive...Sounds like 500 MHz was the target all along, the 9700 aside.

The 500 Mhz may have been the target all along, but I doubt that cooling solution was. In other words, if not for the 9700, and the ti-4600 was the current best on the market, I'm willing to bet the clock speed would be low enough to not require "2 slot" cooling.

On a related note, Anand had also said "since march" that the NV30 clearly beats the R-300 on paper.

I don't know how one could reach that conclusion...they both use similar bandwidth savings techniques, while NV30has a higher pixel rate, but R-300 has more bandwidth.
 
Joe DeFuria said:
I don't know how one could reach that conclusion...they both use similar bandwidth savings techniques, while NV30has a higher pixel rate, but R-300 has more bandwidth.

Anand appears to have a blind spot on ATI using compression with AA, just the same as NVIDIA (although that does work without AA as well).
 
andypski said:
BoddoZerg said:
Essentially all modes of antialiasing are available at all resolutions without any performance hit. (In Quake 2 and Starcraft)

If essentially all antialiasing modes are essentially always available for essentially no performance hit, then how come several of the screenshots of the oh-so-impressive CineFX tech demos show rather blatant jaggies?

More importantly, if there was no performance hit for AA, then why would you offer any of the lower quality modes? Maybe for those people who want lower quality AA at the same performance as high quality AA?

:eek:

How dare you insult those of us who love our jaggies? Jaggies have rights too, you cant just squash them with high performance antialiasing! SAVE THE JAGGIES FROM GEORGE W BUSH! ANTIALIASING IS VIOLENCE AGAINST JAGGIES! REMEMBER - JAGGIES ARE HUMAN JUST LIKE YOU. DOWN WITH THE IMPERIALIST ANTIALIASING PIGS!
 
Maybe I'm blind but I didn't find anything about gamma corrected FSAA in the official Intellisample pdf - do you think they accidentally forgot to include?

Here is that paragraph:

"New Antialiasing Modes
The NVIDIA GeForce FX GPUs support a new 6XS mode under Direct3D and new 8X modes for both OGL and Direct3D. These modes, either enabled in the latest Microsoft® DirectX® titles, or available through the control panel settings, provide a higher level of quality than 4X or 4XS antialiasing. By calculating 1.5X as many samples as 4X AA, 6XS takes image quality higher than any 4-sample solution can. Additionally, the new 8X modes provide the highest image quality by calculating twice the number of samples as 4X modes calculate. These new 8X modes are clearly the choice for top antialiasing quality. All of these choices
empower PC users to fine-tune their display settings to fit their applications and style of computing: as a result, they get fluid frame rates for intense gaming action and great image quality too."


EDIT: http://www.nvidia.com/docs/lo/2415/SUPP/TB-00651-001_v01_Intellisample_110402.pdf
 
Yay, my hunch of 500 MHz is right :) but I didn't expect that gigantic cooling device though.

And release date of Feb 2003 ? And the performance increase from the old R300 is only so so, this is all rather dissapointing.
 
By calculating 1.5X as many samples as 4X AA, 6XS takes image quality higher than any 4-sample solution can.

This line makes it sound like their 6XS AA doesn't beat the R9700 6xAA but has to be compared to the 4x mode.
 
malcolm said:
Laa-Yosh said:
I'm sure that sooner or later you will... but right now, you'll have to trust me on that. It's 128 bit and not 256 ;)

After reading it is only 25-50percent faster than the radeon 9700 im glad its a 128bit bus...

So what does this really mean then 5-10% faster? 50% faster my ass... But anyway the card isn't coming out until February so it will be competing with the Radeon 10,000. Nvidia really screwed up this launch. December was bad enough, February is just insanely horrible. I remember someone at the R9700 launch predicting the NV30 would be out in February (I think it was Noko) and everyone just said he was being a fanboi (and he probably was), but now it looks like his prediction was 100% accurate.
 
Nagorak said:
I remember someone at the R9700 launch predicting the NV30 would be out in February (I think it was Noko) and everyone just said he was being a fanboi (and he probably was), but now it looks like his prediction was 100% accurate.

Damit, that was me... more of a lucky/educated guess for a time frame though.. then I became unsure as a result of various other posters who claimed the NV30 had already had its first "tape out". Bah....
 
Humus said:
By calculating 1.5X as many samples as 4X AA, 6XS takes image quality higher than any 4-sample solution can.

This line makes it sound like their 6XS AA doesn't beat the R9700 6xAA but has to be compared to the 4x mode.

maybe they are only comparing it to the GF4 Ti there, kinda like saying look how much better it is from our last GPU etc...
 
Sabastian said:
Well aside from being a huge card packed with DDRII and nearly twice as many capacitors etc... The Radeon 9700 pro while having a 10 layer PCB seems to be considerably more compact. I am not saying that indeed the NV30 is more expensive to produce but it certainly appears to be a physically larger card.

card_front.jpg

That's been the case for the last 3 years. Nvidia makes monstrous cards while ATi make much more compact ones. Also X-Box vs GameCube although that's also due to Nintendo vs MicroSuck.
 
Sabastian said:
Nagorak said:
I remember someone at the R9700 launch predicting the NV30 would be out in February (I think it was Noko) and everyone just said he was being a fanboi (and he probably was), but now it looks like his prediction was 100% accurate.

Damit, that was me... more of a lucky/educated guess for a time frame though.. then I became unsure as a result of various other posters who claimed the NV30 had already had its first "tape out". Bah....

Sorry, I knew it was someone. ;)
 
Personally I like small, compact, cards that pack a lot of power...

not a big huge honkin thing that takes up half the inside of your case :p

bigger isn't always better :D Precision and Elegance are important too
 
Sabastian said:
The Radeon 9700 pro while having a 10 layer PCB seems to be considerably more compact.

FYI, just to clear that up - talking with Brian Skelton from Sapphire (who are the company that manufactures most of the ATI boards for ATI and other vendors) he stated that the production R300 boards are 8 layer boards, not 10.
 
Back
Top