NV30 Update

Chalnoth said:
Just because somebody made an ass out of themselves by releasing totally wrong clock speeds on older parts, nVidia won't be able to get a decent clock speed out of the NV30? That makes no sense at all.

Um, no.
Your interpretation is flawed.
Look: In each "Gen" there are multiple parts, right?
And in the document refered to above, each "gen" has a labeled "speed"=MHz.
His point is NOT that the clock speeds are WRONG for each generation (cause they ARENT) but merely that the FIRST parts released in each generation have substantially lower clock speeds than that listed for that particular generation. The later parts of that generation have fufilled the Mhz listed.
This leads to the idea that MAYBE the nv30 (first in its generation) MIGHT NOT have the high clock speed listed for its generation - IE, that like all the other generations, that clock speed is met by later parts (refreshes) in that gneration.
Your failure to even bother to try and understand soemthign that implicated "your" company unfoavorably is quite irritating.
You arent stupid - why the huge blind spot then?
 
Those clock speeds are still very wrong, Althornin, whether related to a generation or not.

Examples:
Gen1=125MHz In this generation we had core clocks of 90MHz, 125MHz, 150MHz, 175MHz, and even 195MHz (though 150MHz was the top "official" from nVidia...).
Gen2=250MHz In this generation the max was 250MHz.
Gen3=350MHz If this includes the GF4, then the max here was 300MHz.

In other words, this makes no sense at all! The numbers are just way off. I think it's far more accurate to conclude that this person just doesn't know what he/she's talking about, and thus this means nothing.

Side note, can somebody post an exact link to the pdf? I didn't quite get where this thing was...
 
Chalnoth said:
mboeller said:
I'm not sure myself if the NV30 will be faster than the R300. An new presentation (26_2slides.pdf) showed the NV30/Gen4 as running at 450MHz, but the same presentation showed Gen1/TNT at 125MHz (but we know only TNT2 was that fast); Gen2/GF at 250MHz (but GF1 was only 120MHz) and finally Gen3/GF3 at 350MHz (but the GF3 was only 200MHz, and maybe the NV28 will reach 350MHz). So it seems highly unlikely that the NV30 will reach more than 350MHz.

Just because somebody made an ass out of themselves by releasing totally wrong clock speeds on older parts, nVidia won't be able to get a decent clock speed out of the NV30? That makes no sense at all.


WOW, Nvidia does something wrong! And that statement from You! :D
Sorry I have no direct link, but the link was given in this forum, and by the way the PDF is from Nvidia :D
 
Nagorak said:
woolfe99 said:
This is not a very objective observation. Each one of those ATI cards that you are comparing to an Nvidia card came out afterwards. Then you conveniently overlook the flipside of the coin. Here's the other way of writing history:

Geforce1 v. Rage Fury Pro

GF1 has T&L (later FSAA via driver hack)

Geforce2 GTS v. Rage Fury Maxx

Geforce2 has T&L, dot3 bump mapping, FSAA

Geforce3 v. Radeon 64 meg Vivo

GF3 has pixel and vertex shaders

Nvidia: the first 32 bit card (TNT), the first T&L, the first pixel and vertex shaders, the first FSAA, the first AF.

Gee look, Nvidia has always produced more "advanced" cards than ATI. :rolleyes:

The only point in time where your observation is defensible is with the GF4 vs. the 8500, since the GF4 was really just a speed bump as compared to the GF3. There you have a bona fide situation where a card that was released later (the GF4) has fewer features than a card released earlier (the 8500). Compensating for that is a fairly sizeable performance delta however.

Let's not get TOO fanATIcal here...

Ok so by your way of doing things we can compare R9700 vs GF4 Ti4600 and say it's more advanced (even after NV30 is released)? He's comparing the same generations of cards, which does somewhat make sense.

The R9700 release more or less invalidates your point (although I realize you were just using the listing as an example). The point is Radeon was ATi's answer to Geforce 2. Rage Fury was made to combat TnT. Yes, they were late, but the doesn't mean they were not meant to compete with them. And, that all changed (fortunately) with the R9700... more competition is good. :)

But now that the R9700 is out and ATi's had its day in the sun, I really wish the NV30 was released. NV30 just taping out now is just really bad news for us consumers. :(

My point I thought was obvious. Doomtrooper in every instance was comparing an ATI card to an Nvidia card that was released BEFORE. It doesn't matter what was intended to be an "answer" to what. Doomtrooper says ATI has more "advanced" cards, and he uses as one of his examples the fact that the 8500 had PS1.4 and truform where the GF3 did not. That's logically absurd, because the 8500 came to market 6 months later. The same is true true for Radeon 1 vs. Geforce 2. Yes, it had more features but it came out 5 months later. It is axiomatic that a card released later will have more features. ATI cannot fairly benefit from the perception that its cards are more "advanced" than Nvidia simply because it has been on a different (and later) time schedule for several years now. What was odd is that in spite of later releases ATI could never really outpace Nvidia's performance until just now. Doom also listed examples of ATI's "firsts" but totally ignored all of Nvidia's "firsts." Anyone who can only see one companies contributions and not the other is not really looking at things in a particularly impartial way.

That said, ATI has done a great job with its current generation and deserves all the kudos it gets.
 
That's logically absurd, because the 8500 came to market 6 months later. The same is true true for Radeon 1 vs. Geforce 2. Yes, it had more features but it came out 5 months later. It is axiomatic that a card released later will have more features. ATI cannot fairly benefit from the perception that its cards are more "advanced" than Nvidia simply because it has been on a different (and later) time schedule for several years now.

GF4 was approx 5/6 months after the 8500. Yet the 8500 still had more advanced features vrs gf4......
 
jb said:
GF4 was approx 5/6 months after the 8500. Yet the 8500 still had more advanced features vrs gf4......
Precisely what I was thinking. Yes, ATi parts that came out later have been more advanced, but this isn't always the case with NVIDIA parts. That is debateable however, because the GF4 does do things that 8500 can't (e.g. MSAA, Trilinear + AF), so it might just be a wash.
 
woolfe99 said:
[
My point I thought was obvious. Doomtrooper in every instance was comparing an ATI card to an Nvidia card that was released BEFORE. It doesn't matter what was intended to be an "answer" to what. Doomtrooper says ATI has more "advanced" cards, and he uses as one of his examples the fact that the 8500 had PS1.4 and truform where the GF3 did not. That's logically absurd, because the 8500 came to market 6 months later.


Let me help you with some History here:

Radeon 1 was released same time as a GTS, following that Nvidia released Geforce 2 Pro, Geforce Ultra, then Geforce 3.

Radeon 8500 is released and Nvidia releases Geforce ti and Geforce 4 ti...

How the hell does a Geforce 2 ULTRA and Geforce 3 Titanium and Geforce 4 Titanium that were released after ATI cards when these cards came out 6-9 Months after the ATI offerings.

eek5.gif
 
I could very easily dig out a ton of threads on the topic of Unreal Tournament 2003...and how the 8500 was going to kick the living crap out of the nVidia boards (GF3/4) simply because of the supposed "higher tech" Pixel Shader 1.4 support...

Then, when the numbers were finally analyzed, it turned into another nut-roll...It was like, "Ah crap...darn...there goes that whole argument...let's try another angle!"

Anyhow, I don't ever equate things like higher Pixel Shader versions to more features...at least, not all by themselves. If there's some data to support, say, PS 1.4 totally mopping the floor over a PS 1.3 product, then we're talking...or, a massive visual quality difference. Something.

Man...I really cannot wait to do some surround gaming. There isn't a feature, IMHO, that will come close to _really_ changing the way you game than this one...
 
I could very easily dig out a ton of threads on the topic of Unreal Tournament 2003...and how the 8500 was going to kick the living crap out of the nVidia boards (GF3/4) simply because of the supposed "higher tech" Pixel Shader 1.4 support...

Does UT2003 even make use of PS1.4. My understanding was that it was mainly a DX 7 game.
 
Side note, can somebody post an exact link to the pdf? I didn't quite get where this thing was...

Here's the PDF (from *ahem*, nVidia)

http://videos.dac.com/videos/39th/26/26_2/26_2slides.pdf

See Page 7.

No matter how you look at the data, one thing is clear:

In not one instance does the clock speed represent the debut clock speed of a generation. It's a refresh. So perhaps the 350 Mhz "Generation 3" is a clue as to what NV28 is....will be interesting to see.

This does give good reason to doubt that the first gen NV30 is targeted at 450 Mhz, as has often been speculated.
 
Does UT2003 even make use of PS1.4. My understanding was that it was mainly a DX 7 game.

I agree. I don't recall anyone speculating that Radeon 8500 would out-do GeForce3/4 on UT because of better pixel shaders, for that reason.

That being said, I believe the "first" UT Tech demo benchmarks showed 8500 significantly ahead of the GeForce (which is when people started fishing for reasons why). Curiously, the very next "release" of the demo levelled the playing field between 8500 and Geforce3/4.
 
Chalnoth said:
Just because somebody made an ass out of themselves by releasing totally wrong clock speeds on older parts, nVidia won't be able to get a decent clock speed out of the NV30? That makes no sense at all.

It does show, however, that Nvidia has a tedancy to overestimate clock speeds. With all the problems I am skeptical of them hitting 450 MHz. 400 MHz seems more likely, or even 350. I kind of suspect they'll make sure to release it at a fast enough speed to ensure it beats R300 though. Their reputation is on the line.

woolfe99 said:
Doom also listed examples of ATI's "firsts" but totally ignored all of Nvidia's "firsts." Anyone who can only see one companies contributions and not the other is not really looking at things in a particularly impartial way.

I see your point...I was more or less just nitpicking.

Anyway, I don't think anyone would accuse Doomtrooper of being "impartial" or "unbiased", so I wouldn't worry too much about it. ;)

Typedef Enum said:
I could very easily dig out a ton of threads on the topic of Unreal Tournament 2003...and how the 8500 was going to kick the living crap out of the nVidia boards (GF3/4) simply because of the supposed "higher tech" Pixel Shader 1.4 support...

Then, when the numbers were finally analyzed, it turned into another nut-roll...It was like, "Ah crap...darn...there goes that whole argument...let's try another angle!"

Anyhow, I don't ever equate things like higher Pixel Shader versions to more features...at least, not all by themselves. If there's some data to support, say, PS 1.4 totally mopping the floor over a PS 1.3 product, then we're talking...or, a massive visual quality difference. Something.

Man...I really cannot wait to do some surround gaming. There isn't a feature, IMHO, that will come close to _really_ changing the way you game than this one...

Does UT2003 even support PS 1.4? Because if not it'd kind of be hard for it to be any kind of advantage whatsoever. I think Epic may have optimized the game specifically for GF4 (and not even all Nvidia cards). Look at the lackluster performance comparatively of GF3 Ti500 and Radeon 8500. As much as I think they make pretty good games, it wouldn't be the first time Epic did something dumb. ;)
 
I also consider Truform a advancment in visual Quality..much the same as FSAA and of course n-patches is needed for displacement mapping which was done on hardware on the 8500.

I also remember a certain individual taking some of my Serious Sam shots and making a mockery of it on the forums, yet no one can deny the improvement.

ssam-sans.jpg


ssam-avec.jpg
 
jb said:
GF4 was approx 5/6 months after the 8500. Yet the 8500 still had more advanced features vrs gf4......

The GeForce4 was also not a new architecture. It was just a refresh. Yes, nVidia did improve the architecture more than for the usual refresh, but it still was just a refresh.
 
Chalnoth said:
The GeForce4 was also not a new architecture. It was just a refresh. Yes, nVidia did improve the architecture more than for the usual refresh, but it still was just a refresh.
I don't consider the NV25 a "refresh."

I consider the GF256 DDR (was that NV11?), NV16, NV20-Ti, and NV28 to be "refresh" parts, as all but the GF256 DDR fell on approximately 6 month cyles after the "new architecture" was released (GF DDR was more like 4 months).

NVIDIA's naming scheme seems consistent: first number denotes DX generation/major core family, second number denotes position in that family. x0 and x5 parts are architecturally different from the previous cores, and subsequent numbers are typically core/mem speed bumped parts (or perhaps AGP 8X added).

NV10, NV15, NV20, NV25, NV30... these are architecturally different, even if only minor changes (like NV20 to NV25). NV11, NV16, NV20-Ti, NV28... these are for the most part identical to previous core, only with faster speeds, DDR memory controller, or updated AGP interface.

Just my opinion of course, but I just don't consider the GF4 to be the "refresh" that most people think of when talking about NV's 12 months cycle with a 6 month refresh. The NV20 to NV25 was a year, and the NV20-Ti was the refresh (and the NV28 will be the GF4's refresh).
 
Those shots were mocked for a very simple and obvious reason...they looked ridiculous, and deserved to be mocked. It's that simple.

When a feature ends up making players look like they're constructed from a bunch of balloons, it actually has the effect of making a mockery out of both the game, as well as the hardware...

Surely you can't sit there (though I'm sure you can/will/would) and deny the fact that those screenshots were NOT ridiculous looking.
 
I agree. I don't recall anyone speculating that Radeon 8500 would out-do GeForce3/4 on UT because of better pixel shaders, for that reason.

Joe,

I quite honestly don't have the fortitude to go grab a bunch of quotes (and believe me, I'm not talking about 1 or 2) that insisted that Pixel Shader 1.4 support was going to separate the men from the boys, as it related to UT2003...I think you can pretty well figure out where they likely came from and where you could possibly go to dig them up.
 
Saem said:
I could very easily dig out a ton of threads on the topic of Unreal Tournament 2003...and how the 8500 was going to kick the living crap out of the nVidia boards (GF3/4) simply because of the supposed "higher tech" Pixel Shader 1.4 support...

Does UT2003 even make use of PS1.4. My understanding was that it was mainly a DX 7 game.
Yes, it does use pixel shaders (no vertex shaders, mind). It falls back to DX7 texture operations if applicable, but pixel shaders are in. Mostly for speed-up purposes, mind--the game won't look any different on a DX8/9 board when compared to a DX7 VGA.

I, too, can't remember discussions about how the Raden8500 will benefit hugely in UT2003, speed-wise, from PS1.4 support, compared to GF3's earlier PS versions.

ta,
-Sascha.rb
 
Hi Type,
Typedef Enum said:
I quite honestly don't have the fortitude to go grab a bunch of quotes (and believe me, I'm not talking about 1 or 2) that insisted that Pixel Shader 1.4 support was going to separate the men from the boys, as it related to UT2003...I think you can pretty well figure out where they likely came from and where you could possibly go to dig them up.
Aaah, ok. I thought you were talking about the B3D forums. ;)

ta,
-Sascha.rb
 
Typedef Enum said:
[I quite honestly don't have the fortitude to go grab a bunch of quotes (and believe me, I'm not talking about 1 or 2) that insisted that Pixel Shader 1.4 support was going to separate the men from the boys, as it related to UT2003...I think you can pretty well figure out where they likely came from and where you could possibly go to dig them up.
There are also quotes from you insisting that the GF3 Ti series would "eat the 8500 for lunch" due to the obvious architectural advantages it would have over the GF3. The point? When speculating, everyone makes a fool of themselves, present company included.
 
Back
Top