NVIDIA: Beyond G80...

Maybe NV recieved some info about R600 performance and had to react "urgently" ;) ?

Maybe they heard it's going to suck so bad that no-one is going to buy it... so they're anticipating a massive surge in demand for GF8800GT[SX] parts from all those who so far have been holding off waiting for X2800XTXXX ;)
 
It's probably something really simple like they want to have high volumes of lower end cards available for the big Vista push. I'm sure OEM's would love to be able to tout those brand new entry level desktops with DX10 support.
 
I didn't want to open a new thread for this, but here's a tidbit relayed (via vr-zone) by next-gen.biz:

Nvidia: Graphics Don't Matter? 'Nonsense'

In a recent Next-Gen interview, Nvidia’s Roy Taylor argues that graphics are key to eliciting emotion from gamers in the same manner as movies. And to those who say “graphics don’t matter,” this GPU guy says “nonsense.”

They’re talking nonsense. It’s ridiculous to say that graphics don’t matter,” says Taylor, who is VP of content at 3D graphics firm Nvidia. “That’s like saying, ‘The quality of my TV screen doesn’t matter.’ Oh really? So then in that case, you can go watch 24 in black-and-white on a seven-inch screen," he laughs.

But Taylor does concede that gameplay oftentimes takes a back seat to visuals. “It is a fair criticism to say that sometimes graphics have been applied, not at the expense of gameplay, but without equal consideration to it,” he says.

In particular, Taylor says that graphics will bring more emotion into games, arguing that gamers seldom become as emotionally attached to in-game characters as they would to characters in a movie.

Taylor adds that a lot of this in-game emotion will be conveyed through improved facial expressions and animations, and having developers focus gamers more closely on, for example, faces of enemies or a gamer’s character in a mirror.

“It’s nonsense to say graphics don’t matter,” Taylor reiterates. “I think we can do a lot better, but I think we’d like to see graphics encompassing the gameplay.”

http://www.next-gen.biz/index.php?option=com_content&task=view&id=4648&Itemid=2

I could infer this has some bearing on a future push towards powerful mid-range and high-end discrete GPU's for the mobile market, where the "graphics don't mater" thing is most prominent.
Mobile users are starting to pay greater respect to the graphical prowess of their notebooks than ever before, i believe.
 
If you take the original leakage seriously, the shaders were supposedly limited at 1.5GHz scaling. . . and thus a core clock of roughly 635MHz. Take that with a grain of salt, of course. . .
 
Limited artificially or physically? Overclocked 90nm cards are already bumping up against 1.5Ghz without issues.
 
I dunno that'd I say "physically". Maybe "by design" would be a better phrase. If you think back to some issues reported with 7800GTX512 and 7900GTX OC models, you'd get a better sense of what I mean by that, I think. Speculative on my part of course, but I seem to recall that appearing in the original VR-Zone G80 leak.

Edit:

Core clock scalable up to 1.5GHz

http://www.vr-zone.com/?i=4007
 
Maybe they heard it's going to suck so bad that no-one is going to buy it... so they're anticipating a massive surge in demand for GF8800GT[SX] parts from all those who so far have been holding off waiting for X2800XTXXX ;)

I wouldn't be suprised if there was at least some "Hmm, what do the think they're doing" talk.
 
On the Inq article, some people have speculated that the "driver tweaks" that are supposed to have a key role in the 8900's speed boost could include making the MUL available for general shader instructions. I'd like to read others' thoughts on that notion.
 
On the Inq article, some people have speculated that the "driver tweaks" that are supposed to have a key role in the 8900's speed boost could include making the MUL available for general shader instructions. I'd like to read others' thoughts on that notion.

You're not suggesting that such a "driver tweak" would be exclusively for a possible future refresh only? I'd consider such an idea unlikely, but if yes insert an ugly compilation of curses from my behalf :D

That wouldn't be a "refresh" but a joke IMO and it wouldn't be the best publicity for NV either; sooner or later the userbase will find out what is going on. Just like Transparency AA not being a G7x exclusive feature after all, which was somewhat swept gently under the rug, due to lack of protests from the userbase.
 
Stupid guessing on my part, but it could as well be possible that the MUL ain't working right in the G80 due to some sort of failure and will do so in the tweaked version.
 
Stupid guessing on my part, but it could as well be possible that the MUL ain't working right in the G80 due to some sort of failure and will do so in the tweaked version.

I think they'll use it exclusively to output 1080i in SLI correctly.

But I agree with you, I don't expect anything drastic either. It will be G80 done right, which makes you wonder why nVidia had to launch last year and not wait another spin and come out with a 100% functional unit.
 
Back
Top