Disappointed With Industry

"Slouching towards Gomorrah"? That was always the nightmare scenario, but I'm not there yet. I think a cigar is just a cigar on this one.

I agree, which is why I couched my comment with a "could possibly" wording. I certainly don't want to see that happen either. And I think we'll see an effort to regain the performance crown with a .65nm 29xx.
 
And I think we'll see an effort to regain the performance crown with a .65nm 29xx.

or maybe not especially if ati is late again. I found this quote:

"NVIDIA says its G92 high-end graphics card will deliver almost a teraflop of computing performance. In an analyst webcast, Nvidian Michael Hara says that the chip will be ready for Christmas, a release cycle the company adopted with G80, where high-end products come out for Chrimbo and the mid-range and low-end products hit in the spring. The actual power of G92 might surprise some. The 8800 can rustle up about 330Gflops, which means the green team is suggesting that the 9800 could be three times more powerful!"
 
Well, there's a difference between making an effort and succeeding at it. :cool:

For ATI/AMD this tends to be a significantly larger difference. :cry: It tends to be a significantly smaller difference for Nvidia/Intel. :cool:
 
My last blog entry talks about my borrowing a co-worker's spouse's 8800 GTS and my initial, very positive (albeit limited) impressions of the board after a few hours spent with it playing LOTRO. He wants $300 for it and I'm tempted, though I'd much rather have a GTX.
 
Is that a 320Meg or 640 Meg GTS you were running? The cheapest (after rebate) 640 Meg GTS on NewEgg is $320. The cheapest 320 Meg GTS on NewEgg is $270.
 
It's a 640MB board. Not sure how the 320 versions would handle me running at 2560x1600 with 8x AA, especially with the highest texture settings. I think he paid $300 for it months ago and without a PCIe mobo it's been collecting dust at his house.
 
Thats a good price for a 640Meg board. It would be another $220 ($520 total) to step up to a GTX. Not sure if that makes it any easier for you to decide. Having only a GTS would also make it easier for you to justify an upgrade to the G92 in the Winter. :LOL:
 
Re - Power and Heat.

Isn't that almost always the case with new generations of harware however. Most especially in the graphics arena.

After all we went from graphics cards with no heatsinks to graphics cards with heatsinks to graphics cards with heatsinks and fans to graphics cards with dual slot coolers.

The power envelope required to run them in order to support new features has always resulted in an increase.

The 7800 GTX to 7900 GT isn't a new architecture and even then it's more an anomaly than a truism.

Sure there was no "need" for DX10 and it's additional transistor budget. However there was also no "need" for DX9, DX8, etc...

So, yes, we could stagnate the industry and say that from this point forward we no longer need innovation or new features, however, I personally like to see new features and what developers can do with it.

From what I can see the features of the current DX10 don't bring immediate bling bling to the game, however it appears to have been designed to allow you to do what you could in the past more efficiently, thus allowing more things to be done in the same given amount of time/whatever. Which, in theory, should lead to more improvements in visual quality at theoretically the same framerates.

In regards to why no DX10 titles yet and does this mean the industry is purposely holding back on DX10. I just don't see it yet. So far, it seems to be mirroring the deployment of DX9 and DX8 titles with regards to DX9 and DX8 hardware deployment.

I really don't see the adoption of Vista as a major roadblock. Especially considering that virtually all machines sold in the next year leading up to a greater inflection of DX10 titles will have Vista pre-installed.

Vista also does a few things, albeit clumsily at the moment, to try to inform the casual computer user of whether or not their computer is capapble of running a "Vista rated" game. Assuming MS doesn't abandon the idea, the new Vista performance rating is at least a step in a direction to allow casual users to find out what the Lowest performing part of their system is that might prevent them from running a game well or at least acceptably.

And going forward, Vista has the potential to be a much better gaming platform as it (from my limited experience on 3 machines, laptop, gaming, and 2 year old gaming) appears to use resources much better than XP did, with the benefit of being significantly more stable when gaming.

I think the perceived slowness in adoption is more a result of so many changes being introduced in a short period of time.

I don't think there's any fault to assign as it appears to just be a repeat of past DX introductions. Now if 2 years from now, there's still only a small handful of DX10 titles, then I'd be willing to investigate where and who should be to blame, but honestly, right now I just don't see it.

Regards,
SB
 
It's a 640MB board. Not sure how the 320 versions would handle me running at 2560x1600 with 8x AA, especially with the highest texture settings. I think he paid $300 for it months ago and without a PCIe mobo it's been collecting dust at his house.
Best price I could find for 640MB GTS is $370, that don't sound like a bad deal if you like the card.

Offer him $250. :devilish:
 
Yeah those MIR's are a trap - I've forgetten to submit quite a few. But I just picked myself up a 640MB Evga GTS from buy.com for $295 AR. Was hoping not to have to upgrade until DX10 round 2 but my inability to enable triple buffering in Prey in widescreen resolutions on my XT just about sealed the deal since that's the only game I'm playing currently.

PS: If you're in the market for an X1900XT-512 with a prettached Zalman VF-900CU let me know.
 
The 7800 GTX to 7900 GT isn't a new architecture and even then it's more an anomaly than a truism.

I dont see it that way. The 7800GTX to 7900GT situation had applied in the past and likely will in the future. We saw it the FX era too, As well as the R300 Era, ((9500 Pro to 9600XT)). ((Geforce 7800GTX to 7900 GT)). ((I'd have to look closer look at X800 Pro power consumptions compared to 9800XT, as this wasnt something discussed much back then))

These are specific instances where the technology remained the same and benefited from the down process of the the lower process. And I am sure with the 8800GTS/8800GTX we will see something similar in regards to power consumption once the process changes.


Sure there was no "need" for DX10 and it's additional transistor budget. However there was also no "need" for DX9, DX8, etc...

So, yes, we could stagnate the industry and say that from this point forward we no longer need innovation or new features, however, I personally like to see new features and what developers can do with it.

Theres a definate "need" For DX10, DX 9.0, and DX 8.0 adoptions. These adoptions allow the devs to move forward. The question about how useful some of these features are is hard to decide. In the past we have seen some companies use hacks/tricks workarounds in developing features they didnt see as useful but werent allowed that commodity this time. But I still believe a higher performance per watt could have been achieved on DX 9.0 setup. But thats simply because DirectX 9.0 has been around for so long.
 
I dont see it that way. The 7800GTX to 7900GT situation had applied in the past and likely will in the future. We saw it the FX era too, As well as the R300 Era, ((9500 Pro to 9600XT)). ((Geforce 7800GTX to 7900 GT)). ((I'd have to look closer look at X800 Pro power consumptions compared to 9800XT, as this wasnt something discussed much back then))

The 9500 Pro to 9600 XT isn't a very good comparison as performance on 9600 XT often wasn't better than performance on 9500 Pro.

One thing I'm wondering about Nvidia's newly announced product cycle. It sounds like they are going to a 1 year cycle. Do you know if that's a 1 year cycle between new products or a 1 year cycle alternating new products with refreshes.

It'd be odd if they were going with a refresh 1 year after the original product. Meaning 2 years between new products.

However, if it's a one year cycle for new products, having a refresh occur in 6 months is rather ambitious to say the least.

If this is the case of New Product Intro - 6 months refresh - 6 months New product intro - 6 months refresh.

I think that would do more to kill ATI than anything else. But I have doubts that even Nvidia could pull that off.

Or are they dropping the whole idea of a "refresh" part? Figured you might have a better insight into this.

Regards,
SB
 
For the most part they were equal.. I think thats the point I am getting at. You can achieve similar performance at lower wattage. But it usually comes with the second generation of products.
 
Back
Top