The Graphical Divide Between High End and Mainstream

I agree that we need a mainstream card that supports DX8 shaders. As long as it supports pixel shaders in hardware vertex shaders can be emulated.

I wonder how much cheaper the TNT2 actually is to manufacter than a GeForce 2 MX. I hope it is significant since otherwise I can't imagine why they're still selling them.
 
The Radeon 8500 64MB already seems to meet the author's requirement. I'm pretty sure the GF3 Ti200 does as well, or will very soon due to 8500 prices. And I'm talking retail, not even OEM.

The article seems very old...now that those conditions are met (sub $200 and DX 8, 8.1 even), I'm curious if the desired result (DX8 being established as a baseline) will come true. With the GF4MX coming out, it seems that nVidia is hindering this (as well as the Kyro cards), and that it won't...unless the low price of the 8500 can be sustained (i.e., the 128MB OEM models will still be as cheap, as the 64MB cards are being replaced) and forces the competition to keep in line.
 
Even if DX8 cards were free, most game developers won't support DX8 effects until the majority of PC game buyers have the card installed.

If people upgrade their computers every three years, it would take two to three years after DX8 cards become cheap until game developers can reasonably support them.

In the short term the best hope for DX8 effects in games is either ambitious sofware developers who can self-fund their product (e.g. id Software), or ports of Xbox games (e.g. Halo).
 
3dcgi said:
I agree that we need a mainstream card that supports DX8 shaders. As long as it supports pixel shaders in hardware vertex shaders can be emulated.

I wonder how much cheaper the TNT2 actually is to manufacter than a GeForce 2 MX. I hope it is significant since otherwise I can't imagine why they're still selling them.

well, I think SiS just made one... check out the "SiS, Cebit and AGP 8x" thread.

Or another choice is that it is actually High end which will be priced like SiS products usually; very agressively.
 
Compaq Evo D500 SFF P4/1700-20GB-CD (128MB-Vanta 16 MB AGP-W98-i845-NIC € 1273.99

a p4 with a 16mb vanta card also the Evo D300 same thing. price range 1000-2000 euro's differences are small just more meg and HD space more memory I didn't see any with anything higher then a gf2mx 16 and max 32mb and most have are M64 class to.

A few have a Gf2 pro but they where hard to find (I just took a compaq as example it goes for all the big brands do)

And Joe Smoe usually buys a complete pc in a store loads quake3 or any other new game from late 2001 early 2002 and can't play it or at least it plays like crap come on year 2002 AthlonXP pentium4 1Ghz+ you can't put in a broken videocard......

SO this article ain't that old it still valid

<quote>
A bandwidth impaired and limited GeForce MX won't do any better than a TnT2 class video card. Even worse though is the fact that NVIDIA didn't release just one but three models to their GeForce MX line, the MX, MX200, and MX400(essentially an MX). The normal Dell or Compaq, especially in this sluggish market, will say "hey this will add to my bottom line as well. Cheaper video cards, I'll get the worst MX of the bunch". Nothing wrong with that. But one reason the market is sluggish is when John Doe goes out to buy Quake3 on his new computer with a shiny 1.4Ghz Thunderbird that he paid $1500 for, and abruptly finds out that Quake3 doesn't work since he has a 2 year old graphics processor. This is not only bad for the game developers, which for the most part develop to the lowest denominator, but also for computer manufacturers which see less business because of scourn and confusion on the average consumer's side. And if you think this is far fetched take a stroll on over to HP's website. While surfing the site you notice that the GeForceMX 400 is included in some of their computers. Great. But the thing is, they have a brand new 2Ghz Pentium4 models which features a "32 MB SDR Nvidia TNT2 M64 w/ TV-out",
</quote>
 
With Ti 200's and Radeon 8500 Retail cards going for as low as 139.99 whats the problem here ?? Its been one of my main disgusts with the PC gaming industry..you go out and buy a decent card with a advanced feature set and pay a premium price only to see 10% of these features used during the cards lifespan.
Classic example is Unreal 2, bascially another DX7 game with SOME Dx8 functions. Thats is also why I refuse to pay more than $399.00 Canadian for a video card (approx the same price of a console here in the 'Great White North').
 
Hey, I'm the author of that article. I post here every now and then. The site hasn't been updated in about 2 months since ive been redesigning it, a few articles were posted to test server load. Anyway, the article is probably ~2 months old, and it still holds true. Retail boards may be easy to come by granted, a Radeon 8500 is at a very appealing price I believe. But, the article was also meant to discuss the OEM value in this. Vanta, TNT, TNT2, TNT M64, Rage 128's, etc....they are all bundling in new computers. My friend bought a 2Ghz Sony VAIO recently, it even included 256MB Rambus memory. He payed north of $2000 USD for the computer, yet it only had a TnT2 i believe (might have been a TNT even I can't remember right now honestly).

I personally believe ATI and NVIDIA should both phase out their old product lines, not slowly but rapidly. A nice chunk of NVIDIA's revenue still comes from sales of their TNT line of products (espicially in Asia). The graphics product cycle may be 6-8 months, but OEM's see the product cycle more like 2 years, and release computers still with 2yr old cards. Integrated graphics are also becoming more and more standard, which is why Intel is the biggest graphics company IIRC. NVIDIA and ATI have both recently entered the chipset markets, which may help out this 'graphic divide' if they produce some higher quality chips.
 
A Good article fremin 8)


I live in Europe and it is the same here aldo nvidia has the most oem cards. A tnt2 m64 16mb with a 1.2 athlon come on people that thing was 1800 euros that card can't even play gp3, baldurs gate2 or 1 for that matter(it runs just stutters a LOT), it has a bad texture quality etc. oh it came with a packard bell.........

They should stop making those old cards a.s.a.p. and while I am at it why the hell did ATI kill T&L on the Radeon VE? and disabled HyperZ by default oke you can turn it on in the registry or with a tweaker like if everyone knows that.

They don't care that is why....... ofcourse you could say why should they care they just try to make a living mmmm more like they see you as a milkcow

YOu bought the pc with an cheap oem in it oke... now pretty soon your games wont run so what you do you buy a faster more expensive one and oh what they recommend at the store or your friends or whatever source you get info from right nvidia they sell more cards how nice....


mmm it late but I post it anyways if there some mistakes in this post I correct them laters....... :(
 
why the hell did ATI kill T&L on the Radeon VE? and disabled HyperZ by default oke you can turn it on in the registry or with a tweaker like if everyone knows that.

There has to be some form of incentive between getting a lower end model and a higher end model. Intel and AMD both realize this and many times disable cache (its physically there however) or use other tactics to produce lower costing chips (such as Celeron and Duron).

Most consumers are uneducated in the field of computer and espicially 3D Graphics and don't realize what they are buying, and go for the computer with the higher MHz regardless of other speed metrics.


Even if DX8 cards were free, most game developers won't support DX8 effects until the majority of PC game buyers have the card installed.

If people upgrade their computers every three years, it would take two to three years after DX8 cards become cheap until game developers can reasonably support them.

In the short term the best hope for DX8 effects in games is either ambitious sofware developers who can self-fund their product (e.g. id Software), or ports of Xbox games (e.g. Halo).

I absolutely agree with Duffer here. Even if new computers came out with support for DX8, most game development houses would wait until most computer had support for this (why does the sims sell so good?). Still, there is a niche market for hghend games as well. Im sure doom and unreal2 will sell good ;).
 
*sigh*

Yet another topic bemoaning the upgrade cycle, the lack of support for advanced features, and all the rest. And yet so few replies to my suggestion of a lengthened hardware cycle? Pah! Perhaps it just seems pointless to discuss because you believe it will never happen...?

In any case, I think I've already made my feelings on this clear. The current hardware upgrade cycle is rediculous. It is tough on the consumers *and* the developers, hardware and software. It makes it more difficult for smaller companies to enter the market and increase innovation and competition. At the same time it stretches the resources of even the best of hardware developers thin, and forces the software developers to settle for 3rd best in terms of feature support.

But the solution is of course some miracle new card out of left field that will offer everything we want at an attractive price, right? Then everything will be fixed. Yeah! If not that, then what do you all suggest as a solution, as opposed to my proposal?

- JavaJones
 
JavaJones, I couldn't agree with you more.

BTW, I love Java - the language of course, don't really dig coffee.
 
Saem said:
JBTW, I love Java - the language of course, don't really dig coffee.
(OT) But what about the island?

<My first post via Konqueror "browser">
 
OT: Java 2 rules
coffee.gif
 
I personally believe ATI and NVIDIA should both phase out their old product lines, not slowly but rapidly.

I think both companies are now going to complete product lines instead of new high-end cards filtering down to the low-end over time. nVidia does seem to be the most guilty of the tactic of introducing a high-end product that makes a name for itself, and then introducing a low-end version mostly for OEM use - TNT M64, TNT Vanta, GF2 MX. GF2 MX200. Now the GF4 MX lineup makes that concurrent to the high-end release, and takes advantage of the GF4 name. But the MX440 and 460 are really very capable cards, just not DX8 cards. So they add the MX420, a real dog, that in turn takes advantage of the improved MX reputation provided by those cards.

But the problem described in your article is really an OEM PC manufacturer problem, not a graphics card manufacturer problem, and it arises out of competition and consumer ignorance (the "proc speed is all that matters" thing). Beyond that, the suggestion that the graphics companies would first introduce a cheaper version of new technology makes no sense to me. They want/need to make high margins on the stuff at first, to pay for the R&D, and also need to work out the bugs before the stuff hits the mainstream market. As much as we complain, enthusiasts will and do tolerate the bleeding edge.

nVidia has introduced mainstrean cards as new products this time around with those MX boards. But they aren't really cheap, more than GF2 boards run now. They're reasonably fast, but don't include DX8 stuff, pretty much the same deal as the Radeon 7500. That makes sense to me, because if adding DX8 capability adds cost, then suddenly they aren't budget cards anymore, are they? Why add cost if the benefit today is negligible?

The irony of it all from the game development side is that the best-selling games are the ones that can afford to build in the new features, like Quake3 did with T&L. Then those engines get reused and the feature spreads. But that those games sell so well means they will be run on many PCs that simply don't have the graphics horsepower. So they have to build in the capability to adjust the game to the graphics capability, which they also can afford to do.

I just think that these newest techniques aren't going to be widely adopted by anyone - the PC OEMs, the game developers, so even the graphics companies. But they are cool, so they sell to the high-end and enthusiast market at premium prices, even if they largely go unused except for a few demos. It's just the reality of the thing - the mainstream wants capability for the stuff that's actually out there (DX7 games) but don't want to pay too much, so they give them cards ranging from the 7500 and MX460 down to the GF2 MX and Radeon 7000, and the small enthusiast sector is already tiring of DX8 that hasn't even showed yet and now wants to hear about DX9 hardware instead. The divide goes farther than the hardware itself.
 
Why are people focusing on ATi and nVidia when looking at this problem? Intel and PowerVR are the ones lagging behind in this field, nV and ATi if guilty of anything should be blamed for pushing features too fast.

Out of roughly one hundred million PCs sold in a year the overwhelming majority are bought and used for years with the most demanding game ever thrown there way being solitare or the like. You don't need a 55" Plasma 1080i HDTV to watch the local news and you don't need a GF4 or R8500 to look at spread sheets or browse the web. The problem then is one of consumer ignorance purchasing PCs with inferior graphics chips that have any intention of gaming with them. This is why so many PCs ship with TNT2s and integrated graphics solutions. The latter of those nVidia has actually helped out tremendously with offering a comparitively speaking feature loaded graphics solution while still giving an AGP slot for upgrade options. It remains to be seen if this will be of benefit or not in the long run of course.

When you have the problem isolated you then need to look at what can be done to rectify the situation for the gaming industry. Knocking boards such as the GF2MX doesn't really make any sense. How is the GF2 Ultra any better? When increasing the resolution/color depth or increasing the amount of effects utilized. Running at 640x480x16 there is very little difference between the two of them in the overwhelming majority of games on the market or in the near future. The casual gamer, the type that will have the GF2MX or comparable board is not going to care that much about the increased resolution or enabling FSAA, if they did they would know better then to purchase a PC with a board such as that in the first place or they would learn enough about it to rectify the situation.

Java brings up a good point about extending the life cycle being a likely solution. MS is giving every bit of evidence that they are doing this by significantly prolonging the DX life cycle although it seems to me that the XBox would be a more likely reason for this then the possible side benefit for PC gaming. However, that said, you run the risk of stagnating the hardware market if this is taken too far and forthermore you begin to push performance, of which most of the boards already offer obscene levels, over improvement upon technology.

I see the most viable solution coming from the game companies themselves. By making increasingly flexible engines that scale with hardware for more then what we have seen to date they can both cater to the lowest common market, while pushing the higher end rigs pleasing both ends of the spectrum. No matter how you look at it, the one way that is completely foolish is to try and lay the blame at the feet of the industry leaders. If Intel and AMD both pushed out 10GHZ CPUs tomorrow would we begrudge them because software makers couldn't keep up? That's foolish to say the least.

As for the argument that nV and ATi should drop the lower end parts, do you people have some deep hatred of poor people? Do you think they shouldn't be able to buy a cheap PC for their family? That line of argument is both elitist from that angle, and plain moronic on a business level where the majority of the market could not care less about 3D PC gaming and don't want to pay for it. How do you think ATi and nV afford development of new parts? Is anyone under the impression that the GF3, even down to ~$120 levels, is going to seel anything resembling a good enough quantity to support a company of nV's size or the same for the R8500 and ATi?

The entire PC industry isn't here to please 3D enthusiasts, no matter if we like it or not. It would be better spent looking at viable ways of dealing with the PC market and having the realization that it is not going to change on the hardware end then it is to dream up some scheme that would end up putting all of the leading companies into bankruptcy. If developers like it or not, it is up to them to deal with the gaps in the PC market. They are likely to increase continually for the next several years, not decrease. That is the curse and blessing of PC gaming.
 
The casual gamer, the type that will have the GF2MX or comparable board is not going to care that much about the increased resolution or enabling FSAA, if they did they would know better then to purchase a PC with a board such as that in the first place or they would learn enough about it to rectify the situation.

This is a good point. I'd guess the reality is that most casual gamers don't know much about adjusting many settings with their cards or games to get the best out of them anyway, so why build in even more complicated features into the cards for these folks? ATi's Radeon drivers default to 16-bit color depth in OGL and D3D, and how many people do you think are out there obliviously running at that setting? This is why games like UT detect a user's hardware and recommend settings based on that - they don't know enough to do it themselves.

The complaints I have regarding the industry is that they attempt to convince people who do want gaming capability that they are getting more of it than they are. The MX line is kind of like that - you could say the original MX was actually a GeForce because it had a T&L engine, but it certainly didn't have what one might expect from a GF2 in terms of raw rendering power. The MX200 was even worse, and who would have expected a GeForce card with a 64-bit SDR memory interface at the same time the GF3 was being hyped everywhere? ATi's RadeonVE also had a 64-bit bus, although it was DDR, and that card had no T&L engine. And now the GF4 MX thing. At least Intel didn't call the Celeron a Pentium MX...

Then there's the other marketing-driven aspect of graphics board design decisions - selling them on memory quantity specs. This is the equivalent to PC sales built on CPU mhz specs. How many times have you seen someone say something like, "I am upgrading my video card and want a 64mb card..."? There was a reason why the TNT2 had 32mb and the Voodoo3 only 16mb, but that difference in itself gave nVidia a marketing advantage, one not entirely downplayed by the e-press. There was a reason why the V5 and Rage Fury Maxx had 64mb, but that became the high-end standard with 64mb single-chip cards from S3 and nVidia. So then we saw the 64mb MX400, the 64mb Radeon SDR/7200 and then the 128mb Ti 200s and now 8500s, and that's really just marketing. But it adds to the cost, so more people are driven to buy even lower-end cards.

I think both the graphics companies and the PC OEMs could be more up front about the real 3D capability of some of these cards. Maybe that's viewed as counterproductive in terms of sales, but I'm not certain it actually is. Most PC makers offer graphics upgrades, and wouldn't they prefer to sell a PC to a customer for more money and have that customer more satisfied with what they get? But the combination of competition and consumer ignorance push them away from doing the right thing.
 
3dcgi said:
I agree that we need a mainstream card that supports DX8 shaders. As long as it supports pixel shaders in hardware vertex shaders can be emulated.

It looks like SIS was thinking the same thing I was since their new card does exactly this.
 
Back
Top