10 years ago we woke up to this news....

wingless

Newcomer
Every now and then I like to do a search on old hardware. I was taking a look at the Chip Index and saw the Rendition V2200. I remembered its been almost exactly 10 years since it was out and I'm happy with how far 3D acceleration has come since those days. I used to want a V2200 but then the Voodoo and Riva TNT came out and blew it the hell away. I bought a Riva TNT because I had faith in a little company called Nvidia. Anyways, I found some articles online that talk about benchmarks of our old video cards and it only makes me wonder what we'll be buying 10 years from now.

http://findarticles.com/p/articles/mi_m0EIN/is_1997_August_11/ai_19658520

http://www.guru3d.com/review/mercury3d.html

http://www.guru3d.com/review/MAGICTNT.html

http://www.guru3d.com/review/velocity.html

Also how many of you still have that FINAL REALITY benchmark from around 1998? I do and I still benchmark each new card with it. Strangely my R600 has some error that wont allow the 3D scores to be read accurately. Also it pisses me off because I still can't seem to max the damn thing out after 10 years!
 
Last edited by a moderator:
http://www.guru3d.com/review/visiontek/geforce256/index.shtml

Who the hell gamed at 1600x1200 32bit way back then. I couldn't afford a monitor that could do that until last year! Anyways its always good to go back and look at a Geforce 256 review. Its almost prophetic. I skipped from a Riva TNT to a Gefore4 MX so the jump to a T&L card was badass for me even if it was a low-end model. Either way this Geforce 256 article is a fun read and brings back lot of memories.

PS: Guru3D has all their articles still on their site under their reviews section so they're easy to find. Read up and remember all your glory days...
 
I did, actually, later that year when the geforce 2 was realeased. (but only with a few old and not too demanding games that would also let me select 1600x1200x32 though :)) You could actually get pretty cheap 19" CRTs even back then.
I can't think of a 19" monitor that could run a 1600x1200 resolution at 85 Hz or higher. Any refresh rate lower than 85 Hz on a CRT is useless IMO. I have my professional 20" CRT (made in '96) set to 1152x864 although it supports much higher resolutions than that.
 
I can't think of a 19" monitor that could run a 1600x1200 resolution at 85 Hz or higher.
The one I bought could only do 77hz at 1600x1200, but I remember a friend of mine bought one that could do ~90hz at 1600x1200. So they do (did?) exist.
Any refresh rate lower than 85 Hz on a CRT is useless IMO. I have my professional 20" CRT (made in '96) set to 1152x864 although it supports much higher resolutions than that.
At 100+ hz? I prefer resolution over refreshrate. I tried reducing my desktop to 1440x1080@85hz, but it felt too cramped. Thats why I didn't upgrade to LCD until I could aford a 1920x1200 one.
 
I can't think of a 19" monitor that could run a 1600x1200 resolution at 85 Hz or higher. Any refresh rate lower than 85 Hz on a CRT is useless IMO. I have my professional 20" CRT (made in '96) set to 1152x864 although it supports much higher resolutions than that.

Putting a personal qualification on there is no proving your point. The vast majority of people can handle a CRT at around 75Hz no problems, and many can handle 60Hz as long as they're not at it for hours. Just simply look at what so many public places at their monitors at for ages.
 
Thowllly
I remember a friend of mine bought one that could do ~90hz at 1600x1200. So they do (did?) exist.
I didn't know that. All decent 19" monitors that I've seen maxed out at ~75 Hz.

At 100+ hz? I prefer resolution over refreshrate. I tried reducing my desktop to 1440x1080@85hz, but it felt too cramped. Thats why I didn't upgrade to LCD until I could aford a 1920x1200 one.
At 85 Hz, it is a very old monitor. Just tried setting 1280x960 @ 76 Hz - flickering is annoying and would probably give me a headache if I left it on long enough. I consider an LCD to be a downgrade (but that's a whole different topic). And aren't high resolutions and affordability their only selling points?

Skrying
Putting a personal qualification on there is no proving your point. The vast majority of people can handle a CRT at around 75Hz no problems, and many can handle 60Hz as long as they're not at it for hours. Just simply look at what so many public places at their monitors at for ages.
True, but those people can also handle something like an 800x600 resolution, even when it's not native to their large wide-screen LCDs :) (I am not making this up - I've seen too many people using whatever the default resolution is in their OS no matter what their monitor is).
 
Skrying
Putting a personal qualification on there is no proving your point. The vast majority of people can handle a CRT at around 75Hz no problems, and many can handle 60Hz as long as they're not at it for hours. Just simply look at what so many public places at their monitors at for ages.
True, but those people can also handle something like an 800x600 resolution, even when it's not native to their large wide-screen LCDs :) (I am not making this up - I've seen too many people using whatever the default resolution is in their OS no matter what their monitor is).

Please don't over generalize like that, it comes across patronizing even if you don't mean it to be. :)

I myself generally can't stand non-optimal resolution on a LCD (other than the 30" Gateway), and yet I do perfectly fine with 70-75hz refresh on a CRT. I honestly can't tell the different once it goes higher than 75hz. Then again I also have never had a headache in my life. :)

I also have quite a few friends that do CAD and Graphics Design that have no problems with 75hz refresh, although I also know a few that can't handle anything less than 100hz (but these are in the minority).

Regards,
SB
 

Awesome post! Thank you for showing me that. I was pretty new to the forum back then and didn't frequent it as much as now. I'm glad to see so many didn't forget the days we wished we had computers as powerful as our current machines are. I really hate that Final Reality doesn't work well with my 2900XT. It doesn't report some of the scores.

Also so many of you had high resolution CRT's 10 years ago! I was still in high school so they were a little out of my budget. I could only run 17 inchers back then so I missed out. This year I upgraded from a 19" CRT to a 22" LCD. I've simply been behind the times for too long.
 
Most humans can't detect the refresh rate anymore above 70Hz - and many only through peripheral vision (i.e. don't look at, but to the left of your screen) though they can still get a headache from 60hz even if they can't actively tell. That's why one of our government labor standards specified 72hz a long time ago as a minimum requirement for desktop CRTs. I still seem to be able to tell the difference between 75hz and 85hz, but it's very subtle.

I had a 1600x1200x60hz a long time ago also, an IIYAMA Vision Master Pro 17". (60hz is still good enough for games, but I almost never used it anyway, though pictures looked very nice in this resolution, in addition to the 60hz being flickery, my videocard was slow) What I miss most about CRTs vs LCDs is being able to handle different resolutions well, though it also saves you having to worry about it too much. Still, the CRT now supports 1680x1050 which the CRT basically could still have matched.
 
I'm glad to see so many didn't forget the days we wished we had computers as powerful as our current machines are.

yup I am so stoked that my 7600gt is 8.77 times more powerful than a s3 virge :D

ps: another old benchmark
3D Winbench
http://www.majorgeeks.com/3D_Winbench_2000_d108.html

my 800gts actually looses overall to a q6600 (software rendering) but wins in just 3d
I wonder why the bus transfwer rate is so much faster in software
 
Last edited by a moderator:
Back then, we've been reading FiringSquad and SharkyExtreme ;)

It was the reign of 3dfx, we've been gaming at 640*480 and were amazed by the high resolution, and the increased 3D image quality (MIP Mapping! Bilinear Filtering!). The Riva128 was still buggy with OpenGL, the TNT was considered vaporware, and the almighty Voodoo2 was on the horizon. ATI was considered to be out of the race, too ;)

We've been waiting for Unreal, Half-Life didn't look more then a simple Quake clone, and there was Prey, too. Oh, and some lame looking 2D strategy game with a scifi setting called Starcraft...

However, the most important thing about this thread is that it makes me feel old :) so I probably shouldn't read it...
 
Ah yes, waiting for Unreal and the promise of online multiplayer spanning multiple servers across the internet.

You would be able to go directly to another server without leaving the game by going through a door or a portal. And so Deathmatches would be able to span multiple servers spread across the net...

Of course, that feature never made it but it was one of the hyped features at the beginning. :)

Regards,
SB
 
Hello.

i would like to give you my humble opinion. this test doesn't really apply to a 3D accel only , but the whole chip + interface.
game today i think count on cpu for physic & collision engine, but there isn't here.
it's only some small render of texture and polygon. something wich can be done at something like 1000hz right ?

i would like to use the term out of range :)
 
Back
Top