engineer forgot bilinear filtering unit in PowerVR PCX1 ?

Joe DeFuria said:
[Matrox Mystique] Lacked bilinear filtering, and if memory serves, lacked alpha blending as well. Matrox PR touted "screen door transparency" (dithering) as a performance feature. Only problem was that other cards (Voodoo, PCX, Verite...) handled alpha blending just fine. ;)

Hey, but on the plus side, the Mystique had hardware-support for 16-color (4-bit CLUT) textures, just like the Playstation 1. The marketing material said 4-bit textures saved video-RAM (1/2 the size of a 256-color CLUT texture.)

Of the 'mediocre 3D crowd', the Matrox Mystique's Direct3D engine was faster than the infamous S3/Virge, ATI Rage3D, and Trident 3D/975. (When doing an equal comparison -- i.e. bilinear filtering disabled on all devices.) As a matter of fact, I think the Cirrus Logic GD-5465 and Mystique were comparable in performance (but at least the GD-5465 had bilinear filtering and RAMBUS!)

My experience with the S3/Virge ... Tomb Raider 1 (yup, S3 had an S3/Virge edition) in '320x200 Virge' mode was slower than the software 320x200 MCGA mode. And that was on my AMD K5-90 (another story in itself...) On the plus side, the Virge mode , thanks to 16-bit color, did look better.

To remember the Mediocre-Era of <$200 PC/3D, you would do well to remember this quote from the movie Minority Report: "In the land of the blind, the one-eyed man is King!" And the Matrox Mystique was certainly King of the Blind. (Not talking about the Verite, PCX1, and Voodoo...)
 
asicnewbie said:
Of the 'mediocre 3D crowd', the Matrox Mystique's Direct3D engine was faster than the infamous S3/Virge, ATI Rage3D, and Trident 3D/975. (When doing an equal comparison -- i.e. bilinear filtering disabled on all devices.)
It varied. The Mystique had more consistent performance, while the Virge was very dependent on what was turned on (each of bilinear, Z, colour, perspective correction, texture, etc. added cycles). It was also somewhat later, and the ATI/S3 chips were already being updated with their successors by the time it was widely available.

My experience with the S3/Virge ... Tomb Raider 1 (yup, S3 had an S3/Virge edition) in '320x200 Virge' mode was slower than the software 320x200 MCGA mode. And that was on my AMD K5-90 (another story in itself...) On the plus side, the Virge mode , thanks to 16-bit color, did look better.
I certainly know it had an S3/Virge edition ;). I found on the original Virge that it tended to be pretty much a wash with the CPU unless it was a very fast CPU for the time, but you had to leave perspective correction off. On the Virge DX/GX (which were already available when the Tomb Raider port was done) perspective correction was free and bilinear was faster, which was a large improvement.

What should have been done (in hindsight) was to remodify the hardware driver to go back to using depth-sort (which is what the software rendering engine did) instead of the hardware Z-buffer (which is what the 3dfx port, and thence the other hardware ports) did. It would probably have run at the limit (30fps) then.
 
asicnewbie said:
To remember the Mediocre-Era of <$200 PC/3D, you would do well to remember this quote from the movie Minority Report: "In the land of the blind, the one-eyed man is King!"

Erasmus of Rotterdam is surely rotating in his grave... ;)
 
Back
Top