Captain Chickenpants
Regular
The rest of us are very shy.
CC
CC
Captain Chickenpants said:The rest of us are very shy.
CC
Joe DeFuria said:[Matrox Mystique] Lacked bilinear filtering, and if memory serves, lacked alpha blending as well. Matrox PR touted "screen door transparency" (dithering) as a performance feature. Only problem was that other cards (Voodoo, PCX, Verite...) handled alpha blending just fine.
It varied. The Mystique had more consistent performance, while the Virge was very dependent on what was turned on (each of bilinear, Z, colour, perspective correction, texture, etc. added cycles). It was also somewhat later, and the ATI/S3 chips were already being updated with their successors by the time it was widely available.asicnewbie said:Of the 'mediocre 3D crowd', the Matrox Mystique's Direct3D engine was faster than the infamous S3/Virge, ATI Rage3D, and Trident 3D/975. (When doing an equal comparison -- i.e. bilinear filtering disabled on all devices.)
I certainly know it had an S3/Virge edition . I found on the original Virge that it tended to be pretty much a wash with the CPU unless it was a very fast CPU for the time, but you had to leave perspective correction off. On the Virge DX/GX (which were already available when the Tomb Raider port was done) perspective correction was free and bilinear was faster, which was a large improvement.My experience with the S3/Virge ... Tomb Raider 1 (yup, S3 had an S3/Virge edition) in '320x200 Virge' mode was slower than the software 320x200 MCGA mode. And that was on my AMD K5-90 (another story in itself...) On the plus side, the Virge mode , thanks to 16-bit color, did look better.
asicnewbie said:To remember the Mediocre-Era of <$200 PC/3D, you would do well to remember this quote from the movie Minority Report: "In the land of the blind, the one-eyed man is King!"