Ich Kann nicht Deutsche verstehen

I didn't like the Chip.de review too much. Copy/paste straight out of Matrox marketing documents, and unreflected benchmarks with lack of interesting comments can't make for a valid review. Also, I am far more interested in minimal framerates in actual GAMES than Aquamark scores and the like . . .

Another review will be up in some 10 minutes' time (9am CET) at www.3dcenter.de , again in German. 3DC benches games exclusively, so the review might be a slightly more interesting read than the other stuff online at this time.

Please keep in mind, though, that Leo only had 4 days to write the review and didn't sleep for the last 50h. ;) Future additions and corrections will be implemented for sure in the next couple of days.

ta,
-Sascha.rb
 
I think the thing to make note of is the product that we won't see for another 9 months or so...Parhelia2.

Once they get these drivers ironed out...use a smaller process...crank up the core...add another 4 pipelines...they will have a much easier time selling people on the $399 pricetag.

As it now stands, I think you nailed it...The main problem is this card scales _very_ poorly. You cannot, for example, drop a feature here and there, and reclaim like 80 FPS, like a Ti4600. The flip side is that you don't really lose out a heck of a lot either.

I really think Matrox will have a ton of headway in their drivers...I mean, a TON. It just seems impossible to fathom a product featuring FOUR times the number of Vertex Shaders over an 8500 to lose out that badly...And even with the relatively slow clock, it should still perform much better without all the IQ features enabled.

In the end, I think it's somewhat apparant that Matrox went down the stability route first...get the drivers up to scratch, so that the thing doesn't BSOD left and right...and then focus on optimizations.

I'm going to keep an eye on this thing during the Summer, as I'm very much convinced that we will end up seeing significant strides in performance. Of course, they will end up fighting...and in all likelihood, getting trashed by R300/NV30...but you can't build Rome in one day, as the saying goes.
 
A question regarding drivers (as my Matrox days ended with the Mystique ;) ):

is Matrox known for actually performance-tuning drivers later in a product's lifecycle? Or are we talking one set of drivers each 6 months to cover some compatibility issues?

ta,
.rb
 
OK, review online:

Matrox Parhelia First Look (German)
http://www.3dcenter.de/artikel/parhelia_firstlook/

Games tested include Aquanox (game, not Aquamark), Dungeon Siege, Max Payne, and even Ultima IX. ;) All benches feature screenshot comparison to GF4Ti, btw.

Interesting note:
Leider hat der aktuelle Treiber 1.0.1.225 noch einen Bug beim anisotropen Filtern - er erreicht maximal den Level 2 (kann man gut mit Serious Sam 2 mittels der Option "gap_iTextureAnisotropy" nachprüfen), obwohl der Parhelia-Chip eigentlich auf Level 8 wie die GeForce4 Ti ausgelegt ist. An diesem Punkt sind dann auch alle geplanten Benchmarks mit anisotropen Filter flachgefallen, weil dies auf Level 2 (dem niedrigsten, Level 1 ist der isotrope Filter) mehr oder weniger überhaupt keinen Sinn macht. Wir müssen diese Benchmarks schuldig bleiben, bis Matrox diesen Treiberbug ausräumt - wenn unser Testsample dann überhaupt noch bei uns ist.

"Unfortunatelly, there's a bug regarding anisotropic filtering with drivers 1.0.1.225 -- only 2x AF is supported (tested with Serious Sam: Second Encounter's gap_iTextureAnisotropy option), even though the Parhelia supports up to 8x, just like the GF4 Ti. At this point, I decided to skip benchmarks with AF enabled, simply because 2xAF doesn't make any sense. We'll provide AF benches as soon as the bug has been fixed by Matrox--if we still have our test sample by then."

ta,
-Sascha.rb
 
Drivers seem to have a ton of room for improvement. I'd also like to know why the Ultima IX framerate was locked in at 49.5fps--some features not being usable in Matrox's driver?

Is Matrox's "accidental" disabling of aniso greater than 2x a way to hide their slower core speed until they pump out some driver speed increases, or does core speed not matter so much in increasing the degree of aniso?

Can't wait to see this review translated into English. Heck, I can't wait to read English reviews. :)
 
Pete said:
Drivers seem to have a ton of room for improvement. I'd also like to know why the Ultima IX framerate was locked in at 49.5fps--some features not being usable in Matrox's driver?

this is an interesting note... I didn't found that out before I read the review again...

only time shows that...
 
I don't think I've seen any CPU scaling results with Parhelia yet, I'd be very interested in seeing how it would do with a 2.5+ GHz P4.
 
1980776_d995ee7438.jpg



1980768_0f3647c3e8.jpg


1980789_d995ee7438.jpg


1980944_b294e7f4f0.gif


1980936_b294e7f4f0.gif



1980957_b294e7f4f0.gif
 
Of course it's still too early to come to conclusions on my part, yet I still have serious doubts that with the predictable AA algorithms next generation producst will employ, that they'll be able to get lower performance drops with AA employed, especially if they allow to utilize 8x or 16x sample AA.

The pricetag of Parhelia is high, but if they intended to re-enter the graphics market and stay, they're already quite high on my future upgrade possibilities list.
 
well, there's nothing badly wrong with card... But it is definately in wrong price class. if they could drop that price here in europe 200 Euros, it would be in very attractive card.

(now: 553 Euros at Finland.)

as a notice for our friends on the other side of Atlantic:
553.00 EUR = 536.765 USD

soo, here it is about hundred bucks more expensive.

EDIT: fixed the price, it was even too low.
 
What framerate is it pulling in surround gaming with UT2003 then? In the 20's?

I am wondering that as well.

I think that this card should have done better given the raw hardware specs. We all know that specs mean nothing but we should wait and see if a driver rev can improve it or not.
 
Typedef Enum said:
I think the thing to make note of is the product that we won't see for another 9 months or so...Parhelia2.

Once they get these drivers ironed out...use a smaller process...crank up the core...add another 4 pipelines...they will have a much easier time selling people on the $399 pricetag.

Such a beast would run even slower.
 
faa

We can all talk about raw speed as much as we like, but keep in mind a few things -

Matrox specifically said that this card would not be a framerate champion in today's games.

I think some of you aren't really _looking_ at those AA scores. note that even at 1024x768x32x16FAA, that card is _still_ pulling 101 frames per second in Quake3. 101 frames per second is a lot. Add the gorgeous edge AA that Matrox has, and you're looking at stellar image qualities. Even the 4600 is only pulling 10 fps more at only 4x AA at this resolution. And, at that AA setting, the Parhelia remains fully playable (above 30fps) all the way up to 16x12x32!

Don't get me wrong, I've never owned a matrox videocard at all, and I currently use a GF3, but I don't even bother with AA on it because it doesn't look nice (I used to have a Voodoo 5 so I'm spoiled) and it hits framerate too hard for such little improvement. I wish we had more detail on filtering, too.

The GF4 drops 40fps at 4xAA in SS, and for 16x FAA, the matrox card only drops 10fps. Think about this, somebody who finally built usable AA into a videocard. Now, the question is, how much performance is left in the drivers, as they've got to be fairly immature at this time?

Too bad that Matrox is pricing themselves out of the market, though.
 
Re: faa

Actually, 100fps is not a "lot", because it's Quake3 - one of the least demanding games on the market. It may be a "lot" with decent AA compared to what we've seen currently, but certainly not enough to justify the high price tag. The point of FAA was not only that it was better looking that 4XFSAA, but it provided a minimal performance hit - right now, this is not entirely the case. Yes, it's better looking than 4XFSAA - enough to justify the price and the fact it won't work with all games, as 4XFSAA will? And I doubt most Quake3 players would consider just over 30fps "playable". You're not paying $400 for a Quake3 card, as Matrox has mentioned themselves. While it may be in the 4600's ballpark, so is the 4200 (and would probably still beat it with these drivers in many games) - and costs less than half the price. Comparing it to a 4600 is the best case scenario for the Parhelia, actually – it’s the only way Matrox’s offering isn’t completely embarrassed on the price/performance scale.

Folks, when's the last time you saw a videocard release that actually exceeded your expectations? It seems we go through this every year, hype builds for a product, people state "It's on my purchase list!", then the benchmarks come out, and people are dismayed. Drivers, drivers, drivers – a lot of us seem to forget this whenever we hear the specs of a new card. I can understand those that want this as a workstation card for Matrox’s excellent 2D display and multi-monitor support, but that’s an incredibly small fraction of the market.

It's obvious that driver tweaks could/should bring about a large performance boost, but that's quite a variable. How much? When? Is Matrox's driver team up to the task? Are Nvidia's the "standard", or is their driver development process unmatched, and Matrox will never catch up? I have to side with those who just don't see a market for this card. Even if the R300 and NV30 aren't orgasm-inducing (well, any card can be provided you go to the right sites :) ), they would have to be a screw-up of gargantuan proportions not to trounce these benchmarks. Chances are both will offer far more DX9 capability as well, even if they aren’t screamers with today’s games.

I'm particularly surprised at the 1600*1200 scores - even with immature drivers I would have expected the raw bandwidth available would let Matrox distance themselves in that area, but unfortunately not.

I'll confess that I was never really interested in Parhelia, I’m more interested in graphics products that can advance the PC gaming market (or at least keep it afloat!). $399 cards simply don’t do that, especially in light of increased competition from consoles. What they do however, is provide a platform to launch more economically feasible (from a consumers perspective) models, which might be interesting. But again, that’s another huge variable – the drivers have to mature quickly, and the price has to plummet – I can’t see a 64meg version of this card having a snowballs chance in hell of succeeding unless it’s priced in the ~$250 range, max.
 
Back
Top