Reading all these Parhelia reviews...

Typedef Enum said:
- It will give you pretty darn awesome signal quality...In all likelihood, second to none.

Absolutely, though I don't think this will make too much difference to most.

- Although the Anisotropic filtering capability is limited right _now_, it is capable of achieving GF3/4 level. It will surely be addressed in future drivers.

Why do you think it is capable of achieving GF3/4 level? I see no evidence yet that supports that...got any info on the maximum degree of anisotropy that he hardware supports?

- There is no single Antialiasing method in consumer boards that isn't without some sort of 'con.' The closest thing to perfect I've seen is B3D's overview of 3DLabs older Wildcat board...and you just have a feeling that it would suck major wind in an actual game.

Granted, but I would personally consider the lack of AA in certain situations with FAA enabled on the Parhelia to be worse, at least, than the GF3/4's drawbacks. Since newer games are using alpha blends, the alpha test issue is more or less a mute point. It may be much harder for software developers to figure out a way to keep the Parhelia's FAA behaving nicely all the time (if the problems aren't fixed in drivers...and it is a pretty good possibility that they will be fixed).

- On the topic of FAA...Although there are some technical issues to consider, I feel that some of the issues are probably driver related...though we will see how it pans out in the near future.

The only real question as to whether the issues are driver related or not has to do with how Matrox designed the edge detection algorithm. It really just depends on how adjustable the technique is by the drivers (i.e. it could be as adjustable as a specialized fully programmable processor on-die that does the sorting, or it could be fully hardwired...).

- On the topic of performance, these are about as close to 'raw' as you will probably find. Without a doubt, we will see the performance increase. the question is more a 'when' then an 'if.'

I agree...though I think the question is more "by how much" and "when," with "if" not even appearing.

- Cost: Matrox needs to pull a GeForce3 maneuver in order to bring the cost down...Whatever it takes, they need to shave a good $100 off.

But this is the big problem. With a 256-bit memory interface, Matrox can't lower the cost by too much and still make money. You can be sure they're already making much less money per card than nVidia makes on the Ti4600's (which is a scary thought...).

- Performance: Right now, on very raw drivers, Parhelia is generally capable of GF4-level performance when you crank everything up. The downside, obviously, is that you do NOT get GF4 level performance when you turn everything off. So the question remains...Which is better?

If the FAA worked 100% correctly, I'd agree...it's really, really too bad Matrox didn't bother with a multisampling FSAA technique, at least as a fallback from FAA if nothing else. Multisampling also has the added benefit of improving memory access in highly-complex scenes when FSAA is off (With a little bit of caching, naturally...).

And yes, I feel that all hardware designers should be focusing 100% on performance with high degrees of comprehensive anisotropic and good FSAA, instead of just baseline rendering.
 
Chalnoth said:
Multisampling also has the added benefit of improving memory access in highly-complex scenes when FSAA is off

what?
Explain further, please?
How does multisampling (a form of AA) affect memory access when it is turned off?
 
Rev, enjoy your break, have fun with your family. I hope to see you back here soon, helping put things in perspective for the more educated gamer.
-------
Icrontic seems to have a "big picture" review, along the line of FS'. Quite a few reviews were kind to Parhelia, taking into account its added features, and being cautious with their final recommendations. Here's the problem: benchmark numbers are merely backup data to support your hypotheses. Many of the bigger review sites simply base their conclusions on the benchmark data, relying more on automated tests than personal play. It's easy to see which is which, which is why I root for Reverend's quick return. (Hooray for alliteration! ;) )

The issue with focusing on benchmarks is this: you're judging not only current "playability," but, more importantly, longevity. A card that scores an "unnecessary" 60fps above the mainstream 60fps min will just be able to use that extra speed for either extra features or newer, more demanding, games. As unpleasant as it is, price/performance is an issue for most of the people who have the time and inclination to frequent game-hardware-oriented websites. For professionals, the extra cost of a Parhelia is justified for its deservedly premium image (both consumer-conciousness- and display-wise).

Shave $100 off a Parhelia, and you'll be left with a GF4Ti, and it's associated unreliable 2D image quality, lack of triple head, lack of gigacolor, lack of 16xFAA, lack of glyph AA, etc., etc. The Parhelia can only get better. For those who spend long periods of time staring at text, they'll find the extra scratch for the Parhelia, just as they found the extra scratch for a flat screen 19"+ CRT.

Hellbinder: 16xFAA certainly counts as better IQ than the competition. And unplayable is from your FPS POV. Flight and racing sims, strategy games, RPGs--all those will run more than acceptably at around 60fps with FAA. Don't forget that, just as the 8500, Parhelia's drivers have a long way to go to garner default developer support in an nVidia-centric world.

Lil' Penny: SV does filter textres, as it's nothing but a fancy version of SS.
 
Althornin,

My bad. I didn´t realize you were correcting Chalnoth´s typo. Of course it should have read OGMS in my former post too.

I thought you were implying that it´s been marketed by NVIDIA as 4x9, hence my reply. It´s not even present in the current drivers for what it´s worth heh.

A misunderstanding; I apologize 8)
 
Nagorak said:
Brent said:
EgonOlsen said:
Maybe instead of Average FPS we need to start focusing on the Minimum FPS in a game or benchmark. The higher the Minimum FPS the better. The game will be smooth throughout gameplay.

There's nothing to show that Parhelia has a higher "minimum fps". This is a perfect example of just grasping at straws to support this product. When it's running almost half as fast as a GF4, I guarantee you that it's minimum FPS is going to be lower. All the marketing nonsense about how this card "doesn't slow down" when in complex situations turned out to be just BS. At 1600*1200 it's too slow to be playable without AA/Aniso... So what if it only takes a 30% performance hit when they're on, it still will be running that much slower. Maybe this is a good card if all you play is the Tomb Raider series of games. :rolleyes:

I totally agree. If the average fps is low, it is very likely the minimum fps will be low, unless the low average is due to vsync (as has happened with many Radeon 8500 reviews regarding UT). In fact, I have proof right here:

ssam-1024.gif


There's no sign of Matrox having any higher minimums, or lower standard deviation in framerate. Well, on the 1600x1200 graph on the same page, Matrox is a bit more steady, but its minimums are still far below the competition's.

Also, just looking for minimums is not a good idea. What if a video card spilled over its texture load routine into the first few frames of a benchmark? It would have horrendous minimum framerate, but the playability is still there for the rest of the game without any dips. Also, a card that frequently goes to the same minimum would score the same. What would be better is to throw out say 1-5% of the slowest frames, and then take the minimum framerate of what's left.

Either that or include a standard deviation with the average framerate, but we all know Joe Blow would be thinking, "What a geeky website! WTF is this deviation thingy? I need a nap."
 
Mintmaster said:
Either that or include a standard deviation with the average framerate, but we all know Joe Blow would be thinking, "What a geeky website! WTF is this deviation thingy? I need a nap."
Joe Blow is more likely to think "Standard deviant?" Then report the site as supporting "deviant behavior" :)
 
Althornin said:
what?
Explain further, please?
How does multisampling (a form of AA) affect memory access when it is turned off?

MSAA requires multiple z-checks per pixel pipeline to operate. When FSAA is disabled, these extra z-checks (with a little bit of cache, obviously) can be used for early z rejection, and are especially helpful when the scenes are very complex (polygon-based grass would be one such situation).

Of course, the existence of MSAA doesn't guarantee the use of early z rejection, but it does make it almost trivial for the hardware to implement.
 
Mintmaster said:
Also, just looking for minimums is not a good idea. What if a video card spilled over its texture load routine into the first few frames of a benchmark? It would have horrendous minimum framerate, but the playability is still there for the rest of the game without any dips. Also, a card that frequently goes to the same minimum would score the same. What would be better is to throw out say 1-5% of the slowest frames, and then take the minimum framerate of what's left.

Either that or include a standard deviation with the average framerate, but we all know Joe Blow would be thinking, "What a geeky website! WTF is this deviation thingy? I need a nap."

No, I think looking for minimums is a very good idea, and horrendously-low spikes should definitely be considered (but other, more stable minimums are just as important).

Obviously, terrible performance spikes are generally a result of poor texture management, and are at least as much a problem as lower overall performance. More important than the actual minimum framerate under these circumstances is the frequency of such drops. If they happen very often, it's a very bad thing, especially if you're talking about a 128MB video card... (Not that I'm saying anybody's ever seen one in the Parhelia...there's no evidence for poor texture management yet...).

More-or-less sustained minimum framerates (wider dips than a frame or two...) are absolutely everything when you're trying to discern playability.

This is also why framerate graphs are definitely a good thing. You want to see relatively stable framerates (no frame-to-frame oscillations of framerate...as we saw with the Rage Fury MAXX, if I remember correctly), and wide dips are of utmost importance.

The frequency of spike dips would be more meaningful in a very long framerate graph (30 mins+).
 
You know, i do think that the parhelia is interesting. And At 300mgz it would be truely awesome.

I guess what really irritates me is the fact that if this were ATI's latest offering you would all be ripping it to shreds, as would every review site on earth. No one thinks that the 8500 is better than the GF4 becuase it has better smarter Aniso. Instead it is called a Driver hack, controvertial, or not even looked at. Matrox lead people to believe this would be a true gamers card. They even stated that "of course its faster than a GF4" on more than one occation.

Here we have FAA that is not totally working right, not 100% compatible and Gives PArhelia the win with sub 50 FPS (much more sub than that i am being kind here). Yet we are all supposed to give them a break, be nice, look at the bright side??? etc etc etc.. ITs not fair at all if you ask me.

IF they had positioned this as a workstation card, with a little gaming on the side this would be an entirely different matter.
 
Hey, I rip into the lack of completeness of ATI's features just as much as I've been ripping into Matrox's lack of completeness in FAA.

Of course, it should be obvious that the shortcomings of ATI's aniso are a hardware issue, while whether the shortcomings of Matrox's FAA are hardware are software is up in the air at the moment.

And yes, I also tend to at least mention nVidia's cards' shortcomings as well, though only one has ever bothered me (the compressed texture issue), I've managed to get around or solve every one of them in the games I play (including the TC issue).

So far, we have no evidence of true hardware shortcomings on the part of the Parhelia, in comparison to today's video cards, except perhaps their anisotropic implementation (that is, the lack of high-degree aniso).
 
BoardBonobo said:
LittlePenny said:
...
-its 10bit color does not have enough Alpha bits for a 4 pipe card

Again this is a limitation Carmack spoke of that I wouldn't mind hearing more about.
...

I think the limitation is a current DoomIII design issue and concerns the use of stencil buffers. The same issue (but not the same reason) that meant Truform couldn't be used.

It actually doesn't relate to the stencil buffers. In any mode Parhelia will render 10 bit color, but only when Gigacolor is enabled will 10 bits be written to the frame buffer. Storing only 2 bits of alpha is a problem if the blend function tries to use the frame buffer's alpha value. Not many games do this so there generally aren't any problems using a 2-10-10-10 frame buffer. Carmack obviously plans to support these problematic blending modes in Doom3 so he doesn't like it.
 
May I remind you that the R200 did not have Smoothvision operational for quite some time after launch just garden variety OGSS? And don´t anyone tell me that it has always been 100% troublefree but it has been pointed out already here.

I haven´t seen one yet pointing out that comparing completely different algorithms doesn´t exactly make the comparison fair.

R200 =max 4x Supersampling

NV25 =max 4x Multisampling (valid against SSAA only with anisotropic enabled)

Parhelia = 16x edge AA (4x times higher the samples then the other two and antialiasing only edges)

Pick 2 in any combination of the 3 cards above and I will have to object to each and every one of them.

That doesn´t change much either when you set NV25 against R200 in terms of anisotropic filtering. Each algorithm has it´s positives and it´s negatives. Sad but true no card can have it all, as none is perfect either.

Finally if Parhelia´s raw performance is due to very shaky drivers at the moment the picture can change by a lot, granted that Matrox will set to optimize them first. But if the card has hardware flaws then I´m afraid that it looks really bad.

Neither the fillrate numbers make any sense so far, nor it´s linear performance across resolutions in DroneZ as another example and definitely not it´s rather low performance in quad-textured games like Serious Sam.

Why would it be awesome at 300mhz? Why is clock frequency suddenly so important? If it would be that important then R300 should have one underwhelming aspect too.
 
So far, we have no evidence of true hardware shortcomings on the part of the Parhelia, in comparison to today's video cards, except perhaps their anisotropic implementation (that is, the lack of high-degree aniso).

Why do I have the impression that it´s limited to 2x aniso at the moment due to another driver quirk?
 
Ailuros said:
Pick 2 in any combination of the 3 cards above and I will have to object to each and every one of them.

The FSAA implementations of the Radeon and Parhelia are currently buggy (The Radeon 8500's will probably not be fixed, while I'd say there's a greater than 50% probability that the Parhelia's will).

The FSAA implementations of the GeForce3/4 work exactly as advertised, with the only real drawback being taken care of by games using alpha blends (Note that I'm talking about base 2x and 4x...I don't personally like Quincunx...).

While it could be said that there should be little difference between a problem whose cause is because of bugs or simply a result of the implementation, I beg to differ. For one, bugs are hard or impossible for game developers to work around. Some game situations will simply display them, while others won't.

By contrast, drawbacks that are a resultant of the implementation are much more predictable, and therefore much easier to work around. It's just a nice plus that the "workaround" for the alpha + MSAA issue turns out to improve image quality for all video cards, and appears to be having widespread support for game developers.

If Matrox can indeed solve the FAA problems in drivers, I will say it's good. Until then, I consider it as very little more than useless. One thing that does bode well for FAA is that it appears that Quake3 shows no issues with it. It stands to reason that other games should therefore be able to be fixed as well with some updated drivers. It would be nice....
 
Althornin said:
Smoothvision DOES help texture aliasing. Just like any Supersampling form of FSAA.
I could have sworn in that old AA article Dave and Kristof wrote for 3dfx they mention super sampling does not help with texture aliasing.
 
I'm taking leave from writing articles/reviews/what-nots, I won't be taking leave from complaining when I feel like it :)

I'm not saying performance should be downplayed - I'm not disputing the fact that performance will always be the top criteria (it is!). My beef is with almost all sites making a lame attempt of a shootout (there are inconsistencies in performing their shootouts, like unequal settings amongst different cards that have different ways of doing what appears to be similar technology to give the viewer the illusion of similar technology being performed) turn out like a review, and then stressing how important it is for a new card to be better or "be in the running" with the currently available ones. That is the wrong way to inform the public about why the new card should be a purchase consideration. That is not a review. That is a shootout, about which is fastest or very nearly the fastest, plain and simple.

A minimum of 60fps is plenty good enough for ALL games. Saying it's not enough means you have fallen into the drugged mindset. Yes, it depends on the resolution and a reviewer should include at least 1600x1200 numbers. You give the cautionary message that it may not be good enough for some upcoming games but you also need to think if whatever upcoming games you have in mind will be bought by the majority (I doubt everyone, or almost everyone, will buy DOOM3 regardless of how good it will look). Just because you're a reviewer that likes games that give you the ability to benchmark it doesn't and shouldn't mean that it is/will be an important game to consider in a video card review.

The comment on minimum framerate is an important one, especially so in a shootout. I started out wanting to do this in F1 2001 in B3D's forthcoming GeForce4 Ti 4200/4400/4600 shootout but I honestly forgot.

The comment on the possibility that the Parhelia just ain't that good. That's a subjective matter that depends on what is meant by "good". "Ain't that good" is different from "Ain't good enough". At this point in time, and having read the reviews and not having a Parhelia to experience on a first-hand basis, I'm likely to subscribe to the former ("ain't that good")... yes, the numbers speak for themselves even if I discard the scores of the other cards that it is being compared with. I'll withold any "Ain't good enough" judgement for now however. You know how it is... drivers, different games, etc.

I don't care if a particular card "gets smeared" by other cards performance-wise if the card already gets at least 60fps. If it gets me 60fps minimum with everything it is capable of IQ-wise turned on (and remember, we may have different technologies at work here with the various cards, all trying to approach the same end result), it is a contender for my money. It is then that I try to weigh in any extras that card may have over its competitors.

My first video card reviews were of the Voodoo3s and the original Quadro (two to three years ago!). The Voodoo3 reviews were of a quality that I am now too embarrased to talk about. The Quadro review, while not of particularly good quality personally, was an eye-opening experience (in reviewing it, that is). I received emails from regular folks saying how informative that review was. And then I received emails from developers because of that review. That was a real surprise to me. Then I got similar reactions (from the public and from developers) with my series of Voodoo5 reviews and articles. They appreciated the stuff that rarely gets mentioned on other sites. I'm not championing myself... I'm merely stating that reviewers need to start doing things differently for the sake of providing real, relevant information. Time constraints and the need to be the first out the door with a review... the first is unavoidable, the second is. Ah, if only everyone thinks alike and talk to each other for the sake of the public (isn't that what a website is for unless the website is a business?).

All in my humble opinion of course.
 
Reverend, awesome post man, that broght a tear to my eye ;)

I agree, as long as the Minimum FPS is around 60 with all features on then i'm happy, that's the goal, high rez, features on, game settings to max, a minimum of 60fps... mmm that would be some good gaming....
 
Chalnoth said:
Why do you think it is capable of achieving GF3/4 level? I see no evidence yet that supports that...got any info on the maximum degree of anisotropy that he hardware supports?
ATM, Matrox is indicating that it's either just a driver bug, or it's been limited for performance, and they may enable it later.

the alpha test issue is more or less a mute point
Moot point! Moot! :p
 
Back
Top