Parhelia: "Five pixel shader units per pixel pipe?&quot

Pete

Moderate Nuisance
Moderator
Legend
(I asked this in a news post, but I haven't rec'd a reply yet, so I'm trying again.)

I'm reading an interview with Matrox over at Gamer's Depot, and this struck me as unusual:
5 pixel shader units per pixel pipe
I was under the impression that current video cards have one pixel shader per pipeline--are you telling me Parhelia has 4x5=20 vs. 8500's/GF4Ti's 4x1=4, or is my current impression wrong? Or is he getting fancy with his definitions?

BTW, do Parhelia's pixel shaders contain comparable range to 8500's? I'm thinking of that globe picture from Tech-Report, showing the 8500's shading as more pronounced because of greater range. Does this extended range translate to better IQ in games--is Morrowind's water more "rippley" with the 8500?
 
Re: Parhelia: "Five pixel shader units per pixel pipe?&

Pete said:
(I asked this in a news post, but I haven't rec'd a reply yet, so I'm trying again.)

I'm reading an interview with Matrox over at Gamer's Depot, and this struck me as unusual:
5 pixel shader units per pixel pipe
I was under the impression that current video cards have one pixel shader per pipeline--are you telling me Parhelia has 4x5=20 vs. 8500's/GF4Ti's 4x1=4, or is my current impression wrong? Or is he getting fancy with his definitions?

BTW, do Parhelia's pixel shaders contain comparable range to 8500's? I'm thinking of that globe picture from Tech-Report, showing the 8500's shading as more pronounced because of greater range. Does this extended range translate to better IQ in games--is Morrowind's water more "rippley" with the 8500?

I think that means that Matrox's hardware can do 5 pixel shader operations per pipe per cycle, i.e. 20 per clock. Radeon 8500 and Geforce3/4 only do 2 per cycle per pipe, hence 8 per clock. However, remember that their clock rate is much lower, so it was kind of a trade-off. Also, shaders with few instructions will be wasting hardware potential.
 
Interesting--Parhelia is even more powerful than I thought. I suppose it is true that it's not the card that's limited, but the games/benchmarks used to "test" it.
 
actually , the parhelia has 4 texture units per pipe too . So technically it can do 4+5x4 operations per pass which is why they call it a 36 stage pixel shader...
 
Pete said:
Interesting--Parhelia is even more powerful than I thought. I suppose it is true that it's not the card that's limited, but the games/benchmarks used to "test" it.

I'm sorry but these sort of statements are just sheer nonsense. Yes, the games are flawed, causing Parhelia to look bad.

What were you planning on using the card for, playing a Matrox custom demo and no games? :rolleyes:
 
What can I say? Given the specs Parhelia appears to be packing, I must conclude that the reason it's not cleaning up is because current games haven't been designed with Parhelia in mind. It's like using the wrong measuring tool. Obviously I would want Parhelia to perform well in both current and future games, but I suppose the core clock disparity will prevent that.

I accept that Parhelia is a disappointment in current games, as every single review has shown. But I do think that it may perform better in future games that use more than the standard nVidia feature set (4x2, which ATi was forced to adopt because no one was exploiting their extra Radeon texture unit). I also expect its drivers to improve (hopefully as significantly as the 8500's), as a few benchmarks left me scratching my head (such as its 49.5fps across-the-board performance in Ultima IX).

Call me an optimist, or ignorant, but until I see future games benchmarking as poorly, I'll conclude it's more a case of unoptimized games and drivers. Once I see future games performing poorly, though, then I'll have to admit that lack of occlusion culling and subpar engineering (sw &/ hw) are the culprits. Just don't call me a fanboy, as your sarcastic post seems to imply.

But, seriously, you think games can't be "flawed" (as benchmarks), as in not optimized to take into account future hardware? Take a look at Morrowind's reverse draw order.
 
I think it's just that the drivers aren't yet optimized. At launch, the G400 had roughly 50% the performance in OpenGL it would eventually attain.

My only worry for Matrox is that once the drivers are matured (which I am pretty confident about), there will be better hardware out by nVidia and ATI.

I wouldn't want to get anybody's hopes up too much, however. I seriously doubt there will be such a drastic improvement in current drivers, but I wouldn't be surprised if the Parhelia eventually managed to outperform the GeForce4 Ti 4600 in a fair number of games (Mostly future games that use lots of textures per pixel...though probably not DOOM3 due to statements by JC that hardware designers weren't optimizing for single-texture graphics nearly enough...).
 
Pete said:
What can I say? Given the specs Parhelia appears to be packing, I must conclude that the reason it's not cleaning up is because current games haven't been designed with Parhelia in mind. It's like using the wrong measuring tool. Obviously I would want Parhelia to perform well in both current and future games, but I suppose the core clock disparity will prevent that.

I accept that Parhelia is a disappointment in current games, as every single review has shown. But I do think that it may perform better in future games that use more than the standard nVidia feature set (4x2, which ATi was forced to adopt because no one was exploiting their extra Radeon texture unit). I also expect its drivers to improve (hopefully as significantly as the 8500's), as a few benchmarks left me scratching my head (such as its 49.5fps across-the-board performance in Ultima IX).

Call me an optimist, or ignorant, but until I see future games benchmarking as poorly, I'll conclude it's more a case of unoptimized games and drivers. Once I see future games performing poorly, though, then I'll have to admit that lack of occlusion culling and subpar engineering (sw &/ hw) are the culprits. Just don't call me a <bleep>, as your sarcastic post seems to imply.

But, seriously, you think games can't be "flawed" (as benchmarks), as in not optimized to take into account future hardware? Take a look at Morrowind's reverse draw order.

We already know it performs slower in Unreal Tournament 2003. That's a future game, right? How far into the future do you have to look?

This same argument was used with the original Radeon, and honestly, as much as I liked the card, by the time games came out where it performed faster than the GF2, both cards were too slow to run them decently. So, I'm just saying, don't bet the farm on the "future games" argument (I never bought my Radeon with the expectation of it being faster in future games anyway).

The truth is a video card really needs to run whats out there at the time, because by the time newer stuff is out there, the card itself will be totally out of date. It seems to me that the future games argument has been used a lot as an excuse for flawed hardware that performed poorly with current games (and not a real lot better with new ones either).

I'd say there's a chance that bad drivers could explain the problems with the Parhelia. But there's also the chance that the card is just a PoS for some unknown reason. I guess time will tell... The only problem is, unless they fix the drivers in the next couple weeks, they're going to be behind the eightball again, regardless.
 
Nagorak said:
We already know it performs slower in Unreal Tournament 2003. That's a future game, right? How far into the future do you have to look?

I'm trying to say that its performance in games like UT2k3 could possibly increase very significantly within the next six months as drivers improve. But, even this will likely be too little too late (and should certainly not influence anybody's buying decisions today...).
 
The GF3 started life slower than a GF2U. The 8500 started out slower than a Ti200. I'm giving Parhelia a chance.

I also don't buy hardware on release--I'd rather wait a 3-6 months for driver issues to be ironed out. I also save money on the hardware, and the corresponding games, which have also been patched. It's a win-win situation, IMO.

I know UT2K3 is a new game, but as even the demo is unreleased, it may well be as unoptimized as Parhelia's drivers. And one game does not a great card make. Not to mention Anand is the only person currently using the benchmark.
 
Pete said:
The GF3 started life slower than a GF2U. The 8500 started out slower than a Ti200. I'm giving Parhelia a chance.

I also don't buy hardware on release--I'd rather wait a 3-6 months for driver issues to be ironed out. I also save money on the hardware, and the corresponding games, which have also been patched. It's a win-win situation, IMO.

I know UT2K3 is a new game, but as even the demo is unreleased, it may well be as unoptimized as Parhelia's drivers. And one game does not a great card make. Not to mention Anand is the only person currently using the benchmark.

I actually wouldn't put too much stock in that benchmark either. Actually, it seems like it's either out of date or incomplete. We know that UT2k3 will run on 3DFX cards (no clue how well though, but I bet it won't be the greatest), yet they won't run the test. So, take it with a grain of salt.

We'll just have to see how Matrox's driver work goes, but I think it makes a lot more sense to expect improvements from their driver team, rather than magically from newer games.
 
Two quick things on that benchmark and 3dfx cards:

1. The post I saw where they were running 3dfx cards had them, I believe, running at 640x480 or below. I'm not sure they'd run acceptably at higher resolutions...

2. The benchmark may make use of features that the 3dfx cards do not support, making any comparison meaningless.
 
Back
Top