Matrox working on new GPU

BrynS

Regular
X-bit Labs report that Matrox is apparently developing a 90nm GPU using the memory interface and controller IP of Mosaid Memorize.

[...] “Mosaid Memorize IP was the best choice to meet specific price-performance targets we set for future 90nm graphics processing units,” said David Chiappini, director of ASIC engineering, Matrox. “Mosaid’s complete solution allowed us to reduce risk and time to market through the use of proven third party IP, allowing us to complement our own memory interface specialists and tape out more products in a shorter period of time”. [...]
As mentioned in the X-bit Labs piece, none of Matrox's "current" designs comply with Windows Vista Premium (SM2.0) requirements. Is this likely to be a baseline Parhelia revamp for Vista Premium compliance or might Matrox pursue something a bit more ambitious (<=SM4.0?), albeit in low-end form?
 
Last edited by a moderator:
I so much wanted the parhelia to be something special

i do hope matrox realise (if they are going for the high end)

they have to match the big 2 on price
and beat them on performance

anything less and they might as well not bother

the parhelia 2x the price of a 4200ti + half the performance i mean who in their right mind would thing this was a good idea....
 
I so much wanted the parhelia to be something special

i do hope matrox realise (if they are going for the high end)

they have to match the big 2 on price
and beat them on performance

anything less and they might as well not bother

the parhelia 2x the price of a 4200ti + half the performance i mean who in their right mind would thing this was a good idea....

Well it was a good card for design... And... i'm sure it was good at something else!
 
Actually the original Parhelia was a bug ridden mess. There even was a really bad hardware issue with overlays I believe. AND, after initially touting VS2.0 support, they disappeared that from the spec sheet ~1 yr later. The card can hardly run Doom3 either. I think a R200 would outperform it. Nevermind the bad OpenGL driver issues I read about that were preventing the card from working right with the game for quite a while. LOL.

The only people I think that really fell for those cards were Matrox loyalists who were blinded by, well, brand loyalty. Their old standy of superb analog quality was almost a moot point at that time too cuz NV and ATI had that down almost 100% in 2002.

I'm not really sure what to expect from Matrox these days. I believe they lost the bulk of their talent from even the Parhelia days. And it's not like they can be raking in cash to really dump $$ into the complex R&D required for a high-end modern card. It would be neat for sure if they could do something, but they are so out of touch with the mainstream gaming enthusiast market that I really doubt this will turn into much.
 
I just dont see Matrox having any of the required skills, talent, time, or money to do R&D or drivers. Their best option would be to license IP from ImgTech/PowerVR.
 
Parhelia sounded incredible on paper with 4 TMUs per pixel pipeline. but was a dud ultimately.


it was the first consumer GPU to have a 256-bit memory bus, so it does deserve a place in history.

I wouldn't be too hopeful about a new Matrox GPU. they just don't have the resources to compete with the giants.
 
The big problem with Parhelia was that it wasn't followed 6 months later by Parhelia II.

But then that was always why Matrox was going to get their clock cleaned --they couldn't keep up to the speed that others were moving at in reving/improving their tech.
 
Isn't Matroc still popular among multiple display users ?
Couldn't they just make a card for this market which runs Aero ?
 
Nvidia and ATI has been so fiercely competing together, that no other graphics companies can even get close to what they have achieved. Matrox(and others excluding ati/nvidia) needs to get their driver acts together.
 
Parhelia sounded incredible on paper with 4 TMUs per pixel pipeline. but was a dud ultimately.


it was the first consumer GPU to have a 256-bit memory bus, so it does deserve a place in history.

I wouldn't be too hopeful about a new Matrox GPU. they just don't have the resources to compete with the giants.


It was not really that good on paper either ..................... what happenend is that when the paper specs were listed and everyone started to freak out over this new partial DX9 card with displacement mapping and a 256 bus ........ they all forgot to take into consoideration the lack of occlusion culling , which is what effectively killed it in 3d . It was an important paper spec ( or lack thereof ) which most people didn't notice .
 
Matrox(and others excluding ati/nvidia) needs to get their driver acts together.
Drivers certainly seem to be Matrox's weakest point - I had bought a P650 for our home office computer because I wanted a passively cooled, low power consumption unit that would give very good 2D quality; the only other requirement that I asked of it was it would be able to run Real Arcade games to appease the wife. You can pretty much guess what I'm going to say....it wouldn't even run RA games let alone anything vaguely 3D. The only program that it actually managed to run properly was (and here's no surpise) 3DMark2001 where it ran better in Software TnL mode than Pure Hardware:

http://service.futuremark.com/compare?2k1=8696054 - Software TnL
http://service.futuremark.com/compare?2k1=8696029 - Hardware TnL

It also ran 3DMark03 amazingly enough but the ORB system won't permit the score to be published:

Code:
3DMark Score	407 3DMarks

Game Tests
GT1 - Wings of Fury	29.4 fps
GT2 - Battle of Proxycon	2.3 fps
GT3 - Troll's Lair	2.3 fps
GT4 - Mother Nature	Not Supported

Feature Tests
Fill Rate (Single-Texturing)	281.3 MTexels/s
Fill Rate (Multi-Texturing)	794.2 MTexels/s
Vertex Shader	2.6 fps
Pixel Shader 2.0	Not Supported
 
I don't if I am one of those Matrox fan boys that Swaaye talked about, but I am still going to comment this thread.

so much talk and so little actual facts about parhelia.

Here's some raw facts why it wasn't what it was supposed to be:
A) Parhelia was 8-10 months late. (it was supposed to come out about year before R300, or if you want to see it from nvidia point of view, 4-6 months after Geforce4)
B) it never reached the original designed speeds. (Original clock speeds were planned to be 275MHz to 300MHz. It was released 200 - 220MHz)


and yet, it's still only card in prosumer market space supporting:
A) three displays
B) ...which 2 of them have Hardware video layer.

and that's the thing keeping them alive. if you need those, you really don't have any options.
About the article/quote in opening post... I Might be wrong here, but I have feeling that I saw that comment/press release already years back. about the time when they switched original parhelia core to revision that had worst faults fixed and a small speed bump, So I would not be jumping the gun here. It is still true that they need workout new core if they want to stay on the niche boat. BUT then again, their latest market developments have showed more direction of working multimonitor stuff not dependable on their chips.

again, who lives will see...
 
B) it never reached the original designed speeds. (Original clock speeds were planned to be 275MHz to 300MHz. It was released 200 - 220MHz).
The second revision (AGP 8x) was about 20% faster and as I remember, comparable to GF4Ti (pure performance). I didn't find any review containing some game benchmarks with AA. I think Parhelia AGP 8x with FAA 16x could be even faster than FX5800/5900 with FSAA 8x...
 
The second revision (AGP 8x) was about 20% faster and as I remember, comparable to GF4Ti (pure performance). I didn't find any review containing some game benchmarks with AA. I think Parhelia AGP 8x with FAA 16x could be even faster than FX5800/5900 with FSAA 8x...
Clock difference was 30MHz for the core and memory (220Mhz -> 250Mhz, the bulk version with 200Mhz clocks was dropped from the sales at the same time.). Also, several users in MURC tested it's overclockability and even with stock cooling it reached the original planned clocks. With new memory controller helping also on memory bus efficiency, overclocking made a bigger difference than in original one.

depending on game Second/New Revision Parhelia was faster than Ti4400/Ti4600. More textures layers , bigger difference for parhelia. FAA, when it worked, was quite powerful and quality was top notch. Again, even though the problem with Stencils was fixed in new revision, other Aliasing problems arised, so new revision didn't make much difference on AA.


...but of course you can't say this all aloud, or you will be stamped as Blind Matrox fan boy for rest of your life. (nothing comes for free as they say...)
 
Last edited:
I myself have a nice collection of Matrox cards prior to Parhelia. G400 MAX, G200, Mil2, Mystique 220. I was a real Matrox buff until they couldn't get a new board out for 2 years (instead brought out rip-off crippled G400s) and got stomped utterly in game performance. Then, Parhelia shows up at monster price and seemingly designed for some niche willing to pay big time to run 3 monitors. The drivers were and are horrid. If the thing hadn't had triple head I don't know what M woulda done with that card.

It was never even remotely a worthwhile games card. It couldn't reliably beat the older GF4Ti and then 9700 totally slaughtered it a few months (2?) after it launched.

It's not even a worthwhile pro card, if you look around at the benchies. Radeon 9000s were beating it up.

BTW, I have an account at MURC I started in 2000. I was really active on there in the G400 days.
 
Last edited by a moderator:
The second revision (AGP 8x) was about 20% faster and as I remember, comparable to GF4Ti (pure performance). I didn't find any review containing some game benchmarks with AA. I think Parhelia AGP 8x with FAA 16x could be even faster than FX5800/5900 with FSAA 8x...


That would be really hard to do without occlusion culling .
 
I believe Matrox was also the first graphics company to build their drivers around .net. Anyway, I remember the whole Parhelia fiasco and it's a shame it didn't work out for Matrox. They are never going to compete in the 3D market again but hopefully then can release a nice budget card that will allow them to actually make sales when Vista comes out and tie in with their video products.
 
They are never going to compete in the 3D market again but hopefully then can release a nice budget card that will allow them to actually make sales when Vista comes out and tie in with their video products.

Buget and Matrox? Maybe from the performance point of view but not the price. Anyway I expect that they will only do enough to get a “Ready for Vistaâ€￾ Logo.
 
Back
Top