Matrox: Launch date official - May 14th.

Joe: god damn, you were faster... :)
just a few seconds faster. :)

well, I deleted another thread. :)

umm, hey, how I gonna get down from here?? I mean from roof... :D

I just can't wait! This is Great! another problem is how I am going to able keep my mouth shut up, before launch... so far, every piece have matched to this puzzle... :LOL:
 
rumors, indicate that doom3 may not be demoed at e3 with a nvidia card..

that means matrox finally got opengl right ... well at least better than before
 
rumors, indicate that doom3 may not be demoed at e3 with a nvidia card..

Well, I've not heard these rumours but unless they are specifically saying that its being demo'ed on a Matrox card then I'd say that 3Dlabs are in with a good shout as well. I don't know how programmable/flexible Matrox's part is but 3Dlabs is certianly goes the most towards Carmacks calls so far (Higher precision, Virtual Texturing, programmability).
 
Oh, and based on the teaser graphics and some other rumors at MURC, the official name of the new chip / card will in fact be "Parhelia." ;)
 
Quake 3 was first demoed on ATI hardware. Is it that unthinkable that Doom 3 may be as well ?

That's what I find really amusing, it couldn't possibly be one of the fastest and most feature rich cards on the market, right?

Chances are, from the cards available CURRENTLY, I'm pretty sure the game will look best on the ATI 8500, they've got better shaders and plenty of raw power polies/texturing and besides, didn't Carmack's last .plan said he's done the work for the 8500 code path? So it should be good to go.

Just my thoughts on the matter.
 
As exciting as a Matrox launch is, and not to rain on the Parhelia parade, but assuming that Parhelia is some uber chip, seeing as it took them this long to get it out the door, it would seem to me that Nvidia and ATI could catch up rather quickly.


Or, Matrox takes a strong lead, and gets busting on another chip. As it sits now though, it seems that their development cycle rivals 3dfx.
 
<mumble> Who cares about Doom3, when it is demoed, and with what? </mumble>

Parhelia is public in a week. Yeeee-haaaa!!! Some say it's still a month or two to shipping cards, but who cares. YeeeEEEE-HHHAAAAAA!!!!!

Nappe, c'mon, is it really 4 pipes, 4 TMUs each? 256-bit wide DDR? How about them groundbreaking video features? Spit it out now, be a sport. :LOL:
 
GunHead: No Comment yet.

there is too high risk that some ppl get in troubles if my infos are right...
of course there is also a trouble that I am in pee, if my infos are incorrect and I hype them, but basically I lost my all respect with Bitboys Avalanche ( no, neither they are dead. Just quiet... ;) ) so that really isn't a good reason...

anyways, Maybe I'll drop few hints during this week.

one thing I can say: I am SOLD! and I want that card asap, though it would cost a small fortune.
 
It is there

Hmm, what should i say... matrox is back... and they will be respnosible that some other firms will get in trouble

hmm, did someone realize that the nvidia stocks are falling :)

hmm, it is really a pity that some small canadian firm do not have some...

At last, we will see more soon...
 
Chances are, from the cards available CURRENTLY, I'm pretty sure the game will look best on the ATI 8500...and besides, didn't Carmack's last .plan said he's done the work for the 8500 code path?

Well, in Carmack's latest relvant .plan, he basically said the following:
1) Radeon 8500 is a "fine card" for Doom 3
2) There was some technical issue that he felt was holding back Radeon 8500 performance from realizing it's potential. He's not sure if it can be worked around or not.
3) He was "guessing" that the GeForce3 would be a "safer buy" for Doom3.

Now, that was quite a while ago before several driver releases so who knows if things have changed. However, based on what he said back then, I would conclude that the Geforce4 would be the best of the "current cards" to display Doom3 on.

Though for all we know, ATI will launch the Radeon 8500 successor (R250?) at e3, and maybe it will be displayed on that. ;)

I am hoping that Doom3 is displayed on either the Matrox or 3D Labs part though...because that will give some instant credibility to the GL drivers (as far as gaming goes.) The current rumor surrounding Doom3 is that it will be shown on a "multimonitor" rig...add that to the Parhelia triple-head rumors, and the "Doom3 will be shown on Parhelia" rumor is born. ;)
 
Joe,

I'm going to have to disagree Carmack doesn't really show oodles of enthusiasm in his posts. If he says it's fine, I'm guess that's a strong complement. I think he was bumed because it didn't reach his full potentail, but I think the shading effects on the 8500 will be superior, they are more flexible and capable. The latest ATI betas are supposed to do wonder for OpenGL performance, so I think things might be different.
 
I agree that Carmack saying "it's fine" is indeed a strong compliment. I also agree that Radeon's shaders are more capable and flexible...so does Carmack. The fact that the Radeon didn't do as well as he thought it should based on its increased flexibility is the problem.

The most notorious fix in the recent GL betas is the one for the "GLExcess / high-Poly" bug, which is a texturing issue. Carmack is in fcat on record saying that particular bug fix did not address the specific technical issue he was speaking of in his plan. This is not to say that he doesn't have drivers that have significantly boosted Radeon performance, I'm just saying we don't have any evidence that he had.

I own a Radeon 8500, and fully expect it to be a "fine" card for Doom 3. ;) However, based on Carmack's known statments I'm not expecting it to be as good as a GeForce 4 ti 4600, that's all. ;)
 
However, based on Carmack's known statments I'm not expecting it to be as good as a GeForce 4 ti 4600, that's all.

How about the following statement. Since the 8500 is more flexible and capable it'll have more visual detail (greater amounts of effects possible) or perhaps run at a higher visual detail (more effects at once, there maybe some level of effects setting). But due to the fact that the GF4 has more raw power (texturing/poly) and so it'll be able to run with a greater amount of filter/FSAA or resultion.
 
Personally, I don't see it. The only way Carmack is planning on taking advantage of Radeon's more flexible hardware is to increase performance. As we know, Radeon's increased shader flexibility is supposed to give it "relatively more raw power" when rendering a Doom3 type scene, due to fewer rendering passes being required compared to a GeForce board.

Carmack is coding a Radeon specific code-path, but not for the speicific purpose of putting in more or "higher quality effects" for the platform...it's just to wring out as much performance as possible on the platform. I fully expect GeForce / Radeon quality to be very similar.

Just for reference, here's a link to the relevant .plan update:
(See February Entry)

http://www.webdog.org/cgi-bin/finger.pl?id=1&time=20020315204819&highlight=carmack

Again, I am certainly hopeful that whatever issue ATI had identified as being the likely culprit for lower than expected performance is something they can address with a driver update. Even if they can't "fix" it, I am fully content knowing that the Radeon's performance is at GeForce3 levels, which is the same for every other game.
 
Even if they can't "fix" it, I am fully content knowing that the Radeon's performance is at GeForce3 levels, which is the same for every other game.

Pfft, other games. ;P
 
Back
Top