Ich Kann nicht Deutsche verstehen

Fuz said:
What is "jääräpää"?

My friend at IRC explains it to you:
Fiiu @ IRCNet said:
[23:06] <Fiiu> 'Jääräpää' is a person that will buy Parhelia, whether it's fastest or slowest. Because he just makes his mind and is going to buy it. :LOL:

;)
 
Is anyone really surprised by they scores? Seriously? From everything I have heard and read about the card, its preforms just about exactly what I thought it would. Maybe a little slower, but it will get a little faster with drivers, but not that much I would expect.
 
Fuz said:
What is "jääräpää"?

Well.. It´s a well known word in finland. It basically means same as stubborn. But we have also a perfect human example, he is known as jordan1 to many persons in finlands hardwarescene. I ques that i don't have to explain this any further.. :)
 
yeah i think they have pulled it down, i see hardocp has commented on the card as well and still seems bitter :LOL: also ben did you get your card?
 
The question we should ask ourselves is it worth $399?

Easy answer for me.

No graphics card is worth $399 especially knowing that it won't be the next best thing since sliced bread in a few months time.

However I was slightly disappointed it did not perform faster. The TnL engine seems very poor as does the vertex engine. Again people will say wait for the drivers and compare it to Radeon 8500 launch. When the Radeon 8500 launched with the 3286 drivers (for XP) it was only slightly behind a GF3.

Time will tell, and I do not want to bash the Matrox effort at all. From nothing to this is an impressive feat in itself and if you like the FAA feature (who wouldn't) and happen to have 3 screens available to you then more power to you.

PERSONALLY I will wait for the R300, P10 and NV30 before completely dismissing the Parhelia.

Another point to note is that the benchmarks tested showed games that are either currently available or use features which are becoming mainstream in PC gaming (Pixel Shaders, Dot3 bump mapping etc), and the previous generation cards are running them at the same speed or faster than the Parhelia. This is not a problem in itself - the problem for ME is that when I make an investment in a graphics card I expect it to last at least 12 months before upgrading (more often 18 months).
At the moment I do not believe the Parhelia fits this category as my Radeon 8500LE is having problems running some games at max setting already (bought it in November '01).

:)

Edited: just 'cos
 
Hi Guys'n'Gals, my first posting here :)

What striked me most was that Lars Weinand mentioned the AF-Quality being only equivalent to that of GF4 with only 2xAF turned on.

Wasn't it supposed to have at least 8tap-Quality?

Although barely visible on the Screenies, he also mentioned FAA-Problems with some Objects in the Max Payne Scene, where the Ticket Counter was not AA'ed as supposed to.....

Odd, i agree. A Chip with this texel-fillrate and this memory bandwidth at it's disposal is supposed to be much faster especially in higher resolutions and with FSAA turned on.

Sorry for Matrox, but as the saying goes "You never get a second chance to make a first impression".....

Hope tomorrow clarifies some of these issues....
 
I have a feeling alot of Parhelia reviews are going to mention the R300 coming soon, and then the R300 reviews are going to mention the NV30.

As of right now if I were buying a computer, I'd get a Parhelia. Imagine how much less window flipping you would have to do with two extra screens. I could program in the center, have my assignment tasks on the right, and debug on the left!

And they benchmarked Aquanox? Gimme a break.
 
I use a 19" NEC at home and even at 1600*1200, I run out of space, with that said, I'm really leaning towards a multimonitor display. Coding really sucks with such a small screen space.
 
SirXcalibur said:
Is anyone really surprised by they scores? Seriously? From everything I have heard and read about the card, its preforms just about exactly what I thought it would. Maybe a little slower, but it will get a little faster with drivers, but not that much I would expect.
Word. *nods*

Just as a reminder: this review was written by Borsti of Rivastation. You know, the same Borsti (Lars) who thought it surprising how ATi manages to get close to NV's "64x AF", quality-wise, by offering "only" 16xAF ("NV also offers 32tap and 64tap anisotropic filtering, ATi only 16tap") . . . Take anything he writes with a pinch of salt, especially if he's talking about technical stuff.

But more reviews will be up soon, quite possibly on 3DC, too. ;)

ta,
.rb

P.S. oh, hi Quasar. *waves* ;) .rb
 
I guess I'll retype my response here as well...

I appplaud Matrox, having "come back" to the highend scene. Unfortunately, they had a _lot_ of ground to makeup, and I feel the performance numbers that we have seen is rather indicative of it. nVidia and ATI have been able to build upon a solid 2-3 years worth of 3D architecture in both the GF4 and 8500...Matrox basically folded the tent, and started anew.

I fully expect the drivers to get much better over time...In fact, the one thing you can be sure of is that these numbers will only go _up_...But, at the end of the day, you have to ask yourself this one question....

R300 is probably going to be announced sometime in the next 4 weeks or so, give or take...There is no doubt that this chip is going to bury Parhelia in a really significant manner. About the only thing that Matrox will *really* have going for her is the Image/Picture quality angle...With that said, does anybody honestly expect Parhelia to be able to compete, dollar wise, with R300? I expect that ATI is going to target the ~$350.00 figure for R300, which is smack-dab where Parhelia currently sits.

Given what we _probably_ can surmise with regards to R300 performance...Doesn't it seem impossible to imagine Parhelia selling @ $399 ? In fact, as soon as R300 is shipping in volume, I think Parhelia will almost certainly have to drop to $299, just to get consideration...if not even less.

Then, of course, there is NV30...Nobody really knows what this thing is all about...But rest assured, there is a 100% certainty that NV30 will end up smoking the Parhelia as well...And I might even extend it into the area of Image Quality as well (save signal quality...but in features).

About the best spin I can put on Parhelia is this...They just _had_ to get something out the door. They've done just that. Now, they need a good 8-9 months of driver tweaking...and then in Springtime, they can focus on a .13u DX9 part with significantly better drivers.
 
ConeK, you should be ashamed for coming to B3D to post ad hom remarks behind someone's back. I don't know Jordan1, but while I prefer Finclockers I do read Muropaketti now and then (yeah, yet another Finn here) and I'd peg you as the most ignorant besserwisser ever seen on the fora there. Nappe1's inexplicable and excessive tantrums with anything even remotely critical towards Bitboys or Matrox notwithstanding. While Jordan1 seems to be rather the clueful party, although he invariably gets stuck in your never-ending pissing contests. Hopefully this thread can continue now on topic without more interludes nobody called for. Just my 0.02 euro.
 
*waves back @ nggalai*

Well, _if_ Parhelia is the same chip last year rumored as G800 and Matrox just did not finish it in time, i'd say, last year it'd blown everything in its vicinity out of the water, but product cycles are so damn fast in this industry that they eventually can ruin a whole company as they did in the past, when another former big player couldn't keep up in the race and finished its chip the better part of six month (on prod. cycle) too late.

Let's hope, matrox does slightly better, chances are they will thanks to being a privately owned company.......

I guess, especially in HQ-Gaming regards, where bandwidth matters most, their 256bit interface can save them a lot of ground in the fight at least against nV25 and R200. And maybe, just maybe, R300 and nV30 are not so close around the corner as they appear to be now.

If Matrox delivers both on good drivers and on Parhelia-based cards in July they'll have a definite high-end competitor with an edge in image quality for al least three month (my guess) until R300 arrives on the shelves and another one to two months ahead of nV30, which, if really grown in 0,13µ, can rip everything else apart....
 
I'd like to know why everyone here seems to be bending over backwards to defend the Parhelia, yet you're openly skeptical of the R300.

People talk about bias, but it's obviously running rampant here. You see comments like "it's slow, but that's to be expected". I haven't seen any benchmarks of the card, but from what I've heard it doesn't sound good, still I'll reserve judgement for the time being.

That said, if it's slow, then how do you figure it's going to sell for $400?????? Supposedly this is a "professional card", not a "gaming card" (very convenient to change the focus when it looks to be turning out to be a lackluster card for gaming). Please explain to me why professionals won't just buy a G400 or G450...they don't need the gaming performance. The only real difference other than that is the third monitor, and I'm just not sure how many people will consider that worth a $400 investment (plus add'l monitor, which as a "professional" would be another $2500).

I agree with Typedefenum, I have a hard time understanding how this card is going to sell and who it's going to sell to. The price is just way to high for what is being offered. For $400 you really need a product that blows everything away, and even then the card won't sell in huge numbers till it drops into the $200-$250 range. The price is just all wrong, and anyone who buys it for this price is throwing away money since, within a month, it's going to have to drastically drop in order to remain competitive with the R300 (and if it doesn't drop in price, that's probably even worse in terms of sales).
 
Quasar said:
...but product cycles are so damn fast in this industry that they eventually can ruin a whole company as they did in the past, when another former big player couldn't keep up in the race and finished its chip the better part of six month (on prod. cycle) too late.

Not just 3DFX, but ATI was getting close to the centre of the whirlpool as well - they managed to pull out before going down under, though. The Radeon probably was meant to compete with the original GeForce, but ATI probably got a bit backlogged from the many problems of the Rage Fury MAXX. ATI lost a huge amount of market share during that period, and the 8500 was just enough to keep them healthy. Even R300 seems like it should have been out already, assumming the accepted rumour of 0.15 micron process holds true. If they're lead time on NVidia isn't enough, they'll probably lose out on another round.

I think the same will happen to Matrox as well. Good effort, but they should have released it earlier with less pixel shading power (I believe parhelia does 5 ops per pix per clk), or later on a smaller process with more pixel pipelines, so that it can compete with R300/NV30 and games will actually use that many ops per pixel.
 
Nagorak said:
I'd like to know why everyone here seems to be bending over backwards to defend the Parhelia, yet you're openly skeptical of the R300.

People talk about bias, but it's obviously running rampant here. You see comments like "it's slow, but that's to be expected".
Um, the absence of criticism isn't the same thing as praise.
 
I'm with Nagorak here... It's nice that people are polite and give Matrox a chance, but to put it plainly : Parhelia blows.

Good IQ no doubt, drivers will mature etc etc. It's just that with all that silicon space / bandwidth you should do a lot better. Matrox's timing for market is also sadly rather bad, R300 will be here very soon. And that GPU will have both power and features with same or even less (!!!) price than Parhelia.

To sum it up; Parhelia is nice card for serious work, but NOT for games.
 
Back
Top