So who's thinks their predictions were right about the nV40?

K.I.L.E.R said:
It may look like the most futureproof card now but that may change in about a few months or so.

I think the R420 has to be much much faster then the NV40 to make it more futureproof since it's (well, from what we know now at least) SM2.0 vs SM3.0.

But then again, there's the rumoured coming of the Power VR Series 5 ....... :)
 
AlphaWolf said:
Is nv45 moving to tsmc? I haven't heard that ibm has worked out their low-k issues yet.
The current rumors seem to indicate that the GeForce 6800 is being developed at IBM, while the GeForce 6800 Ultra is being developed by TSMC. That would seem to indicate that nVidia's high-end parts will be manufactured by TSMC for the time being.

I think the fall refresh parts will need to be doing more about memory than core speeds in any event.
That will be nice for current games, but I think future games will be more core limited.
 
ben6 said:
I'm not surprised. Course something tells me I'll have some down and dirty time with a MSI 6800 Ultra in a few days :). Hope my brand new 450W PSU can handle it :)
Well i don't know what you have in your PC, but HFR reviewed the 6800U in a 350W enermax PSU, so you should cope ;)
 
Mintmaster,
I also see the Bandwidth being the most limiting factor for the R420.
Lets say:

Core 500MHz vs 412 for the XT
Memory 600MHz vs 365
Pipelines 16 vs 8

If i calculate correctly it makes per pipe
1/2*412/500*600/365= 0.67. Means that Ati will need to increase its bandwidth savings by 50% to have the same as the 9800XT.

Of course if the 800 XT core has got the same frequency as the 9800 or even the 6800U it would decrease the need of extreme BW/ BW saving technics.
 
Evildeus said:
ben6 said:
I'm not surprised. Course something tells me I'll have some down and dirty time with a MSI 6800 Ultra in a few days :). Hope my brand new 450W PSU can handle it :)
Well i don't know what you have in your PC, but HFR reviewed the 6800U in a 350W enermax PSU, so you should cope ;)

It seems like NVidia was very conservative when recomending a 480 W PSU. Maybe they want to make sure that people are ready for upcoming products.
 
I am impressed with the advancements that Nvidia has made with the NV40. Two handicaps I see:

- The process they used is limiting their speed at the moment to 400mhz to keep reasonable power and heat dissapation.
- The other is the NV3x legacy, meaning developers programing for the NV3x workarounds may very well affect the NV40 for awhile. Like in FarCry where the NV3x path is downgrading the IQ for the NV40, hopefully short lived. Still many programs even Half-Life2 where significant effort was made to make it a resonable performer for the NV3x (supposenly 5x the effort and time) now has another path to take or should I say a future path since it will be awhile before there are a significant number of NV40 cores out there to even bother. In short I think Nvidia has hosed the developers with the NV3x cores and themselves now.

In the end I think current titles and near future titles (all this year) will show better performance on ATI's R420 chip cards except maybe DoomIII. ATI will have a significantly higher clock speed with an already great design (hopefully improved somewhat) with optimize drivers. In addition many developers already know how to get the most out of ATI design while now faced with two very different DX9 designs from Nvidia. While ATI's next core maybe not as advance in a number of aspects, the process they used will just plain outdo what Nvidia has now. Now if Nvidia will get a .09 micron NV4x 8).
 
noko said:
Still many programs even Half-Life2 where significant effort was made to make it a resonable performer for the NV3x (supposenly 5x the effort and time) now has another path to take or should I say a future path since it will be awhile before there are a significant number of NV40 cores out there to even bother. In short I think Nvidia has hosed the developers with the NV3x cores and themselves now.

That shouldn't be a big issue though since the NV4X should be able to run the R300 path (standard DX9 path that is) without problems.
 
Chalnoth said:
jimmyjames123 said:
It's not like NVDA is going to sit still with a 400Mhz core clock, especially when we all can see how efficient the underlying NV40 architecture is, and how much performance can potentially increase with core clock speeds.
Well, of course not. We can be pretty much certain that the NV45 is on the way for a fall release, and with a die shrink to .11 micron with low-k rumored, 600MHz for the NV45 seems conservative.

I was living under the impression that NV45 is what the R423 is for ATI, but that's just probably me.
 
I'm not sure what to make of it. Compared to the Nv3x, this card is amazing. Yet, its IQ is at best, comparable to the R3xx (and it's hard to decide, since we're still stuck with crappy JPGs instead of PNGs, which would be ACTUALLY UEFUL!), there is angle dependant AF (and I thought this would be the generation to lead us away from this... and for those who say nVidia wanted to match ATI, they need to be reminded that even when the cards competed in a full on 90o contest, ATI 16x AF was faster, IIRC, than nVidia 8x AF), and most disappointing is that pure MSAA doesn't go beyond 4x AA.

So, what I have here is a card that can go up to 4x (non-gamma adjusted) MSAA and 16x angle dependant AF, something my Radeon 9800 Pro already does (with gamma adjustement). In the end, what I'd be paying for is the ability to notch up the res by one, and a bit of SM 3.0 future proofing. Right now, I'm not feeling a great urge to upgrade.
 
So, what I have here is a card that can go up to 4x (non-gamma adjusted) MSAA and 16x angle dependant AF, something my Radeon 9800 Pro already does (with gamma adjustement). In the end, what I'd be paying for is the ability to notch up the res by one, and a bit of SM 3.0 future proofing.

That's exactly the way I see it (ok there might be even rarer cases where it's going to be two resolutions). Frankly I didn't expect that much more either and I don't expect more from ATI's sollutions in the end either.

Assume former high end sollutions where playable in game X in 1024, and now you can use the exact same settings and gain playability in 1280 is a nifty step upwards. To expect that it would have gone all the way up to 1600 for instance is a bit too much don't you think?

Agreed on the IQ evaluation.
 
Man you guys are tough to please. 2x performance of previous gen + new features, and still not impressed. It used to be that 2x performance was a pretty impressive gain for people.

I think you are both offbase BTW, because it's not simply a matter of resolution. Future games that need more fillrate and ALU power are going to run more than 2x slower on older HW.
 
I don't see how anyone can complain about the 6800U.....it's a home run, period. It betters anything ATI has got in almost every way, with maybe the exception of FSAA, and there, lets face it, the differences are minor. It's not hot and it's not noisy. If the 6800Us were available today, anyone looking to spend more than $350+ would be absolutely stupid to even consider buying anything else. Well, as long as your power supply will work AND the damn thing can fit in your box. ;)

However - yes DC, gotta add this :LOL: - we still don't know what ATI has comming, but will shortly. The R420 is going to have to bring something to the table beyond what the 6800U has IF theres no 3.0 shaders, OR cost a lot less. Speed would do it......

As it stands right now, there will be 6800Us in my gaming and HDPC's as soon as they are availible at CompUSA(I use their tradein warranty!) UNLESS ATI shows me more..... and ATM, I'm keeping an open mind.

One more thing, a note to nVidia, if anyone there is listening - stop the "optimizations" RIGHT NOW . You no longer need them, and the industry doesn't either. You've got an excellent product, don't soil it by continuing what has become a really dirty, not so, secret, OK?
 
Hard to please? To put things into perspective wouldn't my old Ti4k4 die on me I doubt I would had upgraded to R300, if that rings any bell.

Future games that need more fillrate and ALU power are going to run more than 2x slower on older HW.

Which future games exactly? Careful for future games there is also going to be future HW available.

I don't think a gamer would have a hard time skipping a generation before upgrading, not in the past and not today. I skipped the GF3 too and I didn't regret it either considering what NV25 had to deliver compared to that.

It's not 2x times faster across the board at the moment, rather up to 2x or 3x times faster than former high end sollutions. About the same degree as NV25 compared to R300.
 
Ailuros said:
I was living under the impression that NV45 is what the R423 is for ATI, but that's just probably me.

You're right, but the traditional definition of the NVx5 parts has been the refreshes, so people still use NV45 when talking about the fall refresh, rather than the PCI-E part.

martrox said:
One more thing, a note to nVidia, if anyone there is listening - stop the "optimizations" RIGHT NOW . You no longer need them, and the industry doesn't either. You've got an excellent product, don't soil it by continuing what has become a really dirty, not so, secret, OK?

Interesting point there - we don't actually know how genuine these scores are yet. Really need to sniff out any application detections, and see how much the scores drop. On paper, it shouldn't need any - but I think it's best to withhold judgement.
 
martrox said:
However - yes DC, gotta add this :LOL: - we still don't know what ATI has comming, but will shortly

Yeah, but my point was, Ailuros seemed to be saying that it didn't matter. R420 nor NV40 excites him.
 
Ailuros said:
Hard to please? To put things into perspective wouldn't my old Ti4k4 die on me I doubt I would had upgraded to R300, if that rings any bell.

Well, that's you. Like I said, tough audience. Most people will consider a 2x perf increase to be a big deal. 2x, + new features, + better 2d video processing = impressive for some people.

Which future games exactly? Careful for future games there is also going to be future HW available.

EQ2 beta. Runs like a dog on everything, even next-gen HW. Doom3 @ 1024x768 w/4xAA and with all bells and whistles. HL2 on non-CPU bound maps. Battle for Middle Earth: geometry instancing provides super-linear speedup. All near term (hopefully) titles.


I don't think a gamer would have a hard time skipping a generation before upgrading, not in the past and not today. I skipped the GF3 too and I didn't regret it either considering what NV25 had to deliver compared to that.

You can always skip a generation if you're willing to drop res. My 9700 PRO has problems with many games in "high" detail mode now. Does that stop me? No. Do I hate lower res and choppy framerate? Yes. I have an LCD monitor, so running at anything less than it's native res sucks.

It's not 2x times faster across the board at the moment, rather up to 2x or 3x times faster than former high end sollutions. About the same degree as NV25 compared to R300.

It's pretty much 2x except in CPU or bandwidth limited cases. There are not many cards that are truly, absolutely, 2x faster across the board in all games. Some benchmarks still show older midrange cards hanging with the big guys at the top.

Seems like the only thing that could truly impress you guys is something like the Playstation-3 being shipped now, on a PCI-E card. :)
 
R420 will mark a significant step up for me personally if I see a vastly improved AF algorithm and an 8x sparse (8*8 grid) MSAA mode. Legacy support for SSAA would be highly welcome too.

Just because the hybrid MS/SSAA or pure SSAA modes get usually overlooked it doesn't mean that they DO NOT have their uses or aren't welcomed at all. In fact for older CPU bound games an 8xS + AF combination is unbeatable in terms of IQ.
 
Seems like the only thing that could truly impress you guys is something like the Playstation-3 being shipped now, on a PCI-E card.

I've no idea what SONY has in store in terms of anti-aliasing; judging by their past track record alone (which isn't exactly fair) I'm having even here second thoughts.

***edit:

Yes. I have an LCD monitor, so running at anything less than it's native res sucks.

21" CRT here; not that much different since anything lower than 1152*864 lookslikea**(tm). Of course will I upgrade within this year, but that doesn't mean that I'm not allowed to have higher expectations in terms of IQ, especially since it would have been perfectly possible.

I never took any part in the filtering related conversations of the past; in fact I recall being almost crucified for stating that angle-dependancy isn't bothering me as much. It's still true; I just can see that specific topic to be swept elegantly under the carpet and the former "offenders" most likely to be extremely silent about it.
 
DemoCoder said:
martrox said:
However - yes DC, gotta add this :LOL: - we still don't know what ATI has comming, but will shortly

Yeah, but my point was, Ailuros seemed to be saying that it didn't matter. R420 nor NV40 excites him.

DC, no product is going to be everything to everyone. I really do wish the NV40 had 6x/8x MSAA..... because, with it's power, it can use them. Many will say, well, just jack up the rez...and their right, to a point. Here's the problem - well, for these old eyes, anyway - I have real problems reading most text in games past 1024x768/1280x1024..... so just what benifit does higher rez do me? The really sweet spot for me would be those resolutions at 6x/8x FSAA and full AF. The NV40 has the power, no doubt...... but the mixed 8X fsaa is really not usable. Jeez, find another way to drop this incredible card to it's knees....... But, as I stated before, the card is a home run, no doubt bout that!
 
Back
Top