When Tuesday does the G70 NDA expire?

Chris .. looking at your AA review .. i'm not very impressed with the GTX's AA.

2xAA/16xAF
http://www.nvnews.net/articles/chrisray_iq/7800gtx/2xAAnorm.png

4xAA/16xAF
http://www.nvnews.net/articles/chrisray_iq/7800gtx/4xaanorm.png

8xAA/16xAF
http://www.nvnews.net/articles/chrisray_iq/7800gtx/8xSnorm.png

8xAA/16xAF (Gamma Correct, Transparent Supersample)
http://www.nvnews.net/articles/chrisray_iq/7800gtx/8xStsaagamma.png

If you have a look at the leaves and the gun, the AA is still visible even at SSAA.

Isn't SSAA supposed to be like FSAA?

US
 
There's something you're missing from those shots, though:
The GTX allows one to enable supersampling on only alpha textures, which will fix those problems you're seeing, at a rather modest performance hit. As for the AA on the gun, well, bear in mind two things:
1. Gamma correct AA wasn't enabled.
2. That is a pathological case, and you really should have something to compare it to before making a determination.

Overall, the GeForce 7800 GTX, despite its lack of 6x multisampling, should provide the best anti-aliasing of any video card out there, considering its support for forced AA applied to alpha tested surfaces.
 
Gamma Correction was enabled in the control panel. As was transparent super sampling. I was a little surprised at the results for super sampling in Far Cry myself. Yet the transparent multisampling seemed far more effective at removing leave edges.

http://www.nvnews.net/articles/chrisray_iq/7800gtx/8xStmaagamma.png

Anyway. I have found the results for transparent supersampling and transparent multisampling to range. Sometimes multisampling seems to do a better job than super sampling and vice versa. I'm not entirely sure why this is the case but this is my overall impression of the various modes. Considering transparent multisampling looks better and provides a smaller performance hit in far cry's case I have preferred using it. But you may have to do some testing to see what provides optimal image quality.
 
Ah yes .. the MSAA does look better.

Just another question

http://www.nvnews.net/articles/chrisray_iq/7800gtx/hl8xnorm.png
http://www.nvnews.net/articles/chrisray_iq/7800gtx/hl28xtmaagamma.png
http://www.nvnews.net/articles/chrisray_iq/7800gtx/hl28xtsaagamma.png

I see there was an issue on the top left hand corner where the one rail goes almost white. A driver problem?

The fence does look better with TSAA though.

Just wish you had used a better lit area for demonstration. Those pics are pretty dark.

The AA at the window sill though again calls into question about the GTX's AA.

Since I didn't look at the FC and HL2 MSAA screenies .. i'm looking at them now.

With MSAA .. the fence looks exactually like norm8xAA. Actually . .there doesn't seem to be a difference. With TSAA . .the Fence is different though(better).

US
 
The fence issue is a known bug. ((I asked Nvidia about and it and its in the release notes. I was told its one of their "Bug fixes" in the works.)) I mainly chose the dark scene because of the way light bounced off the distance fence in the very far background. ((It also didnt appear as dark and for some reason fraps didnt capture the same gamma))

I decided against gamma correcting this scene because I felt adjusting the color properties would defeat the purpose of the test. The bug note was explained on the post actually. ;)
 
RE: "5000+" (single core) I'm not entirely convinced it will happen in that time-frame. Intel is seemingly incapable of going beyond 3.8Ghz (or rather, producing higher-clocked chips at reasonable cost) so AMD is in no hurry to press on and the most recent roadmaps point to dual core (at lower clocks), lower power and thermal density as goals for 2005/2006.
 
Who knows what will happen.

2 years ago . .you would've laughed at a dual core CPU .. never mind a 4 core CPU(which is said to be released by 2007).

Saying that ... intel are behind atm .. they know it .. and they'll do anything(if you have to compare the last two years) to try and claw their way back. Saying that .. i'm sure AMD will try to nail intel to the wall by any means neccessary.

Have you forgotten about the articles about intel's break through's which could enable CPU's to hit 10Ghz in the future?

http://www.pcworld.com/news/article/0,aid,66464,00.asp
http://news.zdnet.com/2100-9584_22-888838.html
http://news.zdnet.com/2100-9584_22-1015424.html

While I know .. it seems they've hit a snag at 3.8Ghz .. it seems they'll do whatever it takes to get the performance title back.

US
 
Unknown Soldier said:
If you have a look at the leaves and the gun, the AA is still visible even at SSAA.
The gun? The gun is AAed as good as it gets, IMO.
TSAA doesn't have an effect on the leaves, that's a bug. It does work on the palm trees, however.
TMAA doesn't always work (which IMO is a bug, too), but when it does, it looks great (especially in combination with 8xS, because that kills the dithering pattern).

I'm pretty sure it won't take long until we see transparency AA from ATI, too. It just makes a huge difference.
 
Maybe you right, but having a quick look at both pics, it could just be that the pic(AA) looks better in the TSAA pic because the pic is in a different position.

Maybe if the pics could've been taken at the same frame.

Maybe Chris should try the FS benchmark which allows a frame capture at a certain frame .. even though I know these can also vary, although the frame should be more accurate than what Chris has presently done.

US
 
Unknown Soldier said:
2 years ago . .you would've laughed at a dual core CPU .. never mind a 4 core CPU(which is said to be released by 2007).
Well, HyperThreading was known about over two years ago. I know I was talking about the possibility of multicore chips by then:
http://www.beyond3d.com/forum/viewtopic.php?t=3645&postdays=0&postorder=asc&start=51

Saying that ... intel are behind atm .. they know it .. and they'll do anything(if you have to compare the last two years) to try and claw their way back. Saying that .. i'm sure AMD will try to nail intel to the wall by any means neccessary.
Intel basically just needs to switch their focus to the Pentium M architecture to give the Athlon 64 a run for its money.

While I know .. it seems they've hit a snag at 3.8Ghz .. it seems they'll do whatever it takes to get the performance title back.
And going for higher frequencies isn't it. They've hit a wall, and that wall will not be breached.
 
Yes I know they've hit a wall with the current technology .. that's why they went the dual core route.

But it's not to say they can't go higher.

I guess the next few years will tell.

US
 
If they stick with the Pentium 4 architecture, the move to higher clock speeds will be excruciatingly-slow. They may be able to push the barrier back slightly with a lot of effort, but it's not going away.
 
I expect we will see a new processor based on the Centrino core, with builtin memory controller annd 64 bit support.......... :rolleyes: :rolleyes: :rolleyes:
 
Intel's reluctance to do a 4GHz part and keep Netburst going also have deep-seated marketing reasons. I've personally seen 4GHz (clearly marked as such with that as their working clock) Intel CPUs that'll never see the light of day, and marketing is one of the main reasons why. Along with heat and power of course.

It all stems back to the P3 days.
 
martrox said:
I expect we will see a new processor based on the Centrino core, with builtin memory controller annd 64 bit support.......... :rolleyes: :rolleyes: :rolleyes:

since when is Centrino a core?
 
Rys said:
It all stems back to the P3 days.

Pentium Mobile IS Pentium III, don't think intel suddenly had a new design ready for it's budding Pentium 4 and Pentium 4m Laptop implementations.
Just Like Centrino isn't a processor but a "marchitecture." (gotta love L'inq)

If you saw the lecture by Bob Colwell you know that P4 was just a step to far.. actually a step into ridiculous architecture.
So for Damage control, they want back to PIII and used their P4 experience to create a processor that had less problems and was far more controleable.

That it turns into a perfect architecture shows how badly intel messed up with the P4.
With Perfect I mean that it's almost able to deliver the same perfomance at the clockspeeds that Athlons are running at, see the link Chalnoth gave you.
 
I'm talking about the marketing reasons for not introducing a 4GHz P4, not the technical merits of each CPU core. Was certain my post put that across.

I know full well what the current CPU lineage is :D
 
Back
Top