AMD: Beyond R600

In Q1'08, R700 is to fight against G100? So, in the space of a year we're going to see the G8x refresh, G90, G9x refresh and then G100?? Given NV's previous GPU launch schedules I find it highly unlikely we'll see that rate of progress.
 
Nvidia has no incentive to release anything until they have some competition. Apart from occasional inflection points like a new OS or major DX revisions, why release early when they can keep raking in the cash from their current cards and improve the return on their investment? In the meantime they can build up a warchest for any future requirements, only releasing their newer tech when their sales start to slow down.

If anything, the lack of any competition from AMD will slow down the release (not necessarily the development) of new chips from Nvidia.
 
I agree, that G100 in 2008 talk is hogwash.
What the original story behind this claim says is simply that the next chip Nvidia releases will be called G90. In other words, "G90" wil be the name of the G80 refresh (as opposed to "G81", or whatever). The next chip after "G90" will, of course, be "G100". That will be DirectX10.1, SM5.0, and will be available Q1 2008.
 
AFAIK, Directx 10.1 is still tied to SM 4.0. SM 5.0 will most likely arrive when Directx 11 arrives. :p

As for AA and Ultra High resoutions.

High Resolutions don't do diddly squat to even reduce jaggies if the display size grows at a similar rate as the resolution does.

The only way for high resolution to reduce the need for AA is if the size of the display remained constant while resolution increased. However, I doubt that will completely remove the need for advanced filtering techniques. AF will still be needed. A high resolution won't get rid of texture crawl. Nor would it most likely get rid of shimmering from shader aliasing. However, with a high enough pixel density it might obviate the need for high levels of edge AA or even get rid of the need for edge AA.

I wish I had that one 22" IBM monitor with 3840x2400 resolution to play with. Yes, it's pixel response is absolutely horrid and unusable in anything resembling a game, however, it would be interesting to see if (at max res) jaggies were reduced enough to not be immediately obvious.

Regards,
SB

PS - What? You don't think graphics card power will at least quadruple in the next 2 years? ;) :LOL:
 
PS - What? You don't think graphics card power will at least quadruple in the next 2 years? ;) :LOL:

For a single card, not counting multi-slot kind of situations? I think that'd be the outer limit. But did you have a current single card in mind as your base or were you already SLIing 8800GTX for your baseline?
 
I am assuming the R700 will be an incremental update rather than a big update. In addition, does this mean no R650? Or is the R700 meerly an R650?

Isn't the R700 going to start using PCI-e 2.0? With double bandwidth, it could be much faster.
 
For a single card, not counting multi-slot kind of situations? I think that'd be the outer limit. But did you have a current single card in mind as your base or were you already SLIing 8800GTX for your baseline?

Geo, notice the wink ;) and the unhideable laughter :LOL: in my comment. :) I was being glib at that point. I hardly expect a quadrupleing of performance in single cards. Doubleing or tripleing performance in 2 years seems more reasonable.

However, if the next generation of GPU designs really are modular (R700, Intels secret chip) then who knows what may be possible if cost is not an issue.

Which just leads to my conclusion/argument (not very well made I admit) that AA of some form will be needed in the forseeable future as a 24" display running 3840x2400 (or similar pixel density) with fast response times is at least as unlikely as a graphics card capable of running said resolution at playable framerates.

So as far as I can see higher forms of AA are much more reasonable to implement than such a beast of a combo that may or may not remove the need for at least Edge AA. Although even at such a resolution objects in motion may still need at least 2x AA to avoid crawling jaggies that might not otherwise be noticeable at such a pixel density.

Regards,
Croaker
 
However, if the next generation of GPU designs really are modular (R700, Intels secret chip) then who knows what may be possible if cost is not an issue.

If cost is not an issue, most things are possible. Throw enough money at it, and you could have huge amounts of eDRAM, fast GDDR4 and some GPUs all clustered together into one humongous bale of 3D wonder. :oops:

But let's not get the "high end" quite that high just yet.... :D
 
According to Wikipedia, the R700 will be the first one from ATI that uses PCI-e 2.0: http://en.wikipedia.org/wiki/PCI_Express#PCI_Express_2.0

There aren't any motherboard that supports PCI-e 2.0 yet, and I haven't heard of one coming out soon. So, probably next year...

Motherboards lacking the support shouldn't stop anything, though, as at least if I understood it right, PCIe 2.0 is backwards compatible "both ways" (as in, 2.0 cards work in 1.0 mobos and vice versa)
 
Back
Top