there is no NV47, but G70 is on the way

_xxx_ said:
Uhm, how many "top" games are still playable @2048?
I don't know. Why do I not know? Maybe because.... nobody bothers to benchmark at that resolution! except FS

If they did, then I could decide for myself if the framerate would be good enough for me.
 
calm down mate. :) I also want more games that take advantage of higher res (read multi-monitor), but i dont go nuts with the font size. 8)

epic
 
epicstruggle said:
calm down mate. :) I also want more games that take advantage of higher res (read multi-monitor), but i dont go nuts with the font size. 8)

epic
Belive it or not, I was actually perfectly calm when I made both those posts. I wrote it like that because on previous occasions when I have brought the point up I've been ignored, and for comedic effect (well, I thought it was funny anyway, in a melodramatic kind of way :p)

But now that I've gotten your attention I can get back to the point. :) What I'm after is not better support in games for higher resolutions (although it wouldn't hurt). Most games I play already supports it, either directly in their menus, or indirectly through editing setup files. What I want is for reviewers to start actually testing games and GPUs at those resolutions. Being able to play at 2048x1536 would be a big plus both for the games that are playable at that resolution and for the GPUs that makes the games playable at that resolution. I would just like to know what I get before I spend any money on a game or a GPU.
 
NV48 is a native AGP NV40 core that was most probably ported to TSMC 110nm (as NV41 which became NV42 in the process).

NV48-based cards are already in retail as you can see.
 
geo said:
nutball said:
geo said:
The plot thickens. :LOL: Damnit, the dual-core hints just will not die.

They will if people will let them.

See this?

http://graphics.tomshardware.com/graphic/20050622/nvidia_7800_gtx-23.html

Why would I expect anything else from THG?

I'm wasn't reacting to speculation about multi-chip solutions, I was reacting to the use of the words "dual-core", and particularly in the same paragraph as the words "AMD" and "Intel" (as admirably demonstrated by that THG article). I hate people picking up buzz-words from one context and misusing them in another. What next, speculation about HyperThreading in GPUs? Hard-drives with 48x read/write speed?

Multi-chip graphics solutions exist already, you can buy them today. It doesn't take a genius to work out that they're an inevitable next step over the next couple of generations for more mainstream configurations.

AMD dual-core is two CPU cores on the *same piece of silicon*. In what way is that a solution to the problem of GPUs being ****-off great pieces of silicon? Intel is slightly different (two discrete dies in an MCM), and a more likely model for GPUs. But slapping two dies in a package and doubling the width of the memory bus sounds like a recipe for cooling and PCB design challenges to me, but WTF do I know :?

But don't you think that the Xenos model of multi-die makes more sense than two-G70's-in-a-package model of multi-die anyway? Is Xenos "dual-core" using your semantics?
 
I wish more reviews would concentrate on higher than 1600x1200 resolutions and abandon 1024x768, which is utterly useless for those type of cards - many games are CPU limited even at 16x12 without AA/AF! The argument that most gamers don't have such screens to support over 16x12 is obsurd; most gamers will NOT buy such a card anyway!
 
Kombatant said:
I wish more reviews would concentrate on higher than 1600x1200 resolutions and abandon 1024x768, which is utterly useless for those type of cards - many games are CPU limited even at 16x12 without AA/AF! The argument that most gamers don't have such screens to support over 16x12 is obsurd; most gamers will NOT buy such a card anyway!
I agree
 
Kombatant said:
I wish more reviews would concentrate on higher than 1600x1200 resolutions and abandon 1024x768, which is utterly useless for those type of cards - many games are CPU limited even at 16x12 without AA/AF! The argument that most gamers don't have such screens to support over 16x12 is obsurd; most gamers will NOT buy such a card anyway!

So gamers have high-end CRTs or massive LCDs capable of 1600x1200 or more, but can't find the pennies for high-end graphics boards to drive them :?:
 
Rys said:
Kombatant said:
I wish more reviews would concentrate on higher than 1600x1200 resolutions and abandon 1024x768, which is utterly useless for those type of cards - many games are CPU limited even at 16x12 without AA/AF! The argument that most gamers don't have such screens to support over 16x12 is obsurd; most gamers will NOT buy such a card anyway!

So gamers have high-end CRTs or massive LCDs capable of 1600x1200 or more, but can't find the pennies for high-end graphics boards to drive them :?:

Actually that's not what I said, so I will rephrase, hoping to make my point clearer (in any case, excuse my Engrish :LOL: ) . I've heard (more than once) the argument that "we don't bench in resolutions over 16x12 because there are only a few people with monitors that can run such resolutions". What I am simply pointing is:
a) People who are interested in buying a G70 or an R520 are not people with 17" monitors. Those people will NOT see a difference by buying such a card because at the resolutions that their monitors support, everything will be CPU limited. They can easily buy a X800 non-XL and get the exact same performance at 1024x768, with $400 less. There are exceptions, but they're just that: exceptions. More and more people (aka enthusiast gamers) are getting monitors that can easily display resolutions over 16x12, and they want to know how the latest and greatest fare in those resolutions.

b) Why is everyone STILL benching at 1024x768? What's the point? We already knew that this resolution was CPU-limited with the previous generation. Is there a purpose in presenting graphs after graphs where every single card in the review has the same performance? Hell, even at 1600x1200 many games get CPU limited nowadays with the 7800!

Hope I phrased things better this time :)
 
I get you now, sorry for the misunderstanding :)

I can sort of see your point, but only as far as new high-end hardware goes, and only for the no AA and no AF case.

For example, a 6800 Ultra can't play Splinter Cell in full SM3.0 mode with HDR at framerates I'd call very playable, at 1024x768. Far Cry's little better.

For something like Riddick, 1024x768 is playable but only with around 2x AA and 8x AF. 4x AA if you don't mind an average framerate under 60fps. And most people will want some AA at 1024x768, won't they?

And that's on a 6800 Ultra, remember. What if you own a card slower than that, as almost every bugger on the planet does? You drop 1024x768 with new games titles and nobody will read your reviews, since all you'll do is show people how bad their hardware is! Most sites review the mid-range and low-end SKUs, too.

~720p is still disgustingly relevant, even in the PC space, even with new hardware and new games, I'd argue.

And playing new games is what most people do, right?
 
In my previous post I said:
I wish more reviews would concentrate on higher than 1600x1200 resolutions and abandon 1024x768, which is utterly useless for those type of cards...
which is what I am talking about really, those high end cards. You can't expect to have a review out about the 6200 or the X550 and not bench 1024x768. Heck, I'd bench 800x600 and lower if nessecary too :)

But you are right about games that are pushing high-end cards hard even today; these should be benched in 10x7 because there is something to be seen from the reader's perspective. But a chart that shows every card having the same performance, ranging from a 7800 SLI combo to a lowly X800 is of no use imho.
 
One day I'll read an entire post and see everything in it :LOL: Sorry for missing out the key point, I'm in doh mode today (started benching a CPU earlier for max performance after setting a lower multiplier in software while testing power outputs, only to wonder why performance was off....)

Your point was well made, I just didn't see it.
 
Rys said:
One day I'll read an entire post and see everything in it :LOL: Sorry for missing out the key point, I'm in doh mode today (started benching a CPU earlier for max performance after setting a lower multiplier in software while testing power outputs, only to wonder why performance was off....)

Your point was well made, I just didn't see it.

No worries, we're all like that from time to time... I took a X850XT home to do some testing last weekend, only to find out when I got home that I forgot to get the special 6-pin power cable these cards need. Result? I had it on my office, looking at it the whole weekend, without being able to do anything with it :LOL:
 
nutball said:
Multi-chip graphics solutions exist already, you can buy them today. It doesn't take a genius to work out that they're an inevitable next step over the next couple of generations for more mainstream configurations.
Inevitable? For mainstream configurations?
If we just look at high-end gfx chips and extrapolate forward, then multi-chip look fairly logical.
However, that bird flies against the overall trend towards integration, continuosly lower ASPs for desktop systems, and the trend towards mobile computers (not only are half of all computers sold laptops, but among private consumers the proportion is even higher).

So going to multi-chip solutions may be logical, but only within an ever narrowing niche, hence my questioning that "mainstream" statement. Even now, among people who play games, only a low percentage goes for the highest performing parts. Multi-chip will never be a cheap solution due to packaging and interconnect costs. If costs are driven even higher than today, how large is the market? Mainstream?

The only thing I can see happening in this direction is that you might go to a multichip solution that allows you to save on other parts of the gfx system at a given performance point, most notably the memory.

Graphics lends itself nicely to parallell execution, so more transistors will help performance substantially for the foreseeable future. But chips, packaging, interconnects and testing costs money, and when the market already mostly rejects the larger single chip solutions, I have a really hard time seeing the inevitability of multi-chip graphics processing as an attractive mainstream solution.
 
If you do not put in 1024 or 1280 as well you can't see when cpu limitation no longer takes place, so 1024 is important as well, otherwise you have to take the reviewers word for it.

Really you want 1024 to a very high resolution so you can see any scaling effects.

I agree with Kombatant though, if a review site only does 1024 or 1280 then that is a waste.

With a lot of 17/ 19 inch LCD monitors being sold nowadays 1280x1024 with a lot of permuations of AA and AF is important to a lot of reviewees.
 
nutball said:
I'm wasn't reacting to speculation about multi-chip solutions, I was reacting to the use of the words "dual-core", and particularly in the same paragraph as the words "AMD" and "Intel" (as admirably demonstrated by that THG article). I hate people picking up buzz-words from one context and misusing them in another.

Actually, I completely agree with that personally. Nevertheless, the anecdotal evidence is growing that's where NV is going. If you think THG (and some other forum posters, even here) is starting that rumor rather than reacting/passing on what they are hearing is where we part company --and I suspect the rumor mill on this one starts at an @nvidia.com email addy.
 
Back
Top