Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Anandtech is doing AMD a huge favor by testing 4k and leaving out 1080p. The vast vast majority of people gaming on these cards are going to be using 1080 considering it is the most common resolution.
There is likely a strong correlation between those who buy >=US$550 graphics cards and those who have multi-monitor or UHD gaming display setups.
Exactly, the rush for 1080p IMO is premature, especially on next-gen.. but also for PC. Even for the top single GPUs, running 1080p@60fps is almost impossible with demanding games. And it seems every game nowadays becomes demanding when you add those PC specific features.. TressFX , DOF, HBAO, Global Lighting , Deferred MSAA ,TXAA , Tessellation .. etc.Heck, with the mass of "next gen" games incoming even 1080p is going to be a struggle for these GPU's in some games at max settings while maintaining 60fps.
There's probably a strong correlation the other way around.
Why on Earth would that be the case?
Why on Earth would that be the case?
Yes I completely agree here. Despite these GPU's being the most suitable options for ultra high resolutions I'll bet the vast, vast majority of people that use them will still be gaming at 1080p and it's there that the Ti extends it's lead over the 290x.
4K benchmarks if anything should be left to a separate chart at the end given that they're of little more than academic interest (rather than practical use) at present.
Heck, with the mass of "next gen" games incoming even 1080p is going to be a struggle for these GPU's in some games at max settings while maintaining 60fps.
People with a R9 290 in current games should be playing with 4k downsampled to 1080p, or at least super-sampling AA at 1080p, which should give the Hawaii chips the same advantage as playing with high resolutions.
You're asking why people running multi-monitor setups are likely to also buy expensive graphics cards? Ummm because they're necessary to run games adequately at those resolutions?![]()
But that's exactly what kalelovil said and you replied that the correlation was probably the other way around. Or am I drunk or something?
But that's exactly what kalelovil said and you replied that the correlation was probably the other way around. Or am I drunk or something?
Kaleovil said that high end GPU correlates to multi monitor/high res, but trinibwoy argued NOT the opposite, but that high end GPU owners don't necessarily game at very high resolutions, but that it's more likely that high resolution gaming setups are run with high end GPU's.
So other way is not the same as opposite in this case. Basically a high end GPU owner can easily have only one 1080p display, but almost all 4K or 3 display gamers have a very high end gaming setup.
I'd move the time frame to say that GPUs have passed that point generations ago, likely before Furmark was even a thing.Regarding boost/powertune/turbo, while it is definitely a can of worms, it's ultimately unavoidable. As these cards are increasingly power-limited, you enter the space where you can't turn the whole chip on at once. If you design your chip to run at the same clocks in firmark as a game you're going to be leaving a lot of useful performance on the floor.
This is no different than the situation on CPUs for the last couple years, particularly on ultra-mobile (15W, etc. and down).
The issue -IMO- is that the silicon is usually designed to work at suboptimal performance levels, after all a power virus is just a code that maximizes silicon operation .At least we're finally getting GPU hardware designs that have moved beyond the "might die from rendering furry donuts" stage.
That's fair, I'm just noting that finally push is coming to shove more than in the past. i.e. cooling is at its limits for chip sizes/power, throttling is becoming fairly significant.I'd move the time frame to say that GPUs have passed that point generations ago, likely before Furmark was even a thing.
Yes I remember railing against how unacceptable stuff was there... at the time we got a lot of silly PR replies about how firmark isn't legitimate and so on. Maybe we can get an apology now that they've done what we were saying they needed to do in the first place?The notable thing was just how exceptionally primitive they were shown to be, with hacky driver black lists and cards killing themselves on demanding applications (or more recently, StarCraft 2's menu screen).
No vsync ftl I guessThe most recent and roundly confirmed case of GPUs offing themselves without even touching the cooler was in 2010 when StarCraft 2's menu screen fried GPUs.
Let's dial it back a few notches muzz...