AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

4K res IMO is a pipe dream at this point. It's taking me two cards right now to get "good", and "fluid" fps on my 120hz overclocked Korean IPS 1440p monitor in the most demanding titles. I am in no hurry to jump to sluggish performing 3840x2160 graphics at 30-60hz.
In a lot of games, you can turn down a marginally useful visual effect to boost framerates a lot. From raw specs alone, it's reasonable to expect a lot of XBox One games running fluidly at 1080p to run even better at 4k on a 290X.

I honestly don't understand the 4K hype. The performance hit is extreme, the benefit (compared to other investments in graphics technology like physics, lighting, geometry) is meager at best.
Have you compared the visual payback we get from Ultra and even high quality in various games? It's often quite marginal nowadays.

EDIT: And borrowing from Dr. Evil's point, multi-monitor is largely analogous to 4k.
 
Last edited by a moderator:
Okay, if you sit in a living room environment, it has its benefits. On a typical desktop it should be quite different, though.

Next gen consoles <<< PC ;)
Quality will be scaled back, I'm sure of it. Even now, some games won't run in 1080p.
 
I suppose with Mantle we will have 4K at acceptable performance but I'd rather stick with my Eyefinity setup until I can upgrade all 3 monitors to 4K ones.
 
Last edited by a moderator:
IMO 3D gives a massive impression of increasing screen size. On a 'mere' 27" screen from 3-4 feet away I feel like I'm playing on a cinema screen half the time when 3D is activated. Switching back to 2D makes the screen feel tiny. It is highly dependant on convergence settings though.
 
Have you compared the visual payback we get from Ultra and even high quality in various games? It's often quite marginal nowadays.

EDIT: And borrowing from Dr. Evil's point, multi-monitor is largely analogous to 4k.

Very true. Yet reducing fps by about 2/3 for just more pixels but not better content doesn't seem like a good solution either. A Far Cry 2 or Skyrim still can look like crap in 4K. But if you compare graphics at 1080p between games from say 2007 and today, the difference is easily seen. If I spin that thought further, I imagine tech like those fancy GPU-Physx and tessellation demos in games and utilized extensively to improve immersion and atmosphere. Surely that will cost a lot of performance, but in my opinion, these are areas where the investment is much more urgent than "just" more pixels with the same static or detail-lacking content.
 
Well I take 4K 30-60hz over 1440p 120hz any day and imo it's the best thing that's happened in a long time. It allows one to game with a huge perceived screen size, that's what I'm after. It's a bit challenging to achieve at the moment, but things are changing quickly.

I have no doubt 4K will be all the rage in 5 years or less time. But I can forsee a couple 20nm GM110's or Radeon 390x's struggling in the next year or two. How quickly will people adopt it? Surely the new consoles won't have the grunt to push 4k.
 
I have no doubt 4K will be all the rage in 5 years or less time. But I can forsee a couple 20nm GM110's or Radeon 390x's struggling in the next year or two. How quickly will people adopt it? Surely the new consoles won't have the grunt to push 4k.

No but they do support the resolution in theory, which means it will help the adoption rate if someone brings 4K video services to the consoles
 
I have a standard GTX580 in my desktop and it's not silent at all when I'm playing games. The GTX660 Ti in my HTPC is more silent but it doesn't use standard cooling. As I explained before, the silent mode should be different than just using a more efficient cooling setup, from what I see in some reviews.
It was a joke, Totz.
 
Is AMD only allowing previews of 4k numbers because that's 290x's best showing or because 4k is going to be an integral part of a new marketing drive? Given the only decent 4k monitor is 3,500USD it may be a bit premature...

Both. That is, 4K is going to be an integral part of the new marketing drive because this is where Hawaii shines. Also because most GPUs will struggle in 4K, which AMD hopes will shift their product mix higher or entice upgrades.
 
I think 4K screens can be nice once small and cheap (like $200). Then people with slower GPUs can any use weird low res and have it look somewhat decent :mrgreen:
High PPI would give more pixels for the scaling. Worst case, use 1280x720, that's 3x less in both directions.
 
unless rift uses 4k screens, 4k screens are irrelevant :LOL:

That is going to take a couple of years probably, but once it happens everyone will understand what 4K is about :) Huge perceived screen size. You won't hear people asking "how you can sit so close to the screen?!", when ironically the screen is put so close to your face that that's all you see :)

4K TV prices are falling hard at the moment, they have already essentially replaced the high end 1080P LCD market in the $3500-5000 segment, when those same models were $5000-7000 just 2-3 months ago. I expect monitors to drop too, but you want a big one.

I have no doubt 4K will be all the rage in 5 years or less time. But I can forsee a couple 20nm GM110's or Radeon 390x's struggling in the next year or two. How quickly will people adopt it? Surely the new consoles won't have the grunt to push 4k.

Thing is you don't have to run with max settings and yes I know that sounds little counterproductive, but like mintmaster hinted many of the more demanding settings these days are little more than just frame rate dividers. Low visual benefit, but a large hit on frame rate. Some optimizations will be required, but running next gen console ports at 4K with good settings should not be too big of a problem at 30-60fps, if you have more than one high end GPU.
 
Last edited by a moderator:
Very true. Yet reducing fps by about 2/3 for just more pixels but not better content doesn't seem like a good solution either. A Far Cry 2 or Skyrim still can look like crap in 4K. But if you compare graphics at 1080p between games from say 2007 and today, the difference is easily seen. If I spin that thought further, I imagine tech like those fancy GPU-Physx and tessellation demos in games and utilized extensively to improve immersion and atmosphere. Surely that will cost a lot of performance, but in my opinion, these are areas where the investment is much more urgent than "just" more pixels with the same static or detail-lacking content.

I've been gaming at 1500p (2400x1500 in a window) for over 4 years now and dread going down to 1200p when I visit my mother in Japan.

I might go in on this if it offers a nice boost at my preferred gaming resolution. But not sure I want to buy any video card over 400 USD. Will have to see.

It'll also be useful for people that do multi-monitor gaming. 4k isn't the only thing that pushes a lot of pixels. :)

Regards,
SB
 
Also, there’s almost no difference between Uber Mode to Quiet Mode.

Because Quiet mode doesn't throttle clocks/performance in the vast majority of situations.

Whycry is asking why Uber mode is worse than Quiet mode for Bioshock Infinite but the results are well within the margin of error.

We need more data on power usage, actual clockspeed, temperature and power state to get a full picture of the situation.
 
A new slide supposedly leaked from AMD Reviewers Guide for R9 290X shows that the card will scale from 1.8x to 2.0x i CrossFire configuration.

Maybe, but AMD claims "great Crossfire scaling" with similar charts for every new high-end GPU, and it never actually happens across the board.
 
WRT to uber vs quiet, perhaps they are testing the card on a open air test bench, where the plentiful cool air is enough to stop the card from throttling, but stick it into a smaller case and the difference between uber and quiet modes should be much bigger.
 
WRT to uber vs quiet, perhaps they are testing the card on a open air test bench, where the plentiful cool air is enough to stop the card from throttling, but stick it into a smaller case and the difference between uber and quiet modes should be much bigger.

Good point.
 
The key will be if users can take control/override quiet and uber modes altogether and overclock the card the traditional way disregarding noise and temperature considerations.
 
Back
Top