What's the definition of a 4K capable GPU? *spawn *embarrassment

Discussion in 'Architecture and Products' started by Malo, Jan 7, 2019.

  1. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,421
    Likes Received:
    605
    Not as clear cut, but implied by yourself:

    If its clear cut that nvidia wants to convince graphics cards customers to spend more money to play at 4K like you said, you are in fact defending that the GTX2060 should be assumed by them as perfectly fine for 4K, so gamers spend less, hence branded as such.

    But there is more:

    Regarding my expression of stopping to chase windmills, I'm sorry you got that wrong, but I was not explicit enough either.

    I did not intend that to be a flamebait, but only an expression of something you finally came to grips with on this answer:That there is no conspiracy here to hide GTX2060 4K results.

    Not against the card, no. But let's drop that yes, it is just "dejá vu" and I'm not expecting any sort of closure.

    On that note:

    I'm glad you have finally seen that, but that was not what you defended until now:

    PS - It is funny how I'm a bit too much expressive sometimes (not always intended as flamebait), but in all honesty, you are not that far behind, with very strong affirmations (involving the word "clear" a lot or "I don't believe for a second", to then, a short time later, backtrack on what you said (a little bit, common, its true, I'm not trying to flame bait... ).
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,082
    Likes Received:
    10,279
    Location:
    Under my bridge
    :???: 4K's not a binary switch. If a video card can output 4K to a 4K display, it's a 4K card. It'll play games at whatever framerate based on the user's settings.

    I'd have thought it helpful for consumers to have info on how well any GPU runs at different resolutions and settings to make an informed decision over which card to buy, whether an alternative in the same family that offers better bang-per-buck, or if they'll stretch for more frames, or if the alternative company offers a better option, particularly if one can grab a good deal somewhere.

    What other cards are priced <$350? Are these cards options for someone with a limited budget to play games on a 4K display, or are 2070+ now the only options for people wanting to play 4K?

    In terms of cost, a 4K monitor can be got for <£300. A 2060 is £350. a 2070 is £460. I don't believe that people who have £650 to spend on a 4K display an RT enabled GPU can be assumed to have an extra £100+ to get a 2070. If 4K monitors were £1000, sure, but they're not and I don't see a fair argument to say people who are price sensitive aren't going to be interested in gaming at 4K.
     
    BRiT, vipa899 and ToTTenTranz like this.
  3. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,531
    Likes Received:
    4,194
    To that, I'll add that many people have 4K TVs and the budget for that is considered "home appliance" money and not gaming PC money.
    In practice, it means gamers will often split the bill for a new TV with their spouses/girlfriends/boyfriends, whereas PC monitors come from their own nerd stuff budget.
    And.. people often connect their gaming PCs to TVs. It's not that far-fetched.


    You could have watched the video you so vehemently asked to the link for..
     
  4. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Whats so hard for people to understand this? If it can run games at lowered settings @4K at perhaps 30fps with dips, its still a 4K capable card. Its like some think 4k has become the standard now? Its still a very high resolution, with steep hardware requirements.
    If the 1080 was considered a 4K capable GPU then the 2060 is too. Its performance is very close, even @4K.
    What about PS4 Pro/One X, they have much less capable hardware, but they are considered 4k capable, and rightfully so, as both can output 4k games.

    IQ/visuals probally are the most prominent thing to most, ray tracing on the new gpu's impresses everyone, well not the naysayers then, but many. Played BFV once and i just hope il get a RTX gpu soon or later, perhaps RTX 30xx series. Its too expensive just now. Screen space reflections are a generation behind now.

    Turing also adds DLSS, variable rate shading and things like AI/deep learning. Much can be done with the new Turing architecture, we just need more software to show it in games i think. Theres a rather large list but i would like to see that trend continue in the future.
     
  5. Benetanegia

    Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    216
    Likes Received:
    131
    And again, I disagree vehemently. The 1080 was a 2016 product and could play (some) 2016-2018 games at 4K fairly well. It still plays many, but far less of them than when it launched. In a year or 2 will it play new games at 4K? Nope. Highly unlikely. However it'd been a card that's been capable of "being a 4k card" for most of its life. In 2020 the RTX 2060 will still be selling most likely, will it play most games at 4K then, especially with its 6GB of VRAM. Hell no.
     
    DavidGraham and Picao84 like this.
  6. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,421
    Likes Received:
    605
    Sure, my Shrondigers remark was regarding the flip flopping of opinions about RTX2060 performance at 4K and associated shenanigans of NVIDIA constraining reviewers action. i.e. Claims that nvidia was controlling reviewers both because 4K performance is good enough (therefore competing with RTX2080) or because performance is bad (e.g. RE7).

    This is where we will have to agree to disagree then. From my own experience of myself and friends, anything north of 150 pounds for a monitor is hugely expensive, nevermind 300!!

    Yes, people who buy TVs might spend a bit more, but they are taking into account the size of the screen which will be much larger than the average monitor, as well as the fact that it has a tuner, smart apps (especially with Android TV now), etc, something most monitors won't have. The difference in value proposition between both product types is huge, really.

    I would definitely expect someone who buys a 250+ monitor to get at least an RTX 2070 to go with it. It's not like they have to spend the money all at once! If you have a decent PC already and the monitor, you are just upgrading the GPU..

    Edit - Looking at the Steam Survey, FWIW, users with 4K monitors represent only 1.42% of all the users. This should give you an idea of how niche a market 4K still is for PC gaming.
     
    #46 Picao84, Jan 10, 2019
    Last edited: Jan 10, 2019
  7. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Neither will the Pro or One X, yet they are a classified as 4k capable now. even a 2080 will suffer at 4k in about 2/3 years.
     
  8. Benetanegia

    Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    216
    Likes Received:
    131
    Yes they will be able to do 4K, there's no going back from what they are offering right now (which is not true 4K anyways). Console hardware specs are not a moving target, unlike PC.

    As for the 2080, in 2/3 years it will be replaced and by then it'd would have done its expected job. The 2060 will not be replaced in just a year, and it will most likely start to fail at doing 4K in that timeframe. No one should expect a card like the 2060, with its 6GB and bandwidth to be able to do 4K in the near future. It doesn't even do such a great job right now...
     
    Picao84 likes this.
  9. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,082
    Likes Received:
    10,279
    Location:
    Under my bridge
    Again, it's not a binary switch. You can lower quality settings to hit playable framerates, or play 4K at lower framerates. You can also play your old and favourite games in 4K. So unless future games absolutely won't run at 4K (fat G buffer that doesn't fit the VRAM, sort of thing), it's still a 4K card.

    Technically (and this is still supposed to be a technical forum ;)) a 4K GPU is simply any card that can output 4K, and has enough VRAM to be able to render a 4K output, even if that's 1 fps in some game. If you want to be more exclusive, you can mandate it has to be able to run 30 fps minimum on the lowest quality settings.
     
    BRiT and vipa899 like this.
  10. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    The 2060 is faster then One X and Pro, and will still be that, even in 20 years. Btw RDR2 on One X is offering native 4k, its the most impressive game so far.
     
    BRiT likes this.
  11. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,421
    Likes Received:
    605
    Just forget it, I said exactly the same things you are saying now pages ago. This will go in circles until everyone is exhausted...
     
    pharma likes this.
  12. Benetanegia

    Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    216
    Likes Received:
    131
    Come on now. That's not what's being discussed here. I don't think it's in anyone's mind to buy a new card to play pong at 4K...
     
    pharma and Picao84 like this.
  13. Benetanegia

    Newcomer

    Joined:
    Sep 4, 2015
    Messages:
    216
    Likes Received:
    131
    Sigh... But in 20 years, if they still choose to make games for them, they'll target and optimize for that hardware. That will not be the case for the 2060...
     
    pharma and Picao84 like this.
  14. Malo

    Malo Yak Mechanicum
    Legend Veteran Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    6,770
    Likes Received:
    2,819
    Location:
    Pennsylvania
    Yay, The lowest of low Intel iGPUs are now considered 4k gaming GPUs!

    What's your break point? If I can run CS:GO at low settings 15fps is that considered 4k gaming GPU then? Generally neither PC gamers nor reviewers consider 4k30 at high settings on average across titles to be considered a true 4k gaming GPU. The 60fps target has always been considered to be the true worth and it has been that way for a long time. "Can it run Crysis?" has never been about achieving 30fps...

    Yes, it's somewhat subjective and yes depending on the game and the level of settings you're content with, you could indeed be happy at 4k with a lower-mid GPU. But it's generally not what is considered to be truly a PC 4k gaming experience.
     
    w0lfram, pharma and Picao84 like this.
  15. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,082
    Likes Received:
    10,279
    Location:
    Under my bridge
    You're reducing the argument to make it fit. Anthem will run on 2060 with reduced quality. Anthem 2 will probably run on 2060 at 4K with reduced quality.

    Pick a 1080p GPU from yester-year. Can they play modern games at 1080p? Battlefield 1. Minimum specs GTX 660. Performance 1080p at > 30fps. If you bought a GTX 660 in 2012 to play 1080p games, you'd still be playing 1080p versions of the latest games on it 6 years later.

    That's science, and data. Real hard data that a GPU bought for a resolution still games at that resolution a good 6 years later, which is as long as a console generation and a fair time for a GPU. GTX 1080 is still an option for 4K gaming now and for a few years yet. RTX 2060 is an option for 4K gaming now and a few years yet.
     
    BRiT and vipa899 like this.
  16. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,082
    Likes Received:
    10,279
    Location:
    Under my bridge
    Precisely. It's stupid to hold GPUs to fuzzy definitions, because users choose their own settings. It rindunkulous that everyone's so caught up on this and utterly derailed this thread, which I haven't time to clear up to spawn a "what's the definition of a 4K GPU?" thread.

    It should be dropped. Facts are people can choose to buy a 2060 for playing 4K games at a decent framerate and it will be able to play 4K games at a decent framerate for years to come on the latest software, as born out by all the GPUs that have come before and continue to game at whatever resolutions. There's no point arguing that. Attention should return to how well the 2060 is performing in games and whatever else these icky PC threads attempt to discuss.
     
    vipa899 likes this.
  17. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,768
    Likes Received:
    1,497
    Oh that, did watch till his complaining got on my nerves. Let's see if he does any better with Vega VII.
     
  18. vipa899

    Regular Newcomer

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    The GPU in the Xbox One X provides native 4k, with a quite stable 30fps for RDR2. Its having one of the most impressive graphics out there, up there with Sony's AAA games, that says alot. Thats RX580/1060 performance.
    2060 will do just fine at lowest settings @4k in two years, if not more.

    DF found RDR2 technically very impressive, running @4k 30fps. 30fps isnt always a bad thing, in special in SP games.

    People who buy entry level Turing products for its price tag might be content with 4k 30fps. Your raising the bar to min 60fps somehow, thats a steep requirement even for mid-end gpu's. You sure dont like gaming on consoles do you? There we live with 30fps and upscaled 4k for the most. Theres nothing wrong with that as 4k is a huge resolution with huge requirements even on todays hardware.

    Can confirm this, my pc thats connected to the tv has a MSI GTX660, im able to run wolfenstein 2 @ 1080p 30fps, quite stable too. Yes i have to reduce settings, but nowhere below that of base PS4 settings, which is very acceptable imo. Same for doom, looks and plays better even.
    Not my video but a 760 isnt far off from a 660.




    Edit: Saw shiftys last post after i wrote this one. Wont continue about it :)
     
  19. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    15,553
    Likes Received:
    4,458
    You mean like the 980 TI? Where NV wanted it tested at 4k?

    Or the 1070? Which performs worse at 4k than the 2060, but was tested at 4k at launch?

    This is a ridiculous assertion as no graphics card performs the same on future games as it does on past games. :p

    This would be like saying no card should ever be tested at anything other than 640x480 because in the future games it won't perform as well at higher resolution as it does now. Oh wait, maybe 640x480 is too high? Perhaps 320x240 would be better? :)

    Regards,
    SB
     
  20. w0lfram

    Newcomer

    Joined:
    Aug 7, 2017
    Messages:
    109
    Likes Received:
    27

    But that is the point. The RTX 2060 is not a 4k card. Because the RTX 2070 isn't either...

    Just because someone is able to/or decides to hook a RTX2060 to their 4k television for cinematic movies & arcade games, doesn't mean it can push 3840 x 2160 pixels at stable frames. And yes stable 60 frames/Hz is the defacto standard for being able to push a particular resolution.

    My RTX 2080 is barely ahead of the game at 3440 x 1440p and my Ti struggled too... so how is a RTX 2060 going to handle 4k when a 80Ti can't handle 2k at stable frames?


    I think a RTX 2060 would make a great 4k Desktop computer & media machine. For light cinematic gaming, or downscaled 1080p stuff.
     
    Picao84 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...