What's the definition of a 4K capable GPU? *spawn *embarrassment

Discussion in 'Architecture and Products' started by Malo, Jan 7, 2019.

  1. What I meant is clear as water, yes.
    It's just that you didn't get it.

    First of all, 4K is not an absolute term that perfectly defines all the variables that influence IQ and framerate. It's just saying how many pixel are being rendered per frame.
    Therefore, I'd never say that card X or Y "should run 4K" no matter what the price point. It's a hollow argument from the get go.

    A $350 discrete graphics card should have been tested at 4K, and the price isn't a valid argument because consoles who cost $300-400 run at that resolution. That holds specially true with the XBox One X which AFAIK doesn't even have ID Buffer for checkerboard.


    What's the excuse for GTA V then?

    The last time I suggested someone to purchase a graphics card was a RTX 2070 which a good friend of mine did buy and is using ATM. It was a great purchase at the time even despite the unused features.
    I'm interested in the frametimes for the RTX 2060 at 4K because I'm considering buying one for my HTPC. Which lo and behold is connected to a 4K TV (and currently has a RX480). On my HTPC I play mostly 3rd person action games like Tomb Raider so I don't really care about solid 60 FPS, and the RT hardware would be convenient for offline renders in my job.


    Am I really the one with an axe to grind here? Your very first post earlier today was already loaded, and this last one is 80% flamebait and personal accusations.
    Kindly do back off or just focus on the arguments, please.
     
  2. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    Read my post again and you will get your answers.
     
  3. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    It should run at 4K at that price point and it can run at 4K at that pricepoint. Just change the game settings (to match the consoles). As such, as the GPU can run 4K games, why should benchmarks of its 4K performance be ignored? What's needed are benchmarks comparing it at different settings and framerates versus the other cards, which is the very purpose of benchmarks.

    Malo is right in that benchmarks versus higher end cards running high end settings is not a basis for saying 2060 is a crap option, but the implication that a $350 card shouldn't be considered for 4K isn't valid either. 2060 should be benhcmarked at different resoltuions and different quality settings for fair, widespread comparisons. Although that is predicated on the purpose of benchmarks being for fair comparisons and not just for fanboy internet pissing contests!

    I don't give a shit about 'axes to grind'. With none of the PC history baggage the long-time posters here are carrying, I'm just seeing an argument about 2060 benchmarking.

    Amiable posters should assume a misunderstanding rather than an agenda. This line..."You are just grasping at straws, even going to the point of defending that reviewers should test old games to see if a new card can ran them. Do you know how ridiculous that sounds?" is just making an argument. Totz hasn't even made a strong point to be 'grasping at straws' with. He just said 2060 should be benchmarked at 4K as it's a viable option for PC gamers in the market for a new GPU wondering what they can get for their money and if $350 is good enough for a reasonable 4K experience.

    I'm even somewhat bemused as to your response and position. Should the 2060 be exempt from 4K benchmarks? How does that help consumers?
     
  4. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    I'm not saying that it should be exempt, but I'm not shocked they are not done by reviewers either. How many people who are interested and have the money to be in the cutting edge, which 4K still is, would be looking at a 349 card? Or, in other words, for whoever the limit they can spend is 349, running 4K is a bonus, not the main objective.

    This is the kind of thought reviewers will do to better optimise the time they spend on reviews. There is no conspiracy here, it's not significantly different from what reviewers have always done. Equally GPU manufacturers, both Nvidia and AMD have always tiered products through target resolutions. Why is this concept alien and, OMG the spawn of Al <ModEdit>, all of a sudden?

    Do you think that reviewers should test a GTX1050Ti at 4K as well? They can reduce the settings and achieve 30FPS maybe? What about a GT1030? No, connect a laptop that costs more than a console and try that MX150! After all they are both full systems! Consumers need to know!!!! 99999

    See how ridiculous this argument is now?

    Edit - If anything, what we should be discussing is the huge increase in price for a x60 tier card, which obliterates the fact that it performs like a GTX1070Ti. Barely any improvement in price / performance.
     
    #24 Picao84, Jan 8, 2019
    Last edited by a moderator: Jan 9, 2019
    pharma likes this.
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    It's not ridiculous. It's a fair argument and you present a fair counter-argument. In an ideal world, the 2060 would be benchmarked at a wide range of resolutions and quality settings, but time constraints mean websites are going to limit their attentions to targets they think represents their audience. Fair points. It can still be argued that it's reasonable for someone buying a $350 GPU to be interested in gaming at 4K. If that discussion were ot proceed in an orderly fashion, people would firstly present anecdotal evidence as the go to, followed by stats if someone finds them for PC hardware expenditure, monitor resolutions, and even national average incomes and PC gaming populaces.

    Plenty of sane debate to be had without needing to make it personal or claiming the different view to yours is absurd. Most importantly, you could have presented your counterpoint in the above fashion without jumping straight to accusations. ;) Would have saved on typing and internet electrons too, making polite discussion the more efficient, ecofriendly option. :D
     
    #25 Shifty Geezer, Jan 8, 2019
    Last edited: Jan 9, 2019
    BRiT, vipa899 and pharma like this.
  6. Spawn of Al? <ModEdit>
    ...
    Ok bran whatever. <ModEdit>



    Computerbase.de has done a frametime comparison between the 2060 and 2070, on games they found the 2GB less VRAM to make a substantial difference:
    https://www.computerbase.de/2019-01...gramm-final-fantasy-xv-2560-1440-6-gb-vs-8-gb

    They only did it at 1080p and 1440p, but on both Final Fantasy XV and CoD: WW2 there are some pretty tall spikes. FFXV shows a ~18ms average with spikes up to 60-70ms so this is bound to produce stutter.

    [​IMG]

    Since the RTX 2070 has the same architecture and doesn't show any of those spikes, I'm guessing this is the VRAM filling up and the card sits idling while waiting for data to stream from the main system RAM through the PCIe 3.0 bus

    This is seemingly happening with only a handful of games (high profile games aren't showing this behavior), I wonder if nvidia is hard at work doing manual driver optimizations on a per-game basis to avoid filling the VRAM with latency-sensitive data.
    Since Turing doesn't have a multi-level memory access organizer like HBCC, maybe they're doing the same as AMD did when they launched Fiji (which was also tight-ish at 4GB at the time of release).
     
    #26 Deleted member 13524, Jan 8, 2019
    Last edited by a moderator: Jan 9, 2019
  7. vipa899

    Regular

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Since the 2060 is more capable then 4k consoles at 4k it should be tested at that res too. Its as much a 4k gpu as a 1070ti/1080. It will do fine at 4k if you can accept reduced fps/settings.

    My gtx 670 2gb/i7 920 pc is older then base ps4 but still can run games att higher fps/settings.2008 cpu, early 2012 gpu. My gpu was in that price range then.
    Wolfenstein and doom both give a better experience on the 670 pc. Upgrading every 6 months is a thing of the past.
    That 2060 is probably in line with ps5 performance.
     
    Deleted member 13524 likes this.
  8. I hope not... But it should definitely be closer than a Xbox One X.
     
    vipa899 likes this.
  9. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    Not for enthusiasts, it isn't (as long as there are new cards each 6 months lol). And even if you are right with slow upgrades, that's precisely why going for the top GPU makes sense for enthusiasts. That's the whole point from the beginning. 4K monitors are still expensive, so whoever has them will surely not settle for a 349 GPU that can play 4K at medium settings in the short term, low settings on the medium term. The x60 range was never for enthusiasts.

    I sure hope the PS5 is at least 30% more powerful than a GTX2060.
     
    #29 Picao84, Jan 8, 2019
    Last edited: Jan 8, 2019
    pharma likes this.
  10. vipa899

    Regular

    Joined:
    Mar 31, 2017
    Messages:
    922
    Likes Received:
    354
    Location:
    Sweden
    Same here, but its not too unrealistic. 2060 being like a 1080, even beating it. Somewhile ago ppl thought ballpark 1080 performance isnt far off what could be in there. Tflops dont say much.
    It all depends on what amd has with navi.
    Im offtopic with this though :)
     
  11. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    Yup, that's an hyperbole reflecting how much I considered you arguments to be hyperbolic, in light of what was always the practice (most reviewers didn't the test the GTX1060 in 4K either for example and that includes the Tom's Hardware from your argument). You implied that reviewers were somehow in bed with nvidia for not testing the GTX2060 at 4K so nvidia can direct them to the more expensive options. In my view that's nonsense, hence my sassiness.
     
    #31 Picao84, Jan 8, 2019
    Last edited: Jan 8, 2019
  12. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,451
    Likes Received:
    471
    So RTX 2060 is faster than GTX 1080 already? Yesterday, when I checked the reviews, average performance was at the level of GTX 1070 Ti. I'd say RTX 2060 is faster in high-FPS situations, slower in low-FPS situations. Significantly faster than GTX 1070, but no way better than GTX 1080. As for GTX 1070 Ti, its 8GB memory and more stable performance seems to make it a better solution.
     
    Picao84 likes this.
  13. Ok just to clear out the 4K conversation, it started with me writing this:




    When sending out new GPUs to reviewers, IHVs provide them with guidelines on how to review the hardware. It's a fact that these guidelines exist, as it's also a fact that the tables they provide show the resolutions/settings they want the cards to be tested at.
    I'll stand by what I wrote about these guidelines for the RTX2060 being clear about testing with 1080p and 1440p. I don't believe for a second that nvidia encouraged reviewers to test games at 4K. On the contrary.

    And they did so to avoid showing some embarrassing results like that one on Resident Evil, and not because the RTX 2060 is inherently unable to run many games at 4K with high/ultra settings (it's not).
    It's also not because it has "60" in the name, nor because it costs $350.
    Name and price don't fully determine how far a card can go. Performance does.
     
  14. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    Performance has always dictated name and price, so name and price creates expectations of what the performance is. No matter how you twist it, you know its true.
     
  15. Benetanegia

    Regular

    Joined:
    Sep 4, 2015
    Messages:
    394
    Likes Received:
    425
    I disagree with the second sentence. Performance at release may not hold true on future tittles, throughout the life of a card, thus painting a false picture of the card's capabilities when people, down the line, go read launch day reviews and expect that 4K performance at launch to still be true many months/2 years later. Cards tested and capable of 4K at launch were "labeled" as "4K cards" in the past and has remained for their lifetime despite not being true anymore. I find painting a mid-range card (regardless of price it is still mid-range Turing) as 4K capable quite problematic IMHO. Just my opinion tho and I'm not against testing at 4K as an extra data point, I'm kinda against the conclusions that inevitably arise from testing at such resolution, cards that may appear to punch above their weight on games that are or will be old through the cards lifetime.
     
    Picao84 likes this.
  16. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    And yet none of those jumps happened in two consecutive generations.

    Between the GTX260 and the GTX560 there was the GTS8600, the undying variations of G92 and the GTX460.

    Between the GTX560 and the GTX760 there was the GTX660.

    Between the GTX760 and the GTX1060 there was the GTX960.

    On the context of the RTX series where 4K is still the top "doable" resolution with a single GPU, its only natural that a mid range GPU (can be argued if 349 price is dead centre middle, but its not high end) is not expected to perform great at 4K.

    Just like you said, there will be a point in time when it will not make sense to not test a xx60 at 4K. But it is not today, when even a GTX2080Ti has trouble running some games at 4K with all bells and whistles!

    Why do we keep beating around the bush with this? It's just common sense.

    Look if you think that the GTX2060 should be tested at 4K, fine. Build your own website, buy the card and test it yourself. Enough with chasing imaginary windmills.
     
    #36 Picao84, Jan 10, 2019
    Last edited: Jan 10, 2019
  17. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    Exactly! You can bet that if nvidia would have branded the GTX2060 as 4K ready we would now be discussing if nvidia was misleading customers.

    I mean, how many discussions were there here already about Nvidia cards not performing as well in the future (especially Kepler ones) compared to AMD cards of the same age?

    Now people want for a mid range card to be branded as 4K when 4K is still a though nut to crack? Madness, madness, I tell yah!
     
    DavidGraham and pharma like this.
  18. Picao84

    Veteran

    Joined:
    Feb 15, 2010
    Messages:
    2,109
    Likes Received:
    1,195
    Why would NVIDIA care in this situation? They did not say the GTX 2060 was 4K capable. If they would have and then reviews would show its performance in 4K is limited, I would understand you. But that's not the truth.

    Or do you really think that the usual buyer of x80 and above tiers would suddenly go "oh, hold on I'll get the GTX2060 instead because I do not need an uncompromised solution"?

    This conversation is borderline schizofrenic.

    Totz first says that the GTX2060 can run 4K and that NVIDIA is not letting reviewers test it because they want people to buy the GTX2080 instead.

    But then the same Totz says "oh they don't let reviewers test it in 4K so people don't see how bad it runs RE7 at 4K"

    If you don't understand how these two sentences are mutually exclusive, you need to go back to school.

    Ladies and gentleman, say hello to the Schrondigers GTX2060. It either can run 4K but NVIDIA doesn't want you to know Or it can't run 4K but NVIDIA doesn't want you to know. The baseline is NVIDIA doesn't want you to know "something" but for the performance you really need to open the box!
     
    DavidGraham and pharma like this.
  19. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,491
    Likes Received:
    978
    Location:
    en.gb.uk
    Honestly man you need to accept that this place bleeds red when cut. It always has. Accept it and move on, there's no point getting frustrated about it.
     
  20. Hardware Unboxed is the vlog arm of TechSpot. they've been around for 3 years and already have 320k subscribers.
    Gamers Nexus is 9 years old and has 500k subscribers.
    If an audience of several hundred thousand isn't "mainstream" then I don't know what is.

    And if AMD punished tech sites for unfavorable review scores then that's terrible and it doesn't excuse nvidia for doing the same.
     
    Lightman likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...