NVIDIA Maxwell Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 9, 2011.

Tags:
  1. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,129
    Likes Received:
    904
    Location:
    still camping with a mauler
    Couldn't we tackle the problem from the other end and go for really low black levels? There would no doubt be lawsuits if a monitor came out that could be bright enough to damage your eyesight. If you looked up at the sun in a game you would actually have to put on sunglasses! :cool2:
     
  2. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    No, but outdoors, on a sunny day, all of your field of view is bright. It's not a glum, dimly lit periphery, with a very bright window-sized rectangle in the middle blasting sunlight-level brightness at your face. :) It's the contrast between your lamp-lit surroundings and your computer monitor's screen that do it, and the difference doesn't have to be that great to be tiring. My Apple Thunderbolt Display goes up to 400 nits or something like that according to specs. At max brightness, it is nowhere near a summer day, but very, very uncomfortable to look at for any length of time. With auto-brightness enabled, the backlight slider doesn't go above 50% ever for me, and often sits at about 1/3 brightness.

    "Easily." :lol: Sorry, but most regular folks can't afford super huge displays. Or even have room to fit them. Besides, a display that is larger than what can fit in the center of my vision would be even more tiring than one that is small and extraordinarily bright I would think, not to mention a lot of software is not designed with displays so large they stretch into the peripheral vision. UI elements (like a HUD in a game) would slide out into the periphery and become useless.

    HMDs are not for everyone. It's also another gadget that costs money, and which hooks up with cables to your PC which some feel is annoying and clutter-y, and there's also people who argue they're just not ready yet (screen door effect, color abberration and whatnot.)

    Also, the blueness of computer light (and the alledged problems that brings) would not be fixed by just putting a set of goggles over your head. In fact you might excarbate the issue by covering more of your vision with the display... :p
     
  3. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,456
    Likes Received:
    578
    Location:
    WI, USA
    Yeah for non gaming and non video, I run one of the low blue light modes and drop brightness way down. LCD brightness has always caused me discomfort in dimly lit rooms and I really like this monitor's options.

    I don't like auto brightness though because it likes to fluctuate and that's unwanted.
     
  4. Ethatron

    Regular Subscriber

    Joined:
    Jan 24, 2010
    Messages:
    858
    Likes Received:
    260
    You can create any strength of additional illumination on the back side of the monitor, for as long as you can pay the electricity:
    https://en.wikipedia.org/wiki/Ambilight
     
  5. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,172
    Location:
    La-la land
    My apartment is perpetually gloomy (due to poorly placed, small windows), so there's no noticeable fluctuation. Having auto brightness enabled also lets me watch movies, or even browse the web at night with all lights turned off without straining my eyes... :)

    Yes, I have a couple low-wattage LED bulbs hanging from a bookshelf down behind my monitors for this reason, but as they don't adjust accordingly to what's shown on the screen they can't compensate for a very bright image. Even if they could, the paint on my wall just might catch fire from the level of flux hitting it! :lol:

    Also, an overly bright ambilight would mess with dark monitor imagery, so the opposite situation is also a bother. :p
     
  6. Ethatron

    Regular Subscriber

    Joined:
    Jan 24, 2010
    Messages:
    858
    Likes Received:
    260
    Maybe switch to Occulus HDR. :p
     
  7. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,907
    Likes Received:
    1,607
    http://www.theplatform.net/2015/07/07/nvidia-ramps-up-gpu-deep-learning-performance/
     
    nnunn and Grall like this.
  8. iMacmatician

    Regular

    Joined:
    Jul 24, 2010
    Messages:
    771
    Likes Received:
    200
    pharma likes this.
  9. A1xLLcqAgt0qc2RyMz0y

    Regular

    Joined:
    Feb 6, 2010
    Messages:
    985
    Likes Received:
    277
  10. Frenetic Pony

    Regular Newcomer

    Joined:
    Nov 12, 2011
    Messages:
    322
    Likes Received:
    82
    Not really, your eyes can tell the difference in brightness between say, 500 nits and 1000 nits regardless of contrast level. It's honestly the software support that I'd be worried about. We have standards for "HDR images" and a new color space, and all hardware manufacturers need to do is hit that target (or at least close enough in the case of cheaper hardware).

    But software, I'm just afraid a new image compression format to replace JPEG will take forever. BPG seemed the obvious choice, until HVEC turned overly greedy and evil. WEBP just doesn't any future proof support at all, with 8bit encoding max and nothing beyond SRGB colorspace. Not to mention how consoles would cope, as you'd need an increased bandwidth for increased color precision across the board to avoid banding. Ohwell, probably just worrying for no reason. The new HDR and REC 2020 standards are probably only going to be widespread enough to care in like 5 years anyway, LCD's can't even get to either 1,000 nits reliably or REC 2020 colorspace at all yet.
     
  11. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    Good news for you!
    Nvidia latest geforce drivers for W10 have selectable 8/10-bit color output!
    ...so Nvidia does listens too...:D
     
    DavidGraham, Lightman and pharma like this.
  12. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,907
    Likes Received:
    1,607
    Nvidia: Geforce GTX 980 (990M) for notebooks based on the desktop GTX 980?

    Article is in German, so could be errors in the above translation.

    http://www.notebookcheck.com/Nvidia...s-auf-Basis-der-Desktop-GTX-980.147678.0.html
     
  13. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,797
    Likes Received:
    2,056
    Location:
    Germany
    A bit coarse, but facts are true to the german source.

    One thing should be added:
    The rumored chip TDP is [said to be variable] between 100 to 200 watts, so notebook manufacturers will have a wide range performance to work with depending on the cooling capacity of their devices obtained.
     
    pharma likes this.
  14. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,838
    Likes Received:
    4,455
    The current 980M already uses a GM204.

    A newsworthy info would be if the 990M was not using GM204.
     
  15. Kaarlisk

    Regular Newcomer Subscriber

    Joined:
    Mar 22, 2010
    Messages:
    293
    Likes Received:
    49
    It uses a GM204 with 1/4 of shaders fused off. A full GM204 at 10%-30% higher clocks can give you 30-70% more performance.
     
  16. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,838
    Likes Received:
    4,455
    30% higher clocks is a bit unbelievable. The 980M already has a 1038-1127Mhz range of clocks. 30% above that would bring the 990M towards 1.45GHz, which not even the most expensive factory-overclocked 980 models reach at default values, with 2x the TDP.


    I know some people believe Maxwell to be the second coming of Christ, but let's be real.
     
  17. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,907
    Likes Received:
    1,607
    990M Speculation ...


    http://muropaketti.com/nvidia-valmistelee-uutta-lippulaivaa-kannettaviin-geforce-gtx-990m
     
  18. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    I'm ready for the Aorus X3 update to this new graphics chip :D
     
  19. Kaarlisk

    Regular Newcomer Subscriber

    Joined:
    Mar 22, 2010
    Messages:
    293
    Likes Received:
    49
    10-30% higher actual clocks, not maximum boost clocks.
    There is a reason I specified a range. GTX 980M is probably, at least in some laptops, power and/or temperature limited when boosting.

    A site that did a direct comparison had GTX 980M at 58-75% of GTX 980 performance.
    In other words, GTX 980 was 33-72% faster than a GTX 980M.
    The rumors mentioned claimed a 200W TDP. The current 980M has a 100W TDP and a quarter of shaders disabled.
    I do not see how unlocking the full GPU and giving it twice the TDP will not result in a huge performance increase, in most applications.
     
    pharma likes this.
  20. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,012
    Likes Received:
    112
    My problem is the article mentions variable TDP from 100-185W. At 100W it would probably be barely faster than a 980M. But at 150W (or more) that's going to look quite different.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...