AMD Radeon finally back into laptops?

Discussion in 'Architecture and Products' started by ToTTenTranz, Jul 12, 2016.

Tags:
  1. Tahir2

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,978
    Likes Received:
    86
    Location:
    Earth
    Go on then since you asked... ;)
     
  2. RedVi

    Regular

    Joined:
    Sep 12, 2010
    Messages:
    387
    Likes Received:
    39
    Location:
    Australia
    Comparing different models from different eras of different ages...

    The majority of TV's have had LED backlight since 2010, anything you bought in 2013 certainly was, thick, thin or otherwise. LED vs Flurescent definitely has an impact on colour, with LED often looking more artificial because of the range of colour the white in the backlight could produce. These days though it's all down to the calibration/settings as most modern sets are decent. All TV's look horrible out of the box. Most of the time you will need to reduce the backlight right down to 2 or 1, enable any available gaming mode (even for watching movies), turn off image interpolation (120hz/240hz stuff), turn off any other settings (contrast/black enhancer, etc), change the colour temperature (to warm, usually, sometimes neutral is fine) and often still play with the gamma or colour balance.

    Just look up calibration settings for your model of tv. Put them in and adjust to taste from there.

    Aside from that, having 4 times as many sub pixels is going to let more backlight through, so I think this is what you are experiencing. Yes backlights are getting brighter to reach higher NITs for HDR, but this movement is not synonymous with a bad viewing experience.

    For power use on a LCD, letting more light through is not going to use significantly more power. The backlight itself will affect that and on most modern TV's backlight will scale up and down with the brightness of the scene, this will affect power use but is not at all related to the TV being LED or 4K. OLED, yes, having more sub-pixels will directly result in higher power use all else being equal.
     
  3. Erinyes

    Regular

    Joined:
    Mar 25, 2010
    Messages:
    647
    Likes Received:
    94
    Cant say I'm surprised..I knew it was coming out of your lower back. And comparing Apples to Oranges as usual. Oh and the very same personal experience which claimed that a 15" laptop display consumes 50 watts..no 65 watts..no wait..20 watts. As for your last statement..I have nothing to say..it does not merit a response.

    To once again try to get back to the actual topic..we have yet to see any official news from AMD on mobile Polaris parts.
     
    #163 Erinyes, Nov 11, 2016
    Last edited: Nov 11, 2016
    gamervivek and ToTTenTranz like this.
  4. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    Of course you can't cause I have the freakin TV's, I know what I have been seeing lol.

    I can put up the pics if you like, but no you don't want that, because then means I'm not making shit up, which is what you are trying to imply, sorry but that doesn't work with me, when you can't say shit because you have not seen the differences between today's LCD's vs yester years LCDS.

    http://www.onsemi.com/pub_link/Collateral/TND353-D.PDF

    Back on topic? Ok sounds good.
     
    #164 Razor1, Nov 12, 2016
    Last edited: Nov 12, 2016
  5. gamervivek

    Regular Newcomer

    Joined:
    Sep 13, 2008
    Messages:
    715
    Likes Received:
    220
    Location:
    india
    Comparison of 460 Pro(16.6 drivers on bootcamp) vs. 960m in xps 15, slight edge for nvidia chip in standard dx11 graphics benchmarks while 460Pro does better in time spy tests by ~25% and 12%.

    GTA V numbers are close too between the two which is promising since AMD tend to do worse in it though he goes above the 2GB limit in the settings.

    Starts at 5:20



    Looks very good when compared to expectations of it barely matching Maxwell's efficiency two years later.
     
    Nemo and Lightman like this.
  6. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    The 860m and 960m, both chips are quite similar (same chips gm107, http://graphics-cards.specout.com/c...a-GeForce-GTX-860M-vs-Nvidia-GeForce-GTX-960M), the 960m was actually a disappointment in the Maxwell 2 line up, and even more of disappointment in laptops. The performance difference was within 5% too.

    http://gpuboss.com/gpus/GeForce-GTX-860M-vs-GeForce-GT-750M#performance

    Lets look at geek bench from the link above, according to this Geekbench on the 750m is faster than a 860m? So lets take that right out the window shell we? Geekbench isn't showing us anything at all.

    Comparing the skydiver scores seem to line up with what the guy had with the macpro vs dell.
     
    #166 Razor1, Nov 20, 2016
    Last edited: Nov 20, 2016
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...