AMD Radeon finally back into laptops?

Personal experience going from a hd tv going from a then year old LCD without LED to a HD LCD with LED backlight a 4k LED backlight I can tell the damn difference. I'm not blind, and tell the back light is much brighter on the 4k, And I have been only buying samsung for the past 10 years or so! AND I still have all 3 tv's in my house.

Comparing different models from different eras of different ages...

If you want see this, just have to find out if they aren't using the top three lines of a TV line from which ever company they are comparing, they won't be using LED or OLED tech. And those TV's that do tend use less resolution, will not be in the top top lines. Regular HD TV's from samsung are up to the 5000 line I think? 6000, 7000, 8000 all of the are qHD or higher, slim forms too.

Pretty sure the 5000 line doesn't have slim versions, and thus no LED backlight in 2013. This article was from 2015, might have changed now, but the last TV I bought was for my gathering room, 50 inch 7000 samsung the top version in that line, LED back light (late 2013), 5000 didn't have LED nor did it go up higher than HD. 6000's had LED's did not go past HD 2013 , the 7000 were LED's in 2014 with thin form factor again nothing past HD, then 8000's did have LED backlight when they were introduced in 2013 didn't have 4k and in the 2015, were the first to come with 4k with all the goodies. This has changed a bit now. So you can't even compare across the lines over the different years. Yester-years 7000 is today's 5000 or something like that and not ever year do they go down line segments like that either. This is pretty much the same for all TV manufactures too, cause I was looking at Sharp, LG, and Sony when I made my purchase.

The majority of TV's have had LED backlight since 2010, anything you bought in 2013 certainly was, thick, thin or otherwise. LED vs Flurescent definitely has an impact on colour, with LED often looking more artificial because of the range of colour the white in the backlight could produce. These days though it's all down to the calibration/settings as most modern sets are decent. All TV's look horrible out of the box. Most of the time you will need to reduce the backlight right down to 2 or 1, enable any available gaming mode (even for watching movies), turn off image interpolation (120hz/240hz stuff), turn off any other settings (contrast/black enhancer, etc), change the colour temperature (to warm, usually, sometimes neutral is fine) and often still play with the gamma or colour balance.

Just look up calibration settings for your model of tv. Put them in and adjust to taste from there.

Aside from that, having 4 times as many sub pixels is going to let more backlight through, so I think this is what you are experiencing. Yes backlights are getting brighter to reach higher NITs for HDR, but this movement is not synonymous with a bad viewing experience.

For power use on a LCD, letting more light through is not going to use significantly more power. The backlight itself will affect that and on most modern TV's backlight will scale up and down with the brightness of the scene, this will affect power use but is not at all related to the TV being LED or 4K. OLED, yes, having more sub-pixels will directly result in higher power use all else being equal.
 
Personal experience going from a hd tv going from a then year old LCD without LED to a HD LCD with LED backlight a 4k LED backlight I can tell the damn difference. I'm not blind, and tell the back light is much brighter on the 4k, And I have been only buying samsung for the past 10 years or so! AND I still have all 3 tv's in my house.

Is that enough for you or do you need pics of my 7000 sq foot house with a pool and tennis court? and my 2 amg mercedes 550 4 matics and audi A8l?

Cant say I'm surprised..I knew it was coming out of your lower back. And comparing Apples to Oranges as usual. Oh and the very same personal experience which claimed that a 15" laptop display consumes 50 watts..no 65 watts..no wait..20 watts. As for your last statement..I have nothing to say..it does not merit a response.

To once again try to get back to the actual topic..we have yet to see any official news from AMD on mobile Polaris parts.
 
Last edited:
Cant say I'm surprised..I knew it was coming out of your lower back. And comparing Apples to Oranges as usual. Oh and the very same personal experience which claimed that a 15" laptop display consumes 50 watts..no 65 watts..no wait..20 watts. As for your last statement..I have nothing to say..it does not merit a response.

To once again try to get back to the actual topic..we have yet to see any official news from AMD on mobile Polaris parts.


Of course you can't cause I have the freakin TV's, I know what I have been seeing lol.

I can put up the pics if you like, but no you don't want that, because then means I'm not making shit up, which is what you are trying to imply, sorry but that doesn't work with me, when you can't say shit because you have not seen the differences between today's LCD's vs yester years LCDS.

http://www.onsemi.com/pub_link/Collateral/TND353-D.PDF

Back on topic? Ok sounds good.
 
Last edited:
Comparison of 460 Pro(16.6 drivers on bootcamp) vs. 960m in xps 15, slight edge for nvidia chip in standard dx11 graphics benchmarks while 460Pro does better in time spy tests by ~25% and 12%.

GTA V numbers are close too between the two which is promising since AMD tend to do worse in it though he goes above the 2GB limit in the settings.

Starts at 5:20


Looks very good when compared to expectations of it barely matching Maxwell's efficiency two years later.
 
Comparison of 460 Pro(16.6 drivers on bootcamp) vs. 960m in xps 15, slight edge for nvidia chip in standard dx11 graphics benchmarks while 460Pro does better in time spy tests by ~25% and 12%.

GTA V numbers are close too between the two which is promising since AMD tend to do worse in it though he goes above the 2GB limit in the settings.

Starts at 5:20


Looks very good when compared to expectations of it barely matching Maxwell's efficiency two years later.


The 860m and 960m, both chips are quite similar (same chips gm107, http://graphics-cards.specout.com/c...a-GeForce-GTX-860M-vs-Nvidia-GeForce-GTX-960M), the 960m was actually a disappointment in the Maxwell 2 line up, and even more of disappointment in laptops. The performance difference was within 5% too.

http://gpuboss.com/gpus/GeForce-GTX-860M-vs-GeForce-GT-750M#performance

Lets look at geek bench from the link above, according to this Geekbench on the 750m is faster than a 860m? So lets take that right out the window shell we? Geekbench isn't showing us anything at all.

Comparing the skydiver scores seem to line up with what the guy had with the macpro vs dell.
 
Last edited:
Back
Top