AMD Radeon finally back into laptops?

Well depends on the laptop, we were talking about the razor blade, which has a 14 inch monitor with QHD res and a 165 watt power brick, so its quite different then what you have.

Its CPU is rated for 45 watts, and its motherboard is rated for 30 watts, all thats left that we don't know is its monitor. You would be hard pressed to find QHD monitors less than 20 watts. Any case it all ends up around 75 watts left over for the graphics card. Which is right around 80watts that you stated right?

Ahh..the Blade wasn't mentioned in the post I responded to. But even then..it isn't 50 Watts. And I still dont know from where you are getting that a motherboard consumes 30 watts. There are also plenty of other components which consume power btw (Hard drive(s), RAM, fans, Wifi chip, etc) so its really not possible to determine the graphics card power the way you suggest.
This article lists LCD (IPS) and OLED power consumption for a 14" inch display as 5.2 W for the LCD and 8.7W for the OLED (in a completely white screen, regular use - less than half). Both screens at WQHD resolution. Even with a 15.6" 4K screen, you won't exceed 10W.

Cheers[/URL]

Great find. I think that settles it.
Complete white is the lowest power level for pixels ;) OLEDs are a bit different and I'm not that familiar with many laptops that use OLED's, I thought most of them haven't switched over to them because of the life expectancy and glare.

Its like 3 or 4 times less power usage than Black is. so yeah Its still going to up above 20 watts when doing daily things.

Err..complete white is the lowest power level for Pixels in a LCD display? Isn't it kind of the exact opposite of that since the backlight would be at full power? The last chart in the article posted by Gubbi shows that for both 50% white and 100% white, the IPS LCD consumes about 5W.
 
When I was talking about the motherboard, I was talking about all other components that it can power.

Pixel color and backlight are two different things, to have a white pixel you don't need to have the backlight on at full (of course for our eyes it looks more white with the full backlight)
 
Last edited:
the pixel colors matter for LCD's which use IPS, VA or TN, and VA does use the most power of them all purely on a pixel level, but all of them have the same characteristics from a power perspective when looking at the color the pixel is at.
Nope. Now you actually made me look it up... TN pixels default to let all light through, IPS pixels default to block all light, hence that is their respective low power state. Some people actually measured it (no VA in there, and I'm really too lazy to look that up too...):
http://techlogg.com/2010/05/black-vs-white-screen-power-consumption-24-more-monitors-tested/17
But these differences are irrelevant in comparison to backlight brightness.
 
Nope. Now you actually made me look it up... TN pixels default to let all light through, IPS pixels default to block all light, hence that is their respective low power state. Some people actually measured it (no VA in there, and I'm really too lazy to look that up too...):
http://techlogg.com/2010/05/black-vs-white-screen-power-consumption-24-more-monitors-tested/17
But these differences are irrelevant in comparison to backlight brightness.


And this is what I stated earlier,

As I stated 20 watts for the LCD is conservative. Because the brightness and what is being shown makes a big difference in how much wattage the LCD burns. If the brightness is turned down and depending on the model have to be turned down A LOT to get it down to 20 watts. And if you don't believe the things I have posted thus far, I suggest you look up Microsoft break down of laptops and what power consumption is , the LCD monitor is by far what takes up more then any other component. Some times as much as 5 times more then the CPU!


Brightness has more affect in the power consumption than the pixel color, but the pixel color also affects the power consumption.

I was thinking the power savings from the pixel color shifts are around 5%, then you get around 40% savings from not using the full backlight. You end up with 50 watts.

And if you look at power consumption from a none touch screen to touch screen add another 30% to that, because touch screens drain battery life around 30% faster (the notebook in question does have a touchscreen 14inch).

There is no way the razor blade with a 1060, with a 14inch high res screen with touch screen, the screen alone is using less then 20 watts, unless you have the backlight down to like 100 mits, which is very low, less then half of what the screen is capable of. Might work in a complete dark room, but that's about it.

https://www.luculentsystems.com/techblog/how-backlight-brightness-affects-battery-life/

this is a standard 15inch screen (highly doubt this guy was testing a high res screen and with a touchscreen) but while gaming its using 35 watts total (for the entire laptop), but we can see the backlight is increasing the monitor power usage greatly.

http://soar.wichita.edu/bitstream/handle/10057/3214/GRASP_2010_133-134.pdf?sequence=1

All of the breakdowns I can find, the Screen use the most % of the power, and we are talking about 50 watt laptops in most of the these breakdowns, not gaming, that is 40% of 50, is about 20 watts. I just can't see how that is going to change much when you have higher resolutions and features like touch screen to boot.

PS just the increased resolution alone, going from 1080p to QHD will double the power consumption.
 
Last edited:
http://www.ultrabookreview.com/10704-razer-blade-14-2016-review/

This finally talks about wattage used by different modes and what the user is doing.

  • 7.6 W (~ 9h 12 min of use) – idle, Power Saving Mode, screen at 0%, Wi-Fi OFF;
  • 11.2 W (~ 6 h 15 min of use)– light browsing and text editing in Microsoft Word, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 11.5 W (~ 6 h 5 min of use)– 1440p full screen video on YouTube in EDGE, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 10.2 W (~6 h 52 min of use)– 1080p full screen .mkv video in Movie App, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 30 W (~2 h 20 min of use)– heavy browsing in EDGE, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 42 W (~1 h 40 min of use)– heavy gaming in 1080p, Balanced Mode, screen at 50%, Wi-Fi ON;
42 W when not plugged in, is a laptop with a 970m is eating up while gaming at 50% screen brightness and this is low res screen. Wish it had a 1060 review with something similar, but nothing right now.

This screen from idle to use case is 4 watts, add in quadriple the res, 16 watts, add in touch screen 21 watts.
 
http://www.ultrabookreview.com/10704-razer-blade-14-2016-review/

This finally talks about wattage used by different modes and what the user is doing.

  • 7.6 W (~ 9h 12 min of use) – idle, Power Saving Mode, screen at 0%, Wi-Fi OFF;
  • 11.2 W (~ 6 h 15 min of use)– light browsing and text editing in Microsoft Word, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 11.5 W (~ 6 h 5 min of use)– 1440p full screen video on YouTube in EDGE, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 10.2 W (~6 h 52 min of use)– 1080p full screen .mkv video in Movie App, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 30 W (~2 h 20 min of use)– heavy browsing in EDGE, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 42 W (~1 h 40 min of use)– heavy gaming in 1080p, Balanced Mode, screen at 50%, Wi-Fi ON;
42 W when not plugged in, is a laptop with a 970m is eating up while gaming at 50% screen brightness and this is low res screen. Wish it had a 1060 review with something similar, but nothing right now.

This screen from idle to use case is 4 watts, add in quadriple the res, 16 watts, add in touch screen 21 watts.

This is a super cool review. But I agree with you - I wish they had the new 1060 one. That'll be neat to see.

I'm also incredibly interested in this kind of analysis for the larger Blade Pro and its 1080. I can't believe they fit a laptop 1080 in a sub-1" thick chassis. I've been wondering if it would throttle.
 
This screen from idle to use case is 4 watts, add in quadriple the res, 16 watts, add in touch screen 21 watts.

Err..no..double the res does not mean double the power. Far less than that. And higher res displays are usually higher quality and more efficient. From this link, 4K vs FHD is 30% more power - http://www.geek.com/chips/4k-tvs-use-30-more-power-backlights-and-processing-power-to-blame-1639893/ and this one showing how a higher res screen can even have lower power - http://pocketnow.com/2014/10/28/qhd-smartphone-displays

Either ways..we are straying far off topic with this so I will get back to the topic of the thread. I'm surprised that even after the Apple announcement we still haven't heard ANYTHING from AMD on mobile Polaris parts for Windows. I really do hope they're planning on releasing in time for the holiday season.
 
http://www.ultrabookreview.com/10704-razer-blade-14-2016-review/
<snip>

42 W when not plugged in, is a laptop with a 970m is eating up while gaming at 50% screen brightness and this is low res screen.

Ultrabook Review said:
Screen:14-inch, 3200 x 1800 px resolution, 10-finger multi-touch, IGZO IPS panel

3200x1800 is low res to you? The 4W is with touch.

This screen from idle to use case is 4 watts, add in quadriple the res, 16 watts, add in touch screen 21 watts.

More math and numbers from the lower part of your back.

Cheers
 
Err..no..double the res does not mean double the power. Far less than that. And higher res displays are usually higher quality and more efficient. From this link, 4K vs FHD is 30% more power - http://www.geek.com/chips/4k-tvs-use-30-more-power-backlights-and-processing-power-to-blame-1639893/ and this one showing how a higher res screen can even have lower power - http://pocketnow.com/2014/10/28/qhd-smartphone-displays

Either ways..we are straying far off topic with this so I will get back to the topic of the thread. I'm surprised that even after the Apple announcement we still haven't heard ANYTHING from AMD on mobile Polaris parts for Windows. I really do hope they're planning on releasing in time for the holiday season.


That is because the 4k tvs they are showing use OLEDs and LED technologies.
 
3200x1800 is low res to you? The 4W is with touch.



More math and numbers from the lower part of your back.

Cheers


They aren't using the full res.....

Guess you couldn't be bothered by reading the article.

4 watts is in idle and screen off :/ to screen on touch not being used. Regular use of windows and web browsing doesn't affect the screen power usage as much as gaming either, if you are switching pixel colors more often, that will affect the power usage GREATLY, the reason why white doesn't affect IPS's as much as other colors is because the monitor has to send electricity through the crystals to create black, or other colors. Now if you are shifting pixel colors imagine what happens and when you aren't using the full resolution what happens.....

I have tested resolution and power using on my old Dell 30 inch LCD, at full res it used to burn 150 watt, when I dropped it down half the resolution, guess what 75 watts, that was what my power back up showed, I know its not that accurate, but when looking at differential, margins of error should be removed automatically. As refresh rates of newer IPS monitors are better this will also improve power usage and increasing resolutions will not give a 1 to 1 increase anymore, but not that far off. Do I need to explain this or can you understand it from basic math why refresh rates can affect power consumption? Now with the resolution differences and power usage, this only happened in gaming or doing 3d, or 2d high res photo manipulation (during regular use on desktop power consumption was like 10% different), do I need to explain why neighboring pixels in a screen if having the same colors use less power or can you figure that out?

Edit: also the monitor's crystal grid the way its setup will also have a large impact on how neighboring pixels with same colors reduce power usage too. Hopefully I don't need to explain how lower resolutions on the same screen you end up with neighboring pixels with same color do I?

There are a lot of variables the review didn't get into, because that wasn't the point of the review, so if you don't understand those things, and yet want to point them out as absolute nah doesn't work that way.

reading comprehension is just too much for ya so you need to insult. Whats the use of typing up everything for you when you can't read a link and then use basic knowledge of how these monitors work to figure things out. This is what I was expecting form you, not to read so I didn't go into the full description. PS what you want to talk about backlight again? Yeah ok there is a reason why they used 50% blacklight. cause it saves a shit ton of power.

Anycase doesn't matter, the original point being, nV to reach 35 watts on Pascal, they can probably do it with the 1060 let alone the 1050.

Now lets get back to the TDP figures for P11 and the new Apple mac pro, I think the TDP figures are BS, and the mac pro its TDP is 35 watts, for the GPU we have no clue about what the P11 in those things are doing, downclocked and no throttling, or throttling? Can't even trust that 35watt TDP.
 
Last edited:
4 watts is in idle and screen off :/ to screen on touch not being used.
The four watts is the difference between idling with screen off and light usage (from 7.5W to 11.2W). The browsing is not done a reduced resolution. Actually, that's only 3.8W, - assuming the CPU and Wifi don't use any power (they do).

Regular use of windows and web browsing doesn't affect the screen power usage as much as gaming either, if you are switching pixel colors more often, that will affect the power usage GREATLY, the reason why white doesn't affect IPS's as much as other colors is because the monitor has to send electricity through the crystals to create black, or other colors. Now if you are shifting pixel colors imagine what happens and when you aren't using the full resolution what happens.....

The only power saved from using a lower resolution is in the backend processing the image. The TFT matrix doesn't magically drop to 720p when you watch a video, the individual images are scaled to the TFT's resolution and all pixels in the panel are driven with the result. And once again, power consumption in a LCD is dominated by the backlight, this isn't an OLED or plasma panel. Someone else quoted an article stating going from FHD to 4K increased power consumption by 30%, that's four times the pixels for 30% more power.
 
Last edited:
The only power saved from using a lower resolution is in the backend processing the image. The TFT matrix doesn't magically drop to 720p when you watch a video, the individual images are scaled to the TFT's resolution and all pixels in the panel are driven with the result. And once again, power consumption in a LCD is dominated by the backlight, this isn't an OLED or plasma panel. Someone else quoted an article stating going from FHD to 4K increased power consumption by 30%, that's four times the pixels for 30% more power.


And you didn't read that article, what is the difference between the monitors they were listing, the old non 4k monitors were not LED's or OLED's. So to compare across different types of LCD's and resolutions, don't give us anything useful outside of new tech can drop power usage then the expected amount at higher resolutions.

And how does the power flow in the matrix when you have same color pixels that neighbor each other? What happens when you have to lower resolution on a screen? Don't you have more same color pixels that neighbor each other? What is the implication of that vs. the different type of panels? Unfortunately no one has tested all of the different reasons why LCD's and the different variations at this fine grain, I don't have the tools to do that, the only thing I did or was capable of doing in 2005, was what I mentioned.
 
That is because the 4k tvs they are showing use OLEDs and LED technologies.
And you didn't read that article, what is the difference between the monitors they were listing, the old non 4k monitors were not LED's or OLED's. So to compare across different types of LCD's and resolutions, don't give us anything useful outside of new tech can drop power usage then the expected amount at higher resolutions..

Did YOU even read the article? Where in the article does it say the old TVs were not LED? Unless my eyesight is very much failing me..I seem to have missed that part completely...
The Natural Resources Defense Council (NRDC) has found that a typical 4K TV uses 30% more energy than that of a HD TV. That’s the result of testing 21 TVs (20 LCD, 1 OLED) of sizes between 49 and 55-inches manufactured by LG, Panasonic, Samsung, Sharp, and Sony within the last 2 years.
Not a laptop, but the first usage I've seen of an RX 480 MXM module:
http://www.anandtech.com/show/10811...f-pc-with-intel-core-i5-amd-radeon-rx480-usbc

'AMD Radeon RX480 at 1050 MHz, 2304 stream processors
4 GB of GDDR5 7 GT/s memory, 256-bit memory interface'

Not the first usage actually..AMD has previously announced an embedded MXM based Polaris 10 - http://www.anandtech.com/show/10710/amd-announces-embedded-radeon-e9260-e9550
 
Did YOU even read the article? Where in the article does it say the old TVs were not LED? Unless my eyesight is very much failing me..I seem to have missed that part completely...



Not the first usage actually..AMD has previously announced an embedded MXM based Polaris 10 - http://www.anandtech.com/show/10710/amd-announces-embedded-radeon-e9260-e9550


The Natural Resources Defense Council (NRDC) has found that a typical 4K TV uses 30% more energy than that of a HD TV. That’s the result of testing 21 TVs (20 LCD, 1 OLED) of sizes between 49 and 55-inches manufactured by LG, Panasonic, Samsung, Sharp, and Sony within the last 2 years.

The extra power is required because the backlights these screens use has to be a lot brighter in order to deliver those vibrant colors. The other big energy draw: the processing power required to drive that imagery.

Do you know which TV's have brighter back lights? Or MITS? YEAH LED TV's have brighter Backlights!

Did you ever think when watching a movie on LED vs older LCD tech why they looked like soap opera lighting? I hate that look, and thus why I have kinda been watching movies on my monitor instead of watching them on my LED TV. That problem wasn't there before 4k resolution TV's where out, or well wasn't as bad. The lighting is just too flat and fake, because of the brightness of the backlight and its hard to change because its for entire screen even though with a higher contrast ratio, it didn't not help with the lighting

If you want see this, just have to find out if they aren't using the top three lines of a TV line from which ever company they are comparing, they won't be using LED or OLED tech. And those TV's that do tend use less resolution, will not be in the top top lines. Regular HD TV's from samsung are up to the 5000 line I think? 6000, 7000, 8000 all of the are qHD or higher, slim forms too.

Pretty sure the 5000 line doesn't have slim versions, and thus no LED backlight in 2013. This article was from 2015, might have changed now, but the last TV I bought was for my gathering room, 50 inch 7000 samsung the top version in that line, LED back light (late 2013), 5000 didn't have LED nor did it go up higher than HD. 6000's had LED's did not go past HD 2013 , the 7000 were LED's in 2014 with thin form factor again nothing past HD, then 8000's did have LED backlight when they were introduced in 2013 didn't have 4k and in the 2015, were the first to come with 4k with all the goodies. This has changed a bit now. So you can't even compare across the lines over the different years. Yester-years 7000 is today's 5000 or something like that and not ever year do they go down line segments like that either. This is pretty much the same for all TV manufactures too, cause I was looking at Sharp, LG, and Sony when I made my purchase.

So if these guys are comparing over 2 years, you only have two gens of TV's to look at. A tv from last gen (same size) is not going to have the same features as the latest gen, and in 2015, QHD was the first year to come out, so if they were comparing it to TV's from 2013-14 (same size) there is lot more then the resolution that changed, one of those is going to the backlight change and another OLED changes. Too many variables to sit around and make a general comment that article made at least from a power perspective.

I used to upgrade my TV every year and half to two years, not doing that anymore and won't do an upgrade till HDR TV's come out with high resolution and come down in price, and that is because of the crappy brightness of the backlight.
 
Last edited:
And how does the power flow in the matrix when you have same color pixels that neighbor each other?

But they are not similar. A pixel is made up of subpixel, even if each pixel is the same colour, the subpixel will vary in intensity.

Which is close to completely irrellevant anyway since each subpixel is driven individually.

Cheers
 
http://www.ultrabookreview.com/10704-razer-blade-14-2016-review/

This finally talks about wattage used by different modes and what the user is doing.

  • 7.6 W (~ 9h 12 min of use) – idle, Power Saving Mode, screen at 0%, Wi-Fi OFF;
  • 11.2 W (~ 6 h 15 min of use)– light browsing and text editing in Microsoft Word, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 11.5 W (~ 6 h 5 min of use)– 1440p full screen video on YouTube in EDGE, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 10.2 W (~6 h 52 min of use)– 1080p full screen .mkv video in Movie App, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 30 W (~2 h 20 min of use)– heavy browsing in EDGE, Balanced Mode, screen at 50%, Wi-Fi ON;
  • 42 W (~1 h 40 min of use)– heavy gaming in 1080p, Balanced Mode, screen at 50%, Wi-Fi ON;
42 W when not plugged in, is a laptop with a 970m is eating up while gaming at 50% screen brightness and this is low res screen. Wish it had a 1060 review with something similar, but nothing right now.

This screen from idle to use case is 4 watts, add in quadriple the res, 16 watts, add in touch screen 21 watts.

Hmmm?

7.6 W when everything is idle, monitor at 0%, Wi-Fi off.

10.2 W when playing a locally stored video.

2.6 W difference for Monitor at 50% brightness, CPU, GPU, memory, storage, Wi-Fi module. Storage and Wi-Fi will be negligible as the storage won't be accessed frequently and the same goes for the Wi-Fi module. The GPU will offload most video processing off the CPU, so it'll be mostly the GPU and monitor.

Power consumption of the monitor at 50% brightness is going to be significantly less than 2.6 W unless by monitor at 0% he means brightness at 0% rather than monitor in sleep mode. Seems plausible as <2.6 W seems unlikely but not impossible for the backlight driving a 14" screen. But regardless it isn't going to be anywhere remotely close to 16 W at any point. Heck it's unlikely to be greater than 4 W unless at much higher brightness.

It doesn't matter that it isn't a game, it doesn't matter that it's a 1080p video. All 3200x1800 pixels will be in use and changing colors, unless they are just looking at a static image, but that wouldn't qualify as a video.

As resolution increases the power consumption of the GPU is going to increase, but the consumption of the monitor isn't. 11.5 W with a 1440p video would point to that as well as some increased power use from the Wi-Fi solution. The monitor itself will not have changed power consumption in any significant way.

Unless they are doing multiplayer gaming over the Wi-Fi connection, the best comparison will be between the 42 W when gaming to the 10.2 W when watching the 1080p video. In that case the ~31.8 W increase will be almost entirely due to the CPU and GPU (the storage solution will contribute a negligible increase in power consumption).

But all that said. That's a rather horrible review when it comes to the power used. The author isn't clear about what he means by idle (though that can be assumed most likely) and more importantly WTF does the author mean by monitor at 0%? That last one is quite important as monitor at 0% could mean the backlight is still on which would be different from monitor in standby which would mean the backlight is off.

Heavy gaming in 1080p is rather meaningless as well, no clue on GPU load or CPU load as game isn't listed nor settings used. Likewise what video did they use? One with slow pans and minimal camera movement? A movie with lots of action and camera pans?

But regardless of all that. The only thing that we can be certain of is that the monitor is nowhere close to 16 W.

Meh, it's decent to give an idea of what the notebook is like, but you can derive virtually no information about how much each of the components may or may not be consuming.

Regards,
SB
 
Last edited:
Do you know which TV's have brighter back lights? Or MITS? YEAH LED TV's have brighter Backlights!
And do you have a source to back up that claim? Also in the case of TVs, the brightness of backlights varies considerably from manufacturer to manufacturer. The article clearly says 4k TVs on AVERAGE consume 30% more than HD TVs. I dont even know what you are trying to say with the rest of the post and since its full of guesses and theories you cant draw any conclusions anyway.
But regardless of all that. The only thing that we can be certain of is that the monitor is nowhere close to 16 W.

Meh, it's decent to give an idea of what the notebook is like, but you can derive virtually no information about how much each of the components may or may not be consuming.

Exactly..extrapolating from made up numbers with more made up theories still gives you made up numbers.
 
Last edited:
And do you have a source to back up that claim? Also in the case of TVs, the brightness of backlights varies considerably from manufacturer to manufacturer. The article clearly says 4k TVs on AVERAGE consume 30% more than HD TVs. I dont even know what you are trying to say with the rest of the post and since its full of guesses and theories you cant draw any conclusions anyway.


Exactly..extrapolating from made up numbers with more made up theories still gives you made up numbers.


Personal experience going from a hd tv going from a then year old LCD without LED to a HD LCD with LED backlight a 4k LED backlight I can tell the damn difference. I'm not blind, and tell the back light is much brighter on the 4k, And I have been only buying samsung for the past 10 years or so! AND I still have all 3 tv's in my house.

Is that enough for you or do you need pics of my 7000 sq foot house with a pool and tennis court? and my 2 amg mercedes 550 4 matics and audi A8l?
 
Last edited:
Back
Top