AMD Radeon finally back into laptops?

Now if we want to have some data behind this, Temps of the 1050 and 1050ti don't increase unitl the core starts boosting

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1050-ti,4787-7.html

01-Clock-Rate_w_727.png


Just looking at the temps, they increase 35 degrees C, that alone tells us they can save 35 watts or so if they stop the card from boosting

I agree with your conclusions that this chip can get to 35W with pretty good performance, I think easily actually...

I am not sure I understand some of that though.

"Temps of the 1050 and 1050ti don't increase unitl the core starts boosting"

Yes because the card goes from idle to max boost, not from base clock to boost clock. I'm not sure how you draw the conclusions from the temperature of the card, the fan profile alone makes a big difference here.
 
I agree with your conclusions that this chip can get to 35W with pretty good performance, I think easily actually...

I am not sure I understand some of that though.

"Temps of the 1050 and 1050ti don't increase unitl the core starts boosting"

Yes because the card goes from idle to max boost, not from base clock to boost clock. I'm not sure how you draw the conclusions from the temperature of the card, the fan profile alone makes a big difference here.

There has been test done on other forums who locked their voltages, the temps don't change much at base clocks.

Common guys, its like other people that have tested these things aren't going to come on forums and lie to others. I can see it if it was one person, but when you have more then one person confirming the things we have been discussing.
 
There has been test done on other forums who locked their voltages, the temps don't change much at base clocks.

Common guys, its like other people that have tested these things aren't going to come on forums and lie to others. I can see it if it was one person, but when you have more then one person confirming the things we have been discussing.

Increasing the power limit appears to do nothing for the clocks, and several independent test have shown a maximum of precisely 1911mhz.Didnt affect power draw either

Would be nice if someone with a 1050ti dropped power limit to 50% and showed us the clocks it runs at
 
There has been test done on other forums who locked their voltages, the temps don't change much at base clocks.

Yes, but the fan profile makes a big difference here. The fans don't spin at idle and thus the idle temps are "unnaturally" higher compared to a mediocre load and higher the power draw goes, the more the fans try to even out the temperature.
 
That is true!

Just for you Tots

http://www.tomshardware.com/news/nvidia-10-series-pascal-mobile-gpus,32471.html
Nvidia was vague on the subject of TDP, saying power would differ between OEM implementations and that the company typically does not provide a TDP for its mobile GPUs. Undoubtedly, the power consumption of each module will be lower than the desktop versions in order to work within the constraints of a notebook form factor. In fact, the company revealed that the GTX 1060, 1070, and 1080 would fall into the same power envelope as the GTX 970M, 980M, and 980, respectively.

What did I say about form factors and TDP? You can't compare these things like you want with explicit talks, when its like this can you? Yeah there is going to be guesstimates going on, but when you have different people saying similar things, guess what, its more likely to be true then just blinding guessing.

If you want to break down the 1060 mobile and find out you might get a closer number,

Typical 15 inch LCD how many watts does it use? 50 watts? 50 watts is actually low just looked it up looks like most are at 65 watts......
Then you have the CPU 45 watts?
Then you have the motherboard, and what it powers, another 30 watts?

What are you left with, out of the 165 watts you have 125 to take out..

the 1060 is already hitting 45 watts Even if you take down the LCD down to 20 watts, its still 75 watts man, I was being conservative with my estimates with these current batches of laptops.

All ya have to do is look around and it jumps right out at ya, laptops with the 1060 the 1060 doesn't even have the power needed to fully unleash itself, just because the other components and form factor is limiting it, yet it still gets close to desktop variants even when severally power limited.

https://ssj3gohan.tweakblogs.net/blog/7954/10w-lcd-screen-part-3-analyzing-test-results.html

As I stated 20 watts for the LCD is conservative. Because the brightness and what is being shown makes a big difference in how much wattage the LCD burns. If the brightness is turned down and depending on the model have to be turned down A LOT to get it down to 20 watts. And if you don't believe the things I have posted thus far, I suggest you look up Microsoft break down of laptops and what power consumption is , the LCD monitor is by far what takes up more then any other component. Some times as much as 5 times more then the CPU!

If I remember correctly it was on lifehacker.com fairly old article though 2010 or so?

Just because you think others don't know what they are talking about doesn't mean anything.

Ok found it

https://blogs.msdn.microsoft.com/e7/2009/01/06/windows-7-energy-efficiency/

they aren't talking about gaming comps but if you want to talk about TDP of certain parts, use these figures to figure out how much is left over for the GPU in a gaming machine based on a the brick its got. Granted LCD tech has evolved somewhat so you need to figure that out too, but the other components shouldn't shift much since they will be going down nodes at a similar rate.

http://www.notebookcheck.net/NVIDIA-Quadro-M1000M.151582.0.html

yeah a m1000m quadro is based on a gm107 and its TDP is 40 watts! And performance will be similar too!

The power consumption of the Quadro M1000M is rated for a 40 Watt TGP including the board and memory components, which is 5 Watt lower than the K1100M. Therefore, the card is suited for 15-inch notebooks and greater.
 
Last edited:
Calling you a petulant child would indeed be an insult. Saying your behavior/attitude in your last few posts in this thread is that of a petulant child is not.
Come on man. Both are insults. Hopefully people can stop the insults and move on.

Sales trend for apple is not so great and I'm not at all convinced the latest crop of phones, tablets and laptops will change that significantly. Meanwhile microsoft really put their a-game out with latest releases. I had hoped more from apple as the current products are pretty minor upgrades and apple is missing big potential trends like vr. VR will take a significant effort to get right both on hw and software. If apple is ignoring vr that it's not something they can fix overnight in future.

http://www.macworld.co.uk/news/appl...-results-iphone-mac-sales-down-again-3581769/

edit. It's also very difficult proposition for developers once apple does support vr as apple vr supporting hardware base is zero.
I wouldn't say Apple's VR base is zero. You're correct for Occulus' performance target, but the content that runs on Galaxy VR or Google cardboard requires far less processing power than the Mac Pro is capable of. So if Apple throws their hat into the VR game the size of the installed based will depend on the content Apple wants its users to experience.
 
Typical 15 inch LCD how many watts does it use? 50 watts? 50 watts is actually low just looked it up looks like most are at 65 watts......

I'd say it's closer to 5-10W max with modern laptops on normal settings, max brightness of course uses more, but laptop displays won't go even close to 50W, I'd guess not even to 20W?

https://ssj3gohan.tweakblogs.net/blog/7954/10w-lcd-screen-part-3-analyzing-test-results.html

As I stated 20 watts for the LCD is conservative. Because the brightness and what is being shown makes a big difference in how much wattage the LCD burns. If the brightness is turned down and depending on the model have to be turned down A LOT to get it down to 20 watts. And if you don't believe the things I have posted thus far, I suggest you look up Microsoft break down of laptops and what power consumption is , the LCD monitor is by far what takes up more then any other component. Some times as much as 5 times more then the CPU!

https://blogs.msdn.microsoft.com/e7/2009/01/06/windows-7-energy-efficiency/

The first link is quite interesting! Thanks for that. The figures he has compiled are for desktop monitors from 22" and up and that makes quite a bit of difference compared to a smaller laptop displays with more power management as well. The display uses a lot of power as a percentage in laptops, because it's always on, while the processor and the GPU is either idling or at low to moderate load. If you stress the CPU and GPU to near max levels you'll see the consumption percentages radically flip over as the battery is being drained in a heartbeat.

edited some typos...
 
Last edited:
Well the main change in LCD tech is the use of LEDs, which save around 30% power, so I'm still thinking they are around 15 to 20 watts and that is if the screen is set to power savings mode, Something like the razor blade with its high resolution screen seems about right, its more about the resolution of the screen and not the size.

So any case the point point being, TDP, power usage, all that good stuff, is limited by the form factor, even if the mxm module is rated for a higher TBP, it might not even reach that because the form factor won't allow it to.

Then you have to look at the reason why these mobile GPU's sip power compared to the desktop versions. The power circuitry is very different for the boards, they have been voltage binned (both the vram and the GPU). So getting a 1050 down to 35 watts, should be the same task as getting 750ti down there too, it should be a no brainier to figure that out, they probably don't even need to lock voltages either.....

And we can also see the 1060 is perf/watt is worse than the 1050 and the 1070, 1080; What does that tell us, its being pushed a little more on the performance side so its perf/watt dropped compared to the other cards, so what happens if we drop clocks on that card, a major drop in power used. That card might even get down to 40 watts or so if needed.
 
Last edited:
Heavily downclocked

specs here

http://creators.radeon.com/radeon-pro/

Radeon Pro 460 1.86 TFLOPS

16 (1024) COMPUTE UNITS (STREAM PROCESSORS)

80 GB/S MEMORY BANDWIDTH

~900 mhz Max clocks

Radeon Pro 455 1.3 TFLOPS

12 (768) COMPUTE UNITS (STREAM PROCESSORS)

80 GB/S MEMORY BANDWIDTH

Radeon Pro 450

~900 Mhz Max Clocks

1 TFLOPS

10 (640) COMPUTE UNITS (STREAM PROCESSORS)

80 GB/S MEMORY BANDWIDTH

~800 mhz Max Clocks

Pro 460 max ~907 MHz
Pro 455 max ~846 MHz
Pro 450 max ~781 MHz
 

So 25% lower clocks for 50% of the TDP. Not too bad a tradeoff. Im guessing this is the only scenario where Polaris achieves the 2.5x improvement in perf/watt.
Typical 15 inch LCD how many watts does it use? 50 watts? 50 watts is actually low just looked it up looks like most are at 65 watts......
Then you have the CPU 45 watts?
Then you have the motherboard, and what it powers, another 30 watts?

You think a 15 inch LCD consumes 50 watts? And a laptop motherboard consumes 30 watts?

For reference..the 14" laptop I am currently typing on has a 768p display, a 17W CPU and a mechanical HDD and the power adapter for it is rated at 45W.

Either ways..I dont have a source but AFAIK GTX 1060 MXM has a typical TDP of ~80W.
Pro 460 max ~907 MHz
Pro 455 max ~846 MHz
Pro 450 max ~781 MHz

What I am most surprised by is the fact that memory bandwidth is the same for all of the configurations. I would have expected at least the 460 to have higher speed GDDR5.
 
You think a 15 inch LCD consumes 50 watts? And a laptop motherboard consumes 30 watts?

For reference..the 14" laptop I am currently typing on has a 768p display, a 17W CPU and a mechanical HDD and the power adapter for it is rated at 45W.

Either ways..I dont have a source but AFAIK GTX 1060 MXM has a typical TDP of ~80W.


Well depends on the laptop, we were talking about the razor blade, which has a 14 inch monitor with QHD res and a 165 watt power brick, so its quite different then what you have.

Its CPU is rated for 45 watts, and its motherboard is rated for 30 watts, all thats left that we don't know is its monitor. You would be hard pressed to find QHD monitors less than 20 watts. Any case it all ends up around 75 watts left over for the graphics card. Which is right around 80watts that you stated right?
 
This article lists LCD (IPS) and OLED power consumption for a 14" inch display as 5.2 W for the LCD and 8.7W for the OLED (in a completely white screen, regular use - less than half). Both screens at WQHD resolution. Even with a 15.6" 4K screen, you won't exceed 10W.

Cheers[/URL]


Complete white is the lowest power level for pixels ;) OLEDs are a bit different and I'm not that familiar with many laptops that use OLED's, I thought most of them haven't switched over to them because of the life expectancy and glare.

Its like 3 or 4 times less power usage than Black is. so yeah Its still going to up above 20 watts when doing daily things.
 
Yeah that is why I stated OLED's change that, but not many laptops use OLED's to my knowledge for higher end and higher resolution versions, because of the glare factor.

I just looked up a few laptops, yeah most still seem to use LCD/ISP, razor blade does... many people wished it came with an OLED monitor, but the only laptops that seem to come with those are super thin ones, like the Mac Book air.
 
Yeah that is why I stated OLED's change that

Eh? o_O Google says otherwise (what Gubbi said).

Wiki said:
While an OLED will consume around 40% of the power of an LCD displaying an image that is primarily black, for the majority of images it will consume 60–80% of the power of an LCD. However, an OLED can use more than three times as much power to display an image with a white background, such as a document or web site.
 
Razor1 is saying that LCD uses the least amount of power when it is displaying white screen and he is also saying that this is not the case with Oled.
Actually for LCDs it depends if it's IPS, VA or TN (but I forgot which ones have higher power level with white and which with black...). But in any case, unlike OLED, the differences are rather small - the power draw of the backlight dominates easily (hence, with an adaptive backlight, power draw will be lower with darker content no matter if it's IPS, VA or TN).
 
adaptive backlight works on the entire screen, which is a component of what I'm talking about specifically for power savings, the pixel colors matter for LCD's which use IPS, VA or TN, and VA does use the most power of them all purely on a pixel level, but all of them have the same characteristics from a power perspective when looking at the color the pixel is at.

Even with an adaptive backlight the computer still has to distinguish between the different pixels which will have different amounts of power usage based on what is being rendered. A pure white screen with lets say 100 nits will still burn more watts then a gray screen with the same nits.
 
Back
Top