AMD Radeon finally back into laptops?

Discussion in 'Architecture and Products' started by ToTTenTranz, Jul 12, 2016.

Tags:
  1. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    without boost clocks, I'm sure it can down that far, if not farther if down clocked, with boost clocks the 1050ti stays around 70 watts.

    If you lock clocks at base clocks on a 1060 it going down to 65 watts.... a 1070 is down to 75 watts.

    They can cut down 50% power usage just by locking clocks (more specifically voltage so boost clocks can't be used)

    They don't need to down clock a 1050 or 1050 ti to get to where a r460 is at when down clocked by 25%.

    I'm also pretty sure this is why the mobile Pascals seem a bit slower than the desktops, other laptops are still limited by a lower TDP then what the desktop Pascals can do so they don't boost as much.

    Then factor in binning for lower voltage chips, there ya have it. Even gaming laptops have a max on their TDP. For a 1060 gaming laptop I would expect a 70watt TDP, no more. 1070 laptop probably 105 watts.
     
    #81 Razor1, Oct 28, 2016
    Last edited: Oct 28, 2016
  2. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,099
    Likes Received:
    4,675
    According to PCGamer:
    And as I said before, there's no guarantee that lowering the clocks on the GP07 will make it go as down as 35W for the full card.
    Regardless, that chip wasn't ready in time for the Surface Book so it probably wouldn't be on time for this Macbook Pro either.
     
  3. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    The discrete card at base clocks 1508MHz (something like that anyway) in game uses 62W, at 2050-2100MHz it uses 120W, higher figure 130W-140W more to do with custom models and top clocks (both memory and core).
    So can extrapolate from that, and that is for the full discrete 6GB card.
    Cheers
     
    #83 CSI PC, Oct 28, 2016
    Last edited: Oct 28, 2016
    pharma and Razor1 like this.
  4. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    They are guessing, they don't know, other websites have stated 65 and 75 watts.

    And hell yes it can get to 35 watts, just search for voltage locks for the 1060 and 1070, people have done them on other forums, they are getting the 50% power savings I have been saying.

    This is a moot point any ways, because we know Pascal kills GCN Polaris in perf/watt, you shouldn't even question that a 1050 or 1050ti could get down to 35 watts and still have better performance the question is how much more performance will it have and that end result is most likely retain the base clocks, no boost

    Now if we want to have some data behind this, Temps of the 1050 and 1050ti don't increase unitl the core starts boosting

    http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1050-ti,4787-7.html

    [​IMG]

    Just looking at the temps, they increase 35 degrees C, that alone tells us they can save 35 watts or so if they stop the card from boosting (seriously overclocked card here), so just to point out temps are playing a pretty important roll with power usage when this card is overclocked.

    So its regular boost without it the temps will save 10 watts or so, then take into the effect of the voltage change, which does increase power usage considerably more then the temps, you are looking at alot of savings just by disabling the voltage for boost.
     
    #84 Razor1, Oct 28, 2016
    Last edited: Oct 28, 2016
    -Sweeper_ and xpea like this.
  5. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,099
    Likes Received:
    4,675
    No, you cannot extrapolate "downwards" from whatever happens "upwards". It just doesn't work that way because power consumption with frequency is not linear.
    I get that you often take those line charts from tomshardware as some kind of holy grail that tells you the exact power consumption at each exact frequency, which is a tremendous mistake IMO (and this is something the author himself says too in the article).

    Just to show how ridiculous that would be, that GTX 1060 line shows a slope of 0,1 (20W y-axis/200MHz difference x-axis between 60 and 80W). y = m.x + b, and with 60W - 1500MHz we get the point slope of [ power = 0,1*frequency - 90].

    This is fantastic, because according to your extrapolation, if the frequency of the GTX 1060 is 850MHz, then power = 0,1*850 - 90 = -5 W.

    Congratulations nvidia and @CSI PC , you just invented a graphics card that generates power out of nothing!
    It's a miracle! Let's stop all the investment in renewable power sources and just start a way to harvest the power generated by those GP106 GPUs below 900MHz and we'll have free clean energy for all!


    Now to the serious part, just because you can't see a horizontal asymptote down to 1500MHz (like you see the start of a vertical one close to 2GHz), it doesn't mean it won't be there at 1400MHz or 1300MHz.
    As I stated before, there's a range of frequencies and voltages (not shown in the graph BTW) within which the chip behaves with that pretty linear slope. The chip was just binned for that. Outside those values, there will be little or no returns.


    They are testing and put the number on that ballpark.
    Feel free to share these other websites that claim 65 to 75W through testing and not guessing.

    Different chips made on a different foundry using a different process.
    Feel free to share links of people claiming they got their GP107 cards to pull 35W total from 12V and 3.3V rails, and at what clocks.


    Like the 1050 Ti would unquestionably reach 2 GHz on air with a 6pin connector?


    In terms of power envelope, the GP107 is the sucessor of GM107 (GTX 750 Ti, 850M, 950M), and nvidia never had a single GM107 mobile card go down to 35W.
    A smaller GM108 was developed to reach those values.
     
  6. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    You don't seem to understand how leakage works and what happens with increased voltages with the boosts. I suggest you look up things like that when you have free time before spouting about EE as if it couldn't be figured out with math.

    Are you guessing on that? Cause I don't see the data showing that.........

    I think you should search for them. You kinda put your foot in the fire here with pretty absurd comments, one of them is extremetech, so go look

    I think it can do it, just that the bios isn't letting them do it, We can see the cooling for most of these 1050ti's are not enough to reach 2k. We can see overclocking without the 6 pin attached they still get to 1750 mhz, which means at 75 watts max board supply they have a butt load of room to push power.
    Oh I think you missed the razor blade ultrabook then........ Yes they did.
     
    #86 Razor1, Oct 28, 2016
    Last edited: Oct 28, 2016
  7. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,099
    Likes Received:
    4,675
    The Razer Blade didn't miraculously halve the GTX 1060's TDP. It just uses very loud fans to dissipate the heat of a 80W graphics card, just like its 2015 predecessor.
    If it was using a 35W CPU together with a 35W GPU, it wouldn't need a 165W power supply.


    As for the bile, personal attacks and derailment, I'll just make good use of B3D's top tip of 2016.
     
  8. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    You stated 750ti, the 750ti that were used in notebooks, had TDP's around 35 to 45 watts ;)

    Btw You were the one that stated you thought nV can't get to that power rating.

    If you want me to quote it here ya go


    Don't sit here and try to redirect your line of thought on to others that directly was associated with the what you call derailment now.

    The rest of it, I did no such thing I just stated you need to look some stuff up before you post, because you haven't looked for the information before. If I were to do something like that, ya know and everyone knows I have no qualms and calling out what a person truly is by using the exact words, and I don't see you posting like that so I didn't.

    http://www.extremetech.com/gaming/1...nt-amd-and-nv-graphics-cards-by-a-huge-margin

    GTX 800 series (performance chips) were using GM107 chips, not GM108 which are the ones talked about at extremetech the one I just linked to.

    The lower end GTX 800 think they were GTX 840 and 830 were the ones that were GM108 chips.

    https://en.wikipedia.org/wiki/Maxwell_(microarchitecture)

     
    #88 Razor1, Oct 28, 2016
    Last edited: Oct 28, 2016
  9. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,137
    Likes Received:
    2,939
    Location:
    Well within 3d
    The radeon.pro page has a (partially obscured, off-angle, limited area in focus) visual representation of what appears to be the 460 die.

    Assuming it's representative:
    Without doing any visual manipulation, does it seem like the CUs were engineered to be narrower?
    The SIMD portions are not symmetrical in these CUs, unlike Polaris 10 and others, but like some of the APU implementations like Carrizo. The inner SIMD section looks to be more heavily reworked.
    Various other blocks seem to have been rotated and various SRAM blocks stacked along the length of the CU rather than width.
     
    Lightman likes this.
  10. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    You know your point is bloody stupid and deliberately baiting/twisting my information.
    I think you like just arguing with me, just like you insisted the Nintendo Switch was not Nvidia and was most likely DMP.

    By extrapolate it means look at the figure at base clocks/0.8V and go from there in a more sensible way than the wrong figure quoted by PCGamer, not in your pointless example of -5W/850MHz and a voltage figure that is stupid, there is a limit to how stable it will be at minimum voltage values and yeah it is obvious the linear behaviour is only within the silicon-node optimal performance envelope, and you read enough of my posts to know I go on about that a lot in various threads.
    If you have the choice of the figure you use that has no basis within measurements, or an actual measurement benchmark albeit for the 1060 6GB discrete that was analysed for volt-frequency-watts-performance from 0.8V to 1.1V I know what I prefer.
    So your using as fact the made up estimate from PCGamer to make a point and still want to argue about it....
    Case in point the article you use says
    But instead of arguing and posting a baiting/sarcastic response, you would know that at 1508MHz it ONLY USES around 62W when measured accurately.
    I have posted many times the Tom's Hardware analysis and ironically you also argued briefly against that in the past.
    Here is part of it again, and please cut back on the deliberate baiting/taking my posts out of context.

    [​IMG]

    And then they checked their analysis with gameplay trend analysis as well, anyway far more concise and accurate to the figure you want to argue about used by PCGamer: http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1060-pascal,4679-7.html

    I guess you want to argue with Nvidia that they are wrong stating their P4 GP104 (comparable to 1080 architecture) Tesla is spec'd at 75W with 5.5TFLOPs FP32, and yeah that has a clock range of 810MHz-1114MHz (75W is the top clock not base).
    Anyway out of this discussion now, but I would not use the PCGamer figure, and would use the Tom's analysis to extrapolate the watts, and yes I assume all here have common sense to do that with reason.
    Cheers
     
    #90 CSI PC, Oct 28, 2016
    Last edited: Oct 28, 2016
    xpea, ieldra, pharma and 2 others like this.
  11. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,099
    Likes Received:
    4,675
    Have you seen this entry at radeon.com?
    Perhaps it has something to do with this:



    @CSI PC just through the fact that you put on bold some power/frequency values that I actually wrote in my post you quoted shows how little you read or paid any attention to it.
    I really don't care. Go ahead and use that graph as your holy grail for predicting clocks and frequencies that are not within the values depicted in it. I just tried to explain how that doesn't make any sense but whatever.

    And then you go on mixing whatever average of power values that Tomshardware measured on a single test as claiming it would be the same as validating that card's Thermal Design Power (which is kind of important and restrictive for small enclosures like a laptop). Even the author himself states those results must be limited because of their own time constraints:
    Metro is 1 game, 1 engine in 1 API, probably even using a timedemo, which is 1 scripted run in 1 scenario, and all of this in 1 setting config (for 3 different resolutions, yay). For you, apparently it's everything. Hint: don't follow a Quality Assurance career.

    I also think it's awfully childish to try to rub around some of my guesses that didn't come true but are completely unrelated to the matter at hand. It's called a strawman.
    Yes, up until the last moment I thought the NX was more likely to have either an AMD SoC or a Nintendo SoC with a GPU by DMP. I thought that because a) AMD had yet to announce their 3rd semi-custom design win, b) DMP had just announced a new family of GPUs capable of scaling up to 1TFLOPs, c) nvidia never hinted they had a semi-custom design win during their investor calls, d) Nintendo had used Tegra for dev boards of the 3DS and this could be the case again and e) IIRC Jen Hsu Huang had stated that console wins weren't the deals with the kind of profits they were looking for (turns out he was full of shit).

    My opinion had zero to do with arguing with you or anyone else. I don't remember nor care if my opinions about the NX's SoC were made through replies to your username or some other's.
    You give yourself way too much importance.
     
  12. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Yeah pretty familiar with wafer thinning, Intel has been using it for quite some time now. This is nothing new.

    Well you are guessing, when there is information where you don't need to guess, and that is what CSI_PC and I have been saying.

    For a person that is saying a quality test engineer should follow more than one thing, its a hell of a lot better then guessing in the first place isn't it?
     
    #92 Razor1, Oct 28, 2016
    Last edited: Oct 28, 2016
    CSI PC likes this.
  13. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,137
    Likes Received:
    2,939
    Location:
    Well within 3d
    That seems to involve a manufacturing step more concerned with the Z axis of a chip, perpendicular to the plane the CUs are on.
    Perhaps there are considerations for the implementation that can influence this to a limited extent, if for example there are mechanical reasons for specific choices.
    With die thinning the big win would be shaving (ed: mechanically grinding) the non-patterned side of the wafer, which makes up most of the height.
     
    #93 3dilettante, Oct 28, 2016
    Last edited: Oct 28, 2016
    ieldra, pharma and Razor1 like this.
  14. CSI PC

    Veteran Newcomer

    Joined:
    Sep 2, 2015
    Messages:
    2,050
    Likes Received:
    844
    Sigh,
    you do realise it was Tom's Hardware that proved the 480 TDP and power distribution was incorrect by using similar methodology and context, and followed up by PCPer?
    Funny that AMD corrected their card based upon Tom's Hardware and PCPer, but hey what do we all know, and Metro gave results good enough for the point back then and is hard on power draw for both manufacturers, hence why it is used.
    Yeah time constraints and they spent a solid 2 days just mapping the envelope of 2 GPUs, however again you ignored they double checked by running the game and analysing default/minimum/max clocks to see how it correlated to their earlier model.
    Anyway enough, but had to respond as again you skewed the context and my knowledge, well done.
     
    #94 CSI PC, Oct 28, 2016
    Last edited: Oct 28, 2016
    ieldra likes this.
  15. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Interestingly Metro even though is an old game, when it comes to power consumption its a demanding game so what we are looking at is one of the more power intensive games that give out those kinda of figures.
     
  16. ieldra

    Newcomer

    Joined:
    Feb 27, 2016
    Messages:
    149
    Likes Received:
    116
    While I don't think anyone needs to point out that you pulled a very dull and banal straw man argument, and with an irritating mocking tone to boot, but the hypocrisy of then accusing others posts of being full of bile was really the cherry on the cake.

    Good God man, this is the kind of thing I expect on reddit...

    Obviously I cannot speak for Razor or CSI but I am quite certain nothing they said suggested power and frequency scale linearly.

    The graph posted above shows fairly linear behavior in the clock range tested.

    I'm going to take care to be clear because I don't want you pulling another strawman, pretty soon this thread will be a barn.

    1. Indisputable fact; the entirety of a GTX1060 consume 62 watts at 1500mhz, at 2000mhz it's ~120w.

    2. Indisputable fact: the Tesla P40 is a a full GP104 config with slower gddr5 clocked at ~1ghz boost. The power draw claimed by nvidia is 50-75w, of the whole board.

    So what exactly is your problem?

    The 1060 has a 50% reduction in power from a 25% reduction in clock from 2000-1500mhz.



    Seriously ? Come on.
     
    Razor1 and DavidGraham like this.
  17. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    10,099
    Likes Received:
    4,675
    Original statement:

    Counter-arguments:

    1 - Personal attacks
    2 - Flamebait
    3 - Mentioning completely unrelated stuff like my opinion on the Nintendo NX SoC to try to discredit one's opinion
    4 - Putting "likes" in each of the BFF's posts
    5 - But GP106 (different chip on a different process) consumes 60W at 1500MHz in one test made by one website
    6 - But GP104 (different chip on a different process in probably what are very high binned products for selling at huge margins) consumes 75W at 800MHz according to the IHV.


    And now there's this 75 post guy starting his arguments by calling me irritating and an hypocrite.
    With obvious likes from the BFFs.


    I just wrote we don't know if GP107 can scale down while maintaining its original power/performance curve.
    And all hell broke loose.
    Let that sink in.
     
  18. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    what happened to the post before you posted that? The one I quoted you on? That just isn't important anymore for this conversation?

    I don't get you tots, you start something, but don't like it when people show you contrary to what you stated

    I showed you the gp107 did get down there. Yet you keep on going with your imaginary thinking......

    BTW I did no such things to you other then stating you should look up some of the things you are posting about, hinting to you are not correct. Maybe next time I will call you out as you are? Would that make you feel better? At least that way when you accuse me of personally attacking you it will actually be real?
     
    #98 Razor1, Oct 29, 2016
    Last edited: Oct 29, 2016
    CSI PC, DavidGraham and ieldra like this.
  19. FriendlyNeighbour

    Newcomer

    Joined:
    Sep 18, 2013
    Messages:
    21
    Likes Received:
    8
    I'm laughing as I am reading all this flaming. You people take yourselves way too seriously. It's just a message board. Nobody will remember what you wrote 48 hours from now.
     
  20. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Now we have come down to Pokemon memes lol! At least its entertaining :)

    BTW the TDP figure for the laptop GPU from the way AMD has been marketing their TDP's this gen in their manuals, is for the GPU only, So yeah the question arises how much is it for the memory too.
     
    #100 Razor1, Oct 29, 2016
    Last edited: Oct 29, 2016
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...