AMD Radeon HD 6000M Series Laptop GPUs Launched

Discussion in 'Architecture and Products' started by Berek, Nov 29, 2010.

  1. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,015
    Likes Received:
    112
    Idle power is hugely important for mobile. I don't think though there will be a 3W difference for the mobile parts - but GF108 supports Optimus which could be just what the OEMs want (yes I'm aware that's pretty much a difference in software only, but the OEMs won't care if that's hw or sw...).
    It's true for desktop graphic cards the HD5570 walks all over GT430 while drawing less power under load, though I'm not sure the situation is the same for the mobile parts (could be closer there depending on clocks - something hard to judge given those parts can ship with vastly different specs).
     
  2. wishiknew

    Regular

    Joined:
    May 19, 2004
    Messages:
    332
    Likes Received:
    6
    So can this MVC 3D magic be applied in theory be applied to the other 5x00 chips?
     
  3. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    Optimus has a pretty terrible name with online gamers who play anything that snaps into the PunkBuster framework. Even outside of PB, it still has mounds of problems with app detection on BOTH sides of the fence -- ie not turning on when it should, or turning on when it shouldn't. There are dozens of forums with many hundreds of complaints about Optimus and it's inability to get things working correctly. Just type Optimus Issues into Google and you'll find plenty of very recent examples.

    Given the options, I'm far more happy to "flip the switch" to get my needed 3D performance when I want it, versus hoping that NV's driver can figure it out for me and have no other way to tell it otherwise. That was one of the fundamental decision points in my purchase of a Lenovo Y460 earlier this year...
     
  4. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,555
    Likes Received:
    699
    I have an Optimus laptop and lot of what you said is pretty much BS. You can config the apps that should launch the discrete GPU when you want. You have full control of it. As far as telling if the GPU is on or not, I have two ways of knowing it.

    1 - A light in my laptop changes color (Blue - Intel HD; White - GeForce).
    2 - A taskbar icon where I can see which applications are using the GPU, if any.

    I have never had any problems with it, except one time when Intel updated its Intel HD drivers, but that was corrected by nVIDIA in no time.
     
  5. TKK

    TKK
    Newcomer

    Joined:
    Jan 12, 2010
    Messages:
    148
    Likes Received:
    0
    - on par only in raw performance, not in [gaming] image quality or driver stability.

    - it will take quite some time until notebook vendors finish the transition from Clarkdale to SB dualcores, and most Llano notebooks probably won't see the light of day before Q4/'11. Such transitions don't happen over night, so there's enough time left to sell those 6300M's.


    Possibly. We don't know yet which chips will use what number, but I have a hunch it might look like this...

    6900M = Barts/Blackcomb
    6800M = Juniper/Granville rename
    6700M = Turks/Whistler GDDR5
    6600M = Turks/Whistler DDR3, *maybe* Redwood/Capilano GDDR5
    6500M = Redwood/Capilano rename
    6400M = Caicos/Seymour
    6300M = Cedar/Robson rename
     
  6. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    Say what you want, but it's still very much a problem. Perhaps not for you, but the punk buster bug is STILL there even after a year of the technology being available. There are still issues with Adobe Flash, there are still issues with adding games but losing the "apply" button to make the change in the manual config of the driver.

    And I still have no way to turn the damned thing off if I don't want the NV card running (ie I'm on a flight and don't mind using the onboard graphics for something like Minecraft or Fractal.)
     
  7. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,555
    Likes Received:
    699
    What? LOL!
    I do that all the time when Im on battery.
    Just run the browser with the integrated gpu (right click browser icon -> run with graphics processor -> Intel HD; this must be activated on the nVIDIA taskbar icon -> show context menu options; )
    If its flash, just go to flash options and disable hardware acceleration.
     
  8. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    Dunno, I didn't make the mistake of purchasing an NV-based video solution on my laptop, so mine works exactly as I expect. But it's a very common request on a rather large pile of forums, even still to this month.

    Guess it's not as easy as you'd suggest it is?
     
  9. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,555
    Likes Received:
    699
    Say what? :roll:
    You just said:

    What? Do you want it to read your thoughts? :razz:
    Of course if you dont want for it to use the discreet GPU on an application where it normally should, you have to config it to not.

    All I see here is bashing Optimus just because its nVIDIA, sorry.
     
  10. Psycho

    Regular

    Joined:
    Jun 7, 2008
    Messages:
    745
    Likes Received:
    39
    Location:
    Copenhagen
    Huh?
    I've also seen a lot of trouble with it. If it was just possible to turn it off in the bios or something (I think some laptops allow this). Wouldn't mind if said laptops were always running on the nvidia gpu. But no, we have to rely on the automatic software working, which isn't always the case. Good for you you haven't run into the problems, but don't call BS because of that.
     
  11. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    I want the product to do what I tell it to. If I want the graphics card ON, then it should turn ON. if I want it OFF, then it should turn OFF. Your tongue-in-cheek response is that the software can figure it out -- reality (and a whole lot of people on a big pile of forums) suggest that it isn't always as easy as you say it is.

    What part are you having a hard time understanding? That it doesn't always work the way you expect it to? Or that some how, in some way, NVIDIA may not be infallible?
     
  12. Picao84

    Veteran Regular

    Joined:
    Feb 15, 2010
    Messages:
    1,555
    Likes Received:
    699
    Im not saying it is not failable.
    But, contrary to you it seems, I have experience with it, while you are just bashing it from what you see on the internet. And even if it was AMD technology it would still be prone to fail, as it concerns "choice". So quit the nVIDIA bashing, which is your objective here:

    Or did you say ANYTHING about the topic on hand, namely the apparent rebranding by AMD of Evergreen chips? :roll:
     
  13. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    Personally, I don't see great value in Optimus. My ATI based notebook turns the discrete GPU on when plugged in and uses the integrated one when on battery power. That's exactly how I want it. I don't want to have to worry about automatic switching software goofing up at some point and killing battery life without me noticing.

    Of course, manual override is just a button press away, but battery life is probably halved when gaming with the discrete GPU, so it's not often that I want to do that.
     
  14. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,015
    Likes Received:
    112
    Well apart from Optimus being more flexible (certainly it should be possible to implement the same scheme with Optimus, i.e. use igp when on battery, otherwise discrete gpu), I think the major reason why OEMs like it is very simple: cost. The switching schemes requires external mux chips for switching the display outputs, and on top of that this stuff isn't really standardized it seems, hence requiring the OEMs to do some driver work. Optimus is much simpler from that point of view, no mux required and the software needed is the same for each device, hence easily incorporated into standard nvidia driver.
    Personally though I'm no friend of neither scheme really ;-). Not using windows though switchable is a whole lot more useful than Optimus, even if not optimal :).
     
  15. Berek

    Regular

    Joined:
    Oct 17, 2004
    Messages:
    271
    Likes Received:
    4
    Location:
    Houston, TX
  16. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,496
    Likes Received:
    910
    Honestly, with the bus sizes/memory capacities, XX% faster than YY and everything… this sounds made-up.
     
  17. TKK

    TKK
    Newcomer

    Joined:
    Jan 12, 2010
    Messages:
    148
    Likes Received:
    0
    Could simply be AMD's performance targets, though.

    But I agree that 192-bit and 3 GB sounds strange. Were we talking about Nvidia it would be more likely, but AMD hasn't had anything else than 64-128-256-512 bit interfaces so far.
     
  18. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,457
    Likes Received:
    580
    Location:
    WI, USA
    It sounds like they had a slide with some numbers showing 30% differences between three segments.

    Just another process node. I'm sure we will again have a midrange GPU called the same model number as a completely different, bigger chip on the desktop. I'd guess that it will bring the 6950 performance level to notebooks. Juniper brought notebooks to about desktop 4850 level. Unless memory clocks up quite a bit more though it's going to have a significant bandwidth deficit to the current desktop parts. ATI doesn't seem to want to use a 256-bit bus in notebooks anymore.
     
    #38 swaaye, Dec 30, 2010
    Last edited by a moderator: Dec 30, 2010
  19. leoneazzurro

    Regular

    Joined:
    Nov 3, 2005
    Messages:
    518
    Likes Received:
    25
    Location:
    Rome, Italy
    Hmmm... if Blackcomb will be a Barts derivative, it will probably use a 256 bit bus. And I think it will be, as any other solution (Turks, Caicos derivative) will be almost surely a step backward with respect to the mobile version of Juniper.
     
  20. Mobius1aic

    Mobius1aic Quo vadis?
    Veteran

    Joined:
    Oct 30, 2007
    Messages:
    1,649
    Likes Received:
    244
    6400M - 160 SPs/8 TMUs/16:4 ROPs - Interesting but Jesus it's on a 64 bit bus.........(but available with DDR3 or GDDR5).

    It's a performance gap AMD should've filled a while back in the 3xxx series, or at least had a 160 SP part as the lowest end for desktop parts when the 5xxx series arrived. I'm very interested to see how it does in it's GDDR5 form.

    And yes, I wouldn't have bothered with the 6300M, but I guess AMD still has excess Cedars to get rid of.

    And yay for the return of 256 bit memory buses to AMD's high end mobile graphics, but only on the 6900M :(
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...