AMD Radeon HD 6000M Series Laptop GPUs Launched

That's the point, a quick rebrand is cheap to do and not all sold laptops will have SB's on board.


104mm^2 vs. 130mm^2? With the HD5570 @ 39W being faster than a GT430 @49W? At least it idles 3W or so lower... ;)
Idle power is hugely important for mobile. I don't think though there will be a 3W difference for the mobile parts - but GF108 supports Optimus which could be just what the OEMs want (yes I'm aware that's pretty much a difference in software only, but the OEMs won't care if that's hw or sw...).
It's true for desktop graphic cards the HD5570 walks all over GT430 while drawing less power under load, though I'm not sure the situation is the same for the mobile parts (could be closer there depending on clocks - something hard to judge given those parts can ship with vastly different specs).
 
Optimus has a pretty terrible name with online gamers who play anything that snaps into the PunkBuster framework. Even outside of PB, it still has mounds of problems with app detection on BOTH sides of the fence -- ie not turning on when it should, or turning on when it shouldn't. There are dozens of forums with many hundreds of complaints about Optimus and it's inability to get things working correctly. Just type Optimus Issues into Google and you'll find plenty of very recent examples.

Given the options, I'm far more happy to "flip the switch" to get my needed 3D performance when I want it, versus hoping that NV's driver can figure it out for me and have no other way to tell it otherwise. That was one of the fundamental decision points in my purchase of a Lenovo Y460 earlier this year...
 
Optimus has a pretty terrible name with online gamers who play anything that snaps into the PunkBuster framework. Even outside of PB, it still has mounds of problems with app detection on BOTH sides of the fence -- ie not turning on when it should, or turning on when it shouldn't. There are dozens of forums with many hundreds of complaints about Optimus and it's inability to get things working correctly. Just type Optimus Issues into Google and you'll find plenty of very recent examples.

Given the options, I'm far more happy to "flip the switch" to get my needed 3D performance when I want it, versus hoping that NV's driver can figure it out for me and have no other way to tell it otherwise. That was one of the fundamental decision points in my purchase of a Lenovo Y460 earlier this year...

I have an Optimus laptop and lot of what you said is pretty much BS. You can config the apps that should launch the discrete GPU when you want. You have full control of it. As far as telling if the GPU is on or not, I have two ways of knowing it.

1 - A light in my laptop changes color (Blue - Intel HD; White - GeForce).
2 - A taskbar icon where I can see which applications are using the GPU, if any.

I have never had any problems with it, except one time when Intel updated its Intel HD drivers, but that was corrected by nVIDIA in no time.
 
I dont really see the point of the Radeon 6300M. Sandy Bridge's graphics will be on par to it, so what would be the point of using it given that its a lot more power hungry?
- on par only in raw performance, not in [gaming] image quality or driver stability.

- it will take quite some time until notebook vendors finish the transition from Clarkdale to SB dualcores, and most Llano notebooks probably won't see the light of day before Q4/'11. Such transitions don't happen over night, so there's enough time left to sell those 6300M's.


GZ2007 said:
I think the name change from 5600M to 6600M is not actualy that bad. They are quite close to the desktop cards. The 6800M and 6700M will be much more misleading.
Possibly. We don't know yet which chips will use what number, but I have a hunch it might look like this...

6900M = Barts/Blackcomb
6800M = Juniper/Granville rename
6700M = Turks/Whistler GDDR5
6600M = Turks/Whistler DDR3, *maybe* Redwood/Capilano GDDR5
6500M = Redwood/Capilano rename
6400M = Caicos/Seymour
6300M = Cedar/Robson rename
 
I have an Optimus laptop and lot of what you said is pretty much BS. You can config the apps that should launch the discrete GPU when you want. You have full control of it. As far as telling if the GPU is on or not, I have two ways of knowing it.

1 - A light in my laptop changes color (Blue - Intel HD; White - GeForce).
2 - A taskbar icon where I can see which applications are using the GPU, if any.

I have never had any problems with it, except one time when Intel updated its Intel HD drivers, but that was corrected by nVIDIA in no time.

Say what you want, but it's still very much a problem. Perhaps not for you, but the punk buster bug is STILL there even after a year of the technology being available. There are still issues with Adobe Flash, there are still issues with adding games but losing the "apply" button to make the change in the manual config of the driver.

And I still have no way to turn the damned thing off if I don't want the NV card running (ie I'm on a flight and don't mind using the onboard graphics for something like Minecraft or Fractal.)
 
And I still have no way to turn the damned thing off if I don't want the NV card running (ie I'm on a flight and don't mind using the onboard graphics for something like Minecraft or Fractal.)

What? LOL!
I do that all the time when Im on battery.
Just run the browser with the integrated gpu (right click browser icon -> run with graphics processor -> Intel HD; this must be activated on the nVIDIA taskbar icon -> show context menu options; )
If its flash, just go to flash options and disable hardware acceleration.
 
Dunno, I didn't make the mistake of purchasing an NV-based video solution on my laptop, so mine works exactly as I expect. But it's a very common request on a rather large pile of forums, even still to this month.

Guess it's not as easy as you'd suggest it is?
 
Dunno, I didn't make the mistake of purchasing an NV-based video solution on my laptop, so mine works exactly as I expect. But it's a very common request on a rather large pile of forums, even still to this month.

Say what? :rolleyes:
You just said:

And I still have no way to turn the damned thing off if I don't want the NV card running (ie I'm on a flight and don't mind using the onboard graphics for something like Minecraft or Fractal.)

Guess it's not as easy as you'd suggest it is?

What? Do you want it to read your thoughts? :p
Of course if you dont want for it to use the discreet GPU on an application where it normally should, you have to config it to not.

All I see here is bashing Optimus just because its nVIDIA, sorry.
 
Huh?
I've also seen a lot of trouble with it. If it was just possible to turn it off in the bios or something (I think some laptops allow this). Wouldn't mind if said laptops were always running on the nvidia gpu. But no, we have to rely on the automatic software working, which isn't always the case. Good for you you haven't run into the problems, but don't call BS because of that.
 
What? Do you want it to read your thoughts? :p
Of course if you dont want for it to use the discreet GPU on an application where it normally should, you have to config it to not.

I want the product to do what I tell it to. If I want the graphics card ON, then it should turn ON. if I want it OFF, then it should turn OFF. Your tongue-in-cheek response is that the software can figure it out -- reality (and a whole lot of people on a big pile of forums) suggest that it isn't always as easy as you say it is.

What part are you having a hard time understanding? That it doesn't always work the way you expect it to? Or that some how, in some way, NVIDIA may not be infallible?
 
I want the product to do what I tell it to. If I want the graphics card ON, then it should turn ON. if I want it OFF, then it should turn OFF. Your tongue-in-cheek response is that the software can figure it out -- reality (and a whole lot of people on a big pile of forums) suggest that it isn't always as easy as you say it is.

What part are you having a hard time understanding? That it doesn't always work the way you expect it to? Or that some how, in some way, NVIDIA may not be infallible?

Im not saying it is not failable.
But, contrary to you it seems, I have experience with it, while you are just bashing it from what you see on the internet. And even if it was AMD technology it would still be prone to fail, as it concerns "choice". So quit the nVIDIA bashing, which is your objective here:

Dunno, I didn't make the mistake of purchasing an NV-based video solution on my laptop

Or did you say ANYTHING about the topic on hand, namely the apparent rebranding by AMD of Evergreen chips? :rolleyes:
 
Personally, I don't see great value in Optimus. My ATI based notebook turns the discrete GPU on when plugged in and uses the integrated one when on battery power. That's exactly how I want it. I don't want to have to worry about automatic switching software goofing up at some point and killing battery life without me noticing.

Of course, manual override is just a button press away, but battery life is probably halved when gaming with the discrete GPU, so it's not often that I want to do that.
 
Well apart from Optimus being more flexible (certainly it should be possible to implement the same scheme with Optimus, i.e. use igp when on battery, otherwise discrete gpu), I think the major reason why OEMs like it is very simple: cost. The switching schemes requires external mux chips for switching the display outputs, and on top of that this stuff isn't really standardized it seems, hence requiring the OEMs to do some driver work. Optimus is much simpler from that point of view, no mux required and the software needed is the same for each device, hence easily incorporated into standard nvidia driver.
Personally though I'm no friend of neither scheme really ;-). Not using windows though switchable is a whole lot more useful than Optimus, even if not optimal :).
 
Honestly, with the bus sizes/memory capacities, XX% faster than YY and everything… this sounds made-up.
Could simply be AMD's performance targets, though.

But I agree that 192-bit and 3 GB sounds strange. Were we talking about Nvidia it would be more likely, but AMD hasn't had anything else than 64-128-256-512 bit interfaces so far.
 
It sounds like they had a slide with some numbers showing 30% differences between three segments.

Just another process node. I'm sure we will again have a midrange GPU called the same model number as a completely different, bigger chip on the desktop. I'd guess that it will bring the 6950 performance level to notebooks. Juniper brought notebooks to about desktop 4850 level. Unless memory clocks up quite a bit more though it's going to have a significant bandwidth deficit to the current desktop parts. ATI doesn't seem to want to use a 256-bit bus in notebooks anymore.
 
Last edited by a moderator:
It sounds like they had a slide with some numbers showing 30% differences between three segments.

Just another process node. I'm sure we will again have a midrange GPU called the same model number as a completely different, bigger chip on the desktop. I'd guess that it will bring the 6950 performance level to notebooks. Juniper brought notebooks to about desktop 4850 level. Unless memory clocks up quite a bit more though it's going to have a significant bandwidth deficit to the current desktop parts. ATI doesn't seem to want to use a 256-bit bus in notebooks anymore.

Hmmm... if Blackcomb will be a Barts derivative, it will probably use a 256 bit bus. And I think it will be, as any other solution (Turks, Caicos derivative) will be almost surely a step backward with respect to the mobile version of Juniper.
 
6400M - 160 SPs/8 TMUs/16:4 ROPs - Interesting but Jesus it's on a 64 bit bus.........(but available with DDR3 or GDDR5).

It's a performance gap AMD should've filled a while back in the 3xxx series, or at least had a 160 SP part as the lowest end for desktop parts when the 5xxx series arrived. I'm very interested to see how it does in it's GDDR5 form.

And yes, I wouldn't have bothered with the 6300M, but I guess AMD still has excess Cedars to get rid of.

And yay for the return of 256 bit memory buses to AMD's high end mobile graphics, but only on the 6900M :(
 
Back
Top