GF4 4200Go

Quite amusing.

I heard from ATI quite a while ago that Nvidia would have no mobile part to compete with the M9, and that what we'd probably see would be Nvidia try to stuff their desktop graphics chip into latptops. That's exactly what we got... Nvidia hasn't managed to put much of anything on the market the last half of this year or so worth spit.

I wouldn't expect to see the GF4go-4200 in anything but a few "desknote" systems that don't have batteries, and in one or two manufacturers trying to keep good relations with Nvidia.

Nvidia got away with a lot in that review by not letting Anand have time to test how it really works... how its powersaving is, what battery life is like, etc. The article shows impressive performance numbers for Nvidia vs the M9 in many cases, but that doesn't mean squat in 99% of the mobile market. Implementing the NV28M will not only kill power consumption, it sounds like it'll be a lot more expensive to implement due to its lack of tv-out and LVDS support.

Nvidia needed to put something on the table, and they did... they'll sell a few in the niche that the NV28M fits, but it seems to me this is more for Nvidia to at least have a part on the market rather than to be able to contribute much to Nvidias bottom line...
 
Ichneumon said:
Quite amusing.

I heard from ATI quite a while ago that Nvidia would have no mobile part to compete with the M9, and that what we'd probably see would be Nvidia try to stuff their desktop graphics chip into latptops. That's exactly what we got... Nvidia hasn't managed to put much of anything on the market the last half of this year or so worth spit.

I wouldn't expect to see the GF4go-4200 in anything but a few "desknote" systems that don't have batteries, and in one or two manufacturers trying to keep good relations with Nvidia.

Nvidia got away with a lot in that review by not letting Anand have time to test how it really works... how its powersaving is, what battery life is like, etc. The article shows impressive performance numbers for Nvidia vs the M9 in many cases, but that doesn't mean squat in 99% of the mobile market. Implementing the NV28M will not only kill power consumption, it sounds like it'll be a lot more expensive to implement due to its lack of tv-out and LVDS support.

Nvidia needed to put something on the table, and they did... they'll sell a few in the niche that the NV28M fits, but it seems to me this is more for Nvidia to at least have a part on the market rather than to be able to contribute much to Nvidias bottom line...

That's a great summary Ichy and I Wholeheartedly agree with it
 
IMHO the important parts are here :

The lack of integrated display hardware is not the only stumbling block in the GeForce4 4200 Go's way. As a desktop chip, emphasis on designing the NV28 did not lie in reducing heat and power consumption. Power saving hardware, such as PowerMizer, has the ability to reduce the bottom line power consumption of a chip but the peak power characteristics of a mobile variant remain fairly similar to that of the desktop one. This results in very high peak power consumption and thermal characteristics in the NV28M under situations when the chip is stressed, such as when rendering a 3D scene.

NVIDIA would not disclose the maximum power draw of the NV28M but they did mention that it was high. They were able to get the voltage down to 1.25 volts when at the maximum performance setting (the same as the Mobility Radeon 9000) but apparently the power draw of the chip is still desktop-like.

desktop power usage with 3D running. So you can use your laptop in 2D when unplugged, and only when plugged you can play 3D-games if you can manage to keep the laptop cool enough :rolleyes:
 
Nvidia got away with a lot in that review by not letting Anand have time to test how it really works... how its powersaving is, what battery life is like, etc. The article shows impressive performance numbers for Nvidia vs the M9 in many cases, but that doesn't mean squat in 99% of the mobile market. Implementing the NV28M will not only kill power consumption, it sounds like it'll be a lot more expensive to implement due to its lack of tv-out and LVDS support

Notice again how Anand goes out of his way to showcase the speed aspect, without even an effort at an honest comparrison of the technologies. 99% of hardware sites are nothing But Mercinary sites, or worse.
 
Well, i don't know for you guys, but for me 90% of the time i use my laptop with the power supplier. It's more of the heat that i'm afraid.

Otherwise, i don't know the response of the PC builders to this new chip, do you? :-?
 
Sure it's a stopgap chip with a limited niche market. A face-saving effort. But ATI had better watch out - with Nvidia first on 0.13 micron, they would appear to have the lead on producing a high-performance low-power DX9 mobile chip. Or a very fast low-power DX8 chip.

The mobile arena is the place where 0.13 micron can really pay off, and Nvidia knows this. If ATI suffers anywhere near the delays that Nvidia did moving to 0.13, then Nvidia may completely own the mobile GPU market in 9 months.
 
Hellbinder[CE said:
]Notice again how Anand goes out of his way to showcase the speed aspect, without even an effort at an honest comparrison of the technologies. 99% of hardware sites are nothing But Mercinary sites, or worse.

Here we go again with the imaginary Anand bias. :rolleyes: Did I read the same article as you? The only reason you know of any technology deficiencies with the Nvidia part is that Anand pointed them out to you. He repeatedly slammed the Nvidia part for:
-- anticipated heat problems.
-- anticipated power consumption problems
-- anticipated cost issues due to lack of LVDS and TV-out.
-- Nvidia's reluctance to disclose power or heat ratings.
-- Paper launch - no current availability

What did he miss? I finished reading the article with the clear impression that this product has so many limitations that we won't be seeing much of it. But it's obvious that for a niche market (e.g. travelling presenters that demonstrate 3D graphics, or gamers that need a portable solution but typically play plugged in to the wall), this is the mobile GPU to have. Not a biased conclusion, just the obvious one.

Not to mention Anand went OUT OF HIS WAY to plug ATI in an Nvidia product review! "ATI is not ready to roll over and let NVIDIA steal the performance crown from them one more time. There are certainly forthcoming mobile ATI products on the horizon that set to not only perform on par with the GeForce4 4200 Go but offer better power management and thermal characteristics." (I emphasize this because the Anand conspiracy theorists eagerly pointed out that Anand went "out of his way" to mention Nvidia in a recent ATI review.)

Yet somehow Anand was Nvidia-biased and "dishonest" in his review? I guess some people will always just see what they want to see, rather than what is really there.
 
SteveG said:
Sure it's a stopgap chip with a limited niche market. A face-saving effort. But ATI had better watch out - with Nvidia first on 0.13 micron, they would appear to have the lead on producing a high-performance low-power DX9 mobile chip. Or a very fast low-power DX8 chip.

The mobile arena is the place where 0.13 micron can really pay off, and Nvidia knows this. If ATI suffers anywhere near the delays that Nvidia did moving to 0.13, then Nvidia may completely own the mobile GPU market in 9 months.

I agree with your last point. Next year should prove interesting in this arena! When ATI moves to a .13 micron process it won't be as severe as Nvidia's. As TSMC retains any experience they gain from using it and applies it to any buyer.
 
yes but who knows how long ati has been playing with .13 micron . They may have been prepping the r350 at .13 while finishing the r300 at .15 . hell they could be ahead of nvidia for all we know
 
Brent said:
That's a great summary Ichy and I Wholeheartedly agree with it

Likewise. And just to add a little spice: maybe its just a plan to clear a few more Ti4200s out of the inventory @ nVidia :)

Ultimately it seems like an entirely flawed product (although without a review descriping the power-saving feature, or lack thereof, that may be jumping the gun a bit).

Edit: I should also say that, whie I have had 'issues' with some of Anand's conclusions about nVidia and ATi products, I thought this review was well written and quite even-handed.

LW.
 
i really like the new chip because a lot of people (like me) have a laptop so that they can take it with them to wherever they're going... i dont use it in the car while im driving, I use it in the hotel room when i get there, and thus a power outlet is available. now, if you are on a buss (igg) or at an airport / in a plane, then the situation may be different, but just how many people that spend a lot of time in suits, flying on a 8+ hour plane trip are going to want a NV28? I think that it is perfect for its intended market - portable gaming :) ...... and who wouldnt mind being able to fit your lan rig into a briefcase-size bag?
 
Sage said:
i really like the new chip because a lot of people (like me) have a laptop so that they can take it with them to wherever they're going... i dont use it in the car while im driving, I use it in the hotel room when i get there, and thus a power outlet is available. now, if you are on a buss (igg) or at an airport / in a plane, then the situation may be different, but just how many people that spend a lot of time in suits, flying on a 8+ hour plane trip are going to want a NV28? I think that it is perfect for its intended market - portable gaming :) ...... and who wouldnt mind being able to fit your lan rig into a briefcase-size bag?

Agreed, it fits that niche fine... however do you have any idea what a minute portion of the mobile market that niche is? To top it off, Nvidia has completely missed the window of system building for the holiday season, and why we see the M9 in 20+ system models built by different companies right now...

The NV28M isn't a completely useless product, but it isn't going to make Nvidia any money, nor is it going to get to many OEM wins simply because it likely costs a lot more than the M9 to implement and doesn't have anything compelling feature-wise to compensate for that cost difference (and even its performance isn't *that* much of a crusher of the M9 across the board either)... It isn't going to gain them any market share, which is what ATI is eating up this generation in the mobile market, which they already completely dominate as it is...

It's face-saving for Nvidia so they have *something* on the market this generation, but that's about it. It isn't going to gain Nvidia much of anything.

Its a completely missed generation for Nvidia...

I too am looking forward to the spring/summer when we start hearing about the M10 from ATI and the NV30(ish) based Nvidia mobile part as that will be a much more interesting generation in the mobile market place... this generation belongs completely and utterly to ATI and the M9, and I expect their market share in this area will reflect that come next years numbers.

To top it off, next year while the M10 and NV30M (or 31M or whatever it'll be) are competing on the high-end for laptops, the M9 will Still be a great buy for OEMs for lower-end systems, so ATI keeps on winning because of Nvidias missed generation this time around.

Ok... i've blathered enough... I think i've gotten my point across. :)
 
Your typical suit with a laptop doesn't need 3D, so it is irrelevent. Next time you get on an airplane, count the number of people that are playing 3D games or games of ANY sort besides Solotaire or minesweeper on their laptop.

Gamers don't buy notebooks to play games on them UNLESS they are docked with a real keyboard and mouse. If you want something to do on an 8-hour flight, buy a Gameboy Advance.

On most of the laptops I've seen, they can't even run 8 hours on battery with Windows NOTEPAD as the only application.

The only time you're going to be exercising any of the 3D circuits on your mobile GPU is when it is docked.


When ATI/NVidia produces a killer 3D chip that runs in an IPAQ or GameBoy form factor, then maybe we can talk about the mobile games market. Today, the reality is, Mobile Games == Java Applets on your Cellphone or Gameboy.

Mobile, powersaving, 3D performance is irrelevent today.
 
Ichneumon said:
Implementing the NV28M will not only kill power consumption, it sounds like it'll be a lot more expensive to implement due to its lack of tv-out and LVDS support.

Read this again. Regardless of assumed usage pattern this affects all designs with this chip. Also, all designs must allow for 3D use, it is not acceptable if your design overheats when used to play a game. Which means that not only do you have to design in a hefty cooling solution for the GPU itself, but the overall design of the notebook has to accomodate the higher total power dissipation. =Noise+Size. And nobody likes that pair in portable.

Entropy

PS DemoCoder, why the %&£$!! would anyone in their right mind who is not interested in 3D gaming want this chip at all, as it's a liability in all other cases?
 
Back
Top