What is this: GeForce Go 7200

Per B

Newcomer
In my pretty new HP Pavillon 2400 I have GeForce Go 7200 graphics. What exactly is this hardware like, does it have any dedicated memory? I did find some details on NVidias site, but it was just a dash in the column for memory type.

I was hoping to be able to play some games, but lost my hope after installing 3D Mark 06 and got some really sucky results (when viewing it run it looked like less than 1 fps). However, I just downloaded HL2 Demo, and that actually seems playable even in 1024x768 (there's no option for the native res 1280x800).

Per
 
Last edited by a moderator:
Thanks for the info.

As HL2 Demo seems to work I thought I would give CS Source a try, but it says I need to update my drivers. But I can't find the correct drivers on Nvidias site. It lists drivers for Geforce Go 7800/7900 separately, and then we have the 7 series: the series 7 ones didn't detect my hardware, and he Go 7800/7900 is not available for Vista (in addition to not being for 7200). Help!?
 
Despite the warning i should get new drivers it did work with the drivers I already had. Got around 11 fps on 1024x768 with default settings (which are high details on both geomtery and textures), so I had to lower the res to 640x480. But I'm pretty impressed it works at all: on my old laptop with integrated intel graphics none of the games I have tried to start worked, no even Sims 2. Good work Nvidia! :)
 
Dude, lower the texture settings and watch your FPS skyrocket. Then turn the res back up a notch or two so it doesn't look like utter shyte.
 
Dude, lower the texture settings and watch your FPS skyrocket. Then turn the res back up a notch or two so it doesn't look like utter shyte.

I actually tried now to only disable MSAA (2x was default), but keeping the other settings on high. Get 44 fps on the stress test, and the game is absolutely playable. Not bad for a chip with 32 bit mem interface! :)
 
I actually tried now to only disable MSAA (2x was default), but keeping the other settings on high. Get 44 fps on the stress test, and the game is absolutely playable. Not bad for a chip with 32 bit mem interface! :)

That works too! Can't believe AA would be enabled by default on such a GPU in any game with an auto-detect feature though, unless said auto-detect feature just enables AA by default for low resolutions of a certain generation (i.e. 1024x768 and lower all get 2xAA on GF7-class GPUs)
 
years ago number nine released a 128bit chip what was the 128bit refering to ?
ps; im guessing the 7200 go is about as fast as a 4200ti then you should be ok with most games released in or before 2004
 
Apparently #9 Imagine 128 had a 128-bit memory bus. Remember tho that I128 was quite a high-end product back then and was expensive. RAM also was very slow in comparison with what we have today so the bus had to be wide to get decent speed.

from a google groups search:
The OptiColor 128 and Imagine 128 for Power Mac are the only true 128-bit graphics accelerators on the market today, offering 128-bit capabilities in each of the three major sub-systems -- the Imagines 128-bit graphics engine, its internal processor data path, as well as a 128-bit data bus between the graphics
processor and on-board graphics memory. Data is moved across the optimized 128-bit memory bus to high-speed VRAM at sustained drawing bandwidths in excess of 500 megabytes/second, enabling RasterOps OptiColor 128 to simultaneously process sixteen 8-bit, eight 16-bit or four 32-bit pixels in a single instruction. The extra-high bandwidth ensures that this speed remains nearly constant across all color modes, producing no noticeable performance loss when color depth is increased to 16-bits (65,000 colors), or even 32-bit (16.8 million) colors.

The "Rasterops Opticolor" is apparently a re-labeled I128. This post was about a cooperation between the companies.
 
Last edited by a moderator:
I actually tried now to only disable MSAA (2x was default), but keeping the other settings on high. Get 44 fps on the stress test, and the game is absolutely playable. Not bad for a chip with 32 bit mem interface! :)

16bit mode should be useful too, I remember 800x600 16bit with 2x AA to be a good compromise on geforce 6100 sometimes. (and some amount of AF is worth it)
 
16bit mode should be useful too, I remember 800x600 16bit with 2x AA to be a good compromise on geforce 6100 sometimes. (and some amount of AF is worth it)

There's no settings for 16 bit color. I realized I CAN run at the native res 1280x800 though, needed to select 16:10 first: got 36 fps in the stress test then. I guess it would be even better if I lowered the texture details.

My point with this thread is that you don't need anything better than a Geforce Go 7200! :p

Per
 
There's no settings for 16 bit color. I realized I CAN run at the native res 1280x800 though, needed to select 16:10 first: got 36 fps in the stress test then. I guess it would be even better if I lowered the texture details.

My point with this thread is that you don't need anything better than a Geforce Go 7200! :p

Per

HL2 isn't exactly a GPU-intensive game nowadays ;) Throw F.E.A.R, Oblivion, Dirt or one of a whole host of other modern games on there and then tell us how you don't need a faster GPU :p
 
Back
Top