Is it time for disposable gaming PCs?

rwolf

Rock Star
Regular
With graphics cards costing as much as complete systems isn't it time to create a disposable PC where the processor, gpu, memory, chipset are all surface mounted on the motherboard?

Look at the 6800 and X800, you need the top of the line CPU to take advantage of these graphics cards.

Every time I upgrade I need a better power supply or a better case for cooling, or a new motherboard to support a new bus standard, faster processor, or a new memory standard.

Why not upgrade everything in one shot and have one set of drivers? I am talking about the high end here not garbage integrated graphics chipsets.

When I go to most box stores like Best Buy all the PC's are low end integrated graphics boxes.

Its so painful building a best of breed system for gaming when there are so many products out there such as motherboards, cpus, memory (latency, speed, implementation) etc.

I have to research and custom order all the parts or chance the parts that my local computer store has in stock. You can go to companies like dell and buy canned systems, but usually they are not quite top end especially reguarding graphics cards and gaming. Another problem is that you never get to see or use the system you are buying until the deal is done. Maybe for some of you the computer stores in large urban centers are better than mine, but I have found the experience to be painful.

Why not put the GPU on the motherboard and create an integrated high end system?

What about upgrading and expandability? When its time to upgrade technology has evolved enough you have to buy a whole new system anyway so why bother. There are also so many plug and play interfaces that you really don't need slots in the computer anymore.

I know someone is going to say why don't you just buy an xbox however I still use my PC for much more than gaming as do most people.

So what do people think? Is it time for high end disposable gaming PCs?
 
I think you are just caught up in all the high-end video card reviews and have fallen victim of the upgrade bug :D. I always have to continually fight the urge to upgrade - upgrade needlessly, that is. And to be honest, one of the things that I've discovered to be a myth was this:

Look at the 6800 and X800, you need the top of the line CPU to take advantage of these graphics cards.

From the perspective of gaining the most performance out of the videocard and showing the highest FPS possible on a benchmark, then yeah - your CPU needs to be as fast as possible to realize those peak performance situations. However, when reviewers say, "The bottleneck is the CPU in this situation, a faster CPU would do better" - I find it a little bit misleading.

The first thing I ask myself when building or upgrading my PC is this: "What components in my system is preventing my games from running at a consistent 60fps" The answer to this question is almost always the video card - the video card is the limiting factor to reaching a consistent 60fps. Which brings me back to the CPU 'bottleneck' issue - if the CPU really is the bottleneck in the performance of a video game - but it's "bottlenecking" at say.. 89 frames per second - why do I need to upgrade it? Look at all the CPU-limited game benchmarks out there, where are they capping out? If they are capping out at over 60fps, then - IMO, you don't need to upgrade that CPU.

Again, my main goal is to upgrade the parts which prevent the games I play from running at a consistent 60fps. And in my case, an Athlon XP @ 2.0Ghz/400FSB, will not be limited to under 60fps for quite some time - at least in the games I play. The same goes for my motherboard, my memory, my hard drive - etc. Upgrading my whole system when I upgrade my videocard would be a bit redundant and a waste of money- given my goal of consistent 60fps performance, of course. So, to answer your question: No, I don't think it's time for high-end disposable PCs (not to mention, that would be some expensive disposable equipment).

I hope that all made sense, but that's how I see it.
 
I agree with dksuiko. Your approaching this like you have to upgrade everytime something new comes out which simply isn't the case. There is a difference between needing to upgrade and just wanting to have all the latest toys. My P4 2.4B and 9500 Pro can play Far Cry at max detail with an average of 40 fps which is perfectly reasonable for me. Sure I would like to upgrade, but it's not necessary.
 
we already have them...they are called consoles ;)

these next gen consoles coming out next are pretty much going to be pc's. all these features they are putting in them that have nothing to do with gaming.
 
I already do what the original poster suggests, but I do it every four years. You just need to exercise a little control. :)
 
Sabastian said:
london-boy said:
four years? sheesh... just get a console then!

PC-Gaming > console gaming.. IMO of course.


Depends which games... But then that's another point. And as time goes by, consoles seem to be headed towards PCland and vice versa. In the end i think they'll meet in the middle. Then we'll have a single "thing" with different designs of the same compatible "thing". No distinction between the incompatible "thing1" (Consoles) and "thing2" (PCs).
 
london-boy said:
Depends which games... But then that's another point. And as time goes by, consoles seem to be headed towards PCland and vice versa. In the end i think they'll meet in the middle. Then we'll have a single "thing" with different designs of the same compatible "thing". No distinction between the incompatible "thing1" (Consoles) and "thing2" (PCs).

I agree. I am bias though for the x/y axis that the mouse provides in first person shooters and the communication that the keyboard provides in multiplayer RTS games. (not too mention hotkeys.) I really don't like the console "controller" as a result. I say given that real major advancements in games take years it makes sense though that you not really need to upgrade your whole PC every six months. (or even part of it for that matter.) My system is almost 4 years old cost me thousands of dollars in the first place.. I plan on getting another year out of it yet. Sure it is getting a tad slow comparatively speaking but I can still play the games I want for the most part, granted, newer games are really pushing the min sys requirements.
 
From my point-of-view the thing that limits consoles is what they are (typically) plugged in to, ie. a television. TV's just don't cut the mustard wrt. a high-end flat-panel for image quality. There are many things I do on a PC that I wouldn't consider doing on a console/STB/HTPC-type-effort (something plugged into my telly basically), due to IQ issues of the display and the clumsiness of console controllers as Sabastian pointed out.
 
Yeah i agree about the controls (depending on the games) and the resolution. But it's just a matter of time. Before consoles will be able to output PC-like resolutions (already from next gen, and it wil only get better), and more importantly before someone will come up with more intuitive controls. In the end, it looks like advancements in controls are driven by consoles first and foremost. PCs have been stuck with KB/Mouse for years, while consoles have evolved their controls ever since the NES (digital to analogue, force feedback, EyeToy recently etc....). All those were actually "born" on PC, but it looks like consoles universe is more "ready" for an implementation that people are actually willing and interested in using.
 
london-boy said:
And as time goes by, consoles seem to be headed towards PCland and vice versa. In the end i think they'll meet in the middle. Then we'll have a single "thing" with different designs of the same compatible "thing". No distinction between the incompatible "thing1" (Consoles) and "thing2" (PCs).

Hardware and feature wise you might be right. But games-wise I think they have diverged in the last years. Today, you have very distinct genres that are successful on either PC or console, but not both. Take RTS for PCs or jump'n'runs/beat'em ups for consoles. Interestingly, the classic PC genre FPS seems to gain ground ground on consoles - thanks to Halo, i guess.

On the other hand, gamepads and even analogue sticks have become a niche market on the PC. You almost can't sell a PC game today that prefers an input mechanism other than mouse/keyboard, Racing games (wheel) being somewhat of an exception. Thats something I'll never understand. So many games play best with a gamepad, PoP:TSOT and Trackmania for example.
 
Why ever 4 years ?


My sister alternates every year . Video card / cpu.

2 years ago an athlon xp 1500+ was 50ish with an nforce mobo going for about the same.

Today you can get a 2500+ for about the same price with an nforce 2 mobo for 50ish .

Normaly I ride a socket out . That way depending on motherboard tech i can either update the mobo and have the cpu still or update the cpu and have the mobo still .


With video cards. Well i'm sure if u bought a 9600pro and don't need fsaa and aniso you would be happy with it for at least another year. So thats 2 and a half years of gaming for 200$.

Not to bad.

Y ou can build a game pc for about 500$ if you plan it right
 
jvd said:
Why ever 4 years ?

Because its expensive..not everyone that likes PC gaming is loaded. Sometimes I have a hard time just paying the bills around here.. Secondly it is good enough to just upgrade the memory and or video card a couple of years into the life of the machine.
 
Sabastian said:
jvd said:
Why ever 4 years ?

Because its expensive..not everyone that likes PC gaming is loaded. Sometimes I have a hard time just paying the bills around here.. Secondly it is good enough to just upgrade the memory and or video card a couple of years into the life of the machine.

Thats why u alternate updates .

If you do it right you'd be spending 100$ or less a year on your pc .

IF you don't keep up on it of course you will get stuck spending more .

IF your into pc gaming its not enough to just upgrade the memory. You might get away with the video card.
 
I buy bleeding edge every four years, and it takes four years before it becomes completely obsolete. Upgrading is an expensive past-time, and is rarely worth the hassle. Each upgrade is never taken full advantage of because you open a new bottleneck. To build a cost-effecient system you need to match the parts such that you minimise any ridiculous situations wereby component X is so far behind the rest of the computer you need to upgrade, but now Y is behind everything, then Z, and so on. By upgrading you're turning your PC into a little ladder whereby the last upgraded part is holding everything else back. It's a constant investment which never really works as well as the amount of money speant on it should be rewarded with.

If the ability to upgrade components was the only advantage the PC held over the consoles, it would have been erradicated as a games machine long ago. Thankfully it isn't, and thus it carries on and I don't have to move to the crap console market.

You don't need to upgrade every five minutes to own a PC.

Now, I far prefer to just swap out my PC every few years. It gives me a spare box, monitor, etc, and makes for a far cheaper hobby.
 
My policy for the past 3-4 years has been to upgrade components when I start to notice them (eg. more RAM when running out of RAM, faster processor when Photoshop starts to chug, etc.).

Moreover to first-order I have also tried to double the size or performance of whatever resource I'm upgrading.

Typically this has meant that any given component lives in my system for about 18 months, and (by Moores Law) my spend profile is roughly constant (a few hundred pounds per year).

I never buy leading edge processors, memory, or graphics cards the price premium just isn't worth it. One or two notches down is a sweet spot in price-performance IMO, and I don't feel the need for my PC to reflect the enormity of my manhood!
 
ANova said:
I agree with dksuiko. Your approaching this like you have to upgrade everytime something new comes out which simply isn't the case. There is a difference between needing to upgrade and just wanting to have all the latest toys. My P4 2.4B and 9500 Pro can play Far Cry at max detail with an average of 40 fps which is perfectly reasonable for me. Sure I would like to upgrade, but it's not necessary.

Really ? resolution in FarCry ? 800x600 ? what's your setting exactly ?

I dont believe you !

RainZ
 
rainz said:
Really ? resolution in FarCry ? 800x600 ? what's your setting exactly ?

I dont believe you !

RainZ

10x7, no AA or AF. Good enough for me and it still looks visually stunning. It's the only game that taxes my system, everything else plays like fluid.
 
Quitch said:
I already do what the original poster suggests, but I do it every four years. You just need to exercise a little control. :)


My rule of thumb used to be to upgrade my processor once processor speeds doubled.

When I had my 400mhz and the general speeds went to 800, I got one.
When it went to 1600, up I went.
I DID go to 2.4 a tad sooner than normally but I'm going to try and hold firm until speeds go to 4.2 or thereabouts.

I have upgraded my vid card many times though. I agree that unless you're benchmarking or trying to out-do your friends, gaming buddies, or other upgrade-happy message board freaks, processor speed is overrated.
 
Back
Top