If it wasn't for gaming would you still upgrade frequently?

If it wasn't for gaming would you still upgrade frequently?

  • Yes

    Votes: 9 10.5%
  • No

    Votes: 77 89.5%

  • Total voters
    86

Deepak

B3D Yoddha
Veteran
I presume single biggest reason for constant upgrading is gaming. Games keep pushing hardware and pretty quickly too. So, if would you still have upgraded frequenlty if it wasn't for games?

And, what config would you have had currently if it wasn't for games?
 
On a related note, what effect will gaming consoles have as PC gaming continues to diminish and consoles continue to grow? Or maybe future consoles will have default features like patch/mod/KB+M support from PC and it may become even more compelling for an average PC gamer. I wonder how it is going to affect Intel/AMD?
 
Without games, probably only memory upgrades when multitasking feels slower... besides the usual once every 2/3 years upgrade.
 
I wouldn't upgrade. My old 500MHz P3 with 256MB RAM and 8GB harddrive runs Win98, Office 2k and IE 5.5 just fine. Under Win2K/XP it tends to get more sluggish -- and Firefox or IE aren't half as snappy. My 2500XP/6800GT runs most games surprisingly well, too, which is why I'm putting off the next upgrade.

With regard to gaming consoles I feel certain niche genres are under-represented or non-existant which makes them slightly less appealing to me.
 
I know for a fact I wouldn't upgrade, I only upgrade as needed.

Keeping your rig able to play the latest games is the name of the upgrade game, without it what would be the point? :-|
 
digitalwanderer said:
I know for a fact I wouldn't upgrade, I only upgrade as needed.

Keeping your rig able to play the latest games is the name of the upgrade game, without it what would be the point? :-|

Agreed.
 
Probably not. Not the graphics card anyway. I know myself well enough to realize, however, that my upgrade itch doesn't really stem from any real desire to improve anything in particular about my PC to improve gaming in any particular way. It's more the 'need a new toy to tinker with' thing going on. Given that I'd probably still upgrade *something*, but I would have to find another avenue instead of gaming to 'justify' it. Video editing: New CPU, bigger HD. Digital photography: New camera, bigger monitor, more RAM...

I'm sure I'd be able to think of something...
 
I need a SM3.0 32 shaders pipes 700MHz card when I run Maxthon, Word or Excel.


On a side note, and it's corollary to this thread, it 'd be interesting to see how the big 3D IHV are reacting to the declining importance of the PC gaming market.
I know that the market isn't going anywhere, but still, if it does slow down, it can have heavy repercussions on the IHV plans, roadmaps and, ultimately, incomes.
 
NV and ATI are already in the console arena and are likely to be there for quite sometime, I wonder how much can they take comfort from that!? :rolleyes:
 
The entire industry is affected by the decades long "automatic" speed increases (coming primarily from finer lithography) is grinding to a halt.
Intels 3.06GHz Northwood was introduced over three years ago, and precious little has happened since, whereas performance used to double (roughly) every 18 months. The pace hasn't slackened a little, we are talking a complete paradigm shift.

Graphics, being eminently parallellizeable, hasn't seen as dramatic a slowdown as general purpose processing, although it is painfully obvious that much of the performance gain of the last 4 years has been bought by enormous increases in power draw, compared to the Radeon 8500 and GeForce4 which fit within the 25W total envelope of AGP. This is not sustainable. There is no way in hell that we will see another factor of 5 increase in the next 4 years.

One effect of the slower performance development is that institutions and corporations are trying to shift from 3 year upgrade cycles to 5 year upgrade cycles. While this might seem as far removed from gaming, these are volume drivers of the industry, and if this becomes established, it will have significant effect on the industry as a whole. PCs are slowly becoming mature technology, and Intel is very shrewd in trying to tie their brand name into wireless and other techniques, and aiming to lower power draw so that they can point to energy savings and better ergonomics for institutions and private purchasers, as well as making them better positioned to ride the mobile wave.

Future upgrading is likely to be less frequent, at least as justified from performance. Frankly, I think this is good for gamers, as well as for game developers.
 
Entropy said:
Intels 3.06GHz Northwood was introduced over three years ago, and precious little has happened since, whereas performance used to double (roughly) every 18 months.

That's a good point. I have the Northwood 3.06GHz CPU and I agree. I've had no reason to look into upgrading although as of late I've been tempted to get a X2 4400. Only tempted.

Other specs:
1GB RAM (533MHz for the 3.06)
6800GT
450GB HDs
Audigy 2 ZS
Dell 24" 2405

If not for gaming I would have very little reason to upgrade anything.
 
Vysez said:
I need a SM3.0 32 shaders pipes 700MHz card when I run Maxthon, Word or Excel.

Hey come on, a 3D Excel program would be kinda nice to have for those graphs (Input data in a 3D form). :)

But yeah, if I didn't care about some sort of 3D Acceleration, I'd probably end up getting a multicore CPU and be done with upgrades.
 
Entropy said:
Intels 3.06GHz Northwood was introduced over three years ago, and precious little has happened since, whereas performance used to double (roughly) every 18 months. The pace hasn't slackened a little, we are talking a complete paradigm shift.
While a decent analysis on the whole, that is a very special case caused by Intel's lack of vision with the P4. AMD's been able to increase performance much more steadily.

PCs are slowly becoming mature technology,
I wouldn't say that so much as I'd say that PC's are reaching the limits of basic physics (rather, silicon lithography is). This means that for the PC industry to continue to thrive, we need some dramatically new technology. In the meantime, IHV's are going to rely upon ever greater levels of parallelism to sell more and more chips. It's going to be less about more transistors and more about smarter use of available transistors.
 
Chalnoth said:
It's going to be less about more transistors and more about smarter use of available transistors.

Say it isn't so! :oops:

"Work smarter, not harder" :LOL:

So... who's going to invent the optical RAM? :p
 
This is a pretty relevant read for those interested in these issues, as server needs drive much of the higher end of x86 processing.
http://news.com.com/2100-1010_3-5988090.html?tag=st.txt.caro

What Intel has realised, and what I think the rest of the industry needs to realize, is that they need to sell something else than performance. Wireless range, better sound, improved displays, longer battery lives, lower weight, lower noise, better looks, smaller size et cetera are all reasons for people to upgrade, and they will when the aggregate benefit is sufficient. Note how many of the above that Intel is already pursuing. Also note how important lower power draw is to many of the items above.

Games, and graphics performance has been helping to drive the upgrade cycle even in the face of slowing performance benefits, but I would argue that this is not sustainable, as graphics processing has some of the same basic problems as general computation, and the gfx IHVs approach of accepting higher power draw actually works against a lot of the other items on the list that could help drive sales. High-end graphics currently swims against the greater trends and currents. It won't hold. (Or it might become a separate evolutionary branch on the PC tree, but I doubt that.)

For a few years now, gamers have been a target of marketeers, as they are one of few groups who has any real reason to care about the performance of their computers at all, and they seem to be perceived as a group with more dollars than sense. This desparate appeal to juvenile male gamer egos is annoying but understandable, as is for instance the naming game in graphics cards or the manouvering with various sockets in the MB/CPU scene. It is all done in order to maximise sales, and keep driving an established consumer pattern that is no longer justified. The industry has a common interest, and it is up to the buyer to make sure the value is there.

I'll submit that in the future, that value will be less found in performance and more in other aspects of computer usage. (Dual core processors is an anomaly in this trend, and a welcome one at that, but I can't see much benefit to most people going to two processors and very little beyond four meaning that this won't drive sales for much more than a generation or possibly two of lithography at most.)

But I also feel that focusing on general useability rather than performance alone is a sign of maturity, and most welcome.
 
I think, even without games, I would eventually upgrade to a dual-core CPU, for better multitasking performance. The only other thing that could really drive me to upgrade is if they started releasing components with better power saving features and lower power consumption in general, while maintaining solid performance. Video card wise, I'd have something A LOT more simple if I wasn't gaming. I probably wouldn't have 1.5 gigs of RAM either. What I have right now could last me a very long time if I decided to stop gaming.
 
Back
Top