Remote game services (OnLive, Gaikai, etc.)

The average Dell or HP box the "average" person has cant support a decent video card on the power supply, though.

And that "average" person is never going to replace his own power supply (besides the cost), either, it's a non-starter.

What computer can't support a radeon 5650 at this point in time and it should provide a much better experiance than this.
 
Being a beta tester, It is nice for a rental platform. However, that would depend on the games that you could rent. There will also be a free portal where anyone came stream demos and pay for rentals only without subscription. That will be up and coming in the furture months. As far as buying (long term rental), it's just not worth it.
 
Actually they can. I've replaced GPU in such machines with GPUs capable to run multiplatform games very well. Those average Dell/HP boxes tend to have 350W+ PSU with decent 12v line.


If they can support ANY add in card (unlikely) it will not run any games. You need at least a 4870 to run console ports at medium settings on a typical 1080P monitor, which draws a lot of power.

Looking at a low end Dell at Best Buy, it has a 300 watt power supply, obviously a joke for any gamer. I had to upgrade mine from a 500 to a 650 (name brand Antec, higher quality than no name supply will be in Dell by far) just to run my 4890 which takes 200+ watts by itself. I was getting shutdowns during Crysis with the 500 watt Antec. My 4890 is hardly a cutting edge card anymore, it's mid range. The rest of my PC's specs are low end (Q6600, etc)

I'd say building any gaming rig today you'll need to start with a minimum 800 watt quality power supply, I wouldn't go with less.
 
Last edited by a moderator:
If they can support ANY add in card (unlikely) it will not run any games. You need at least a 4870 to run console ports at medium settings on a typical 1080P monitor, which draws a lot of power.

Looking at a low end Dell at Best Buy, it has a 300 watt power supply, obviously a joke for any gamer. I had to upgrade mine from a 500 to a 650 (name brand Antec, higher quality than no name supply will be in Dell by far) just to run my 4890 which takes 200+ watts by itself. I was getting shutdowns during Crysis with the 500 watt Antec. My 4890 is hardly a cutting edge card anymore, it's mid range. The rest of my PC's specs are low end (Q6600, etc)

I'd say building any gaming rig today you'll need to start with a minimum 800 watt quality power supply, I wouldn't go with less.

I have a 620watt power supply. I have a q9650 @3.8ghz , 8gigs of ram , 4 1TB drives , 1 60 gig ssd , radeon 4850 , and a xfi sound card.

Seems to work well. Could prob get by on less
 
The average Dell or HP box the "average" person has cant support a decent video card on the power supply, though.

And that "average" person is never going to replace his own power supply (besides the cost), either, it's a non-starter.

Dave saves even these people. He offers more than enough computing performance in an HD 5670 which matches up nicely to what seems like the standard game developer target now of working on a 8600-9600-8800GT range of Nvidia cards. Apparently a monkey can do it. :p
 
If they can support ANY add in card (unlikely) it will not run any games. You need at least a 4870 to run console ports at medium settings on a typical 1080P monitor, which draws a lot of power.

Looking at a low end Dell at Best Buy, it has a 300 watt power supply, obviously a joke for any gamer. I had to upgrade mine from a 500 to a 650 (name brand Antec, higher quality than no name supply will be in Dell by far) just to run my 4890 which takes 200+ watts by itself. I was getting shutdowns during Crysis with the 500 watt Antec. My 4890 is hardly a cutting edge card anymore, it's mid range. The rest of my PC's specs are low end (Q6600, etc)

I'd say building any gaming rig today you'll need to start with a minimum 800 watt quality power supply, I wouldn't go with less.

My GTX260 (which is along the same lines as a HD4870) can run console ports maxed at 1920x1200 just fine.

A gaming system with a HD4870 users only about 300watts under load, if the 500 watt antec was shutting down it be because it was defective.

800watts is both overkill and a waste of power (when PSUs are not running at near max output the efficiency drops down and the amount of power that turns to heat gets bigger vs the amount of power that the PC's parts use)
 
Looking at a low end Dell at Best Buy, it has a 300 watt power supply, obviously a joke for any gamer. I had to upgrade mine from a 500 to a 650 (name brand Antec, higher quality than no name supply will be in Dell by far) just to run my 4890 which takes 200+ watts by itself. I was getting shutdowns during Crysis with the 500 watt Antec. My 4890 is hardly a cutting edge card anymore, it's mid range. The rest of my PC's specs are low end (Q6600, etc)

Umm, I'm running a 4890 (higher load power than 4870) on a 300 watt PSU right now with plenty of power to spare. :p That's also with multiple HDs. No crashes... Last game I tried on here was Splinter Cell Conviction, not exactly a title that takes it easy on the GPU with setting maxed.

And Dell generally has pretty good quality PSUs. The one I have in this system is only average. The system I built was restricted to a size that could fit into a rolling backpack. You don't get super quality M-ATX PSUs. :D So a Dell PSU would generally be better quality than what I have in here.

A 5870 uses even less power at load, so anyone that buys a Dell Desktop could use any current gen card in their system if the card will fit. Well current gen minus Fermi that is.

Regards,
SB
 
If they can support ANY add in card (unlikely) it will not run any games. You need at least a 4870 to run console ports at medium settings on a typical 1080P monitor, which draws a lot of power.

Considering you can basically get 4870 performance now passively cooled...

Looking at a low end Dell at Best Buy, it has a 300 watt power supply, obviously a joke for any gamer.

300w is nothing close to a joke. 300w will more than run a 5770 which is many times over the performance of any console and can be passively cooled.

I'd say building any gaming rig today you'll need to start with a minimum 800 watt quality power supply, I wouldn't go with less.

This is absolute BS. Even going with a top power CPU and top end GPU, 500W is more than enough.
 
If they can support ANY add in card (unlikely) it will not run any games. You need at least a 4870 to run console ports at medium settings on a typical 1080P monitor, which draws a lot of power.

So what?

That is far beyond what any console could do or Onlive deliver to you so... and btw such a card would do almost all console ports at high or highest settings at such res unless you expect a pretty solid 60fps experience in all gmes. And those that dont run so well at high/highest, 1080p are games that have visuals that goes far beyond what Onlive would have or consoles no mather res.

So the point is slower GPU would do the job. A budget low power draw GTS250 would out do either being a 9800GTX+ type GPU.



Looking at a low end Dell at Best Buy, it has a 300 watt power supply, obviously a joke for any gamer. I had to upgrade mine from a 500 to a 650 (name brand Antec, higher quality than no name supply will be in Dell by far) just to run my 4890 which takes 200+ watts by itself. I was getting shutdowns during Crysis with the 500 watt Antec.

I thought you said some day ago you had a 4870?

Anyway 200+ Watt for a 4890 sound wrong. Your Antec 500W shutting down might have other reason for shuttingdown like some of their older series having some fatal flaws in design or just overheating due to bad ventilation or overload on certain rails etc.

My 4890 is hardly a cutting edge card anymore, it's mid range. The rest of my PC's specs are low end (Q6600, etc)

So I'll bet it will give you far more than Onlive IQ and a no brainer it runs circles around console versions/visuals. Youre 'overspecing' the requirements, you dont need a tank to tear down rice papper walls. :p

And a 4870 would do very well which I know since I owned one some time ago and right now own a 4890. Heck it does Warhead vanilla DX10 very high at 1440x900 with 4xMSAA pretty much solid 30+fps. Dont believe me? Check out the screenshot thread and the most intensive levels like the snow/ice ones and check the framerate counter. Or the multitude of other games, exclusives and multiplatform with framerate counters many with well beyond highest menu settings run on a 4870 or 4890 depeding on when I took em.

I'd say building any gaming rig today you'll need to start with a minimum 800 watt quality power supply, I wouldn't go with less.

Nonsense, as the others said. Why dont you take a look at some power draw measurements of different systems?

You'll be surprised unless you where thinking of going CrossFire/SLI with potent GPUs. In that case youre bringing a galaxy star destroyer to a boxing match.
 
Last edited by a moderator:
300w is nothing close to a joke. 300w will more than run a 5770 which is many times over the performance of any console and can be passively cooled.
Well, there are no passively cooled 5770's yet, but there is a 5750 that draws less than 75W. The bigger issue seems to be with the bulkiness of the PC's due to full height PCI slots.
 
Guys, sorry but PC power supplies aren't really on the topic here IMHO. Let's get back to the actual issues...

For example, what's the reason behind the relatively low detail settings? Are they trying to save on hardware costs, or does the video compression run on the same system in the background, taking up resources? The entire point of the service was said to be that they'll provide the gaming experience of a high end system at a fraction of the cost, but at this time a 5-year old Xbox is beating them on both detail levels and general image quality...
 
Some fresh in-game shots -- note that this is not a beta-test anymore, but what would be the final product/service.

Ouch I feel sorry for anyone that gets this with plans to use it on a large screen TV.

But it's probably still good enough for Hotel room rentals, maybe PPV (they'd be stupid not to be pursuing the Cable and Sat. TV providers to offer it as PPV rentals of games), and maybe people that spend half of every year travelling.

Hell, if they could partner with a Cable provider to provide a Premium "game playing channel" through the provider, they would have guaranteed bandwidth. And most cable providers have pretty decent bandwidth. I'm sure Cable companies wouldn't mind not only having another premium channel to sell, but also a built in way to upsell bandwidth to a customer. Might allow them to up the quality of their transmitted games. Assuming it's purely a bandwidth limitation and they aren't at the limits of real-time encoding of 720p video without spending more money than they would ever make back..

Then again, heavy game players might just hit their bandwidth cap in a matter of days/weeks. Although heavy game players would probably be more likely to just have a PC.

Regards,
SB
 
Last edited by a moderator:
For example, what's the reason behind the relatively low detail settings? Are they trying to save on hardware costs, or does the video compression run on the same system in the background, taking up resources? The entire point of the service was said to be that they'll provide the gaming experience of a high end system at a fraction of the cost, but at this time a 5-year old Xbox is beating them on both detail levels and general image quality...

Might be they are killing two birds with one stone. Since the image IQ is bad then higher detail settings would mostly go unnoticed. So then they can run more instances of games on each server and lower costs. Also considering the image IQ people using this service wouldn't care about graphics.
 
Guys, sorry but PC power supplies aren't really on the topic here IMHO. Let's get back to the actual issues...

For example, what's the reason behind the relatively low detail settings? Are they trying to save on hardware costs, or does the video compression run on the same system in the background, taking up resources? The entire point of the service was said to be that they'll provide the gaming experience of a high end system at a fraction of the cost, but at this time a 5-year old Xbox is beating them on both detail levels and general image quality...

Either that or they'll need an additional array of servers each able to encode multiple video streams. Even with that I still can't see how they are expecting to make money.

So if we consider they are trying to make it as cost effective as possible. That would be they can't keep upgrading game machines every year. And same for the video encoders assuming it's not all in one box.

So in effect, they have to reduce settings to make this viable (playable on low/midrange hardware with the goal of a machine being able to play and render games for 2-4 years).

And then also lower video encode quality in order to either encode on the same machine as the game is being played (1 machine/blade per online subscriber) or multiple video streams encoded simultaneously on an encode server.

Regards,
SB
 
Guys, sorry but PC power supplies aren't really on the topic here IMHO. Let's get back to the actual issues...

For example, what's the reason behind the relatively low detail settings? Are they trying to save on hardware costs, or does the video compression run on the same system in the background, taking up resources? The entire point of the service was said to be that they'll provide the gaming experience of a high end system at a fraction of the cost, but at this time a 5-year old Xbox is beating them on both detail levels and general image quality...

This one is easy... http://x264dev.multimedia.cx/?p=249

So you've got a total budget of lets be generous and say 200mS. You can expect 100mS to get eaten by the network. That leaves 100mS for the encoding work. 100mS is ~3 frames at 30 FPS so no problem right? Well not so fast. All the top flight encoders use pre-passes and large look ahead windows which requires latency, latency we don't have. So you end up having to do things is slightly less efficient ways which means it takes longer per frame and then you have to change some of the IBP frames work, etc.

So its more of a quality/latency/complexity tradeoff than anything. There is certainly a realtime quality limit that is << less than the non-realtime level.
 
What is the point of games like World of Goo on Onlive?

Any PC that could run the Onlive program could run World of Goo native!

It seems even more pointless than other games on Onlive.


And how does Onlive work anyway?

What OS are the games running on?

Do they systems run more than one game?
And if they do how do they run more than one set of inputs and how do the GPUs output the video?
 
Back
Top