The Next-gen Situation discussion *spawn

Even upping it to 250 or even 300w you still won't get an ultra high end PC GPU in there.

Did you miss the following post:

I Read a PC world review of that Falcon Northwest Tiki PC and they had it consume 231W under full load.

http://www.pcworld.com/article/2583...eview_gaming_monster_in_a_petite_package.html

Launch 360 was doing around 190W, the difference actually is not as much as I thought.


?

Although:

I don't know if PC-games really stress the CPU though.


Anyone here who would be able to elaborate a bit further on that? How much more than 231W would it be if under absolute full load?
 
If that would be all what "next-gen" would have to offer, then that would be quite disappointing, wouldn't it :???:?
Isn't 29.97hz just a typical American (USA, at least) TV frame rate, and doesn't it make sense to turn 60hz 1080i video into 30hz "1080p" for internet video (i.e., deinterlace it)? Not sure what that says about the frame rate of the game itself.

As for G80 in the PS3, just how delayed and expensive do you think the PS3 could have been relative to its competition? The G80 in the 8800GTS was 2.5x the die size and used ~70W more power than the G71 in the 7900GT (roughly equivalent to RSX). G84 (8600 GTS) is closer to the RSX in die size, power, and performance--but it came five months after G80.
 
Isn't 29.97hz just a typical American (USA, at least) TV frame rate, and doesn't it make sense to turn 60hz 1080i video into 30hz "1080p" for internet video (i.e., deinterlace it)? Not sure what that says about the frame rate of the game itself.

Even if "next-gen" consoles would be able to offer such visuals at 60 fps, it still would be quite disappointing, wouldn't it? Those visuals just don't necessarily appear to have a "next-gen" feel to them, don't you think?

So, there has to be more than just that ;)?

As for G80 in the PS3, just how delayed and expensive do you think the PS3 could have been relative to its competition? The G80 in the 8800GTS was 2.5x the die size and used ~70W more power than the G71 in the 7900GT (roughly equivalent to RSX). G84 (8600 GTS) is closer to the RSX in die size, power, and performance--but it came five months after G80.

Ideally no delay at all :cool::mrgreen:?

And by the way, according to the following table:

http://en.wikipedia.org/wiki/PlayStation_3_launch#Release_Data_and_Pricing

the PS3 release date mentioned above (November 2006) actually appears to be for USA/Asia only? In Europe and some other territories it appears to have been released quite a bit later (March 2007 and later)?

And apart from that: would you really have cried if it would have been a mere 70W more?

The "Uncharted" series for example would probably look much better if it could have utilized a G80 instead of a G70/G71, wouldn't it :D?

And "Crysis 1" / "Crysis: Warhead" for example would probably have been possible from day one, wouldn't it :mrgreen:?

That probably would not have been such a bad thing after all, wouldn't it ;)?

And how much harder would it really have been for SCEI to deal with 70W more? Would it really have been THAT hard?

And would anyone really have cried just because the case would maybe have had to be slightly larger and so on or what?
 
Last edited by a moderator:
More area = more expensive, more power = more heat.
RSX is more akeen to a GF7800 AFAIR. (Been ages so don't quote me on that.)

You have power/heat and area/money budgets, and you can't go around that.
There's no technology to improve them beyond what you have on a PC except at a significant cost, and there's only so much people are willing to spend on a new console, and so much loss a company is willing to handle. (They want to be profitable on hardware too, or at least break even, I don't think they'll go for loss on hardware anymore, cost them too much this gen.)
 
And by the way, according to the following table:

http://en.wikipedia.org/wiki/PlayStation_3_launch#Release_Data_and_Pricing

the PS3 release date mentioned above (November 2006) actually appears to be for USA/Asia only? In Europe and some other territories it appears to have been released quite a bit later (March 2007 and later)?
Meaning what?? PS3 in Europe should have come with a G80?? Your argument is all over the place and plain silly. PS3 was intended to launch many months before its actual release. It was designed with damned expensive hardware that was cutting edge for the time of its release. It got delayed which is why it launched alongside G80, and you can't just throw in a new GPU at the last minute, and even if it could have added the G80 in time, it would have been even MORE expensive and hotter! G80 in PS3 was never going to happen. It's a ludicrous notion to entertain. It's also different to expecting whatever tech in next-gen.

The "Uncharted" series for example would probably look much better if it could have utilized a G80 instead of a G70/G71, wouldn't it :D?
Well duh! Why not put in dual G80s and have even BETTER graphics?! Because there are cost/power/heat/size targets to take to heart. Consoles aren't PCs. If you want the bestest of the bestest, get whatever size PC case you care for and stuff it with as much PC hardware as you care to buy. If you want to design a console, design a system for a given cost and size (power draw affects heat and cooling, and thus size and cost) to release at a given date, where have to finalise it long before release so you can have time making the machines and writing the OS and stuff.

And how much harder would it really have been for SCEI to deal with 70W more? Would it really have been THAT hard?
Your entire discussion format consists of replying to every query raised with "is it really that hard (question mark)". It's very aggravating and effectively dismisses the discussion. If you want to go with that argument, then stop now because you're absolutely right. Everything could have been better. Truth is, every product ever made could be a little more. Designers have to draw the line somewhere to produce a product that'll sell and make them money. Instead of just wishing for more, a considered discussion about a future product should aim to be realistic balancing the wants of the consumers with the wants of the business. That covers a wide range of products catering to each side to different degrees. A moderate spec next-gen console at a great price could be a winner. Maybe instead the monster console will sell the best. Maybe the one with the fancy new tech. Maybe a GTX 680 will fit in there, or maybe they'll go with a tiny form factor. But whatever the possibilities, at least present your case with more than "it wouldn't be so bad, would it? :D" ad nauseum.
 
Anyone here who would be able to elaborate a bit further on that? How much more than 231W would it be if under absolute full load?

The 560 Ti alone can hit 218W at max. And that's not at the wall, that's just the card ignoring PSU efficiency.

http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_660/25.html

In general, consoles will run much closer to maximum utilisation of the hardware because months or years of work will be spent optimising just for that particular hardware.

Why? If they really wanted to, they probably could have based it on the G80, couldn't they?

Yeah, if they'd wanted it to come out in 2007.

Edit: and also put them in an even deeper pit of red ink and despair.

Why should gaming consoles lack behind the very latest and greatest technology when they launch?

Because it takes time to modify the latest and greatest technology for use in a console. Unless you're prepared to pay for the vendor to prioritise your low-margin console parts ahead of the massively bigger PC space, there's a good chance that you'll get an existing architecture (or a modified version of it) some time after the PC and even that you'll be launching after newer parts are tricking out into the PC space.
 
(They want to be profitable on hardware too, or at least break even, I don't think they'll go for loss on hardware anymore, cost them too much this gen.)

I still believe this is an incorrect assumption.

Selling at a loss would have still been a boon for MS had it not been for the RRoD, and the PS3's problems lie with WHY and HOW MUCH of a loss they were absorbing, which was due to pushing Cell and BR. If not for those things, (and launching after the 360, not having an established online presence, being hugely more expensive while not being able to showcase any additional "power"), the strategy of selling the PS3 at an initial loss would have been successful again.

The tablet/smartphone games market shows two things. 1) The population of gamers is now larger than before. 2) The majority of "gamers" now want quick, easy games without putting a high priority on graphics.

You can draw a couple conclusions to that, one being that it's okay to not build powerful systems because the gamers don't care.

Or, you can draw the conclusion that if you want to attract the new larger market of gamers you have to produce a product that is worth buying - it needs to have a clear and obvious "WOW" factor. It should also be a multi-purpose device (which means an increased focus, not a decreased focus, on apps and DVR features, etc)

I think any company following the first theory (customers don't care), won't be selling many consoles. They won't attract "hardcore" gamers, and tablet/phone gamers won't see the point to move off their current devices.

The only way a company doing that can survive this next generation is if ALL the companies follow down that same path, so the only alternative is the PC. Which actually would kill console gaming, probably for good.
 
Meaning what?? PS3 in Europe should have come with a G80??

What was meant was:

If Europe and so on apparently already was delayed to March 2007 anyway, why not delay USA/Asia to March 2007 as well and use that time to release worldwide with G80 based GPU in March 2007 for example :mrgreen:?

Instead of releasing with G70/G71 based GPU in USA/Asia in November 2006 and in Europe and so on in March 2007.

It was designed with damned expensive hardware that was cutting edge for the time of its release.

You really considered G70/G71 to be "cutting edge" when there already was the (much better) G80 :oops::eek:?

Well duh! Why not put in dual G80s and have even BETTER graphics?!

Not necessarily considering Multi-GPU micro-stuttering to be an advantage :???:?

If you want the bestest of the bestest, get whatever size PC case you care for and stuff it with as much PC hardware as you care to buy.

That would only be an argument if there wouldn't be any console exclusive games and everything would be on PC as well, don't you think?

Also, according to what a lot of users are posting here, the "bestest of the bestest" would actually be a gaming console equipped with the latest and greatest technology, and not a PC, because of the overhead...

So that somehow makes it even a less valid argument?

By the way, there was a quite ridiculous statement from "almighty" a few posts above:

And it wouldn't bother me what Sony put in a console as I don't own a PS3 or a 360.

As if the hardware choices for the upcoming "next-gen" consoles wouldn't have any impact on PC gaming for probably the upcoming five to ten years... :rolleyes:

Hint:

Current consoles are holding back PC-versions. Even now that some of the newest PC-games have decent DX11 support, it remains that the games are designed to run on a much lower base configuration. You can crank up computationally expensive effects to make even very powerful PCs struggle, but the performance penalty from those effects imo is far bigger than the bonus in eye candy we receive.


;)
 
Last edited by a moderator:
Just an aside, I don't really "get" the whole sales point of the rumored consoles.

Underpowered aka Upgrades: Pitched as "upgrades" instead of "new generation experiences not possible on current hardware"

Uber-waggle: Doesn't work for major gaming genres, often offers a less connected experience for games it can work in, and developers have shown very little grasp to make compelling (e.g. Kinect, I never did get a game GAME that offered a good basic workout with basic motions; instead it was all silly waggle or "fitness" games which were no fun)

DLC: Crap-storm more nickle&diming, DRM, contracts, etc.

I am not sure why we are even talking about new consoles when the whining about power and cost budgets funnels the entire experience into the above.

What is the sales point? Ewww aaahhhh cleaner IQ and a new shooter that plays like all my others? Do I really want to plop $400 down and a $10/mo contract and core parts of my game to be held hostage with DLC costs (just look at Madden!)?

I don't see the carrot.
 
What was meant was:

If Europe and so on apparently already was delayed to March 2007 anyway, why not delay USA/Asia to March 2007 as well and use that time to release worldwide with G80 based GPU in March 2007 for example :mrgreen:?

Instead of releasing with G70/G71 based GPU in USA/Asia in November 2006 and in Europe and so on in March 2007.

What a stupid thing to suggest

You really considered G70/G71 to be "cutting edge" when there already was the (much better) G80 :oops::eek:?

Shifty said DESIGNED, not released, you're confusing the 2.

Not necessarily considering Multi-GPU micro-stuttering to be an advantage :???:?

You're not a PC gamer so stop with the assumptions

Also, according to what a lot of users are posting here, the "bestest of the bestest" would actually be a gaming console equipped with the latest and greatest technology, and not a PC, because of the overhead...

So that somehow makes it even a less valid argument?

Consoles will NEVER offer any where near the experience of a dedicated gaming PC

By the way, there was a quite ridiculous statement from "almighty" a few posts above:

Your point being? I'm still making a lot more sense then you are pal.

As if the hardware choices for the upcoming "next-gen" consoles wouldn't have any impact on PC gaming for probably the upcoming five to ten years... :rolleyes:

Hint:

;)

Better controls for FPS games, more players online, higher IQ, higher frame rates, more content, MODS, multi-monitor, cheaper games.... Just a few things us PC gamers enjoy without any input or influence from a console.
 
What is the sales point? Ewww aaahhhh cleaner IQ and a new shooter that plays like all my others? Do I really want to plop $400 down and a $10/mo contract and core parts of my game to be held hostage with DLC costs (just look at Madden!)?

I don't see the carrot.

But that's pretty much my point, Acert.

In order to make them attractive, money has to be spent to give them something special. Sure, the first defining bit can always be IQ, but then you have to take on apps, services, other capabilities beyond gaming. And some of the gaming power has to be used to simply do new things, like fully desctructible environments, open world games where you can enter EVERY building, games that can be played in teams of 64 or 82 or some hugely insane number now only frequently seen in the PC realm.

All those things take a lot of processing power, and a lot of memory. Oh, and Heaven forbid you want to do multiple things at once!

It's very annoying to be watching episodes of TV on Netflix on the 360 and decide I want to go play a round of Borderlands 2, only to have to exit the app wait for it to load back to dashboard, wait for BL2 to load, and then start playing. Even worse going back to Netflix to continue watching episodes.

No ability to pause, keep some parts of the program in memory so it's almost like a 'tabbed' surfing experience?

These are the kinds of features that should be available next generation, these are the things that, IMO, do make sense to spend the $400 on a new system.

A slight bump in IQ? No thanks.
 
user542745831 said:
Buying a product that would feature the latest and greatest technology on launch?

I'm guessing you were a Neo Geo owner and never lowered yourself to play on the plebeian Nintendo, Sega, Sony, or MS systems.

Instead of lobbying the console manufacturers of the masses to change their ways, perhaps it would be more fruitful to contact SNK and inquire when their next console is launching.
 
The 560 Ti alone can hit 218W at max. And that's not at the wall, that's just the card ignoring PSU efficiency.

http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_660/25.html

Thanks for your reply. But why did you choose to mention that a GTX 560 Ti consumes 218W, when, according to those figures you posted the link to, a GTX 670 (which is much faster than a GTX 560 Ti, isn't it?) only appears to consume 162W?

Your point being? I'm still making a lot more sense then you are pal.

You think so? How can you say you would not care what hardware will be featured in the upcoming "next-gen" consoles if you consider yourself to be a PC gamer?

Can you explain how that makes any sense, when it is apparently quite obvious, that consoles tend to appear to hold back the visuals of PC game versions, as "Dr Evil" explained earlier?

As a PC gamer you would want the "next-gen" gaming consoles to be equipped with the latest and greatest hardware available on launch, to hold back the visuals of upcoming PC game versions as little as possible, wouldn't you?

So how does saying you wouldn't care about what the hardware of the upcoming "next-gen" gaming consoles will be make any sense, if you consider yourself to be a PC gamer?

And apart from that...

And it wouldn't bother me what Sony put in a console as I don't own a PS3 or a 360.

Then why do you post so often in this thread :p;)?

So much about "making sense"... ;)

And to "Rodéric": No, that is not a "troll" post, but just a reply to him.

Or, you can draw the conclusion that if you want to attract the new larger market of gamers you have to produce a product that is worth buying - it needs to have a clear and obvious "WOW" factor.

This.

A slight bump in IQ? No thanks.

This.
 
You think so? How can you say you would not care what hardware will be featured in the upcoming "next-gen" consoles if you consider yourself to be a PC gamer?

Can you explain how that makes any sense, when it is apparently quite obvious, that consoles tend to appear to hold back the visuals of PC game versions, as "Dr Evil" explained earlier?

As a PC gamer you would want the "next-gen" gaming consoles to be equipped with the latest and greatest hardware available on launch, to hold back the visuals of upcoming PC game versions as little as possible, wouldn't you?

Going from 1280x720 @ 30fps with no AA and AF to 2560x1600 @ 60fps with any thing up to 32x AA with 16xAF in my eyes is a generation leap.

We have had PC exclusive games that have been built from the ground up and pushed PC forward, the STALKER series comes to mind as an example.

And then there's the whole PC modding scene...

Skyrim on PC

tesv2012-07-0517-47-1yfq17.jpg


tesv2012-07-0800-20-3wxuab.jpg


Would that not be considered a generational leap? Because it is to my eyes....
 
Thanks for your reply. But why did you choose to mention that a GTX 560 Ti consumes 218W, when, according to those figures you posted the link to, a GTX 670 (which is much faster than a GTX 560 Ti, isn't it?) only appears to consume 162W?

You were asking how much higher power consumption would go under full load. The entry level Tiki ($1827) comes with a 560 Ti, which can use significantly more power than PC games cause it to draw.

The 670 seems to throttle under very heavy load (as do most newer cards), and so would have been a shitty example.

Consoles can't throttle, of course - they have to be able to handle everything you throw at them.

The 670 comes in the (lol$ 2800) top end system btw, and even that is probably drawing more power at max (or about the same) than everything in the launch 360.
 
Going from 1280x720 @ 30fps with no AA and AF to 2560x1600 @ 60fps with any thing up to 32x AA with 16xAF in my eyes is a generation leap.

We have had PC exclusive games that have been built from the ground up and pushed PC forward, the STALKER series comes to mind as an example.

And then there's the whole PC modding scene...

Those mods would look even better if the actual unmodded game looked like that from the get go. Having half a dozen pc exclusives every year also doesn't push the bar too much higher as the release of a new console would. Developers themselves admit pc games could do much more then they actually do but there is little insentive to go to far ahead technologically whithout new a generation of consoles out there.
 
Those mods would look even better if the actual unmodded game looked like that from the get go. Having half a dozen pc exclusives every year also doesn't push the bar too much higher as the release of a new console would. Developers themselves admit pc games could do much more then they actually do but there is little insentive to go to far ahead technologically whithout new a generation of consoles out there.

The games would benefit from most PC gamers play PC for the extra IQ...

If the next set of consoles were forced native 1080p with at least some form of basic FXAA with 8xAF then I might be tempted to get one.

But as it stands now going from 24x Edge Detect AA and transparency super sampling in pretty much every game to little to no AA is a too big of a drop for me in terms of IQ

That's why I have 3 GPU's, turn the IQ up to the max!!!!
 
Would that not be considered a generational leap?

No, not necessarily. Wouldn't you rather be disappointed if those current-gen visuals would be all what "next-gen" gaming consoles would have to offer?

Going from 1280x720 @ 30fps with no AA and AF to 2560x1600 @ 60fps with any thing up to 32x AA with 16xAF in my eyes is a generation leap.

Increasing the rendering resolution of a current-gen game makes it a "next-gen" game for you?

The 670 seems to throttle under very heavy load (as do most newer cards), and so would have been a shitty example.

Just chose the GTX 670 as an example, because it draws less power then a GTX 560 Ti (which you did mention) according to the figures in the link you provided, while actually being faster ;). Could have chosen the HD 7950 as well for example, which, according to those figures draws less power than a GTX 560 Ti as well ;).

Was basically just wondering why you did choose to mention the GTX 560 Ti, when there are actually better and less consuming GPUs mentioned in the link you provided.

You were asking how much higher power consumption would go under full load.

Yeah, but that question was referring to the reviewed system "Dr Evil" was posting about over there:

I Read a PC world review of that Falcon Northwest Tiki PC and they had it consume 231W under full load.

http://www.pcworld.com/article/2583...eview_gaming_monster_in_a_petite_package.html

Launch 360 was doing around 190W, the difference actually is not as much as I thought. I don't know if PC-games really stress the CPU though. Let's hope they are still going for a system close to 200W.


And that system apparently is equiped with an overclocked Core i7-3770K and a GTX 680 according to that review, see:


pcworld.com said:
http://www.pcworld.com/article/2583...eview_gaming_monster_in_a_petite_package.html

[...]

Even under full gaming load, pushing the 4.3GHz overclocked Core i7-3770K and GTX 680 all out, the overall power consumption was only 231W.

[...]


;)

The entry level Tiki ($1827) comes with a 560 Ti, which can use significantly more power than PC games cause it to draw.

See above. The system in question appears to be equipped with a GTX 680.

The 670 comes in the (lol$ 2800) top end system btw

No, the review specifically mentions a GTX 680 :D.

So, to come back to the question:

Anyone here who would be able to tell how much more than 231W the system in the review mentioned above (4.3 GHz Core i7-3770K + GTX 680) would consume if under absolute full load?

The review mentioned above appears to mention 231W for full gaming load. But what about absolute full load?
 
If the next set of consoles were forced native 1080p with at least some form of basic FXAA with 8xAF then I might be tempted to get one.

forcing developers to meet some arbitrary pixel standard will automagically result in a better looking end product, amirite?
 
Back
Top