Next-gen console > Current PC? (at launch)

Especially considering the rumored specs was only 4GB.. nice to see they listened. I've always heard that was one of the developers' biggest complaints about most consoles.. needs more RAM.

Anyway, now that we've gotten a taste of next-gen, are the opinions here still the same? I'd be really curious to see how well my system could play some of those titles, like Second Son, Killzone and Deep Down. Hell, even Knack had some really impressive lighting going on.
 
So will the PS4 be out before graphics cards with 8GB of GDDR5 is one of the questions that can answer the OP :p

Can I count professional cards?

But seriously, my system has 19 GB of ram, 3GB of it is GDDR5 at nearly 2x the speed of the ps4.
 
So will the PS4 be out before graphics cards with 8GB of GDDR5 is one of the questions that can answer the OP :p

Even 4GB cards may be pushing it in some scenarios now but I don't think 6GB will be an issue. Given than both consoles will have a similar amount left after the OS reserve is taken out and considering that the 8GB also serves as system RAM which is of course a seperate pool on PC's. I certainly do see this as rapidly driving up PC GPU RAM configurations though so i'd say theirs a fairly good chance that we'll see 8 and maybe even 12 Gig configs when the 8xxx and 7xx series launch later this year.
 
What blows my mind is how far ahead of the Wii U it is. lol.

We're sure going to see a change in graphics quality in multiplatform games with this kind of power upgrade. So much for current video cards being desirable for long.
 
What blows my mind is how far ahead of the Wii U it is. lol.

We're sure going to see a change in graphics quality in multiplatform games with this kind of power upgrade. So much for current video cards being desirable for long.

I still don't see what's undesirable about a current (high end) video card as long as it's got plenty of RAM.

4GB should see us through the first couple of years without problems I'd have thought - even 3GB may be enough for that. Once the games really start optimising though and making the most of all that memory then PC gamers may be forced to pick up larger memory configurations.

I'm gonna go out on a limb and guess even my paltry 2GB will see me fine for the first year or so.
 
When Xbox 360 and PS3 launched with 256-512MB video RAM, we didn't stick with our 256MB cards for very long. Most PC people are currently on 512MB-1GB, because they haven't seen a reason to upgrade, and because >1GB has been high-end cards that aren't high-volume. You can still game pretty well on a 512MB card if you don't crank texture settings.

I can't wait to see what moving from 512MB to 8GB does for asset quality in games. But then there's also the problem of development costs for ultra high resolution assets.
 
When Xbox 360 and PS3 launched with 256-512MB video RAM, we didn't stick with our 256MB cards for very long. Most PC people are currently on 512MB-1GB, because they haven't seen a reason to upgrade, and because >1GB has been high-end cards that aren't high-volume. Hopefully PC hardware doesn't hold them back. I can't wait to see what moving from 512MB to 8GB does for asset quality in multiplatform games.

256MB cards were around for a good while after the current generation launched. The 8800GT for example launched almost a year later in a 256MB configuration and there were lots of 79xx and X19xx products doing the same. We did eventually have to move to 512MB but how for how long was a 256MB 8800GT able to outperform the current consoles?

And PC hardware would never hold back console gaming. If the PC can't keep up it simply won't get the port. I agree its about time the baseline was shifted though and at least on the memory front the shift appears to be massive. This should spell the end for 32bit OS's once and for all.
 
The <512MB G80/G92/RV670 cards were sometimes inadequate for games of 2007 and reviews usually spotted stuttering problems. I remember the 320MB G80 was particularly bad.
 
A driver update made the 8800GTS 320MB perform better. But the 8800GT 256MB was the biggest piece of crap of the G80/G92 line.
 
Here's how the 8800GT 256MB performed in some of the most demanding games at the end of 2007. At 1280x1024 (higher than console resolution) with 4xMSAA it still seems to be significantly faster across the board. That's 2 clear years after the 360 launched with half the overall memory.

http://www.anandtech.com/show/2396/5

Since the new systems seem to be reserving a lot more for the OS and media functionality we can probably get away with a bit more this time round. How much is the new xbox supposed to reserve again? 3.5GB? If true then at least in relation to that console that would be roughly equivalent to around 2.75GB lasting through the first 2 years of the consoles life. That obviously equates to a 3GB GPU which would probably be sufficient to keep up with the PS4 during that time period as well.
 
The catch is the graph is lower on the 256MB probably because of an unstable frame rate. Big drops mixed in with moments of speed equal to the 512MB card.

Like I've said somewhere here I have been playing with a 512MB 4850 and the hitching is obvious in quite a number of games from the past few years. GPUZ can monitor memory usage. But ya you can just drop some texture related settings and/or forgo MSAA. Or use shader AA. I figure a 1GB 4850 would be quite acceptable for almost any game right now, assuming 1680x1050 or lower.

What's unclear is how efficiently video memory is used on Windows compared to console. For example I've found that disabling desktop composition seems to improve gaming on a 512MB card, perhaps because Aero is entirely killed. On console they have total control whereas on Windows it's driver and API voodoo.

3.5GB reserved for OS on Xbox 3 sounds wrong to me. Just think of how much background junk you'd need on Windows to do that.
 
What's unclear is how efficiently video memory is used on Windows compared to console. For example I've found that disabling desktop composition seems to improve gaming on a 512MB card, perhaps because Aero is entirely killed. On console they have total control whereas on Windows it's driver and API voodoo.

Hmmm, I hadn't really thought about it, but I wonder if that's why the overall desktop UI feels so much faster and more responsive in Win8 than it does in Win7. I'd test it on Win7 but I just realized I don't have any machines here that have Win7 on it anymore (other than WHS which doesn't have Aero).

Not saying that the UI and everything was slow in Win7, but Win8 just feels so much faster and smoother when doing just about anything on the desktop.

Regards,
SB
 
Hmmm, I hadn't really thought about it, but I wonder if that's why the overall desktop UI feels so much faster and more responsive in Win8 than it does in Win7. I'd test it on Win7 but I just realized I don't have any machines here that have Win7 on it anymore (other than WHS which doesn't have Aero).

Not saying that the UI and everything was slow in Win7, but Win8 just feels so much faster and smoother when doing just about anything on the desktop.

Regards,
SB

Windows 8 requires less resources than windows 7. Windows 7 required the same as vista. Its been 6 years since the minimum specs for windows have gone up . And as I said 8 uses even less than those os's so we are actually under 2006 resource usage for windows.
 
I'm confused about what standard we are using for "Current PC". Are we talking 2 GeForce Titans which is $2000 together or your average "mid-range" PC? Problem is that a console that's supposed to sell for $399-499 at launch will never be able to compete with someone willing to drop $2000 on video cards alone.

As such, the actual question we need to ask is "Is a next-gen console going to be so laughably outdated in 6 years that buying a PC is a better deal?" And that really just comes down to die shrinks and what you're willing to pay for. Going below 28nm in the next 6 years will absolutely be harder than the transition from 180nm->28nm we had in the last 6. I actually think the pace of new GPUs is going to slow down (it already has), because new process fabs are getting more expensive to make and transitions are harder to make.
 
The OP suggests top line PC's. Is it going to compete in price, nope, probably doesn't matter to people who already have those PC's.
 
Generally speaking, a PC that can play a current cross-platform AAA console release (Dead Space, Assassin's Creed, Far Cry, Tomb Raider, etc) at 1080p60. This is more or less the target for a "hardcore" PC gamer these days. So a decent rig, but not necessarily top of the line.

My PC is a i7-2600K @ 3.4GHz, 16GB DDR3, and a GTX-680, I would consider it more powerful than that, since there's hardly a game I can't play at 1080p60. So maybe something like a i5, 4GB RAM, and a GTX-570 or something around there.

It's a little easier now that we've seen what the new system is actually capable of. Killzone was running at 1080p30, my general question was to figure if today's PCs can run games like that, and if so, how well? Can I pull of a full 60fps? Or will I be limited to around the same as the console version at 30fps? Will we still get things like better textures and shaders than PS4 can do?
 
Generally speaking, a PC that can play a current cross-platform AAA console release (Dead Space, Assassin's Creed, Far Cry, Tomb Raider, etc) at 1080p60. This is more or less the target for a "hardcore" PC gamer these days. So a decent rig, but not necessarily top of the line.

My PC is a i7-2600K @ 3.4GHz, 16GB DDR3, and a GTX-680, I would consider it more powerful than that, since there's hardly a game I can't play at 1080p60. So maybe something like a i5, 4GB RAM, and a GTX-570 or something around there.

It's a little easier now that we've seen what the new system is actually capable of. Killzone was running at 1080p30, my general question was to figure if today's PCs can run games like that, and if so, how well? Can I pull of a full 60fps? Or will I be limited to around the same as the console version at 30fps? Will we still get things like better textures and shaders than PS4 can do?

I really didn't see anything in the Killzone reveal that couldn't be done on your rig or even something with slightly lower specs at 1080p 60.

It was nice but isn't even a shadow of what Crysis 2 or 3 can do on PC.

Regards,
SB
 
So will the PS4 be out before graphics cards with 8GB of GDDR5 is one of the questions that can answer the OP :p

Irrelevant becouse the PS4 doesn't have 8GB of exclusive graphics RAM. OS, other game related data will shave off quite a bit of that leaving the GB amount for graphics lower than 8GB!
 
and even lower on XBox 3 then...
then how about the VR goggles? Its rumored that they will comes on Xbox 3 sometime in the future.

if thats true then PC with oculust rift has earlier start than console. But maybe that will help VR on Xbox 3 then...
 
Generally speaking, a PC that can play a current cross-platform AAA console release (Dead Space, Assassin's Creed, Far Cry, Tomb Raider, etc) at 1080p60. This is more or less the target for a "hardcore" PC gamer these days. So a decent rig, but not necessarily top of the line.

My PC is a i7-2600K @ 3.4GHz, 16GB DDR3, and a GTX-680, I would consider it more powerful than that, since there's hardly a game I can't play at 1080p60. So maybe something like a i5, 4GB RAM, and a GTX-570 or something around there.

It's a little easier now that we've seen what the new system is actually capable of. Killzone was running at 1080p30, my general question was to figure if today's PCs can run games like that, and if so, how well? Can I pull of a full 60fps? Or will I be limited to around the same as the console version at 30fps? Will we still get things like better textures and shaders than PS4 can do?

Problem is that in this generation and certainly the next, developers will be willing to sacrifice resolution and/or frame rate in order to put out better graphics. When your average console is hooked up to a 40" HDTV set 7-8 feet away, running a game at 1080p isn't as much a priority as it is on a PC monitor where you strongly have the 1:1 pixel ratio effect. My proof of this is that the average person who owns both a 360 and PS3 probably won't be aware that the homescreen is 720p on the 360 but 1080p on the PS3 (and that's with text where its super easy to notice).
 
Back
Top