Next-gen console > Current PC? (at launch)

I really didn't see anything in the Killzone reveal that couldn't be done on your rig or even something with slightly lower specs at 1080p 60.

It was nice but isn't even a shadow of what Crysis 2 or 3 can do on PC.

Regards,
SB

Killzone was developed for 4GB or less though, as the game has been in development for 2,5 years and I understand it to be practically finished by now, but the developers learned about this upgrade only very recently.

I think the amount of RAM taken by the OS in PS3 isn't going to be a lot more tgan 1GB, and graphics data is going to be the brunt of memory used by any game. Definitely though the PC benefits from system RAM for audio and carious other stuff, and can stream in textures from there a lot faster than from disc. ;)

My i7 PC, bought half a year ago for 849 euro, has 8GB of RAM (I think at 25GB sec) and 1GB of GDDR5. I think it will take longer than last gen for a sizeable group of PC owners to catch up to PS4 specs (with more money going to tablets and laptops besides), but at least everyone will benefit from a similar architecture.
 
Problem is that in this generation and certainly the next, developers will be willing to sacrifice resolution and/or frame rate in order to put out better graphics. When your average console is hooked up to a 40" HDTV set 7-8 feet away, running a game at 1080p isn't as much a priority as it is on a PC monitor where you strongly have the 1:1 pixel ratio effect. My proof of this is that the average person who owns both a 360 and PS3 probably won't be aware that the homescreen is 720p on the 360 but 1080p on the PS3 (and that's with text where its super easy to notice).

I actually find the difference between 720p and 1080p on my 50" TV from around 10 feet to be very obvious (I need to run 720p on the PC through the TV for use with 3D). I do think your point stands though, developers will obviously sacrifice resolution and framerates for greater core graphics. One of the benefits of gaming on a PC will be, as it has been this generation the ability to run the same assets at high resolutions / image quality. I'm certainly hoping for better assets/effects too though, hopefully much sooner in the generation than last because of the relatively better position PC's will be in this generation.
 
I think it will take longer than last gen for a sizeable group of PC owners to catch up to PS4 specs (with more money going to tablets and laptops besides), but at least everyone will benefit from a similar architecture.

In memory size I've no doubt you're correct. It'll be a good while before say 6GB becomes the "standard" amount on PC's GPU's. In terms of processing power though I think the opposite is true. So I guess it depends how things play out between those two factors.

How will having more processing power but less memory for the "average" gaming PC effect the balance of power. The high end will have both I'm sure but that's not the only game in town.
 
Generally speaking, a PC that can play a current cross-platform AAA console release (Dead Space, Assassin's Creed, Far Cry, Tomb Raider, etc) at 1080p60. This is more or less the target for a "hardcore" PC gamer these days. So a decent rig, but not necessarily top of the line.

My PC is a i7-2600K @ 3.4GHz, 16GB DDR3, and a GTX-680, I would consider it more powerful than that, since there's hardly a game I can't play at 1080p60. So maybe something like a i5, 4GB RAM, and a GTX-570 or something around there.

I think it's more accurate (as swaaye) recently suggested to picture the PC market as a bell curve with power on the y axis and number of users on the x. Obviously discount any and all PC's not being used for core gaming and then see where the consoles sit on that curve.

Then the question you need to ask is how many years will it be before the consoles move to the left far enough to be at the peak of the curve. And then compare that timeframe to the current generation. I can't be arsed with all that right now though lol.
 
Problem is that in this generation and certainly the next, developers will be willing to sacrifice resolution and/or frame rate in order to put out better graphics. When your average console is hooked up to a 40" HDTV set 7-8 feet away, running a game at 1080p isn't as much a priority as it is on a PC monitor where you strongly have the 1:1 pixel ratio effect. My proof of this is that the average person who owns both a 360 and PS3 probably won't be aware that the homescreen is 720p on the 360 but 1080p on the PS3 (and that's with text where its super easy to notice).
This is certainly true. I was always aware of the UI difference between the two, as the X360 interface always looked "fuzzy" next to the PS3 interface (I used to have both systems, I sold my Xbox a few months ago).

Unfortunately, I am one of the ones who can see the difference. Not so much with video content (1080p is just slightly sharper, but 720p doesn't look bad by any stretch), but definitely with aliased game content. Then again, I have a 60" TV viewed from about ten feet. 720p games look good, nice and sharp, but the pixels are definitely noticeable to me. 1080p is pretty much flawless, and you can get away with a much more meager AA solution, I think, since the individual pixels are much harder to see anyway.

As for PS4, the only one we know for sure about is Killzone, running at 1080p. I hope other developers follow suit, and finally give us the "True HD" gaming they promised us seven years ago.

Another big factor for me is 3D support. That big 3D TV of mine is only about a month old, and it was very disconcerting that Sony appears to be dumping 3D, or at least not pushing it on PS4 as much as they did with PS3, and they're leaving it up to the developers as to whether to support the feature or not.

The reason I bring it up is because I also have a 3D monitor on my PC, and there are games I can play there in 3D that don't support 3D on the Playstation. That alone could be a deciding factor when choosing which version of a game to get, if it's one that I think could benefit greatly from 3D, and if the support for that game is much stronger on PC than it is on console. For example, the upcoming Tomb Raider game, which I'm still on the fence about. I'm leaning more toward PC partially because of what appears to be proper PC support, as well as the very likely chance that it will work very well with 3D Vision, while the console versions don't appear to support 3D at all. That's one thing I'm actually waiting to find out in the next couple of weeks, is how well the game works in 3D on PC.
 
As for my OP, I think at this point, we just need to take a "wait and see" approach. I think Watch Dogs is the only PC cross-platform game confirmed at this time? We'll just have to see how it compares between high-end PC and PS4.
 
For example, the upcoming Tomb Raider game, which I'm still on the fence about. I'm leaning more toward PC partially because of what appears to be proper PC support, as well as the very likely chance that it will work very well with 3D Vision, while the console versions don't appear to support 3D at all. That's one thing I'm actually waiting to find out in the next couple of weeks, is how well the game works in 3D on PC.

If you're comparing to the current generation consoles then surely that's an absolute no brainer. The game will at best be native 720p/30fps on the consoles which means for 3D you'll either get a crappy z-buffer based implementation or a dire decrease in graphics quality to accomodate the rendering of double the frames.

Depending on you're PC you can potentially run the same game in 3D at 1080p/60fps with all the extra DX11 goodies on top.
 
We might actually see pc gamers reduce resolution in 2014 if the oculus rift is as good as impressions make it out to be. I believe the final specs are for 1920x1080 . A lot of gamers might step back from multi monitor gaming for these . The end result being ever more processing power aimed at lower resolutions.


As for ram in next gen consoles. The ps4 has 8 gigs . Subtract at least 1 gig for the os and your at 7 . Then you still need to add in all the code the cpu needs to run. On a pc its easy and cheap to get 16 gigs of main memory and a 3-6 gig video card. This will only become more true at even lower price points as time marches on.

Both ihvs are rumored to have brand new chips out the end of the year which will push the bar even more.
 
The majority of PC gamers run at 1080p anyway. The ones you hear about, supersampling or running multi-monitor setups, are in the minority. It's just a very vocal minority. Gives the appearance that most PC gamers do this, when they don't. Hell, I can't even get supersampling to work on my system at all; whenever I try to activate it, my card overrides it and goes back to default. Possibly has to do with the 3D monitor.. rendering must occur at my native resolution, or the 3D effect won't work. The only game I have that supports it internally is Project CARS, and the final output is still 1080p as far as my system is concerned. That game doesn't support 3D, though.

I have zero faith that the Oculus Rift will change anything. No matter how impressive it may be, it's still a VR headset, and that's been done before. The OR may do it better than previous attempts, but it's still the same tech, with all the same problems. I seriously doubt it will ever reach more than maybe 0.5-1% of PC gamers. Most people complain about wearing glasses for normal 3D, they don't want to wear something even larger and heavier. I sure as hell don't.
 
I could see OR becoming popular for first person fans if it is a truly unique experience. Depending on price of course.
 
Headgear is cool for people that want to relive having to wear a helmet when they were little so they didn't hurt themselves.

Is that negative enough for you? It's always going to be niche, I'm glad they're working on the tech, but the reality is that most people won't be buying or using a wearable display.
 
Like most people are still on 2008 hardware too. It runs WoW and Minecraft fine, afterall. Exciting.
 
Like most people are still on 2008 hardware too. It runs WoW and Minecraft fine, afterall.

More like people who don't need glasses don't choose to wear them all the time. In fact many people who do need glasses don't choose to wear them all the time.

Buying something new and faster than what you have is normal. People need to be convinced to buy something different. If the adoption and use of 3D TV's is any indication of how VR will be adopted, they are in for really tough sledding.
 
I could see OR becoming popular for first person fans if it is a truly unique experience. Depending on price of course.

They're wanting to get it down to <$300. Palmer's even mentioned perhaps $200 in one interview. Granted I expect that to go up once they factor in having to add additional hardware to support translation/positional tracking.

The issue with adoption is really going to come from the software and the potential extremely high performance demands. If(when) they end up going with a 120hz screen, they're pretty well going to need stripped-down custom built games to deliver that minimum 120fps. Multi-GPU systems don't even necessarily become a savior there either as AFR is completely useless due to the latency.
 
Back
Top