If you're not using your own performance experience with the 680 as some kind of example of how much power you'll need to match the XB1 experience the following statement was a pretty strange one to make:
Ahh yes of course, good catch.
If you're not using your own performance experience with the 680 as some kind of example of how much power you'll need to match the XB1 experience the following statement was a pretty strange one to make:
As for the GPU you mention, that’s a hell of a GPU. A PC making justice to it would be much more expensive than the Xbox One.
but the actual life span of a PC is shorter than a console. Especially now when next generation of consoles blur the line in terms of performance vs PC performance. Now the games can run at superior than HD resolutions and better framerates.
The only console which could handle 1080p in the PS2/GC/X-box era was the original Xbox. I wonder about the framerate but that’s fact. Now we have consoles which are several times superior than those and you need PCs as 200% more powerful to really notice a difference.
And now when that line is blurred, surpassing the capabilities of a console is a tough task.
What can make PCs to look as if they have a longer life span is that you can upgrade it, am I right? So you can even keep the same case. This can make you feel the illusion of thinking you are basically using the same rig...
But would be it the same rig you bought 4 years ago? Certainly not.
You just don't realise how much money you have spent on upgrading the rig to get it to play newer games at 30fps.
Decent CPUs (for gaming) are like 100+$. Then there is the mobo. RAM (many PC gamers have the habit of filling all their DIMM’s ‘cos it’s cool ).
Then there is the GFX card, some GPUS are crazy expensive, like the Titan. A decent soundcard, and the 5.1 system laying around. SSDs… Display… PSU… UPS…
Not to mention the cash spent on moddeing your rig with the round cables, better fans and also those cold cathode lights.
And then the software! Your OS… you do pay for all these stuff, don't you?
So once you try to sum up all the expenses you realise how much you paid for your 4 years old rig (half the life span of the Xbox 360), to run Crysis 3 at 30 fps today. I’d say the expenses are a lot more than a 500$ console.
At the pace of which the hardware is improving these days the PC Digital Foundry is using will become obsolete in no time. Whereas the PS4, Wii and Xbox One will improve.
And then the Xbox One is a console meant to be backwards compatible forever, which will only expand the life cycle of the console.
By that I meant that I do enjoy sitting around with friends or siblings playing a game, it is a different type of multiplayer experience rather than being connected to a server hundreds of kilometres away that many people you’ve never laid eyes on do also connect to.
Additionally, it is fun to sit in a comfortable seat and take turns at playing a SP game. That doesn't work the same with most PC games.
Granted but they did choose a pc that has bottom of the barrel sound hardware (aka onboard sound)
A $2 Codec Sounds (to us) like a $2000 Device
That's more a comment on the state of the "audiophile" market than on whether you can tell the difference between two different audioscapes.Just for the record, people have a hard time discerning differences in audio. Case in point in this blind test:
http://www.tomshardware.com/reviews/high-end-pc-audio,3733.html
...where after listing to a bunch of audio gear blindly on pc ranging from $2 to $2000, they came to this conclusion:
Just for the record, people have a hard time discerning differences in audio. Case in point in this blind test:
This sounds to me like the 1080p vs less than 1080p debate.Curious. 792p is questionable but a few posts later, lego at 1200p (to follow the convention of calling 960x1080, 1080p with no caveats) is awesome because of the free supersample? Wouldn't the same ss praise hold true for 720/768p displays and the inevitable upscale for higher end displays. Or is there a floor for how much HD+ a resolution can be before the free supersample perk.
I think the point here is that Ryse in 1080p would undeniably look sharper and somewhat better than it already does now.
Not when it would have to drop framerate or eyecandy or both.
Yeah but when we see games that are the same and one has the better resolution without sacrifices which one would you pick?
one wordWhat is in play is a series of trade-off in one game for one platform.
what happened?Some new details on Respawn Entertainment’s Titanfall have come to light. The game runs at a solid 60fps as revealed by the developer and it is fully V-Synced. That means you’ll never see a torn frame on the screen.
There's been some discussion about the performance of the game, particularly in terms of a fluctuating frame-rate, and what quickly becomes evident is that Sucker Punch has opted to continue the strategy it used on its PlayStation 3 titles: a solid v-sync working in combination with a completely unlocked frame-rate. The difference here is that while the previous titles in the series would frequently drop beneath the 30fps threshold, it takes a mass of action and GPU-intensive post-processing effects to truly impact inFamous: Second Son's performance. Bearing in mind the high levels of detail, and the overall complexity of the rendering pipeline, that's a stunning achievement.
Certainly, the game is a visual feast. Similar to Guerrilla Games' latest work, inFamous operates with a materials-based deferred renderer, which not only allows for a multitude of dynamic light-sources, but also lights the scene according to the physical properties of the objects present - for example, reflectivity and the roughness. An energy-conserving model like the one used here treats light as energy, calculating how light spreads across the surface over the material according to its physical properties. The results can be absolutely beautiful to behold - reflections in particular (what looks like an expert blend of pre-baked and full real-time) can look sublime.
Also worthy of note is the implementation of state-of-the-art anti-aliasing, believed to be a variant of SMAA T2X, as found in Crysis 3. This is one of the best post-process anti-aliasing techniques we've seen, combining a new take on MLAA with a temporal element. Edge-smoothing is phenomenal, and while there is some ghosting, it is not any kind of real distraction during gameplay.
I didnt question the trade offs between resolution and visual detail and I do recognize that Ryse managed to output certain results due to lower resolution. The point is these trade offs would have been less if the XB1 had more performance.Talk all about the theoretical situations all you want, but that's not the situation that's in play. What is in play is a series of trade-off in one game for one platform.