120fps at 4k can still look fake

babcat . I would love to see a dev take a xbox one or ps4 and push 720p and make the best looking game they can. However I sit much further away from my tv than I do my computer. Add to that but multiple screens really bring in a level of immersion I can't get from a tv no matter the size.

so higher resolutions are important on the pc side still .
 
If a developer made a game at 720p and 30fps that appeared similar to a movie, I think it would be a hit.
Most gamers would run away screaming if such a game would require just the slightest amount of interaction due to the lag and the blur.
 
Higher resolutions are great but most games today aren't suffering from a lack of pixels. They suffer from a lack of quality pixels. Environment detail and texture quality is universally terrible relative to the resolutions at which they are rendered.

It's simply a matter of cost. It's very easy to flip a switch and tell the hardware to create more pixels. It's much more expensive and time consuming for artists to create higher fidelity assets.
 
What is pathetic is that if I bought a gtx 980 today, installed crysis 3, and set all settings on max (except useless AA) most of the power would be going to waste.

You are calling for photo realistic faces and consider AA useless :oops::cry: there few things more anti-climatic than photorealistic jaggies.
 
This thread wrongly teases me about a 4K 120Hz monitor which plain doesn't exist and would hardly be supported by vid cards. GTX 980/970 would perhaps be the first nvidia ones, using a couple Displayport cables to connect the monitor.

I gamed for many years at 100Hz (even though it was just 1024x768, with 2x MSAA and then 4x MSAA), it's worth it and makes a tremendous difference. Even if your framerate doesn't keep up, or even especially if your framerate doesn't keep up. No waiting for your monitor and no hard choice of even more waiting on the monitor or huge, long-lived tearing.
I never got very interested in the "new" ways as a result.. > 2-core CPUs at > 3GHz, with a > 1 tflops GPU, in order to play the diminishing returns game (after S.T.A.L.K.E.R. and Crysis 1, new games look about the same in a broad sense) and be stuck at 60Hz.
 
Games are slowly moving there, the difference is developers who aim for realism aren't fixated on just one aspect(in your example, the character).
They have trees, cars, guns etc that need to match the characters level of realism.

You also need to take into account that making realistic games for people with GTX 980's is financially unviable.


And animations, interactions with the environment - including with people and the consequences of doing awry stuff, that's all incredibly hard.
I would like a game were the people ask you to stop when they're giving a presentation (or briefing) and you're constantly switching the lights off and on, or standing in front of the projector, or piling up chairs and crap on people's desks.
Just some reaction, don't necessarily send the security guys every time, which would undoubtedly turn into unneeded murder. Awesome would be if there's some dynamic composition, context-sensitive, of sentences spoken by the AI guys (even if your playing character remains speechless or has a choice of one or more lines at scripted events)

Of course, to create the perfect game you'll need at least five years and a really huge budget, that's becoming a little Mahnattan Project.

Now, about the problem of limited audiences. PS4, Xbox One sort of solve that if you're not unhappy with the results - keep in mind we'll have years of content creation and compute shaders etc. maturing ; and PC games will probably be the better for that.


------

Another way I'd like to see is a return of Arcade gaming. If you want to make a ridiculously impressive game tailored to incredibly powerful hardware, why not do it as an arcade cabinet? (with PC hardware in it of course, perhaps with a downclocked GPU to save on power)
You can then have ridiculous realism, or something completely unrealistic but ridiculously good lighting and IQ, looking like some older CGI reels.

The bonus would be (aside of not having to acquire and set up hardware, game, even microsoft/sony/whatever account) that on an arcade game you wouldn't spend 80% of the time in tutorials, cut scenes and exploration.
But maybe the economic model is doubtful, or the game would be too expensive to play (I remember those racing rip off late 90s/early 00s.. Drop some serious money like four time the cost of playing older arcade games, play for barely a minute and it's game over on missing the second check point)
 
Last edited by a moderator:
IMHLO, the TO is part of the reason, why we don't see his wish come true - hen and egg problem.

Without sufficient graphics horsepower (among other requirements), there won't be games that are optimized for this kind of visual experience. So, refraining from buying accordingly specced hardware actually worsens the problem of lacking contents since the target audience - the ones with monster machines who actually could run the proposed game in low res - stays small.
 
Another way I'd like to see is a return of Arcade gaming. If you want to make a ridiculously impressive game tailored to incredibly powerful hardware, why not do it as an arcade cabinet? (with PC hardware in it of course, perhaps with a downclocked GPU to save on power)

This exists in Japan where arcades are still alive. But almost all the machines in most arcades are PC's in a cabinet. There's an indie arcade scene in Japan that makes games for the arcade.

As a bonus those games sometimes make it onto Steam. :)

You won't be finding graphics powerhouses for games, however. That costs a ton of money and resources. Something arcades can no longer support.

Regards,
SB
 
TN panels don't count.


Why not? I've been an IPS snob for a long time but it seems the latest TN panels are very good. The only real downside is viewing angles but why do people care about those when you're going to be sitting dead center of the monitor anyway?
 
Why not? I've been an IPS snob for a long time but it seems the latest TN panels are very good. The only real downside is viewing angles but why do people care about those when you're going to be sitting dead center of the monitor anyway?

It's possible, but the last one I tried (on a 2013 laptop) was downright horrible.
 
Seeing all the recent rave reviews about fast, high resolution TN monitors is giving me the upgrade itch but quality control seems to be a major issue so I'm trying to wait.

4k at 28" is an interesting option but I think you're limited to 60Hz on current offerings.
 
I'm afraid that the larger the screen, the higher will TN's tech drawbacks will exacerbate. You will get color shifts even when staring straight at the screens, having the borders at a larger distance from the screen center.

But then again, non TNs don't get a free pass either. I'm standing in front of a ~1000Eu 30" 16:10 IPS screen which has terrible flickers when changing back light intensity. And it scares me at night because it emits some strange click sounds for or an hour or more after being powered off (!) It's not old tech even, but its the Dell U3014 I'm talking about.
 
Also, read about Acer's annoying Overdrive tech they have on their TN panels to supposedly improve response. It actually causes ghosting and it can't be easily disabled. A buddy just bought one of their big 4k LCDs and discovered this.

http://youtu.be/l7MrXr-fLbc

I have an ASUS G750JM notebook (GTX 860M). I've been trying out different panels on it. They're pretty cheap (for a reason). All have poor viewing angles but you can get some nice contrast and color with a few. Mainly the discontinued AUO B173HW01 V4. Apparently nobody makes a 17.3" IPS panel.

Something else - I still have a budget DTR emachines M6805 notebook from 2005. Its 15.4" panel is TN but the viewing angles are so good you'd think its PVA or IPS. So I'm not sure I'd would say TN has gotten better. Denser and cheaper, mostly, and bluer "thanks" to LED backlighting.
 
I'm afraid that the larger the screen, the higher will TN's tech drawbacks will exacerbate. You will get color shifts even when staring straight at the screens, having the borders at a larger distance from the screen center.

Good point.

But then again, non TNs don't get a free pass either. I'm standing in front of a ~1000Eu 30" 16:10 IPS screen which has terrible flickers when changing back light intensity. And it scares me at night because it emits some strange click sounds for or an hour or more after being powered off (!) It's not old tech even, but its the Dell U3014 I'm talking about.

That's pretty terrible :D
 
I see the OP's point but would say more realsitic graphics at 720p that uses the gpu's power for making the characters look as photoreal as possible actually sell better than simply making games with dumbed down graphics run at 1080p? That is the bottom line of it.
 
I see the OP's point but would say more realsitic graphics at 720p that uses the gpu's power for making the characters look as photoreal as possible actually sell better than simply making games with dumbed down graphics run at 1080p? That is the bottom line of it.

I would argue that it largely depends on the type of game you're playing.

If you're playing something like GTA, then yes, a physically accurate 720p path-tracer is probably ideal, because it will increase your immersion in the (mostly urban) environment, especially with all the light sources at night, the reflective surfaces (cars, windows, etc.). If the low resolution makes things slightly blurry, it's not the end of the world.

But if you're playing a tactical FPS or a flight simulator, then it's a different story. Being able to see an enemy plane as soon as possible matters a lot, and being able to tell whether it's a Ju-88 or a Fw-190 is crucial. When the plane gets close, having physically accurate light reflections on its wing would be a plus (that you'd barely have time to notice) but hardly essential. In that game I'd take a good 4K rasterizer over a 720p path-tracer any day.
 
Back
Top