Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
I did and I agree for the most part but all things being equal would you buy a phone with a low-res screen or a high-res screen?

I actually was just recently faced with that exact decision and went with the low-res phone. I was trying to decide between an HTC One M8 phone and the LG G3, where the HTC is 1920x1080 and the G3 is 2560x1440. Spec wise all things were equal, which in turn made resolution a disadvantage for the G3. The problem is it's gpu will have to work harder to do the same thing that the M8 does and in turn would stutter more during normal operation, and coming from a Windows phone stuttering is simply not acceptable for me, I expect the phone to be 100% smooth at all times. The G3 also sucks down more battery to power that higher res which was another negative. Finally it looked like there were halos around text on the G3 which made it's display appear worse, hence I ended up buying the M8. So in this case you had two phones where all things for the most part were equal spec wise, but the extra resolution because a detriment to me.


The biggest difference being that consoles are a way to remove most of the aspects that encumber PC gaming.

Yup I agree, and likewise ultrabooks/tablets/phones are a way to remove most of the aspects that encumber console gaming. Who wants to be stuck only being allowed to game in one location? Why can't I take my games with me? Why can't I run my games on other devices? Why can't I share games with my family? Why can't I game privately? Why can't I game on the bus? And so on...


Plus the graphics performance gap between a console and PC is much smaller than the gap between a console and a tab.

It depends when you make that measurement. Today? Sure. How about next year? Or at year 2? Year 3? Year 6?


Tablets at most release people from the static position of their TVs, but consoles are in the midst of readily overcoming that reality.

Sort of, they still enslave you to the main console with streaming and other such things.They as of yet aren't giving you true freedom like ultrabooks and tablets do right now.


It easier for a console to slave a tab/phone screen then it is for a tab to receive modern console specs. By the time tabs catch up with the PS4 or XB1, console gamers will already be streaming consoles titles to their ipads/iphones and android tabs/phones.

I don't think a tablet needs modern console specs to be considered "on par" with a console though, certainly not for the non pixel peeping general gaming public.
 
Last edited by a moderator:
Waiting 3 to 4 years to finally get "efficient" at using hardware in this day and age is dumb, it's a dated way to operate.

Developers are still stuck to the Lowest common denominator if they want to sell software. Example: I go buy a shiny new Ipad 5 and yet games are still being developed around the Ipad 2. I still have to wait 3-4 years before developers target Ipad 5 features and by that time the Ipad 8 is out. Whats the difference? I have to wait either way.

.....chained to a tv like consoles force on everyone.
Its chained to the TV because if you want any kind of modern visual fidelity you have to have powerful processors that in turn suck energy. If developers really wanted to go all out and maximize the potential of something like the A7 it would still produce generations past graphics and (if you are lucky) 2 hours of play time before the user has to search for a wall outlet to continue playing. At that point it defeats the convince of having an expensive mobile form factor just to briefly play the most demanding games on the go.
 
Except consoles can't use the power efficiency to improve visuals until another 7 years pass and they can then launch a new console. Meanwhile all other devices can leverage such chip efficiency to bring out better hardware all the time. Naughty Dog won't make any difference here. They aren't better than other developers, they only gained notoriety because they were allowed to take the inordinate time required to make quirky hardware actually render pixels at a reasonable frame rate. Other developers that aren't bogged down by such minutia get to actually spend brain cycles on the games themselves rather than dealing with learning new hardware nuances which is such a waste of time in this day and age.

Power and manufacturing efficiency is important from the vendor's cost perspective.
But there is also software efficiency (basically optimization and learning how to use h/w well).

You're contradicting yourself above. As long as a dedicated console developer like ND spend inordinate amount of time on the same hardware, they _will_ be able to deliver better results. It may mean a waste of time to you, but there will also be fans.

If you show up to the race track in your perfectly tuned AMC Pacer that you worked on for 4 years, then at the end of the day it's still an AMC Pacer. Your 100% efficiently tuned car will be beat by just about everything else even if everyone else runs sloppy and inefficient. Waiting 3 to 4 years to finally get "efficient" at using hardware in this day and age is dumb, it's a dated way to operate.

Cross platform developers generally cater for the SKUs with the biggest market size. So the latest and greatest will run the code in a brute force manner. It won't be able to tap its max potential. The new/upcoming low level GPU API for PCs may or may not change this picture, I have no idea yet.

It's difficult to define a blanket "simplicity" statement over a plethera of h/w for cross platform developers.
Tablets may not have enough fast RAM yet compared to home consoles.
New PCs may/should in fact be simpler to program in comparison unless they screw up their new features ?

The power difference between pc and console is great but consoles still carved out their niche because of their simplicity. The same thing can work in favor of tablets and ultrabooks, with the additions of convenience and not being chained to a tv like consoles force on everyone.

Simplicity in terms of developing similar things over and over on the same familiar hardware.
I don't think using async compute efficiently is necessarily simple. If true, the Metro folks would have used it day one. There are other challenges.

Ultimately resolution won't matter for games because on tablet or phone they can still render at a lower resolution and just upscale to the devices native resolution.

I don't know why resolution becomes the focus here. We are talking about overall fidelity, efficiency and game quality.

Whether the user is chained to a TV, it will depend on the vendor's product roadmap. Back to power efficiency above, if they can get power consumption/wastage down enough, then they can use the same reference run-time on portable devices too -- with an optimized game library. But by that time, they may already have another gen of home console ?
 
Developers are still stuck to the Lowest common denominator if they want to sell software. Example: I go buy a shiny new Ipad 5 and yet games are still being developed around the Ipad 2. I still have to wait 3-4 years before developers target Ipad 5 features and by that time the Ipad 8 is out. Whats the difference? I have to wait either way.

That's a different problem though. Console game pricing started at the top and works its way down. iPad game pricing started the opposite way around where they started at 99 cents and are slowly making their way up over time. So the problem there is not supporting your iPad 5's hardware, the problem is the financial justification to do so given how low game prices are on tablet. That's a consequence of starting at the 99 cent price point. It's just expected that tablet games will be cheap and it will take time to bring up that price point to where customers accept more expensive tablet games and hence developers can be more comfortable using your iPad 5 fully and recouping their financial investment. Incidentally, backward compatibility and consistent api will help dramatically with that as well.


Its chained to the TV because if you want any kind of modern visual fidelity you have to have powerful processors that in turn suck energy.

If there were 100% true then wouldn't we still have arcades? In the arcade they could put crazy hardware into the games to get maximum visual fidelity. But sometimes other factors trump that, namely convenience. Why be chained to the tv if you can get good enough quality without that?


If developers really wanted to go all out and maximize the potential of something like the A7 it would still produce generations past graphics and (if you are lucky) 2 hours of play time before the user has to search for a wall outlet to continue playing. At that point it defeats the convince of having an expensive mobile form factor just to briefly play the most demanding games on the go.

Needing a wall outlet at worst puts it on parity with existing console. It's not a negative, it's an option. You can still disconnect and go play in another room if you want, or once charged again go play without an outlet. I don't see how needed an outlet is a negative when you need one 100% of the time to game on console anyways.


You're contradicting yourself above. As long as a dedicated console developer like ND spend inordinate amount of time on the same hardware, they _will_ be able to deliver better results. It may mean a waste of time to you, but there will also be fans.

It's a waste of time pragmatically speaking. It's time spent "just to get shit working" rather than time spent improving the product itself. That's why it's a waste of time. To word it an alternate way, say company X makes a product and they spend 80% of all their time sorting out a custom method to package it and distribute it. That leaves them just 20% of the time to make the product itself. Now maybe there are fans of their distribution methods and they came up with really neat packaging but it's still a horrible waste of time when they were only able to devote 20% of their time to the actual product being sold. They would be better off spending 10% of their time on packaging and distribution using existing known methods leaving them 90% of their time to what matters most, the actual product. This isn't about fans or patting oneself on the back for doing something clever, it's about making the best use of man hours where it matters most. Having engineers spend countless hours banging their heads against the wall to make odd hardware do something is a terrible use of man hours.


Cross platform developers generally cater for the SKUs with the biggest market size. So the latest and greatest will run the code in a brute force manner. It won't be able to tap its max potential. The new/upcoming low level GPU API for PCs may or may not change this picture, I have no idea yet.

Why does it matter if it can't tap it's max potential in the most efficient manner possible? So long as you are seeing improvements then were is the problem? I mean look at all these console game remasters that are coming out, Tomb Raider, Metro, Last Of Us, GTA, etc, they didn't re-write entire 360/ps3 game code to get "max potential" from the xb1/ps4, they brute forced their way on there. Does that make them all crap? Do you even care? I mean they look better right? And given how long it takes for new consoles to come out one will be able to get good results from the much maligned brute force method. Brute force does not automatically equal bad.


I don't know why resolution becomes the focus here. We are talking about overall fidelity, efficiency and game quality.

Because as tablet display resolutions go up it doesn't mean that gpu load will go up at all. Given tablet size there is a sweet spot for resolution that you can stick with and let the hardware upscale to native display res. While tv's get bigger and bigger and things like 4k get more important when 65" screens become the norm, tablet screens will always be small. That lets you always stick to relatively low resolutions on tablet and hence you can get more from a tablet gpu than you could from a console gpu. The console gpu will always have to work harder because it's connected to ever growing tv sizes whereas tablet screen sizes remain consistent.
 
Last edited by a moderator:
It's a waste of time pragmatically speaking. It's time spent "just to get shit working" rather than time spent improving the product itself. That's why it's a waste of time. To word it an alternate way, say company X makes a product and they spend 80% of all their time sorting out a custom method to package it and distribute it. That leaves them just 20% of the time to make the product itself. Now maybe there are fans of their distribution methods and they came up with really neat packaging but it's still a horrible waste of time when they were only able to devote 20% of their time to the actual product being sold. They would be better off spending 10% of their time on packaging and distribution using existing known methods leaving them 90% of their time to what matters most, the actual product. This isn't about fans or patting oneself on the back for doing something clever, it's about making the best use of man hours where it matters most. Having engineers spend countless hours banging their heads against the wall to make odd hardware do something is a terrible use of man hours.

Nope. Good developers will use their time to implement and optimize for new features if PS4 doesn't have GPU bottlenecks like PS3. They will allocate their time and resource budget accordingly.

Why does it matter if it can't tap it's max potential in the most efficient manner possible? So long as you are seeing improvements then were is the problem? I mean look at all these console game remasters that are coming out, Tomb Raider, Metro, Last Of Us, GTA, etc, they didn't re-write entire 360/ps3 game code to get "max potential" from the xb1/ps4, they brute forced their way on there. Does that make them all crap? Do you even care? I mean they look better right? And given how long it takes for new consoles to come out one will be able to get good results from the much maligned brute force method. Brute force does not automatically equal bad.

Efficiency matters because the resulting software will run better. End users can tell the difference.

As for newer hardware... like I said, if you just want to show off the gear, you can tweak parameters on PC/Mac to make it run better. You don't really have to wait 3-4 years. The faster they roll out PC GPUs, the less mature the drivers may be.

But it doesn't mean optimization for mainstream console SKUs will go to waste. Users will be able to tell the difference just because developers spend more time testing and ironing out the kinks under different use cases and paths.

Because as tablet display resolutions go up it doesn't mean that gpu load will go up at all. Given tablet size there is a sweet spot for resolution that you can stick with and let the hardware upscale to native display res. While tv's get bigger and bigger and things like 4k get more important when 65" screens become the norm, tablet screens will always be small. That lets you always stick to relatively low resolutions on tablet and hence get you can get more from a tablet gpu than you could form a console gpu. The console gpu will always have to work harder.

Tablet may be limited by memory size and heat.

What you mentioned above seems to be decided by marketing rather than technical folks.
Given any segment of users and devices they want to target, they will decide what and how to deliver to home consoles, tablets and phones.

It doesn't have to be "console gpu will always have to work harder." (vs tablet).
 
...
Because as tablet display resolutions go up it doesn't mean that gpu load will go up at all. Given tablet size there is a sweet spot for resolution that you can stick with and let the hardware upscale to native display res. While tv's get bigger and bigger and things like 4k get more important when 65" screens become the norm, tablet screens will always be small. That lets you always stick to relatively low resolutions on tablet and hence you can get more from a tablet gpu than you could from a console gpu. The console gpu will always have to work harder because it's connected to ever growing tv sizes whereas tablet screen sizes remain consistent.

Don't jinx my dreams of a 32" tablet that has four legs that fold out like a table, and a special harness that allows me to carry it around on my back.

Unless something changes dramatically with the business, to make pushing expensive hardware more profitable, I think hybrid tablets, tablets and cheap laptops will eventually make consoles a less attractive option for gaming, I'm willing to pay a little more for one of those devices with greater utility even if it's slightly less performance. And with low-level APIs coming to all of those devices, the performance advantage will close even further. Consoles are going to become more like PCs to stay relevant and those other devices will become more like consoles.

Either consoles have to shorten their hardware lifecycle significantly and be forwards compatible, or we admit that consoles with hardware that is already low-to-mid range provide "good enough" visuals and those other devices will be just as adequate.

This is a digital foundry thread? What happened?
 
... or they can just make home consoles work better with tablets, or combine them. :)
These are marketing decisions, not necessarily tech decisions though.
 
Nope. Good developers will use their time to implement and optimize for new features if PS4 doesn't have GPU bottlenecks like PS3. They will allocate their time and resource budget accordingly.

Efficiency matters because the resulting software will run better. End users can tell the difference.

As for newer hardware... like I said, if you just want to show off the gear, you can tweak parameters on PC/Mac to make it run better. You don't really have to wait 3-4 years. The faster they roll out PC GPUs, the less mature the drivers may be.

But it doesn't mean optimization for mainstream console SKUs will go to waste. Users will be able to tell the difference just because developers spend more time testing and ironing out the kinks under different use cases and paths.



Tablet may be limited by memory size and heat.

What you mentioned above seems to be decided by marketing rather than technical folks.
Given any segment of users and devices they want to target, they will decide what and how to deliver to home consoles, tablets and phones.

It doesn't have to be "console gpu will always have to work harder." (vs tablet).

I have to disagree. Obtuse hardware is a waste of time. If you can get 90% of the results in 60% of the time, you're ahead. You can spend that time making sure your game is not buggy, add more content, or you can use that time to start working on the next thing. Fighting with 360, PS3 and is only worth the time because so many people have the devices. The market makes it worth the hassle. First party devs are showcasing the system, so they'll spend time a 3rd party might not.

Why is the hardware in PS4 and Xbox One so similar to the PC? Because devs don't want to deal with it anymore. Now it's easier than ever to port to PC and you're seeing games jump to PC that haven't been there in a while: Metal Gear, NBA 2K, etc.

Gamers can't see optimization. They can see relative results, usually coloured by brand loyalties.

DX12, Mantle, Metal (Apple), OpenGL 5(?) will lead to better performance on low-end devices, and should actually lead to more stable drivers. PC/mobile hardware will iterate faster than consoles, and the software layer is quickly catching up.

Intel Haswell, Skylake and onward are going to be very interesting to watch. They'll be very performant integrated solutions, jammed into pretty much every low-end to mid-range laptop around.
 
... or they can just make home consoles work better with tablets, or combine them. :)
These are marketing decisions, not necessarily tech decisions though.

If PS5 and Xbox Two were full-featured tablets that could be docked to a tv to allow for peripherals like VR, Kinect/Eye etc, I'd be happy.
 
Pure Pool sneaked up in the store, hadn't heard a peep about it so far, so a nice surprise, as it is by the guys who did Hustle Kings who did a great job on PS3 and also did some nice tech info sharing on their engine.

Game plays like butter and little to find fault with graphically, clearly native etc (as you'd expect of course) and the table texture is clear and stays clear. More games should be this flawless in how smooth and fast it plays! Only the menus are a little too old-school. But definitely recommended if you at all enjoy this type of game, and I am curious if they've advanced their reflections tech since last time.
 
If PS5 and Xbox Two were full-featured tablets that could be docked to a tv to allow for peripherals like VR, Kinect/Eye etc, I'd be happy.

That would mean the SoCs could only use up to 15W and performance would be crap compared to low-range PCs at the time.

No, thanks.
 
Digital Foundry: Hands-on with DriveClub

Article up on Eurogamer.

Digital Foundry said:
Having been capped at 30fps, this is one extra that helps to mitigate what some may see as an irredeemable backwards step. The visual pay-off is speaking for itself though. In DriveClub, Sony looks to finally have a powerful racing showcase to add to its PS4 stable.
 
That would mean the SoCs could only use up to 15W and performance would be crap compared to low-range PCs at the time.

No, thanks.

Worse, there would be nothing to technically differentiate them from other tablets in the market and they'd very soon be overtaken by them. Other tablets would naturally offer gamepad and docking support and suddenly you have third party games being released on every platform out there with the only true "high end" platform for gaming then being PC's which would include everything from integrated chips upwards.
 
There's a new digitalfoundry article up. Dead Rising 3 on PC.

It reads like the Xbox One is punching way above its weight. Just like several of us have been stating here ;)
 
There's a new digitalfoundry article up. Dead Rising 3 on PC.

It reads like the Xbox One is punching way above its weight. Just like several of us have been stating here ;)

Oh great, can't we just keep the console vs PC stuff in the DF comments? I'll just say here what I said there, no it's not.

It's only people (which seemingly includes yourself) that don't understand either the workload DF are trying to push on the PC or how to interpret the DF numbers themselves that are making this incorrect assumption.

1080p @ 60fps is 4.5x the workload of 720p@30 fps which the XBO runs at - and very inconsistently at that.

The Geforce 780 and even the Titan are no-where near 4.5x faster than the XBO GPU even on paper. So the fact that they can't consistently achieve 4.5x the performance of 720p@30fps - which the XBO also can't achieve consistently is not at all surprising and in no way hints at the XBO "punching above it's weight".

I understand your eagerness to believe that to be the case but a very quick and simple analysis of the numbers would probably have been in order before jumping to incorrect conclusions and then posting them here as some sort of "I told you so".
 
Get this, I tried the freaking game, it's unbelievably unoptimized, @720p only consumes 60% of the GPU because it's capped at 30fps, @1080p it barely goes beyond 80% and even then, the utilization is fluctuating all over the place (50~60~70.. etc). It also doesn't support all aspect ratios, which is ridiculous. CPU utilization is also quite limited too, but that is always explained by the API limitation. Still more work could have been done in this area to increase CPU load. AMD CPUs run this game like total crap, to the point that a lowly i3 2100 beats the fastest AMD processor.

XO was running the game like crap if my memory serves me right, dropping to 20fps in most areas. So despite the PC port shortcomings the experience is far better on it than on XO.As any mid-range GPU can sustain 1080p30, as opposed to 720p20!

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Dead_Rising_3-test-dr_3_proz_amd.jpg
 
Seems to scale very badly beyond 4 threads which would hurt AMD. In fact it doesn't even seem to be taking much advantage of more than 2 cores.
 
Status
Not open for further replies.
Back
Top