Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
There is some juicy tech stuff in there

As far as the CPU optimisations went, yes we did have to cut back on some features due to the CPU not being powerful enough. As we originally feared, trying to support a detailed game running in HD put a lot of strain on the CPUs and we couldn't do as much as we would have liked. Cutting back on some of the features was an easy thing to do, but impacted the game as a whole. Code optimised for the PowerPC processors found in the Xbox 360 and PlayStation 3 wasn't always a good fit for the Wii U CPU, so while the chip has some interesting features that let the CPU punch above its weight, we couldn't fully take advantage of them. However, some code could see substantial improvements that did mitigate the lower clocks - anything up to a 4x boost owing to the removal of Load-Hit-Stores, and higher IPC (instructions per cycle) via the inclusion of out-of-order execution.

On the GPU side, the story was reversed. The GPU proved very capable and we ended up adding additional "polish" features as the GPU had capacity to do it. There was even some discussion on trying to utilise the GPU via compute shaders (GPGPU) to offload work from the CPU - exactly the approach I expect to see gain traction on the next-gen consoles - but with very limited development time and no examples or guidance from Nintendo, we didn't feel that we could risk attempting this work. If we had a larger development team or a longer timeframe, maybe we would have attempted it, but in hindsight we would have been limited as to what we could have done before we maxed out the GPU again. The GPU is better than on PS3 or Xbox 360, but leagues away from the graphics hardware in the PS4 or Xbox One.

I've also seen some concerns about the utilisation of DDR3 RAM on Wii U, and a bandwidth deficit compared to the PS3 and Xbox 360. This wasn't really a problem for us. The GPU could fetch data rapidly with minimal stalls (via the EDRAM) and we could efficiently pre-fetch, allowing the GPU to run at top speed.

Basically weak CPU, but GPU better than PS360 but not in league with PS4/One. Bandwidth not a problem which is interesting.
 
So, this guy worked on a western launch game that had MP support, and it was a well received game.

So... AC3, Fifa13, BlackOps2... anything else?
 
After about a week of chasing we heard back from the support team that they had received an answer from Japan, which they emailed to us. The reply was in the form of a few sentences of very broken English that didn't really answer the question that we had asked in the first place. So we went back to them asking for clarification, which took another week or so to come back. After the second delay we asked why it was taking to long for replies to come back from Japan, were they very busy? The local support team said no, it's just that any questions had to be sent off for translation into Japanese, then sent to the developers, who replied and then the replies were translated back to English and sent back to us. With timezone differences and the delay in translating, this usually took a week !

This portion strikes me as hilarious. Really? in post 2010?

They might have been able to do away with Google translate and do it in real time.


This was surprising to hear, as we would have thought that they had plenty of time to work on these features as it had been announced months before, so we probed a little deeper and asked how certain scenarios might work with the Mii friends and networking, all the time referencing how Xbox Live and PSN achieve the same thing. At some point in this conversation we were informed that it was no good referencing Live and PSN as nobody in their development teams used those systems (!) so could we provide more detailed explanations for them?

Wow, really...?
 
This portion strikes me as hilarious. Really? in post 2010?

They might have been able to do away with Google translate and do it in real time.




Wow, really...?


Do you think if they HAD actually used those systems, Nintendo could honestly be proud of what they made? Online from them is a joke. It's worse than what even the 360 launched with 8 years ago.
 
I am more concerned that they made such a crappy general dev environment. They should have learned something regarding that as well.
 
Lego Marvel Superheroes Face off:

http://www.eurogamer.net/articles/d...arvel-super-heroes-next-gen-face-off#comments

PC and PS4/XB1 versions are graphically identical aside from the very strange lack of SSAO in the PC version (although I can't spot that difference in any of the comparison shots). We're even stuck with the same crappy post process AA filter.

Frame rate is the real surprise though. The game is locked to 30fps on both consoles. Not all that weird until you consider I'm pushing out around 150 fps with everything maxed out on my 670! What the hell is going on with the console versions to have such a poor frame rate?
 
Pretty sure this is due to the newness of the consoles and the 'good enough-ship it' mentality.

I'd like to see the how the next Lego title performs. I believe Lego Hobbit is due this summer?
 
I would think the issue is just money, dirty/quick job.

Wrt to WiiU article it was an awesome read... especially the.part about English... We all have some form of scientist or computing background here, we all know the researches in sciences are published in English...
Now we learn that it takes one week to Nintendo to answer a question, the answer actually being off... wtf
I mean I'm close to assume that some people at Nintendo HQ, part of technical/engineering groups, read less presentations (be it Siggraph, gamefest, material published by studios, etc. ) than lots of members here... they might get theiilr hand on translations but... more than often translations don't cut especially as gap between English and Japanese seems greater to me than say between French and English.

I all the sudden feel better for bashing lot of the decisions they took with the WiiU.

IMO that is awesome and that should be presented to investors eventhough ultimately investors cares for revenues/profits and cash. It says a lot about Nintendo wasted potential.

With decent management I can only fanthom about the amount of money they could make. So much for the "Nintendo does not care my type of costumers", it is more they are clueless, they could have it both way.

Shockingly bad...
 
ummmmm... are there even reference manual in japanese?
Actually I would never want technical documentation in anything but their source language, and this happens to be english for pretty much everything.

I have very bad flashbacks each time I remember translated documentation, often its just cheap which atleast tells you about the effort invested and the quality, other times its gramatically correct but wrong which is even worse....

I can only imagine the front line of "support" beeing some cheap students with a list of prepared answers, and you only get to real technicians when you bugged them a couple times. Pretty much like any support works these days. (Still a rather horrible thing )
 
Lego Marvel Superheroes Face off:

http://www.eurogamer.net/articles/d...arvel-super-heroes-next-gen-face-off#comments

PC and PS4/XB1 versions are graphically identical aside from the very strange lack of SSAO in the PC version (although I can't spot that difference in any of the comparison shots). We're even stuck with the same crappy post process AA filter.

Frame rate is the real surprise though. The game is locked to 30fps on both consoles. Not all that weird until you consider I'm pushing out around 150 fps with everything maxed out on my 670! What the hell is going on with the console versions to have such a poor frame rate?

This is very sad. It seems to be CPU limited, from what you say.
 

Thanks for the link. We joke of companies living in a vacuum, but for Nintendo to actually be doing just that and not use psn or live is just unbelievable. That's insane, but it does totally explain why their online efforts are what they are. The communication issues are less surprising to me. A few friends that had worked at Nintendo USA back in the day used to tell me that westerners there were basically ignored. It was made clear to them that the Japanese opinion is what matters and that they were at the bottom of the totem poll. I suspect the one week delay in communication is not due to translation issues, but more like that their questions, comments and opinions really are of little interest and/or low priority to Nintendo.
 
How can BF4 get full fat feel of 60fps on PS4, but more cross gen ports can't get close? I know Dice said they were getting close to 90% utilization of the CPU while i haven't heard anything like that from other devs but...come on

I guess you really have to, like always, develop the game with 60fps in mind for the consoles from the start. I'll chalk it up to launch titles
 
Eh, what?

On another note, i'm always be-flummoxed by people saying "this isn't next gen" at 30fps caps for PS4 and XB1. Are they saying that 30fps games didn't exist as a majority last generation or the generation before that or even the generation before that? With 360 and PS3 in particular, 30fps was the standard from the beginning. Does that mean it wasn't the next generation of consoles at that time either?

Console devs are going to aim for 30fps and maximize the eyecandy unless they really base their game on 60fps from the start. This has been the rule since almost as long as console 3D rendering has existed.
 
Console devs are going to aim for 30fps and maximize the eyecandy unless they really base their game on 60fps from the start. This has been the rule since almost as long as console 3D rendering has existed.
Now that I think about it, 60fps was a product of the CRT era, and the system was outputting an interlaced signal to get that 60fps, at 480i, not 480p. Some games could do progressive at 60fps, but most couldn't. 60fps has always been the exception, not the rule. Frankly, I'm not sure where this sudden fascination with 60fps has come from, except PC gamers that are used to it. But, frankly, a solid 30fps is perfectly fine for most titles. The operative word being "solid". Most of the "30fps" games that people complain about actually have framerates that fluctuate wildly, dipping as low as the teens and rarely actually hitting the 30fps target. I suppose some people get that into their heads that that's what 30fps is, and so they hate it. But take something like Lego Marvel on PS4. The action is so smooth that I honestly had a hard time telling that it was "only" 30fps instead of 60. I had to look for specific markers to tell me what was going on, something that 99.99% of gamers don't even know exist, much less would actually think to look for. They probably think the game does run at 60fps because of how smooth it is.
 
The more the screen covers your fov, the more important the fps is for any type of games that have lots of camera panning. But if you sit relatively far from your display (like me), then the impact of higher fps is less felt.
Having said that, I do think 30fps is a bit low. The problem is we need to choose between 30 and 60fps and not in between. Is it the display limitation that we can't target like 45fps? I know with that NVIDIA/AMD sync stuff we could basically target any fps, but can it be done without it and the image remains judder/tear free (provided that the fps is constant/stable)?
 
Now that I think about it, 60fps was a product of the CRT era, and the system was outputting an interlaced signal to get that 60fps, at 480i, not 480p. Some games could do progressive at 60fps, but most couldn't.

Not true. Most early consoles, arcade machines, and home computers output a 60Hz 262 line non interlace display with about 200 lines of usable pixels.

Early games makers did their absolute best to reach 60fps.
 
Status
Not open for further replies.
Back
Top