Too bad we never saw a game 5 times better (visually) than Rogue Leader
I'm still convinced that they used blood magic to pull those visuals off. lol
Too bad we never saw a game 5 times better (visually) than Rogue Leader
Good question.
For professional reasons, I have had a long term interest in benchmarking, and often no numbers are better than bad numbers.
Eyeballing the specs and estimating between thumb and forefinger, if done by someone who has been around for a while, will give a ballpark estimate that circumvents corner cases and all the unknown potholes and wrinkles of any given architecture that may mess up a specific benchmark result, but will tend to even out a bit over a larger project. Estimation side steps the compiler (entire software stack for something like a game console) can of worms as well.
Of course, lacking both technical specs, and in my case, intimate knowledge of the hardware demands of real world graphics software, I'm not in a good position to estimate anything personally as far as the Switch goes, but I guess a large number of developers could do a decent enough job of it.
Where such an analysis would get really interesting is of course the specific wrinkles of a design. How would you play to its strengths? What common techniques would you try to avoid, and use alternative methods to achieve similar results?
Using multiplatform title performance is the only option available to the console armchair expertise (to whom I belong). But quantifying the visual output into relative performance numbers of the underlying architectures? Impossible beyond the very broadest of strokes.
Speaking in hypotheticals, if one were to port a PS4 game to the Switch, a naive way would be to reduce frame rate targets from 60fps to 30fps, or resolution from 1080p to 720p (or variable), removing features that simulate optical/cinema artefacts (CA, film grain, DOF, motion blur), or simply reduce complexity such as amount of ambient debris/grass/bushes and so on.Of course if you compared raw peak throughput, Switch might not be that impressive. But nobody reached even close to maximum peak performance on last gen consoles, even if (= when) they spend half of the project programming effort optimizing the code. Modern CPU & GPU allows developers to focus on critical parts instead of having to optimize the whole code base.
Dynamic res for Zelda Breath of the wild. From 648p to 720p on portable mode, 810p to 900p docked.
http://www.eurogamer.net/articles/d...h-of-the-wild-uses-dynamic-resolution-scaling
Never really noticed it, so that's good. It seems like they didn't put a lot of effort into the port, though I guess considering the size of the game it's kind of understandable.
I wonder if Nintendo will still be using their old engines on Switch? For instance, I wonder if ARMS, Splatoon 2, and Xenoblade 2 are all going to use the old Wii U engines? I'd like to see what a native engine could do on the system, but I don't know how Nintendo does things in those regards.
Never really noticed it, so that's good. It seems like they didn't put a lot of effort into the port, though I guess considering the size of the game it's kind of understandable.
I wonder if Nintendo will still be using their old engines on Switch? For instance, I wonder if ARMS, Splatoon 2, and Xenoblade 2 are all going to use the old Wii U engines? I'd like to see what a native engine could do on the system, but I don't know how Nintendo does things in those regards.
That's the first thing that came into my mind as well when reading the DF article. Zelda is dropping frames too often for it to be intentional. Nintendo quality is better than that. Hopefully the firmware bug gets fixed soon.I wonder if the firmware bug that causes a minor performance drain when using dynamic resolution (See: DF article about Fast RMX) could be a potential factor in Zelda performance drops then ...?
You only need to miss 1/30 second (30 hz) by a fraction and frame time drops to 1/20 (20 hz).
That's the first thing that came into my mind as well when reading the DF article. Zelda is dropping frames too often for it to be intentional. Nintendo quality is better than that. Hopefully the firmware bug gets fixed soon.
That's the first thing that came into my mind as well when reading the DF article. Zelda is dropping frames too often for it to be intentional. Nintendo quality is better than that. Hopefully the firmware bug gets fixed soon.
I think you're talking about the PS3 and X360 only, because the Wii U had 3 OoO PowerPC 750 cores, though they were clocked rather low at 1.24GHz and carried an ancient instruction set.Switch has a modern OoO CPU. Last gen CPU code was horrible. Lots of loop unrolling to avoid in-order bottlenecks (no register renaming). Lots of inlined functions, because no store forwarding hardware -> 40+ cycle stall for reading function arguments from stack. No direct path between int/float/vector register files (through memory = LHS stall). Variable shift was microcoded (very slow). Integer multiply was very slow. There was no data prefetching hardware. Developer had to manually insert prefect-instructions (even for linear array iteration). ~600 cycle stall if prefetch wasn't used for L2 cache miss (and no OoO to hide any of it). Code was filled with hacks to avoid all these CPU bottlenecks.
Speaking in hypotheticals, if one were to port a PS4 game to the Switch, a naive way would be to reduce frame rate targets from 60fps to 30fps, or resolution from 1080p to 720p (or variable), removing features that simulate optical/cinema artefacts (CA, film grain, DOF, motion blur), or simply reduce complexity such as amount of ambient debris/grass/bushes and so on.
But that is just cutting down.
Too bad we never saw a game 5 times better (visually) than Rogue Leader
WiiU had no SIMD (only packed singles), no SMT and very low clock rate. Xbox 360 CPU did 4d multiply+add per core per cycle at 3.2 GHz (over 5x peak flop advantage).I think you're talking about the PS3 and X360 only, because the Wii U had 3 OoO PowerPC 750 cores, though they were clocked rather low at 1.24GHz and carried an ancient instruction set.
Mario Kart 8? Looked awesome and ran at a (lot) more stable 60 fps.
No AA, no AF, very basic tracks.... It look good because of "art", but that's it for me. No, I agree it's not fair because of the resolution differences, and the tv it was played on. And that NGC was not blown away by the competition power wise.
Not sure what you mean. If you mean the amount of detail on screen, games pushed waaay more than 5x rouge leader. Or if you mean just in terms of looks.Too bad we never saw a game 5 times better (visually) than Rogue Leader
Yeah i'd say MK8 is Wii U's poster boy. That and Bayonetta 2. Both games didn't need much cpu power so the Wii U could shine. Didn't have AA or AF though. Bandwidth is definitely the reason console games still don't have max AF.Mario Kart 8? Looked awesome and ran at a (lot) more stable 60 fps.
Nope, I made no such assumption, you quoted my "Speaking in hypotheticals..." yourself!You're assuming geometry could be exactly the same but it most probably can't, and you're assuming these multiplatform games run at 60FPS in the standard PS4 version, whereas I think the great majority of them run at 30 FPS.
Maxwell architecture's geometry performance is much higher than GCN's but this is a 2 "Polymorph" geometry engines that needs to run at 300MHz. So while the rest can probably be LOD'ed down, geometry cannot and asset developers may need to go back to the drawing board for this.
And then there's CPU performance. While the A57's performance/clock may be similar to Jaguar (according to geekbench results at least), game devs have access to at least 6 cores at 1.6GHz in PS4Bone and only 3 cores at 1GHz in Switch.
This is over 3x the total amount of instructions per second. And while graphics may be scaled down somewhat easier, going over such a big difference in CPU performance may be even harder.