"Cutting Edge" *cut-off thread*

Simple lighting on Vita, no shadows, missing particle and post-processing effects, overcompressed very low res textures, poor texture filtering

From digitalfoundry's article on Killzone Mercenary:

Virtually every post-process effect from KZ3 was ported over. We have a full depth-based colour correction system, depth-of-field, film grain, bloom, a new volumetric fog system and more. On top of this we also implemented tone-mapping and local contrast adaptation, both of which works really well with our HDR lighting model.

From Uncharted Golden Abyss:

everything contained within that lower-resolution window looks very impressive, and worth making concessions for. The lighting in particular is outstanding, with a level of global illumination being applied across large jungle environments that adds greatly to the feeling of a living, breathing place of nature.


On the other hand, Vita hardware wasn't anywhere as advanced as DX11 GPUs back then.
Featureset parity between ULP GPUs and desktop GPUs didn't happen until much later. In 2011 the Vita's PowerVR 5XT had unified shaders and full OpenCL 1.1 + OpenGL ES 2.0 compliance. Tegra 3 that launched at the same time had a GPU with separate pixel and vertex shaders, and no compute capabilities.
The Vita launched with the best ULP GPU available, with an unprecedented adoption of TSVs for Wide I/O that gave the GPU more bandwidth than anything else at the time, and one of the first adoption of a quad-core Cortex A9 module with NEON.

The Switch launched at the same time as the first Snapdragon 835 phones, which can sustain almost as much performance in mobile mode as the active-cooled Shield TV plugged to the wall.
This means the S835 phones perform about 3x faster than the Switch in mobile mode.
 
The Switch launched at the same time as the first Snapdragon 835 phones, which can sustain almost as much performance in mobile mode as the active-cooled Shield TV plugged to the wall.
This means the S835 phones perform about 3x faster than the Switch in mobile mode.

Any game comparisons to make that point or even how you reach that conclusion with regards to Kyro CPU and Adreno 540?
It is made complex as both can adjust performance depending upon mobile/charged; I agree though X1 has potential for much higher TDP/watts relative to a phone/small tablet.
 
Last edited:
Simple lighting on Vita, no shadows, missing particle and post-processing effects, overcompressed very low res textures, poor texture filtering and then this: "Downgrades in everything from shadows, effects, textures and even its co-op player count are unfortunate, but stand for nothing when a game still runs in the 10-20fps range - with stuttering dragging it down further".

Ok, now let's try to imagine Origins on Switch... or you can try to calculate the difference in pixel count between 1080p/60fps vs 600p/30fps. And it is when the console is docked in Doom because the compromises are even larger when undocked...

Bordelands 2, it's 720p 20/30fps vs 544p 10/20fps which is honorable. Obviously, the game is almost unplayable, but the downgrades seem to be less important from a pure technical standpoint.

Especially in comparison with this

Indeed... the Switch can't even run an indie game at the same framerate than the PS4/XB1.

In comparison, the Vita had no problems to run less ambitious games at the same framerate :

 
Last edited:
That's not completely true. I don't see any signs of motion blur and MLAA and these are the heaviest post-processing effects in Killzone 3. Obviously, unlike Cell with 256 KB scratchpad or modern GPUs with compute shaders and shared memory per multiprocessor, Vita GPU simply didn't have enough flops and bandwidth to handle these PP effects.

GI via probes or lightmaps costs next to nothing, so nothing special here

Featureset parity between ULP GPUs and desktop GPUs didn't happen until much later
There is no featureset parity even now, mobile chipsets (except for TX1) don't support latest DX12 features.

Tegra 3 that launched at the same time had a GPU with separate pixel and vertex shaders, and no compute capabilities.
This doesn't matter since we are talking about actual consoles hardware

The Vita launched with the best ULP GPU available
The same with Switch. Only the latest Adreno 630 has barely surpassed TX1 in Car Chase test, not sure whether it will made it in GFXBench 5 Aztec Ruins test though since mobile GPUs tend to fall short in newer benchmarks because they have never been intended for modern features in new engines, unlike desktop architectures like Maxwell.

The Switch launched at the same time as the first Snapdragon 835 phones, which can sustain almost as much performance in mobile mode as the active-cooled Shield TV plugged to the wall
No, they can't substain as much performance in mobile mode, neither can they sustain this performance for any significant amount of time without thermal throttling in smartphone form factors

This means the S835 phones perform about 3x faster than the Switch in mobile mode.
That's simply a BS, please stop spreading it
 
And it is when the console is docked in Doom because the compromises are even larger when undocked...
I don't think DOOM is an example of a good port. There is nothing special about Doom engine too (in fact, it's quite standard) and it doesn't produce better picture than Outlast 2.

Obviously, the game is almost unplayable, but the downgrades seem to be less important from a pure technical standpoint.
Obviously, it's a piece of junk in comparison with Outlast 2 port, yet it manages to keep at least 30 FPS most of the time unlike the Borderlands 2 port on Vita.

the Switch can't even run an indie game at the same framerate than the PS4/XB1.
This indie game has all modern features like PBS, reflections (thankfully, planar and billboard ones), TAA, DOF with Boke, dynamic shadowmaps, SSAO, etc. and uses these features smarter than most other AAA modern games.
And the fact the game was ported by a small indie team, makes it even more an achievment.

In comparison, the Vita had no problems to run less ambitious games at the same framerate :
That's a good example of a game with much more simplistic graphics in comparison with other games of that time. Switch does the same quite easily, take a look at One Piece: Unlimited World Red or Dragon Ball Xenoverse 2 for example
 
I don't think DOOM is an example of a good port. There is nothing special about Doom engine too (in fact, it's quite standard) and it doesn't produce better picture than Outlast 2.

The simple reality is that Doom is a very demanding game unlike Outlast 2... unless you consider a very linear game with nothing happening on screen as an immense feat.

Obviously, it's a piece of junk in comparison with Outlast 2 port, yet it manages to keep at least 30 FPS most of the time unlike the Borderlands 2 port on Vita.

You're comparing an AAA game with an indie game with nothing happening on screen.

That's a good example of a game with much more simplistic graphics in comparison with other games of that time. Switch does the same quite easily, take a look at One Piece: Unlimited World Red or Dragon Ball Xenoverse 2 for example

Yeah, Xenoverse 2... 1080p/60fps on XB1, yet 900p/30fps in docked mode + downgraded graphics... for instance, you can compare that to Virtua Tennis 4 on Vita...

 
Last edited:
https://dictionary.cambridge.org/dictionary/english/cutting-edge
Cutting-edge: "the most modern stage of development in a particular type of work or activity"

I wonder what you think of the "original i7" with a 130W TDP beating the latest Core i7-7Y75 in both single and multi-core sustained performance. Gosh what have those guys been doing for the past 10 years?!

I think that's a poor example as there have been proper 4 core laptop parts from intel for ages. Unless you're trying to label Jaguar as a mobile chip first and foremost, even though in tablet form it only hit 1ghz in its devices and that's with 4 cores instead of 8.
No, it does not.
Sea Islands introduces a greater number of ACEs (8 in the PS4 and XBone, 2 on Tahiti) for improved compute resource allocation, new instructions and significant gains in geometry performance (see R9 290X vs HD7970 with equal number of geometry processors):

Well fair enough I had forgotten, but Xbox one only has 2. So it's straight off the shelf Bonaire for Xbox one. And 8 aces doesn't mean the PS4 chip came anywhere near even the desktop 7870, let alone 7970. Both consoles were cheap and underpowered, the only impressive thing as I see it was the Ps4 getting 8 gigs of gddr5. However even then the bandwidth is merely adequate, as evidenced by such low AF in many titles for starters.


Reports even say that devs had to chime in and ask for more RAM to get things working, otherwise Nintendo would be content even with just ripping out most of the Shield TV's PCB with 3GB and throw it inside a tablet.

So you don't know about MS wanting to ship 360 with only 256mb's of ram before epic stepping in? This is completely normal btw. All hardware manufacturers work with developers and ask them for feedback on hardware ; what they'd like to see.

"I'll just bother replying to the tiny bits that are (barely) tangibly worthy of a reply."

No, your bias is palpable, and frankly you're too rude and emotional to carry on further discussion. You got triggered by me daring to compare Nintendo's efforts to the PS4 and Xbox.
 
It's clear Nintendo could have done more, but I just have to laugh whenever someone acts like the current offerings from Sony and MS were ever state of the art monsters, and not cost cutting boxes.

What Nintendo is able to do with the switch in games like Odyssey is simply impressive, and I know the hardware isn't tapped just yet. I'm eagerly anticipating Metroid Prime 4 and Bayonetta 3 in particular.
 
Any game comparisons to make that point or even how you reach that conclusion with regards to Kyro CPU and Adreno 540?
Best thing we have are synthetic benchmark results.

We know that S835's Adreno 540 has a sustained performance that's really close to its peak performance (9-19% drop):

xPfFQXh.png


This is probably has to do with thermal dissipation in each model. It's safe to assume that a hypothetical console with the S835 and the same size/thickness of the Switch could get a simple heatspreader and/or heatpipe along the back and it would behave like the Galaxy S8 with a 9% drop or most probably better than that. And the closest we have of that is the Razer Phone with a heatpipe along the back panel.

This is a comparison between the Razer Phone and the Shield TV with active cooling and plugged at the wall:

GHX9zTK.png



Difference between the two on offscreen tests goes from 4% in the T-Rex score to 20% in the Manhattan 3.1.1.
9% lower than each of Razer Phone's scores furthers the difference to 13% in T-Rex and 27% in the Manhattan 3.1.1.

The Switch in mobile mode has its GPU clocked at 300MHz, so 30% of the clocks sustained by the Shield TV. Assuming 30% of the performance, we're now down to 39.2 FPS in T-Rex (worst case comparison) and 8.5 FPS in Manhattan 3.1.1 (best case comparison). This is against the S835 in the Razer Phone with sustained 102.9 FPS in T-Rex and 18.6 FPS in Manhattan 3.1.1.

S835 performance over mobile Switch in T-Rex = 102.9/39.2 = ~2.6x
S835 performance over mobile Switch in Manhattan 3.1.1 = 18.6/8.5 = ~2.2x

This is without taking into account that a S835 could be tweaked to have much less power/heat dedicated to the CPU, to increase GPU sustained clocks. Not to mention a heatsink with a fan.



No, they can't substain as much performance in mobile mode, neither can they sustain this performance for any significant amount of time without thermal throttling in smartphone form factors
Answered above. Difference in S835 is about 9%, and would probably be zero in a tablet with a fan+heatsink like the Switch.

That's simply a BS, please stop spreading it
See above.
 
I think that's a poor example as there have been proper 4 core laptop parts from intel for ages.
But insistently comparing the Jaguar to a 130W Nehalem is a good example, right?


Unless you're trying to label Jaguar as a mobile chip first and foremost

1FDCZEh.png


Yes, I am. Me and AMD, probably plus a few others.


You got triggered by me daring to compare Nintendo's efforts to the PS4 and Xbox.
You didn't compare, you stated they were both cutting edge. Which is objectively false, and I listed a small number of facts saying why that's false.
Anyone here is free to see who's actually being "rude and emotional" here.


It's clear Nintendo could have done more, but I just have to laugh whenever someone acts like the current offerings from Sony and MS were ever state of the art monsters, and not cost cutting boxes.
Not monsters, but they were state-of-the-art at launch.
Unlike the Switch.
:)

That's not completely true. I don't see (...)
Yes well, I'll trust the statements from the games' developers before your subjective perception.

The same with Switch. Only the latest Adreno 630 has barely surpassed TX1 in Car Chase test
Which TX1? The TX1 in the set-top-box plugged at the wall consuming over 19W? The TX1 in the 10" Pixel C tablet with a GPU clocked at ~800MHz and gets ~80% of the Shield TV's performance (before throttling)?
The TX1 in the docked Switch which is clocked even lower than the Pixel C?
 
9% lower than each of Razer Phone's scores furthers the difference to 13% in T-Rex and 27% in the Manhattan 3.1.1.
20.4*0.91 (-9%) = 18,56 for Razer phone in Manhattan 3.1.1 Offscreen
25.4 / 18.56 = 1,369, so TX1 at 1 GHz is 37 % faster in this benchmark, not sure how did you get 27%

The Switch in mobile mode has its GPU clocked at 300MHz
It's clocked at 384 MHz and certainly performance doesn't scale linearly with FLOPS with constant or slightly higher bandwidth.
That's why many Switch games are 900p in docked mode and 720p in mobile.
900p has 1,5625x more pixels. If geometry processing takes 0.5x of total frame time, there are 3x more flops for pixel shading in docked mode, yet many graphics intense Switch games don't scale beyond 900p
I doubt Switch would be even 2 times slower in handheld mode in comparison with SHIELD TV

This is without taking into account that a S835 could be tweaked to have much less power/heat dedicated to the CPU
Mobile benchmarks don't use CPU at all, it would be opposite in real world.

See above
Unconvincing
 
Judging the Switch by how well it fares in the case of Doom (i.e. a port of one of the most impressive and technically advanced games we've seen this gen, and done by a tiny-ass studio no less) while judging the Vita on the basis of Virtua Tennis and Playstation All Stars (which was created specifcally for both the Vita and the PS3 by Bluepoint games using the Bluepoint engine) is a rather weird choice.

The Vita was pretty good at running some of the less demanding PS3 games, but when I'm putting my nostalgia glasses away for a second I really cannot think of a great many occasions in which the handheld wasn't shit out of luck trying to run just about anythig that got a little more complex. Borderlands isn't exactly the only game that springs to mind here. Ever played RE Revelations 2 on the thing? It's scary bad looking. Even with some of the better ports, heavy sacrifices had to be made. Visually, Streetfighter X Tekken on the Vita is a mere shadow of its original self, and it's still not running anywhere close to the desired 60fps refresh rate. Dragon's Crown has near constant and truly nasty frame drops despite running at a quarter of the PS3 version's resolution

The exclusives sure looked nice on the small screen, but in terms of sheer complexity, the likes of Golden Abyss and Killzone Mercenaries came nowhere close to the standards of their big PS3 brethren. Or let's have a look at Wipeout. Studio Liverpool was generous enough to port the entire Wipeout HD/Fury content over to the Vita, and the two games look almost like-for-like at first glance. That said, it's a 1080p/60 game on the PS3 and a 540p/30 game with some really heavy resolution scaling (mostly with the HD/Fury tracks) on the Vita. The PS3 is pushing 8x the amount of pixels per second here.
 
Last edited:
This is without taking into account that a S835
The real question is what would S835 bring to the table?
Lower cost -- no, higher performance in docked mode -- no, would Nintendo use it at higher frequencies in handheld mode - no, there was no sense in putting higher than 720p 6 inch screen in Switch
Would it be fast outside of a single mobile benchmark which was made specifically for mobile chips - most likely NO
Would it support DX12 level features - no
Would it have good perf in modern engines?
What would nintendo get from using S835? The answer is nothing, Switch would not have been any more state-of-the-art with S835, in fact, it would have lost many graphics features without gaining anything in terms of product and package.
 
20.4*0.91 (-9%) = 18,56 for Razer phone in Manhattan 3.1.1 Offscreen
25.4 / 18.56 = 1,369, so TX1 at 1 GHz is 37 % faster in this benchmark, not sure how did you get 27%


It's clocked at 384 MHz and certainly performance doesn't scale linearly with FLOPS with constant or slightly higher bandwidth.
That's why many Switch games are 900p in docked mode and 720p in mobile.
900p has 1,5625x more pixels. If geometry processing takes 0.5x of total frame time, there are 3x more flops for pixel shading in docked mode, yet many graphics intense Switch games don't scale beyond 900p
I doubt Switch would be even 2 times slower in handheld mode in comparison with SHIELD TV

Mobile benchmarks don't use CPU at all, it would be opposite in real world.


Unconvincing
Yeah definitely cannot take the clockrate as a linear performance indicator with GPUs.
Also as you say the Switch performance was further unlocked later on: https://www.eurogamer.net/articles/...-boosts-handheld-switch-clocks-by-25-per-cent

Anyway while dGPU this shows why one cannot just look at a performance and then calculate a fraction when clocks/tdp is massively changed.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9FL1AvNTk1MzkzL29yaWdpbmFsL1BlcmZvcm1hbmNlLXZzLi1Qb3dlci1Db25zdW1wdGlvbi5wbmc=


Had to use this chart as it is the one with FPS, but below facts still apply.
That chart is going from 1500MHz to 2100MHz; meaning for 1060 clocks 1500MHz-->2100MHz is 40% increase while 131fps-->152fps at 2100MHz is 16% increase in performance gains.
1080P is more relevant than the more demanding higher resolutions.
Of course even this cannot be used truly accurately because we are looking at envelopes optimised for whatever window the GPU is meant to operate within, but does help to show one cannot correlate clock changes to performance in a linear way.
 
20.4*0.91 (-9%) = 18,56 for Razer phone in Manhattan 3.1.1 Offscreen
25.4 / 18.56 = 1,369, so TX1 at 1 GHz is 37 % faster in this benchmark, not sure how did you get 27%
I used the term difference, not faster.
Using TX1 as reference (TX1 = 100%) means you divide the S835 by the RX1 reference. 18.56/25.4 = 0.73 , so a 27% difference.


It's clocked at 384 MHz
That's "boost mode". Digitalfoundry themselves are unaware of how many (if any) games are using that frequency, given that it may require turning things off like WiFi.
Not that it would make a huge difference in the ~2.5x performance advantage that S835 already has in mobile mode without optimizing it for gaming (e.g. lower CPU clocks).


There was nothing state of the art in Vita, no Cortex A15, no Rogue, old tech process
Every single component you mentioned in the Vita was state-of-the-art.

1 - It was the first ever application of a quad-core Cortex A9 module with NEON. All other mobile SoC makers only used 2-core Cortex A9 until they had access to 32nm/28nm. The first Cortex A15 implementation only appeared 1 year later.

2 - It was the first-ever PowerVR SGX 543MP4 implementation. The only similar implementation appeared about 6 months later with apple's A5X. Rogue GPUs only appeared 2 years after that

3 - It initially was made on Samsung's 45nm, which was state-of-the-art for all intents and purposes. Samsung only had 32nm available for their own SoCs one year later.

Not to mention Wide IO adoption for VRAM getting it an estimated 12.8GB/s, plus the 6.4GB/s from the LPDDR2.
That's 3x more bandwidth than every other 2011 had at the time.

6 years later and the Switch only has ~30% more bandwidth than the Vita.


Unconvincing
IMO much more convincing than the links, numbers and calculations you have provided so far, though.


Mobile benchmarks don't use CPU at all, it would be opposite in real world.
So now you're trying to suggest the 10LPE quad-Kryo 280 at ~1GHz wouldn't consume a lot less and perform better than the 10nm Cortex A57 in the TX1?


Lower cost -- no
You don't know.

higher performance in docked mode -- no
Why not? Is there any data claiming you can't get more out of a S835 if it's actively cooled and given a 12-15W TDP?

would Nintendo use it at higher frequencies in handheld mode - no
Why would they? Performance in S835 smartphones is already so much better than the Switch in handheld mode.


Would it be fast outside of a single mobile benchmark which was made specifically for mobile chips - most likely NO
Several benchmarks actually, which do serve the purpose of comparing platforms in gaming scenarios.
It's not like GFXBench and 3dmark results paint a completely different picture from most games, in the PC.


Would it support DX12 level features - no

Irrelevant because Nintendo won't be using DX12, but false nonetheless:
r6uIx4I.jpg



What would nintendo get from using S835? The answer is

1 - Immensely more CPU power at lower power consumption. An actually functioning LITTLE module that could run the OS sipping some 100mW leaving a full quad-core big module for the devs.
2 - Well over 2x better GPU performance in handheld mode, probably similar advantage in docked mode
3 - LPDDR4X for 20% higher memory bandwidth at lower power consumption (more battery autonomy)
4 - An architecture that continued to be upgraded for using in future handheld designs, instead of a dead-end that eventually became focused on 300mm^2 chips meant for automotive. This would be convenient for a Switch 2 or a Switch+ down the road.
5 - Embedded modem for a 4G model.
6 - Embedded WiFi+BT for significantly lower power consumption in wireless comms (more battery autonomy)


And this is only assuming Nintendo would simply be purchasing an existing chip, instead of ordering a custom chip for a gaming device (more focused on low-clocked cores, wider GPU, wider memory, etc.), which is obviously what they should have done.
 
That chart is going from 1500MHz to 2100MHz; meaning for 1060 clocks 1500MHz-->2100MHz is 40% increase while 131fps-->152fps at 2100MHz is 16% increase in performance gains.

At 131 FPS in a 5 year-old title you're CPU bound and you know that..
 
At 131 FPS in a 5 year-old title you're CPU bound and you know that..
No it is not because the fps kept increasing up to that MHz....
If the fps was a plateau I would agree, but it clearly is not.
The same pattern-trend can be seen with 4K, although it has greater impact with clocks but still far from linear clock-performance relationship, but then 4K should be more noticable impact and even then it is 24% performance difference vs 40% clocks.
But 1080p is more relevant because The Switch is operating below that with even less resolution overheads.

Still this behaviour-relationship is very relevant when trying to just use clocks in a theoretical way to work out GPU performance relative to the Adreno 540 specifically to gaming.
Anyway the 4K results show the trend as well but higher for reasons I mention; 4K has the highest TDP as well if interested in general when measuring watts and gaming.
 
Last edited:
Judging the Switch by how well it fares in the case of Doom (i.e. a port of one of the most impressive and technically advanced games we've seen this gen, and done by a tiny-ass studio no less) while judging the Vita on the basis of Virtua Tennis and Playstation All Stars (which was created specifcally for both the Vita and the PS3 by Bluepoint games using the Bluepoint engine) is a rather weird choice.

I compared Doom to Borderlands 2...

The exclusives sure looked nice on the small screen, but in terms of sheer complexity, the likes of Golden Abyss and Killzone Mercenaries came nowhere close to the standards of their big PS3 brethren. Or let's have a look at Wipeout. Studio Liverpool was generous enough to port the entire Wipeout HD/Fury content over to the Vita, and the two games look almost like-for-like at first glance. That said, it's a 1080p/60 game on the PS3 and a 540p/30 game with some really heavy resolution scaling (mostly with the HD/Fury tracks) on the Vita. The PS3 is pushing 8x the amount of pixels per second here.

Because 600p/30fps (docked) is more impressive ? Btw, Wipeout has a very agressive dynamic resolution on PS3 (almost never runs at 1080p) unlike Doom on PS4.

On Vita, it's 544p/30fps with 4xMSAA with almost no downgrades and even improved things : https://www.eurogamer.net/articles/digitalfoundry-wipeout-2048-tech-interview
 
Last edited:
Back
Top