Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Yeah but who said that Switch won't have a core dedicaced to the os too (and ram too...) ? On PC, I've a 5820k, and I see some games using more than 4cores. Moreover, on console you may have better api, better optimisations , so, using more core in the end.
For me, it's the same situation that Wii U (can run x360/ps3 games, but need efforts to port, so nobody did it). Yeah maybe the switch can run some ports, but do you think that a lot of dev will re-work their game/assets/etc , to make them run on a weaker hardware ? And without hardrive to install them ? Man I doubt it... Even more with the ps4pro and scorpio. Devs are shooting for more and more power. I guess the "base" power level will be the xbox one, not the switch...

nVidia seems confident yes, but they won't say "eh, it sucks, but nintendo didn't care, so we're good !"
 
You mean 7 Jaguar cores for PS4 and XB1 that are available to games.

Do they have access to the entire 7th core? I thought Microsoft only opened up 50% of the 7th core? Either way, my statement stands that games do not currently spread out the workload evenly over lots of cores. You could have 16 Jaguar cores, and it still wouldn't hold a candle to an I7 quad core processor.

@Rootax

Don't get me wrong, games can use lots of cores, but generally you will still find that 2-3 cores become maxed out, and the rest will just see less and less usage with every additional core.
 
Do they have access to the entire 7th core? I thought Microsoft only opened up 50% of the 7th core? Either way, my statement stands that games do not currently spread out the workload evenly over lots of cores.
That may be true, but it's not supported by evidence AFAIK. You are referencing PC - consoles are very different. You need to get some profiles of recent games to see what they're doing. That 50% utilisation of your extra PC cores might be ~90% utilisation on consoles because they aren't that strong, meaning the whole Jaguar CPU is used ~90%.
 
That may be true, but it's not supported by evidence AFAIK. You are referencing PC - consoles are very different. You need to get some profiles of recent games to see what they're doing. That 50% utilisation of your extra PC cores might be ~90% utilisation on consoles because they aren't that strong, meaning the whole Jaguar CPU is used ~90%.

Its possible, but we aren't going to have any hard evidence of this. Basically there aren't any benchmarks for games on consoles to give us this information to give us a clear cut answer to the question. I do agree that out of necessity, developers have worked hard to improve multi thread utilization in their games.
 
Things are changing very quickly especially due to current gen consoles.
on pc, dx11 is not multi thread friendly.
Engines was designed to make use of strong ipc cores.
All that is changing a lot, and quickly(relatively). Otherwise console games just wouldn't be able to perform the way they do.

edit: Also there's been a few engine presentations/breakdowns that have shown how much their now multi threading
 
There are actually some DX11 games with low CPU utilization and rather even distribution across all cores.

For example Tomb Raider 2013:

2QM89t.png
 
Its possible, but we aren't going to have any hard evidence of this. Basically there aren't any benchmarks for games on consoles to give us this information to give us a clear cut answer to the question. I do agree that out of necessity, developers have worked hard to improve multi thread utilization in their games.
http://twvideo01.ubm-us.net/o1/vaul...rling_Christian_Parallelizing_The_Naughty.pdf

Includes lots of timing captures and analysis. Must read if you want to understand multithreading on a modern console game engine.
 
shows a well coded multi threaded engine, even with dx11 issues regarding threading.

no getting around the industry having to multi thread their games.
necessary for console, jaguar hasn't got the highest ips, and not clocked that high to add to the problem.
 
http://twvideo01.ubm-us.net/o1/vaul...rling_Christian_Parallelizing_The_Naughty.pdf

Includes lots of timing captures and analysis. Must read if you want to understand multithreading on a modern console game engine.

Am I wrong to conclude that this technique puts the image on screen 3 frames behind game logic? With my cave man understanding of this article, it seems like they were really finding a solution to stalls between the CPU and GPU. I'm also not certain this couldn't or wasn't done with the 360. Basically each core is working on a different frame, the PS3 SPU's may have sucked at game logic, but the 360 had three functional cores. I'm also not certain if the ability to yield task to allow other task pass by is something new thanks to the newer CPU's, or if that is something that could have been implemented with the 360/PS3. I'm assuming the ability to yield jobs is different than out of order processing that the newer CPU's do natively.
 
Am I wrong to conclude that this technique puts the image on screen 3 frames behind game logic?
They interleaved the CPU game logic with the previous frame CPU rendering. This adds one frame of latency. On top of that, GPU always runs asynchronous of the CPU. On PC, the GPU can be up to 3 frames late (DirectX maximum). On console games, usually the GPU runs around 0.5 frames late (to ensure that the GPU always has work to do). Thus the GPU frames end part overlaps with the beginning of the next CPU frame. This has implications of resource management. You can't free GPU resources while they are in use. Usually console engines delay the GPU resource freeing by one or two frames, to be sure that the GPU has finished (there might be spikes).
I'm also not certain if the ability to yield task to allow other task pass by is something new thanks to the newer CPU's, or if that is something that could have been implemented with the 360/PS3. I'm assuming the ability to yield jobs is different than out of order processing that the newer CPU's do natively.
Fibers were available also on previous gen consoles (on main CPUs). But the original TLoU was PS3 exclusive and was using Cell SPUs a lot. This presentation discussed the PS4 port. Jaguar is a modern multicore out-of-order x86 processor, so this presentation applies well to modern console and PC development.
 
Isnt it expected that the Switch will be able to perform better than PS3 and 360 significantly?
http://www.eurogamer.net/articles/d...-mobile-games-machine-powered-by-nvidia-tegra
Shield did seem to perform some games better than PS3 and 360. And thats a Tegra 1. Switch is supposed to gave a better GPU and most likely it will be a Tegra 2.
If thats the case I expect people to be pretty satisfied with it. As a handheld which wont need to have resolutions above 720p and will show games on a small screen, it will have extra resources for some pretty sweet graphics.
 
Eurogamer is reporting a rumor for a Pokémon Stars game coming out next year for Switch. One thing is for certain, we are going to find out just how far Nintendo's IP's can take the Switch, because the first 12 months will be loaded. I'm also hearing of a Rabbids and Mario crossover game from Ubisoft. Western third party support will likely be limited, but that didn't stop any of Nintendo's handhelds from being massively successful. If Nintendo can secure some third party exclusives, like Beyond Good and Evil for example, combined with the full complete arsenal from Nintendo's internal studious, the library of games could be very attractive, and so different from what the other consoles offer that consumers will be more likely to want one, thanks to it not being a "me too" product. If you think of the Switch as a 3DS successor with the added perk of being able to play on the TV, the device looks much more favorable.

At the very least I would think the Tegra chip in the Switch is matched to a 128 bit memory bus, processed on 16nm FinFet, and clocks north of 1Ghz. This is one powerful portable console.
 
Status
Not open for further replies.
Back
Top