Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
Pre-reveal maybe, but post the 720p screen is a given. Nobody needs to restate that every time they mention performance.

No. It has to be stated and qualified each time because the device can be hooked up to a TV that allows for greater than 720p screens.
 
Any time anyone has ever said the portable device will be Xbox One level of performance without placing qualifiers on it such as lower resolution.
This is my stance too, and I'm the biggest dreamer here lol. I've said it before it can be pretty close to xb1 but noticeably weaker. Expect 720p versions of xb1/ps4 1080p titles, and possibly upscale/checkerboard rendering/some heavy dynamic resolution to hit higher resolutions. It is not going to match xb1 in raw numbers but it's obvious lowest performance is x1 and it's highest performance is ~750gflops (pascal tegra)
 
This is my stance too, and I'm the biggest dreamer here lol. I've said it before it can be pretty close to xb1 but noticeably weaker. Expect 720p versions of xb1/ps4 1080p titles, and possibly upscale/checkerboard rendering/some heavy dynamic resolution to hit higher resolutions. It is not going to match xb1 in raw numbers but it's obvious lowest performance is x1 and it's highest performance is ~750gflops (pascal tegra)
That's my stance as well. Assuming there are publishers who port their AAA games to Switch, the target resolution will most certainly be 720p. Who knows, perhaps some of Nvidias work with the tools and API make it very easy for developers to implement checkerboard rendering. I think the tools and API from Nvidia will ultimately be more significant than it the Tegra is 512Gflop or 750Gflop. Easy to develop for with native engine support to a long way.

Sent from my SM-G360V using Tapatalk
 
A 2~5W 20/16 nm chip isn't going to be achieving two thirds of what a 50W 16 nm AMD chip is.

Even Intel aren't remotely close to that.

I feel that despite the heavy burdens of both history and physics, expectations may be running away with themselves amongst the faithful.
 
My stance is nintendo failed so much with the Wii-u that I have permanently lost faith in anything they produce. When they released a console less than half as powerful as my then realistic expectation of a 500GFLOP gpu, I can only expect them to do worse than anyone could possibly imagine. I mean the Wii-u is so weak that to this day, people will still deny it because it didn't make any sense.

I wouldn't be surprised if it was <200GFLOPs again.
 
A 2~5W 20/16 nm chip isn't going to be achieving two thirds of what a 50W 16 nm AMD chip is.

Even Intel aren't remotely close to that.
I don't know about Intel, but Nvidia are planning on releasing a 16nm chip with roughly 4x the performance of that AMD chip at 20w in 2017.

Now, I don't think that chip or even architecture is going into Switch (it is for Summit, Sierra, and apparently a single chip replacement for the drive PX2 system), but it does exist....
 
I don't know about Intel, but Nvidia are planning on releasing a 16nm chip with roughly 4x the performance of that AMD chip at 20w in 2017.

Now, I don't think that chip or even architecture is going into Switch (it is for Summit, Sierra, and apparently a single chip replacement for the drive PX2 system), but it does exist....
nvidia lie about those numbers all the time
 
I don't know about Intel, but Nvidia are planning on releasing a 16nm chip with roughly 4x the performance of that AMD chip at 20w in 2017.

Nvidia are releasing a chip with 4 x [8 Jaguar cores @ 1.75 gHz] worth of CPU power and as much GPU horsepower as an RX480 .... at 20W ... on 16 nm .... next year....?

Yeah, I'm just going to come straight out and call "bullshit" on that one.

Now, I don't think that chip or even architecture is going into Switch (it is for Summit, Sierra, and apparently a single chip replacement for the drive PX2 system), but it does exist....

The chip might exist, but those performance characteristics don't.
 
The chip might exist, but those performance characteristics don't.
Xavier.pngVolta.png


Nvidia are releasing a chip with 4 x [8 Jaguar cores @ 1.75 gHz] worth of CPU power and as much GPU horsepower as an RX480 .... at 20W ... on 16 nm .... next year....?
The CPU side probably isn't 4x (more like 2-3x I'd guess), otherwise yeah....
 
Last edited:
A 2~5W 20/16 nm chip isn't going to be achieving two thirds of what a 50W 16 nm AMD chip is.

Even Intel aren't remotely close to that.

I feel that despite the heavy burdens of both history and physics, expectations may be running away with themselves amongst the faithful.

We aren't talking about a 5 watt part, we are talking about a ~12watt part of when the thing is docked. X1 SoC did 850mhz 435gflops in pixel c @20nm on 8watts for a 10inch tablet running the Manhattan benchmark. Why would 750gflops be impossible with out the screen on 16nm fenfit at 10-12w?
 
There's way too much speculation and wild dreams in this thread to remain in the Console Technology section so back into the general Console Industry forum it goes.
 
OK, how is 512 x 4 x 2 x 1GHz = 4TFLOPs FP32 per GHz. Then you have a chip with > 750 GB/s bandwidth is that sufficient? Can you connect the dots from there? So damn sad...
 
OK, how is 512 x 4 x 2 x 1GHz = 4TFLOPs FP32 per GHz. Then you have a chip with > 750 GB/s bandwidth is that sufficient? Can you connect the dots from there? So damn sad...

But where are you getting the projected clock speeds from? And where are you getting 4 x [8 Jags @ 1.75] from?

You might find it damn sad, but some actual, comprehensive answers would help prevent the sadness from spreading.

Honestly didn't know that Volta was 8 SP flops per CUDA core per cycle btw. Couldn't find anything to say that Xavier was shipping to customers in 2017 either.

Edit: Contrary to the image above suggesting Drive PX 2 draws 80 W, Ars seem to be pretty sure it draws much more - more like 250W: http://www.anandtech.com/show/9903/nvidia-announces-drive-px-2-pascal-power-for-selfdriving-cars

"What isn’t in doubt though are the power requirements for PX 2. PX 2 will consume 250W of power – equivalent to today’s GTX 980 Ti and GTX Titan X cards – and will require liquid cooling. NVIDIA’s justification for the design, besides the fact that this much computing power is necessary, is that a liquid cooling system ensures that the PX 2 will receive sufficient cooling in all environmental conditions."

So at this time, I continue to be happy calling bullshit on that 20W figure.
 
Status
Not open for further replies.
Back
Top