Nintendo Switch Tech Speculation discussion

Status
Not open for further replies.
It makes more sense that Nvidia mislead sony than ati signing an exclusivity deal.
Why? nVidia had no idea what the performance of AMD's parts were. Neither did AMD know how their option compared to what nVidia were making. And what evidence do you have that nVidia lied about how awesome their hardware was but AMD didn't? Why didn't AMD make a whole load of fake claims to win the deal? To my mind it's far more likely part of MS's deal secured exclusivity because it gave them a competitive advantage, rather than nVidia lying and misleading Sony into a poor product in a way Sony's engineers weren't able to work out what they were getting while AMD were all holy-Joe and gave Sony the honest facts and got out manoeuvred by nVidia's dirty dealings. Legally and morally, nVidia are innocent until proven guilty in what's amounting to slander here. ;)
 
If someone claimed that the original 3DS was tested at 480MHz would that sound outrageous? What if someone said PSVita was tested at 800MHz?
During full production, yes and yes IMO. This isn't late prototype stage, it's production stage.
Again, I'll bend the knee if someone here has ever participated in SoC QA tests during production stage and tells me this is perfectly normal.

I feel like a broken record here, but it was pretty standard for old Nintendo handhelds to be capable of running at clock speeds far in excess of what they normally ran at.
It seems with the DS was only the Lite that could be overclocked like that. Maybe the chip made on a newer process than the original DS' so it was clocked to maintain full compatibility and gain battery life. GB Advance and previous consoles are so far back that I won't comment how SoCs were made at the time.


2000/2000000 = 0.001% of production volume anticipated for the first month, seems pretty uncommon to me
With an embedded screen but unable to work without being plugged in, it's obviously a devkit. 2000 devkits isn't a rare amount.


Really? Based on developer comments, Switch has been a dream to develop for.
IIRC the ones claiming the Switch is very easy to develop for are indie developers making ports from android games. Makes sense given where Tegra chips have been so far.


So what is the pattern of behaviour that constitutes 'an nVidia' and how is this going to impact Switch?
1 - Xbox dealings gone wrong leading to class action suits
2 - Sony getting an old G70 GPU in a console that released 3 days earlier than G80 graphics cards were put in the market. Comparatively, AMD has made their latest GPU tech available for their semi-custom console chips a lot earlier than they do for their own graphics cards.
3 (if it comes true) - Nintendo getting a 2-year-old SoC based on an old manufacturing process.

Sure, one can make up excuses for each one of those points and they may all be very valid and even true, but the pattern of general lack of goodwill and commitment is still there.
Numbers 1 and 2 lead to both manufacturers never dealing with nvidia again for semi-custom/custom parts so far. For example, given how Scorpio even got rid of the EDRAM and how everything seems to be running on virtual machines, there's little reason to believe its SoC couldn't be made using Denver 2 cores and a Pascal GPU, maybe getting higher performance-per-watt than AMD's solution.
So if you argue the Xbone and PS4 used AMD SoCs because "there was no other possible solution", that's probably not true for Scorpio.


The fact MS continues to use nVidia parts in MS products shows there's no sour relationship there, because business is largely impersonal and it'd be financially damaging to select partners based on past contracts.
Surface and Xbox are different divisions with different chains of command (with a known lack of communication between divisions that Microsoft has been trying to improve).
And even if it was the same division, it's one thing to choose a graphics chip from a shelf and integrate them in a motherboard (like dozens of other OEMs do). It's a completely different thing to trust the company to design a videogame console's semi-custom SoC whose place within the state-of-the-art cadence is absolutely crucial to its success.



Legally and morally, nVidia are innocent until proven guilty in what's amounting to slander here. ;)
This is true.
Though we're not law-enforcement or morals-enforcement officials of any kind. No one here is able to apply any kind of sanction except maybe with wallet votes that matter very little.
We're just forum users with an opinion.
 
In terms of personal experience I can vouch for 'bumpgate' having burnt a lot of bridges for nvidia in the chipset business, of course with Intel moving the memory controller on-die that business died anyway but they did themselves no favours with their pretense that everything was fine and dumping huge service costs on their customers.

The RSX debacle had many parents, Nvidia was brought on late to that project as Sony was for a considerable time considering shipping a Cell CPU only PS3 configuration until saner heads prevailed. So they offered what they had but AMD simply had better tech available at that time, over the cycle the limitations of the RSX stood out more and more hurting it's rep but for the time it was fine if unremarkable chip.

With regard to OG Xbox NV was absolutely entitled to demand fees for access to their NV2 tech as the Xbox team at that time was clearly just getting started and seems to have negotiated a contract with significant gaps. They were unwilling to do any extra work to get to a putative Xbox Slim, they demanded fees for NV2 emulation on 360 and generally maximised their short term returns from that contract. That's fine and legal but build you a solid reputation as a good partner for future projects it does not.
 
And what evidence do you have that nVidia lied about how awesome their hardware was but AMD didn't? Why didn't AMD make a whole load of fake claims to win the deal?

Never said they didn't, but they didn't need to because their tech was superior.
To my mind it's far more likely part of MS's deal secured exclusivity because it gave them a competitive advantage,

When has that ever happened, and why would it? Why wouldn't manufacturers not want their tech in as many different devices as possible? It's a load of bollocks.

Regardless of their Ps3 dealings, Nvidia has always been more scummy than AMD, gameworks being a prime example.
 
Sony were originally planning (or claimed to be) planning an early 2006 launch. Something around Easter iirc. Back in 2004 (or whenever) G80 wouldn't have been an option. G80 was far too big for PS3, heavy modification at short notice was not necessarily possible.

MS managed to get an early version of GF4 for OG Box, back when the PC only had GF3, so it's not like Nvidia haven't ever given console vendors the latest stuff.
 
Sony were originally planning (or claimed to be) planning an early 2006 launch. Something around Easter iirc. Back in 2004 (or whenever) G80 wouldn't have been an option. G80 was far too big for PS3, heavy modification at short notice was not necessarily possible.

MS managed to get an early version of GF4 for OG Box, back when the PC only had GF3, so it's not like Nvidia haven't ever given console vendors the latest stuff.

Yes, their 8800 series was out of the question. I'm not saying Nvidia didn't give them their best (except that it's bandwidth starved, but so is the Xenos in xbox 360 so that's just Sony and MS cutting corners), I just have suspicions on why Sony chose them in the first place.

But at least this time, i'm sure Nvidia gave Nintendo the best they had.
 
there's little reason to believe its SoC couldn't be made using Denver 2 cores and a Pascal GPU, maybe getting higher performance-per-watt than AMD's solution.
So if you argue the Xbone and PS4 used AMD SoCs because "there was no other possible solution", that's probably not true for Scorpio.

.

Tegra Pascal clearly wasn't ready for the time frame Nintendo wanted, if Nvidia's newest shield tv isn't even using it.
 
Never said they didn't, but they didn't need to because their tech was superior.

You have to remember that Xenos was a scuttled PC GPU architecture (not deemed competitive in the PC space and it never appeared there) that MS worked on with ATI to bring to console.

MS had a lot of experience with PC graphics and GPUs; Sony had absolutely none. Sony probably weren't in a place to do what MS did at that time - to asses the potential of an architecture not ready for market anywhere else, to customise it, split it over two dies (still not done by anyone else) and source fabs.

Sony needed a complete GPU that was ready to go and that's what they got. And in terms of perf/mm^2 and perf/watt they got something that was definitely competitive with ATI's off-the-shelf parts.

When has that ever happened, and why would it? Why wouldn't manufacturers not want their tech in as many different devices as possible? It's a load of bollocks.

As Xenos wasn't "just" a slight modification of an off the shelf part, MS would have funded a good amount of work on it and would have protected their investment. ATI probably couldn't have presented a Xenos derivative to Sony - Sony would need to have done what MS did: see how PC GPU workloads were shifting, looked at PC APIs, and started working in 2003 to build a GPU based on ATIs early work with unified shaders.

MS and the Direct X team were in daily contact with ATI and knew what was coming as they were working on DX 10 and constantly profiling PC games. Sony OTOH were still riding high on PS2 at that time and had an enormous boner for Cell.

You can't blame Nvidia for how PS3 ended up. RSX was Sony's best option at the point they realised they needed a PC derived GPU.
 
You have to remember that Xenos was a scuttled PC GPU architecture (not deemed competitive in the PC space and it never appeared there) that MS worked on with ATI to bring to console.

MS had a lot of experience with PC graphics and GPUs; Sony had absolutely none. Sony probably weren't in a place to do what MS did at that time - to asses the potential of an architecture not ready for market anywhere else, to customise it, split it over two dies (still not done by anyone else) and source fabs.

Sony needed a complete GPU that was ready to go and that's what they got. And in terms of perf/mm^2 and perf/watt they got something that was definitely competitive with ATI's off-the-shelf parts.



As Xenos wasn't "just" a slight modification of an off the shelf part, MS would have funded a good amount of work on it and would have protected their investment. ATI probably couldn't have presented a Xenos derivative to Sony - Sony would need to have done what MS did: see how PC GPU workloads were shifting, looked at PC APIs, and started working in 2003 to build a GPU based on ATIs early work with unified shaders.

MS and the Direct X team were in daily contact with ATI and knew what was coming as they were working on DX 10 and constantly profiling PC games. Sony OTOH were still riding high on PS2 at that time and had an enormous boner for Cell.

You can't blame Nvidia for how PS3 ended up. RSX was Sony's best option at the point they realised they needed a PC derived GPU.

Then again RSX had a bugged scaler, so I guess Nvidia should at least take an L for that.
 
Back in 2004 (or whenever) G80 wouldn't have been an option. G80 was far too big for PS3, heavy modification at short notice was not necessarily possible.
This is as much of an assumption as anything else. A whole G80 GPU didn't need to be inside the RSX. Save for video codecs (which would be handled by the Cell anyway), the architecture was just as modular as its predecessors and successors.

MS managed to get an early version of GF4 for OG Box, back when the PC only had GF3, so it's not like Nvidia haven't ever given console vendors the latest stuff.
GF3, GF4 Ti and NV2A are all part of the NV20 family. The only thing NV2A got over the existing Geforce 3 Ti500 was a second vertex shader. The rest was pretty much alike. IIRC the performance difference between a GF3 Ti500 and a GF4 Ti at the same clocks was lower than 10% in most titles.
The only truly innovative thing going into the Xbox from nvidia was Soundstorm.


Tegra Pascal clearly wasn't ready for the time frame Nintendo wanted, if Nvidia's newest shield tv isn't even using it.
The Drive PX2 with a Tegra Pascal has been going into Tesla cars since October. The Switch's production reportedly started in November.
What you're calling "Nvidia's newest shield TV" isn't anything other than a shameless relaunch of the 2 year-old shield TV with a different enclosure. Actually, the model with a HDD even has the exact same enclosure as the 2 year-old one.


MS had a lot of experience with PC graphics and GPUs; Sony had absolutely none.
Back in 2004 Microsoft had designed zero GPUs, whereas Sony's internal teams had developed both the PS1 GPU and the PS2's Graphics Synthesizer, plus a co-processor inside each console's CPU dedicated to geometry transformation and lighting (T&L coprocessors in practice).



You can't blame Nvidia for how PS3 ended up. RSX was Sony's best option at the point they realised they needed a PC derived GPU.
Assumptions...
No one except some people at nvidia can know for sure if a G80 derivative could or could not be included in the PS3. That point is moot.
All we know is nvidia put out G80 graphics cards in the shelves 3 days after the PS3 was launched, and SCE hasn't made business with nvidia ever since.

Would I believe nvidia said to Sony "oh gosh we have a brand new and much better architecture coming up but it's like totally impossible to put it in your console in time" during some meeting? Sure I would.
Doesn't automatically make 100% it true, though.
 
It also didn't help that the PS3's launch was delayed a year in most territories. Had it launched everywhere a year later, it could have contained a more recent and competitive GPU. More than that though, it needed at least another 256MB of VRAM.
 
All we know is nvidia put out G80 graphics cards in the shelves 3 days after the PS3 was launched, and SCE hasn't made business with nvidia ever since.
OMG, the rejection! Think of the thousands and hundreds and tens...and...well two products since PS3 that Sony has released that may merit nVidia involvement. Where one was a portable and nVidia didn't have a competitive part. So one product, PS4, that nVidia could have been a contender for. Obviously they were rejected because they withheld G80 from PS3. Had nothing to do with AMD offering an SOC or better deal. Can't have done - betrayed once and we'll never do business again! :runaway:
 
The Drive PX2 with a Tegra Pascal has been going into Tesla cars since October. The Switch's production reportedly started in November.
What you're calling "Nvidia's newest shield TV" isn't anything other than a shameless relaunch of the 2 year-old shield TV with a different enclosure. Actually, the model with a HDD even has the exact same enclosure as the 2 year-old one.

.

On the one hand you have a product costing $300, on the other $50,000. Pretty huge market difference there. And besides that's a mere month before production. Game consoles aren't designed in a day, you can't just scrap all previous hardware choices that soon before production without a big delay. Which in Sony's case is why they couldn't have used G80.

Another thing is Nintendo actually has precedence for wanting to use dated tech, unlike Sony, even though they've also made cutting edge designs like gamecube and N64. They've already denied Nvidia with the 3ds, so I see no reason why they'd let Nvidia give them underperforming/dated tech unless they wanted it.
 
All we know is nvidia put out G80 graphics cards in the shelves 3 days after the PS3 was launched, and SCE hasn't made business with nvidia ever since.
Bolting on an XDR I/O onto an existing and well known IP (G7x) probably didn't take nearly as much manpower & time as it would have if they had to go through a bunch more testing on the to-be-released G8x. One might consider that the bulk of the relevant resources was out to push the massive G80, as the rest of the line-up such as G84/86 wasn't launched until April 2007.
 
Last edited:
OMG, the rejection! Think of the thousands and hundreds and tens...and...well two products since PS3 that Sony has released that may merit nVidia involvement. Where one was a portable and nVidia didn't have a competitive part. So one product, PS4, that nVidia could have been a contender for. Obviously they were rejected because they withheld G80 from PS3. Had nothing to do with AMD offering an SOC or better deal. Can't have done - betrayed once and we'll never do business again! :runaway:

I just stated facts. Your hyperbole is just yours :p
We can also simply assert that the last 4 consoles from Sony and Microsoft were AMD design wins.
Also, nvidia has never achieved more than one design win with any console maker, ever. AMD has achieved that with Sony (2 in a row), Microsoft (3 in a row) and Nintendo (3 in a row if we count ArtX then ATi).


On the one hand you have a product costing $300, on the other $50,000.
First you state a Tegra Pascal wouldn't be ready for Switch's production. After being proven wrong, you're making it about cost.
Moving goalposts...



Nevertheless, bolting on an XDR I/O onto an existing and well known IP (G7x) probably didn't take nearly as much manpower & time as it would have if they had to go through a bunch more testing on the to-be-released G8x. One might consider that the bulk of the relevant resources was out to push the massive G80, as G86 wasn't launched until April 2007.
This is true. It's not a definite conclusion, though. Perhaps it would be impossible. Perhaps putting G80 into PS3 meant delaying the 8800GTX release for a quarter or so, and nvidia just wasn't willing to do that as they saw the PC market more important. Perhaps nvidia just felt threatened by how much these consoles were going to hurt their main PC AIB business and they wanted to mark the PC's advantage from the start.

All these maybes don't change the facts, though.
 
Last edited by a moderator:
This is as much of an assumption as anything else. A whole G80 GPU didn't need to be inside the RSX. Save for video codecs (which would be handled by the Cell anyway), the architecture was just as modular as its predecessors and successors.

PS3 was originally planned for 6 months (or more) before G80 appeared. Despite the modularity of GPUs, smaller GPUs than G80 did not appear until later in 2007.

You're suggesting that it would be realistic for Nvidia to accelerate their unified program by at least 6 months (!?), and to do so for a massive volume, low margin part, to the detriment of their first unified chip in their core market, which was a massive margin, trickle volume GPU.

We have seen that neither Nvidia nor ATI / AMD have ever introduced a radical new architecture for consoles 6+ months before the PC space. And Xenos was not an example of that either - it was not an "early" 2900, it was a spinoff from an abandoned PC GPU.

To suggest a G80 spinoff for PS3 was as reasonable as RSX is not reasonable.

GF3, GF4 Ti and NV2A are all part of the NV20 family. The only thing NV2A got over the existing Geforce 3 Ti500 was a second vertex shader. The rest was pretty much alike. IIRC the performance difference between a GF3 Ti500 and a GF4 Ti at the same clocks was lower than 10% in most titles.
The only truly innovative thing going into the Xbox from nvidia was Soundstorm.

PC games designed for machines with limited vertex processing abilities are a poor way to judge how much "faster" GF4 could be in heavily vertex bottlenecked games due to its second vertex shader. Unsurprisingly, games with low vertex counts won't benefit greatly from a huge increase in vertex processing capability.

You'll note that even NV2A was a spinoff from the GF3/GF4 line, and not an early introduction of a radical new architecture.

Back in 2004 Microsoft had designed zero GPUs, whereas Sony's internal teams had developed both the PS1 GPU and the PS2's Graphics Synthesizer, plus a co-processor inside each console's CPU dedicated to geometry transformation and lighting (T&L coprocessors in practice).

I'm not sure if you're really missing the point, or simply being obtuse.

MS had a mass of experience with graphics. They were responsible for DX, were working on DX10, and would have been working with Nvidia, ATI and PC engine and middleware makers on an ongoing basis in order to understand and plan for evolving technologies.

PS1 and PS2 graphics chips gave Sony zero visibility on where the PC was going and had nothing at all to do with Xenos or RSX. They were isolated - in terms of both hardware and software evolution - from the PC space.

Assumptions...
No one except some people at nvidia can know for sure if a G80 derivative could or could not be included in the PS3. That point is moot.
All we know is nvidia put out G80 graphics cards in the shelves 3 days after the PS3 was launched, and SCE hasn't made business with nvidia ever since.

That's not all we know, but it seems it's all you're willing to concede.

RSX would have needed to be in mass production before G80 started to trickle off the line, and it was originally intended for PS3 to launch many months earlier then it did. There was no unified shader part in RSX's performance and power segment until 18 months [edit: nope, 12 months] and one node change after PS3 was originally planned to launch.

It's also possible a 90 nm G80 derivative would have performed worse given the same area and power. Some developers here actually argued that point in the past.

You'll note that in games of the time, perf/mm^2 was lower for G80 than for the 7900 GTX, and that this was especially marked at lower resolutions:

https://www.techpowerup.com/reviews/NVIDIA/G80/6.html

Yep, you heard right. Especially in light of a vertex cull monster like Cell, it's quite possible that a similarly sized G80 derivative would have been less ideal than RSX.

I'll reiterate - even if a G80 derivative had been possible (and it probably wasn't) it may have been a worse fit for PS3.

But this is all getting very off topic. Suffice to say, Nintendo and Nvidia will both have their reasons for what's in Switch, and they're probably very good reasons.
 
Suffice to say, Nintendo and Nvidia will both have their reasons for what's in Switch, and they're probably very good reasons.

I have to wonder if the tablet prototype that AMD came up with a couple years back (I guess they didn't Mull over it too long :cool:) was actually a prototype choice offered to Nintendo.
 
Last edited:
Let's nix this idea that Pascal is 'ready', it's in cars attached to batteries storing KWh worth of energy. If that thing worked in a mobile device it would be there as they will sell a rake load more in almost any mobile device than Elon Musk is going to sell pricey motors. Maybe Pascal is super ready and Nintendo chose X1 to spite themselves but I think a safer bet (especially as NV aren't pitching Pascal as a mobile chip) is to assume the power efficient Pascal parts are several months away at the soonest.
 
Not to add fuel to the fire but I am pretty sure that the validation proces for the automobile industry is a tad more strict than an ordinary consumer product.

So, any noteworthy news about the Switch? :)
 
Status
Not open for further replies.
Back
Top