NVIDIA Tegra Architecture

Or they just kept conveniently the 2:1 PS to VS Vec4 ALU ratio T3 introduced. Vertex rate isn't going to increase by a 6x fold within just one year in mobile games.
That's probably true but do you know if T3 reaches it's peak setup rate in many non-pixel limited situations? Maybe T3 is VS bound more than setup bound in games so some of the 6x increase could be balancing the architecture and some accounting for more complex games.
 
That's probably true but do you know if T3 reaches it's peak setup rate in many non-pixel limited situations?

Since it's probably true, why would I as a layman bother to over-analyze it in the end since from one side T4 isn't going to set any worth mentioning sales or mobile games development examples? While it's a legitimate and good question I doubt even NV's marketiers would have a clue to answer such a complex question if you don't prepare them for it :p

Maybe T3 is VS bound more than setup bound in games so some of the 6x increase could be balancing the architecture and some accounting for more complex games.

They didn't bother to upgrade their Z units either compared to T3, yet both the T3 and early T4 signs point at exellent performance in GLB2.7. The trick question here would be whether you'd prefer a hw designer to create an as balanced as possible GPU pipeline for as many as possible aspects or if he'd just skip updating several parts of it because of "good enough" predictions for future game development scenarios.

Since the introduction of KAYLA the majority of heads have turned into that direction for a good reason. In fact I've been thinking these days that NV might be considering with the highest Logan variant to give ULV Haswell variants a run for their money. Brilliant idea if true and no in that case obviously any worries about setup units to be too small are redundant.
 
5.7 inch display huh. I guess they're going big for more battery space and thermal mass..

5.7" display with a resolution of only 1280*720.
This is still some 260 dpi so it's not bad, but it's just not flagship material anymore.
 
Interesting, nvidia has actually concentrated on open gl drivers to such an extent they are better than their windows drivers, qualcomm have quite a way to go on that front.
http://gfxbench.com/compare.jsp?D1=Microsoft+Surface+RT&D2=NVidia+Wayne&D3=NVidia+Shield&D4=Apple+iPad+4&D5=HTC+One&D6=HTC+One+X+%28EndeavorU%29&D7=HTC+One+X+%28Evita%29&D8=HTC+Windows+Phone+8X&cols=8

No, the NVIDIA Wayne "Covington" device appears to be running at a much lower GPU clock operating frequency compared to any Tegra 4 production device such as Shield. FYI, there is actually a design verfication engineer at Microsoft by the name of Covington, so this is probably just a test platform for Tegra 4 on Windows. Strangely enough, the OS is being reported in GFXBench as Windows 8 rather than Windows RT, but that could be a simple mistake in the GFXBench database. Assuming that the NVIDIA Wayne Windows test platform is running at half the GPU operating frequency of Tegra 4 production devices, then the GFXB 2.5 performance in Windows should be virtually identical to the GFXB 2.5 performance in Android.

P.S. Even though Shield has a heatsink/fan, the purpose there was not to crank up max operating frequencies to improve benchmark performance vs. thin fanless Tegra 4 tablets/smartphones. The purpose of using a heatsink/fan was to allow gamers to play for hours on end without having to worry about thermal throttling reducing performance over time, and without having to worry about the device becoming warm or hot to the touch over time.
 
Last edited by a moderator:
No, the NVIDIA Wayne "Covington" device appears to be running at a much lower GPU clock operating frequency compared to any Tegra 4 production device such as Shield. FYI, there is actually a design verfication engineer at Microsoft by the name of Covington, so this is probably just a test platform for Tegra 4 on Windows. Strangely enough, the OS is being reported in GFXBench as Windows 8 rather than Windows RT, but that could be a simple mistake in the GFXBench database. Assuming that the NVIDIA Wayne Windows test platform is running at half the GPU operating frequency of Tegra 4 production devices, then the GFXB 2.5 performance in Windows should be virtually identical to the GFXB 2.5 performance in Android.

P.S. Even though Shield has a heatsink/fan, the purpose there was not to crank up max operating frequencies to improve benchmark performance vs. thin fanless Tegra 4 tablets/smartphones. The purpose of using a heatsink/fan was to allow gamers to play for hours on end without having to worry about thermal throttling reducing performance over time, and without having to worry about the device becoming warm or hot to the touch over time.

I dont understand what the no was in relation to ams? I agree the gpu frequency was not at 670mhz or anywhere near, I was just pointing out the driver optimisation between windows and open gl.

Besides saying having a heatsink/fan has nothing to do with allowing higher frequencies sounds a little off..I would have thought they are intrinsically linked to one another.

Im willing to bet 100 rupees :) we dont see nvidia shield performance in any smartphone. period....also the heatsink and fan has everything to do with why the shield can.
 
Logically speaking, it is much more likely that the GFXB 2.5 score in Windows is based on a Wayne platform operating at about half the GPU clock operating frequency of Tegra 4 production devices, rather than some mysterious Android driver optimizations that would give them a 2x performance boost vs. Windows drivers.

In theory one could boost CPU/GPU operating frequencies when using a heatsink/fan while staying within certain thermal threshold limits, but the power consumption would go up and the battery life would go down (possibly in a non-linear fashion too relative to the frequency increase), so that is not necessarily a good tradeoff. And considering that Shield volumes sold are an unknown even to NVIDIA, it wouldn't make sense to use any specially binned T4 SoC. So it should be obvious that a heatsink/fan will allow for longer periods of gaming without reaching thermal threshold limits and without having a chassis that becomes warm or hot to the touch. So the performance benefit of using a heatsink/fan will not necessarily show up in the form of higher maximum fps, but rather higher sustained fps over time due to reduced thermal throttling.
 
Last edited by a moderator:
@ams I KNOW...were talking about different things..my driver reference was just a general comment not linked to wayne.

I agree with others including you it is clocked lower, thats not under debate and never was.

About the heat sink and fan, (and larger battery) whilst your correct it would allow longer playing time, im also quite sure it would allow them to use higher clocks...as I said we wont be seeing that performance in a smartphone. .no way.
 
http://gfxbench.com/device.jsp?benchmark=gfx27&D=NVidia+Shield

GLB2.7 first what seems to be at full frequency results. If those won't increase and the former Adreno330 leaked results were for real, than the latter might end up about 20% faster.
I doubt that GLB results have something to do with real ingame performance considering thermal throttling, which should be higher for Adreno 330 compared to Adreno 320(especially in real devices)
 
I doubt that GLB results have something to do with real ingame performance considering thermal throttling, which should be higher for Adreno 330 compared to Adreno 320(especially in real devices)

So you think Tegra 4 is not going to throttle in a smartphone or that Adreno 330/320 will continue to throttle in a tablet sized device?
 
I doubt that GLB results have something to do with real ingame performance considering thermal throttling, which should be higher for Adreno 330 compared to Adreno 320(especially in real devices)

I haven't read about any throttling in Anand's recent reviews for Adreno320s. Do you have any facts or first hand experience that speaks for it?

Let alone that S800/Adreno330 haven't been publicly tested yet by independent sources so that you could think or speculate about any thermal throttling, but the very same goes of course also for Wayne/Shield as Jubei points out above which hasn't been yet tested in a final device by an independent 3rd party either.

The 330 results that appeared very shortly in the Kishonti database indicated beyond 1200 frames. Now mark the "ifs" in my former post for that, both for Wayne as for S800.

So you think Tegra 4 is not going to throttle in a smartphone or that Adreno 330/320 will continue to throttle in a tablet sized device?

As I said I haven't read or heard so far about any severe throttling about the S600 in the GalaxyS4 and there both the CPU and GPU are clocked higher than usually. There a smartphone design is at the moment just a tiny inch below Shield as a tablet design in that benchmark.

Now Wayne/Shield's performance could increase through driver optimisations since devices (except probably Shield?) haven't shipped yet. But until then it remains that QCOM probably was on spot with its claims for its current and upcoming SoCs.
 
Dont forget s800 is on a new hkmg HP process over at TSMC, according to qualcomm, this allows adreno 330 to consume half of what they quoted as being "previous generation".

Now qualcomm usually are the most accurate at performance forecasts, although to be honest this is the only time ive had doubts about the possibility of that being true.
 
The performance differences in GFXBench 2.7 are largely academic because no mobile handheld devices in the current or near future is capable of smooth and playable framerates at 1080p Offscreen settings. That said, Tegra 4-powered Shield has a 50% increase in GFXBench 2.7 performance when moving from 1080p (Offscreen) to 720p (Onscreen) resolution, enough to give Shield a 2.2x advantage in average frames per second at native resolution compared to ipad 4! So Shield really is built to deliver smoother framerates and less throttling during extended periods of gaming compared to ultra high res smartphones and tablets, with some additional benefits coming from the console-grade controller and relatively high battery capacity.
 
Back
Top