NVIDIA Tegra Architecture

Yup, depends on your what you consider your primary useage, or what is more important. And, of course, what your own personal preference is.

16:9 and 16:10 are great for watching video, not so great, IMO, for reading text in portrait. For that 4:3 or 3:2 is preferable, again, IMO.

Regards,
SB
 
It's a bluetooth pedal controller, that you can use either on your desktop to spam spellcasts in a world of warcraft kind of game ; or to try to control your character in an fps game you play in the bus, on a Tegra device with touch screen.
 
It's a new software API called nFlight that simulates flocks of birds, flies, mosquitoes, etc. in games. It relies on PhysX for the physics and on new proprietary CUDA-based software for the AI. You need at least a GeForce GTX 650 Ti for it.
 
It's a new software API called nFlight that simulates flocks of birds, flies, mosquitoes, etc. in games. It relies on PhysX for the physics and on new proprietary CUDA-based software for the AI. You need at least a GeForce GTX 650 Ti for it.
Oh I had hoped for TressFX support. How disappointed I am.
 
Isn't TressFX based on Direct Compute? In other words, it's already supported by NVIDIA, though from what I hear it could use some performance optimization.

Another thing, what does this have to do with Handheld Technology? :p
 
It's a new software API called nFlight that simulates flocks of birds, flies, mosquitoes, etc. in games. It relies on PhysX for the physics and on new proprietary CUDA-based software for the AI. You need at least a GeForce GTX 650 Ti for it.

This would actually useful in Doom 4, I hope they implement staying corpses and flies could eventually fly around them (duh. you had some in Quake 2). And of course, to up the limit of lost souls spawned by pain elementals on supported systems :D. Here hoping doom 4 does have pain elementals. Managing the fireballs thrown at you by a horde of 100 slow-moving imps, too.

Of course you ought to be able to encounter more than one pain elemental at once. But have them die a horrible way, turned into a pile of brown slime, so that the Arch-E-Vile cannot resuscitate them.
 
Isn't TressFX based on Direct Compute? In other words, it's already supported by NVIDIA, though from what I hear it could use some performance optimization.
Oh it's terrible! I hope AMD will soon announce they have a breaking news to make, and that this news will be that they have ported TressFX to OpenCL.

Another thing, what does this have to do with Handheld Technology? :p
Humor fits everywhere :D
 
Congrats to Nvidia on the Surface 2 design win.
was expected, let's see if the sales will take off with this new model (IMHO, I don't think so, the price is set too high)

another major design win for T4 with CoolPad, a TOP10 smartphone vendor in China:
http://www.engadget.com/2013/09/23/coolpad-magview-4/

pic_item_1.jpg
 
Congrats to Nvidia on the Surface 2 design win.

Yes although nvidia must be getting peeved with Microsoft for not optimising w8.1 for its power saving "shadow core"...seriously if your not going to enable its main selling point why not go for a asynchronous soc which would be a better solution?.. rather odd if you ask me, not the tegra 4 soc choice..rather the implementation of it.
 
Yes although nvidia must be getting peeved with Microsoft for not optimising w8.1 for its power saving "shadow core"...seriously if your not going to enable its main selling point why not go for a asynchronous soc which would be a better solution?.. rather odd if you ask me, not the tegra 4 soc choice..rather the implementation of it.

nVidia can't flick a switch to get asynchronous CPU cores here - as long as they're using Cortex-A15 it's simply not an option. And even if it was, it'd still mean bringing in more power rails and probably a different/more complex PMIC to support that.
 
nVidia can't flick a switch to get asynchronous CPU cores here - as long as they're using Cortex-A15 it's simply not an option. And even if it was, it'd still mean bringing in more power rails and probably a different/more complex PMIC to support that.

I get a feeling that French Toast meant that if Nvidia can't use its power-saving core with 8.1 RT, then Microsoft should have chosen a different SoC, such as a S800, instead of giving us an imperfect T4 implementation.

Tegra 4 must have really offered them a great price, but Surface 2 isn't a budget slate, so did MS really need to save a few $$.
 
I've heard that Nvidia won the deal because of the quality of their Windows drivers, much better than Qualcomm.
Is it true ? What's the state of Qualcomm DX drivers ? Do we have a Win8 tablet with SnapDragon SoC ?
 
I've heard that Nvidia won the deal because of the quality of their Windows drivers, much better than Qualcomm.
Is it true ? What's the state of Qualcomm DX drivers ? Do we have a Win8 tablet with SnapDragon SoC ?

Samsung ATIV Tab uses an older Qualcomm APQ8060A, Anandtech preferred it to the Tegra 3 devices.

Qualcomm powers all the Windows Phone 8 devices, which uses the same Windows NT kernel, afaik, as its brothers, so they must be up to speed in terms of software, but drivers may be an issue, possibly.

http://www.anandtech.com/show/6528/samsung-ativ-tab-review-qualcomms-first-windows-rt-tablet/6
 
Yea that's what im trying to say, microsoft didnt implement the feature on tegra 3 either, so its either a feature that microsofts internal testing found didnt work or (most likely) Microsoft were too lazy to code for it.
Google worked hard to get nvidias feature working from the start where as Microsoft has had 2 generations of hardware and software and still no breakthrough.
Its obviously important because tegra 4s cores cant be clocked independently and are high clocked a15s so are very inefficient in this crude/ broken setup that Microsoft has implemented.
I know some of the cores can be powergated...im not sure if that is all cores in any order or say 2 on at once minimum...perhaps someone could fill me in there..
The Q4 2012 smartphone equaling 1080p screens are also a let down...but thats obviously for another thread :)
 
It's kind of confusing that they say they don't know how well PowerVR's support is because it doesn't support OpenGL ES 3.0 yet, but they can claim to know that NVIDIA has excellent support for OpenGL ES 3.0. AFAIK, Tegra doesn't have support for OpenGL ES 3.0. Is it correct of them to assume that since the GeForce cards have pretty good support that Tegra will have so as well? Especially as they have no experience with Tegra.
 
Back
Top