NVIDIA Tegra Architecture

Would be more competitive if it did have LTE.
then it would end up being 100$ more, and as you said, the price is already limiting the volume.

for programming it's a nice device, but there are lower price alternatives for mainstream users.
 
AFAIK, the Nexus 9 ended up being a pretty low volume device too, thanks to its inflated price.
 
AFAIK, the Nexus 9 ended up being a pretty low volume device too, thanks to its inflated price.

So where are nvidia mostly selling? If not phones or tablets, can Shield sales and sales for some car devices be enough to continue development?

Other SOC architectures are being fueled by big sales and profits. How does nvidia keep up against Samsung, Apple and Qualcomm?
 
According to nVidia, the automotive market has very high margins so that is supposed to drive their SoCs, along with whatever they can get from Shield consoles and high-end tablets.

Moreover, nVidia has stated that the development of the GeForce ULP range was the main factor to drive Maxwell chips to get their huge boost in energy efficiency and thermals. So at least the increase in marketshare for their discrete graphics cards also came from the Tegra's R&D efforts.

And Denver is supposed to be coming to their discrete GPUs as well.
 
Last edited by a moderator:
So where are nvidia mostly selling? If not phones or tablets, can Shield sales and sales for some car devices be enough to continue development?

Other SOC architectures are being fueled by big sales and profits. How does nvidia keep up against Samsung, Apple and Qualcomm?

As far as I know, the Tegra division has been losing money quarter after quarter for years. I guess NVIDIA is hoping that cars and other products will make it profitable eventually.
 
According to nVidia, the automotive market has very high margins so that is supposed to drive their SoCs, along with whatever they can get from Shield consoles and high-end tablets.

So far the division hasn't shown any worthwhile profit; it doesn't really have to as it might be their ticket to other emerging markets in the future.

Moreover, nVidia has stated that the development of the GeForce ULP range was the main factor to drive Maxwell chips to get their huge boost in energy efficiency and thermals. So at least the increase in marketshare for their discrete graphics cards also came from the Tegra's R&D efforts.

If you burst that marketing bubble:
1. You probably still can't fit the X1 GPU at 1GHz into an ultra thin tablet.
2. None of the desktop GPUs clock at "just" 1GHz and heck you'll have a hard time to find a SKU with NV's default frequencies anyway and most definitely NOT with the peak consumption values NV states at the default frequencies.

It's not that it's not true that there has been some sort of influence, however there's a shitload of marketing also involved. See what kind of GTX970s are available for sale right now and how many of them actually have a 145W max power rating.

Not power but design related: the X1 GPU has FP16 just as upcoming Pascal desktop will have. A chicken/egg dilemma also, since FP16 for upcoming desktop GPUs isn't exactly unique either or without any benefits. Point being that while there are distinct differences between ULP mobile and desktop there are obviously also aspects or call them "needs" that run in parallel. If perf/W is not THE defining factor for HPC solutions then what is these days?

And Denver is supposed to be coming to their discrete GPUs as well.

I don't think so.......o_O and that's exactly the spot where it bounces back to the first paragraph above. How much of the original plans/purposes are left actually today or was it all just a weird bunch of rumors based on wrong assumptions?
 
Last edited:
many people was waiting for this one, Jetson TX1 dev module:
http://www.anandtech.com/show/9779/nvidia-announces-jetson-tx1-tegra-x1-module-development-kit
JetsonModule_678x452.jpg

JetsonCarrier_575px.jpg
 
http://www.anandtech.com/show/9779/nvidia-announces-jetson-tx1-tegra-x1-module-development-kit

I wonder if the OS distro/drivers will work on a rooted Shield TV. I know people who have gotten Linux working on it but as of yet without any kind of GPU acceleration. A standard Linux dev environment OpenGL 4+ and OpenGL ES 3+ would make Shield TV very attractive, while being much cheaper and having some more benefits than this board.
 
Doesn't look like they have it yet, but I got my TK1s from Avionic in Germany -- http://www.avionic-design.de/ -- because of the North America-only preorder last time around. I'll give them a nudge and ask them if they plan to do the new version too.
 
I wonder if the OS distro/drivers will work on a rooted Shield TV. I know people who have gotten Linux working on it but as of yet without any kind of GPU acceleration. A standard Linux dev environment OpenGL 4+ and OpenGL ES 3+ would make Shield TV very attractive, while being much cheaper and having some more benefits than this board.

Yes, that would be very nice.
The Shield device is much more suited as a software development computer than this board, which is more targeted towards embedded hardware integration of their X1 module.
 
I have no idea about the companys who want to use this thing, but isn't 299$ extremely expensive? Who should buy this with this price tag?
 
I have no idea about the companys who want to use this thing, but isn't 299$ extremely expensive? Who should buy this with this price tag?

NVIDIA unveiled a small sized module that harnesses the power of machine learning to enable a new generation of smart, autonomous machines that can learn. The NVIDIA Jetson TX1 module addresses the challenge of creating a new wave of millions of smart devices -- drones that don't just fly by remote control, but navigate their way through a forest for search and rescue.

Compact security surveillance systems that don't just scan crowds, but identify suspicious activity; and robots that don't just perform tasks, but tailor them to individuals' habits -- by incorporating capabilities such as machine learning, computer vision, navigation and more. Jetson TX1 is the first embedded computer designed to process deep neural networks -- computer software that can learn to recognize objects or interpret information. This new approach to program computers is called machine learning and can be used to perform complex tasks such as recognizing images, processing conversational speech, or analyzing a room full of furniture and finding a path to navigate across it. Machine learning is a groundbreaking technology that will give autonomous devices a giant leap in capability.
...
Jeff Bier, president of Berkeley Design Technology, Inc., said: "Based on BDTI's independent analysis, the Jetson TX1 stands out in three respects. First, developing applications on the Jetson TX1 feels more like developing on a PC than like developing on a typical embedded board. Second, the JetPackTX1 installer makes it easy to install a system image on the board. Third, support for CUDA enables developers to use the GPU to accelerate their applications without having to delve into the complexities of GPU programming."

http://www.guru3d.com/news-story/nv...bring-deep-learning-to-robots-and-drones.html
 
I think Anandtech gave the far more reasonable answer to that question:

http://www.anandtech.com/show/9779/nvidia-announces-jetson-tx1-tegra-x1-module-development-kit

However since it was a full COTS implementation of Tegra K1, something unexpected happened for NVIDIA: developers started using Jetson TK1 outright as a production board. For small developers doing similarly small product runs, or just projects that didn’t require a highly integrated solution (e.g. embedded systems as opposed to mobile devices), some developers would just stick with Jetson since it meant they could skip system hardware development and focus on software and/or peripherals.

In that case $599 is dirt cheap IMHO. In other news I like the heatsink/fan: http://images.anandtech.com/doci/9779/Jetson_TX1_Press_Deck_Final-page-017.jpg :runaway:
 
So I noticed that the Cortex-A53 cluster is disabled in Jetson TX1, or at least it's not mentioned in any of the material for it. AFAIK this was also the case with Shield TV. But with Shield TV it wasn't a big deal since it was always going to be in a big wall powered and well cooled box. The TX1 module, on the other hand, could have realistically been placed in a small battery powered custom mobile device given its modest form factor.

Is there any device out there that does enable the A53s on a Tegra X1? Is it possible that it's just flat out broken or somehow compromised to the point where it'd may as well be?
 
Back
Top