Nvidia shows signs in [2018]

Status
Not open for further replies.
I think what Carsten meant is:

Cut: an official price cut from NVIDIA across all partners and AIBs.
Drop: an unofficial price cut transmitted through the back channels and distributors
Exactly. When Nvidia officially cuts prices, early adopters will probably become infuriated, when the prices drop due to, say better supply, Nvidia could shed some of the blame. *cough*
 
So it seems NVIDIA will continue to sell Pascal cards along Turing till the holiday season, which means the overstock of Pascal is real.
One doesn’t need to be a consequence of the other?

With Turing being as large as it is, it’s not at all impossible that they will continue to produce Pascal.
 
Leading Japanese Companies Select Jetson AGX Xavier for Next-Generation Autonomous Machines
Sept. 12, 2018
Speaking at the GPU Technology Conference in Japan, NVIDIA founder and CEO Jensen Huang announced that FANUC, Komatsu, Musashi Seimitsu and Kawada Technologies will adopt Jetson AGX Xavier in their next-generation autonomous machines.

Starting with a small computer module capable of up to 32 TOPs (trillion operations per second), it delivers the processing capability of a powerful workstation yet fits in the palm of your hand. With multiple operating modes at 10W, 15W and 30W, Jetson AGX Xavier has greater than 10x the energy efficiency of its predecessor.
https://nvidianews.nvidia.com/news/...avier-for-next-generation-autonomous-machines

Japan Shifts Autonomous Driving Industry into High Gear with NVIDIA DRIVE AGX
September 13, 2018
Toyota first announced its collaboration with NVIDIA at the 2017 GPU Technology Conference in Silicon Valley. Now the automaker is well underway in incorporating NVIDIA DRIVE AGX Xavier as the AI brain in its production cars beginning in 2020.

Autonomous trucks will also play a large role in making Japan’s roadways safer and less congested. As one of Japan’s leading truck makers, Isuzu Motors is developing autonomous vehicles with NVIDIA DRIVE AGX, starting with 360-degree surround perception, lane keeping and adaptive cruise control features, then moving to platooning and, ultimately, highly automated and fully autonomous vehicles.

Startup Tier IV is using the DRIVE AGX compute platform to develop software systems for urban driverless vehicles, and has already logged more than 6,000 miles in autonomous driving pilots with various operators, including Japan Post. As it moves toward level 5 robotaxis and delivery vehicles, Tier IV is using NVIDIA DRIVE AGX Pegasus for its next-generation vehicles.

The high-performance, energy-efficient NVIDIA DRIVE AGX platform also integrates key sensor manufacturers. Sony’s 8-megapixel automotive camera, Panasonic’s depth-sensing camera, and automotive electronics supplier Omron’s 3D lidar sensor can now all operate seamlessly with the NVIDIA DRIVE platform. The higher resolution and improved quality of these next-generation sensors enables self-driving cars to see farther and with better clarity, even in challenging lighting conditions.
https://blogs.nvidia.com/blog/2018/09/13/japan-partners-nvidia-drive-ecosystem/
 
EETimes: Nvidia Going All Robot, All the Time
9/14/2018

Most AI platform suppliers have been obsessed lately with autonomous vehicles. This week, Nvidia escalated the obsession by spreading the epidemic to “autonomous machines.”

Phil Magney, founder and principal advisor at VSI Labs, called Nvidia “shrewd” to extend the reach of the architecture, since most competitors are focusing exclusively on automated cars. “As we know, there are lots of human driven machines out there where removing the operator is the goal. Nvidia’s new partners in Japan have their bases covered with these announcements.”

Indeed, Huang announced that Yamaha Motor Co. has selected Nvidia Jetson AGX Xavier to develop the system for autonomous machines that will include unmanned agriculture vehicles, marine products and “last-mile” vehicles.

Among an industry chorus singing Nvidia’s robotic tune and committing to the Jetson AGX Xavier system are FANUC, Komatsu, Musashi Seimitsu and Kawada Technologies.

Magney added, “Even factory automation is covered with FANUC’s plan to apply AGX Xavier to its factory automation solutions.”

Partnerships with these big names in Japan lends Nvidia not only credibility but also momentum in robotics and AI applications and development.

What’s so special about AGX Xavier?
The big difference with the new platform is that it’s “based on the Xavier SoC, while the previous Drive PX was based on the Parker SoC,” said Magney. “In addition, the DevKit comes in single or dual SoC versions (Pegasus).”
...
Nvidia’s efforts in developing these AI infrastructure building blocks are paying off nicely as Nvidia expands its eco-system for development. Magney said, “You get the hardware, software tools, access to libraries, and network training tools. Of all the suppliers, Nvidia’s advantage is in the diversity of the DevKit as no one has this collection of hardware, software and development support.”

Yamaha deal
Nvidia’s deal with Yamaha caught the attention of industry analysts like Magney. He called the announcement interesting, “because they will develop solutions for various off-road vehicles, commercial or recreation.” He speculated that Yamaha might apply it to even boats.

“For manufacturers of recreational and commercial ground vehicles, the development of automated platforms is absolutely necessary,” Magney said. “For a company like Yamaha (who must support a variety of different platforms) it is better to unify around a common architecture that is scalable and can be applied to different vehicle platforms.”

Sensor partnerships
Nvidia also boasted that its DRIVE AGX platform integrates key sensor manufacturers. “Sony’s 8-megapixel automotive camera, Panasonic’s depth-sensing camera, and automotive electronics supplier Omron’s 3D lidar sensor” can now operate seamlessly with the Nvidia DRIVE platform, Nvidia claimed.

This offers “greater diversity for various types and brands of sensors,” Magney noted.


The big takeaway is that Nvidia now has “driver support for these devices within its SDK.” He said, “This is one of the key elements of a development kit — driver support for various types and brands of sensors.” Building drivers takes times and effort. By supporting these devices, Nvidia saves developers time otherwise spent on writing device drivers, Magney observed.
https://www.eetimes.com/document.asp?doc_id=1333734&_mc=RSS_EET_EDT
 
Last edited by a moderator:
NVIDIA RTX Platform and Turing GPU Architecture Take Home Advanced Imaging Society Lumiere Technology Award
The NVIDIA RTX platform and the NVIDIA Turing GPU architecture have been recognized with a technology award from the Advanced Imaging Society (AIS). The award was first reported last week by the Hollywood Reporter.
...
“The AIS Technology Award annually acknowledges and celebrates technologies and processes demonstrating both innovation and impact in advancing the future of the entertainment and media industries. The AIS committee felt it important to recognize NVIDIA for its new Turing architecture and RTX platform and the advancements they bring to ray tracing,” said Jim Chabin, president of the Advanced Imaging Society. “This technology allows the lighting of virtual environments to mimic the real world — making our virtual worlds more realistic than ever.”
....
Industry partners and software providers like Adobe, Allegorithmic, Autodesk, Blackmagic Design, Chaos Group, Isotropix, Otoy, Pixar Renderman, REDSHIFT and others representing many of the most important applications for the film industry are praising NVIDIA RTX.
...
The AIS was formed in 2009 by The Walt Disney Studios Motion Pictures, DreamWorks Animation (DWA), Sony, Paramount, IMAX, Dolby and others to advance the creative arts and sciences of stereoscopic 3D.
https://blogs.nvidia.com/blog/2018/...ced-imaging-society-lumiere-technology-award/
 
Rambus Renews Patent License With NVIDIA
Rambus Inc. today announced it has renewed a patent license agreement with NVIDIA. The agreement allows NVIDIA’s use of innovations in the Rambus patent portfolio, including those covering memory controllers and serial links. Specific terms of the agreement are confidential.
...
The Rambus Memory and Interfaces Division develops products and services that solve the power, performance, and capacity challenges of the communications and data center computing markets. Rambus enhanced standards-compatible and custom memory and serial link solutions include chips, architectures, memory and SerDes interfaces, IP validation tools, and system and IC design services. Developed through our system-aware design methodology, Rambus products deliver improved time-to-market and first-time-right quality.
https://www.businesswire.com/news/home/20181005005039/en/Rambus-Renews-Patent-License-NVIDIA

Likely similar to the 2010 agreement they had regarding memory/memory controllers.

Edit:
Micron, Rambus, Northwest Logic and Avery Design to Deliver a Comprehensive GDDR6 Solution for Next-Generation Applications
This comprehensive solution brings together the unique contributions of each company to solve that problem, extending the reach and benefit of GDDR6 well beyond its traditional graphics market. The solution would include:
https://www.rambus.com/micron-rambus-northwest-logic-avery-design-deliver-gddr6-solution/
 
Last edited by a moderator:
Nvidia Enters ADAS Market via AI-Based Xavier
Nvidia is in Munich this week to declare war that it is coming after the advanced driver assistance system (ADAS) market. The GPU company is now pushing its AI-based Nvidia Drive AGX Xavier System — originally designed for Level 4 autonomous vehicles — down to Level 2+ cars.

In a competitive landscape already crowded with ADAS solutions provided by rival chip vendors such as NXP, Renesas, and Intel/Mobileye, Nvidia is boasting that its GPU-based automotive SoC isn’t just a “development platform” for OEMs to prototype their self-driving vehicles.
...
At the company’s own GPU Technology Conference (GTC) in Europe, Nvidia announced that Volvo cars will be using the Nvidia Drive AGX Xavier for its next generation of ADAS vehicles, with production starting in the early 2020s.
By Level 2+, Shapiro means that Volvo will be integrating “360° surround perception and a driver monitoring system” in addition to a conventional adaptive cruise control (ACC) system and automated emergency braking (AEB) system.

Nvidia added that its platform will enable Volvo to “implement new connectivity services, energy management technology, in-car personalization options, and autonomous drive technology.”

It remains unclear if car OEMs designing ADAS vehicles are all that eager for AI-based Drive AGX Xavier, which is hardly cheap. Shapiro said that if any car OEMs or Tier Ones are serious about developing autonomous vehicles, taking an approach that “unifies ADAS and autonomous vehicle development” makes sense. The move allows carmakers to develop software algorithms on a single platform. “They will end up saving cost,” he said.

Phil Magney, founder and principal at VSI Labs, agreed. “The key here is that this is the architecture that can be applied to any level of automation.” He said, “The processes involved in L2 and L4 applications are largely the same. The difference is that L4 would require more sensors, more redundancy, and more software to assure that the system is safe enough even for robo-taxis, where you don’t have a driver to pass control to when the vehicle encounters a scenario that it cannot handle.”

Another argument for the use of AGX for L2+ is that the alternative requires the use of multiple discrete ECUs. Magney said, “An active ADAS system (such as lane keeping, adaptive cruise, or automatic emergency braking) requires a number of cores fundamental to automation. Each of these tasks requires a pretty sophisticated hardware/software stack.” He asked, “Why not consolidate them instead of having discrete ECUs for each function?”

Scalability is another factor. Magney rationalized, “A developer could choose AGX Xavier to handle all these applications. On the other hand, if you want to develop a robo-taxi, you need more sensors, more software, more redundancy, and higher processor performance … so you could choose AGX Pegasus for this.”

https://www.eetimes.com/document.asp?doc_id=1333851&_mc=RSS_EET_EDT
 
NVIDIA GPUs to Power New Berkeley National Lab Supercomputer, Accelerating Scientific Discoveries
October 30, 2018
NVIDIA GPUs will power a next-generation supercomputer at Lawrence Berkeley National Laboratory, announced today by U.S. Energy Secretary Rick Perry and supercomputer manufacturer Cray.

Perlmutter, a pre-exascale system coming in 2020 to the DOE’s National Energy Research Scientific Computing Center (NERSC), will feature NVIDIA Tesla GPUs. The system is expected to deliver three times the computational power currently available on the Cori supercomputer at NERSC.
...
Optimized for science, the supercomputer will support NERSC’s community of more than 7,000 researchers. These scientists rely on high performance computing to build AI models, run complex simulations and perform data analytics. GPUs can speed up all three of these tasks.

Nearly half the workload running at NERSC is poised to take advantage of GPU acceleration, a recent NERSC study found. Between now and 2020, other applications that are algorithmically similar to GPU-accelerated kernels could also make the shift to GPU-ready code — enabling scientists to hit the ground running as soon as Perlmutter comes online.

https://blogs.nvidia.com/blog/2018/10/30/gpus-nersc-perlmutter-berkeley-national-lab-supercomputer/
 
No mention of the GPU architecture. Is it safe to assume post-Volta or post-Turing?
It links to Tesla page which the Tesla is currently a Volta-based GPU. I doubt it would be Turing since that architecture is focused more on rendering acceleration so unless they're planning on releasing a new architecture in 2019/20 for Tesla, I assume it's going to be Volta-based.
 
There could be some Turing GPU's in the mix from the "data analytics" aspect, but not sure. There was mention of RAPIDS which uses Turing gpu's for end to end data analytics acceleration.
Edit: RAPIDS (gpu accelerated platform) will work with any Pascal GPU and higher.
 
Last edited by a moderator:
Images generated by SDR can be used to train a neural network for perception tasks such as object detection on real images.
October 30, 2018
When generating these synthetic scenes the team randomizes the objects created in the scene including lanes, cars pedestrians, road signs, and sidewalks. For each object, its position, texture, shape, and color are randomized but only within realistic ranges. The technique also randomizes lighting parameters such as the time of day and image saturation.
...
The network detects objects in a scene on both videos and in still images with high confidence. This is remarkable because the network has never seen a real image during training.


https://news.developer.nvidia.com/s...mization-makes-deep-learning-more-accessible/
 
Yesterdays annual supercomputing conference stream from nvidia. Makes it pretty clear what is current offering from nvidia to datacenters from volta to turing.

 
Only 2 hours to watch, small price to pay for 'clarity` ;)

I guess the short version of video is hgx2 which is 16 volta machine for training/heavy lifting where each gpu can talk directly to any other gpu via nvlink and nv switch https://www.nvidia.com/en-us/data-center/hgx/

Turing t4 machine for inference. By the looks of it each board looked to have 2gpus and boards are hot swappable. Many boards per machine where each t4 is round 70W.

Also google has that t4 thingy already integrated in their cloud.

https://www.datacenterdynamics.com/news/google-cloud-first-offer-nvidia-t4-gpus/
 
Status
Not open for further replies.
Back
Top