Two years ago I knew nothing about the company NVIDIA or even how to pronounce the firm’s name. Then my granddaughter received an internship followed by a full time offer when she finishes college.
But if you want to see where computing-networked technology is taking us, this may be a place to start.
The company built its reputation as the creator of graphic chips used in computer games. But as reported in the Wall Street Journal: its chips are now being snapped up for everything from game consoles to data centers to self-driving cars. . . Nvidia told analysts Tuesday (March 22) that it now sees a total addressable market of $1 trillion for its growing line of chips and related software.”
His full talk is about 90 minutes and covers the firm’s recent product releases and latest innovations. The theme of the presentation is the creation and production of artificial intelligence (AI) “factories.”
These fully automated intelligent processes will change the way most businesses operate from driverless cars to digital biomedicine to warehouse management.
I was unable to follow many of the technical capabilities which NVIDIA will empower. However he recaps his talk near the end so you may wish to start with this overview before diving into the entire speech.
His opening described NVIDIA’s four layers (integrated stacks) for the networked world designed to “accelerate computing” resulting in “the creation and production of intelligence.” One example of a critical technical component is NVIDIA’s H100, an 80 billion transistor chip.
The Omniverse: Creating Digital Twins for Real World Events
The platform that combines all of these new technical designs and interfaces is NVIDIA Omniverse defined as an “easily extensible platform for 3D design collaboration and scalable multi-GPU, real-time, true-to-reality simulation.“
You can hear the Omniverse presentation by starting the keynote stream here. The session begins with the Apollo 13 “almost disaster” to demonstrate the power of a “replica,” or what Haung describes as today’s virtual equivalent, a “digital twin.”
As I understand, this simulation engine (Omniverse) enables the creation of a digital model that exactly duplicates the physical world. The two realities work together, the real with the virtual version. Companies across industries now have the power of “enhanced predictive analysis and software and process automation that maximize productivity and help maintain faultless operation.”
Prior AI was based on pattern recognition technology, with robotic programmed platforms. The new digital world provides autonomous platforms for applications such as driving, monitoring climate change, and even cell and molecular research.
The digital twin duplicates the real world’s physical environment, events and task management. The virtual model can then create new simulations beyond real world experiences. AI becomes a self-learning technology that can then be used to manage actual events.
One example: NVIDIA’s Earth-2 project is building a super computer dedicated to predicting climate change through simulations. This should encourage changes in human activity now versus waiting till the events actually occur.
Huang referenced several big companies using Omniverse Enterprise to create digital twins of their own operations. These include Amazon, the German Railway system-DB Netze, Kroger, Lowe’s and PepsiCo. NVIDIA’s integrated platform “builds physically accurate digital twins or develops realistic immersive experiences for customers.”
The Impact on Financial Services
Data-information management is the core of financial services. Today the use of AI inspired technology focuses on automated voice or chat bot responses, some enhanced data analytics and 24 by 7 connectivity.
Currently, NVIDIA’s financial service efforts include the use of deep learning, machine learning, and natural language processing to “boost risk management, improve data-backed decisions and security, and enhance customer experiences.”
I have no idea how this Omniverse capability will affect financial institutions beyond the known task functions for which it is used today.
If you want a simple future example however, CEO Jensen carries on a live, unscripted Q & A with his digital double (avatar) named Toy Jensen. I’m not sure which speaker was more credible!