In an announcement at CES today, the company announced an update to their computer chip specifically targeted towards powering self driving cars, called the Drive PX 2. The new chip is loaded with processing power, able to run 24 trillion deep learning operations per second. Deep learning is the industry’s most vogue stab at artificial intelligence, where information is processed by many layers of mathematical equations. Nvidia claims that the Drive PX 2 is about as powerful as 150 MacBook Pros—all that power is used to handle up to 12 video camera inputs, as well as LiDAR, radar, and ultrasonic sensors. Then, there’s the computation, which means the computer has to look at all of the data, dozens to hundreds of times per second, and discern what move to make next. To prevent the chip from overheating, the system is water cooled. If Nvidia wants to sell these chips, it’s logical to try and make this the easiest computer to specifically use for autonomous cars. Along with the chip, Nvidia is releasing DriveWorks, a host of software tools and modules that facilitate testing and car functionality. Developers will be able to use this software to help stitch together images from the 12 potential video feeds, synchronize data inputs, and calibrate sensors. As the number dictates, this is the second card Nvidia has pitched at autonomous cars. The company claims that since their first model last year, the Drive PX, 50 car manufacturers, developers, and research institutions have started to use it. Facebook also recently announced that they’re using Nvidia GPUs in the servers that power their widespread artificial intelligence operations. Nvidia says that the system will be generally available in late 2016, but special partners will get early access.