This Week in Engineering is multi-segment show that explores the latest innovations and tech trends in engineering from academia, government and industry.
Episode Summary:
The U.S. Air Force has announced that Eielson Air Force Base in Alaska will be the site of the services’ first micro-nuclear reactor. The vendor has not yet been announced, but the selection represents the first firm commitment for production type transportable nuclear power systems for base use. The Pentagon is operating a joint force development program for micro-reactors called Project Pele in parallel, with an eye toward a mass producible reactor design by 2022. If successful, the programs may create a new generation of shipping containers sized, truck and air transportable intrinsically safe power systems in the single digit megawatt output range.
The road to self driving has proven much more difficult than many developers in the computer industry originally thought. Tesla is using a unique approach, with cameras only and the lack of radar or lidar images requires systems that must understand video images at a very high level of reliability. To achieve this, Tesla has developed an in-house artificial intelligence system called Dojo. The company has just released a white paper describing a new framework for coding in dojo that replaces an older IEEE standard that was notoriously heavy on processor and memory capacity. Google Brain, Google’s similar in-house AI unit, has their own standard. This could be a VHS versus Beta moment.
Access all episodes of This Week in Engineering on engineering.com TV along with all of our other series.
Transcript of this week’s show:
To see any graphs, charts, graphics, images, and/or videos to which the transcript may be referring, watch the above video.
Segment 1: While nuclear fusion remains the long-range goal for the nuclear industry, several advanced fission technologies are not only practical, but can be deployed in a meaningful number fast enough to make a real impact in global fossil fuel use. While fission reactors are historically large civil engineering projects on the scale of conventional thermal power plants, a new generation of compact micro-reactors promises to make the technology portable, and usable in locations where coal, gas- or diesel-powered generators are primary power sources.
The U.S. Air Force has selected Eielson Air Force Base in Alaska as the location for the first small-scale reactor in military service in five decades. Early transportable reactor concepts were used in Greenland and Antarctica, but proved to be unreliable, and more expensive to operate than conventional generators. New generation reactors operate with simpler, more reliable cycles than the conventional coolant loop systems used in traditional systems derived from naval reactor technology. Several competing designs are under development, some from startups such as Radiant Nuclear, Ultra Safe Nuclear, and others with industry heavyweights like Westinghouse.
The key to small-scale is intrinsically safe design, which simultaneously does away with the need for large and complex safety systems with multiple redundancy, and also does away with the heavy containment structure. Micro-reactors are generally sized to be truck transportable in shipping container formats with typical power outputs on the order of single digit megawatt levels. Transportable reactor technology is naturally of interest to all military services, and the Pentagon started its own research program called Project Pele in 2020, with contracts issued to Westinghouse, BWXT Advanced Technologies and X-energy for design work.
Workable designs are expected in 2022. With multiple companies vying to supply micro-reactors for military base use, the technology can reach a critical mass where it can be deployed in remote communities, mining and industrial sites and as emergency power in disaster areas. But US military applications alone should have a significant impact. The US Defence Department estimates power consumption at 30 TW hours of electricity per year, and over 10 million gallons of liquid fuel per day.
Segment 2: Artificial intelligence is on everyone’s mind today, but the number of real-world applications of AI is surprisingly small. It’s expensive, difficult to develop and with current technology is reserved for the toughest computational problems, notably self-driving systems in the automotive industry. Tesla has tackled the self-driving problem with a unique camera only approach, which places very high demands on hardware and software to make sense of the environment around the vehicle to deliver error-free driving.
Tesla calls their system Full Self Driving, and it operates currently at SAE Level Two, although Level Four or Five driving has been anticipated for several years. To close the gap, Tesla has developed its own in-house supercomputer called Dojo and in an unusual move has just published a technical paper on their in-house technology.
Entitled “Tesla Dojo Technology, a Guide to Tesla’s Configurable Floating-Point Formats and Arithmetic”, the paper explains how Tesla has developed new formats and methods for 8-bit and 16-bit binary floating-point arithmetic for deep learning neural network training. Why is this important? The original standard, IEEE 754-1985, dates to the mid-1980s and was designed to ensure identical results whether tasks were run in software, hardware or a combination of both.
This single precision standard was supplemented in a later revision with a newer, half precision standard to address the fundamental problem of the original concept: very heavy demands on both processors and memory. By going their own way with an in-house standard, Tesla expects to decrease the hardware load even further, addressing a major issue in widespread implementation of self-driving: heavy computational loads that limit the ability of intelligent systems to process data and simulate in reasonable time frames and at acceptable cost. Alphabet’s Google Brain AI unit has been working on the problem with their own in-house solution, and Tesla may create a unifying standard, although without the expectation of full interoperability of code between platforms. Is this the AI equivalent of VHS versus Beta, or Apple versus Android?
Given the small number of supercomputer platforms in use worldwide, probably not, but as supercomputer costs drop and an increasing number of universities and research institutions adopt them, formats and standards begin to matter greatly, especially in a market where high-level coder talent is in very short supply. It’s about self-driving now, but it’s clear that Tesla is developing AI systems capable of solving problems in other regimes. It’s going to be interesting to watch.