posted on June 26, 2013 |
| 9782 views
Holograms are not a mysterious technology. We know how to make them. There’s no theoretical obstacles. And yet here you are, getting your engineering news through a two-dimensional screen. What gives?
The answer, of course, is money. But thanks to new developments at MIT’s Media Lab, holographic displays might become much less expensive. In a paper recently published in Nature, Daniel Smalley outlines his idea for a standard definition, full color, holographic video display.
Key to Smalley’s invention is a $10 optical chip that he developed at the Media Lab. The chief breakthrough is the way it directs light through the crystal that propagates the display’s holographic image – in Smalley’s display, a tiny lithium niobate crystal sits at the center of the operation. Its underside is stippled with microscopic channels called waveguides, which are used to confine and direct the light that will eventually make up the display’s image. They are also embedded with an electrode that provides an acoustic wave.
As beams of red, green, and blue light travel through the crystal’s waveguides, the electrodes embedded within them create an audio wave that filters out the colors that aren’t needed in the image. In the end, the binary channeling done by both the surface of the waveguides and the audio frequency sounded within them allows Smalley to produce a quality holographic video signal, all for roughly the same price as my lunch yesterday.
Impressively, Smalley’s display could become even cheaper. “Everything else in there costs more than the chip,” says Smalley’s thesis advisor, Michael Bove, “The power supplies in there cost more than the chip. The plastic costs more than the chip.”
Currently, a number of companies are actively pursuing holographic technology, and in the coming years, we’ll likely see holographic TVs pop up in R&D labs across the globe. Coupled with MIT’s new holographic technology, those first models might turn into cheap consumer displays.
Image Courtesy of MIT