Yesterday’s Lytro event at NAB had the atmosphere of excitement Danny Boyle captured in “Steve Jobs” — a packed audience ready to be wowed and ultimately leaving the venue wondering if what they just witnessed was a historic, tectonic shift in technology. As had been announced a week ago, Lytro has designed a camera that also captures three-dimensional information about the intensity and direction of light fields, which allows filmmakers to make creative choices that were previously reserved for green screen and computer-animated filmmaking.
The event started with the premiere of “Life,” a short film shot on the Lytro camera by Robert Stromberg, who directed “The Martian VR Experience” and was the visual effects designer behind numerous Hollywood blockbusters. The film itself was not revolutionary — a glossy, often schmaltzy tale of family that featured live action photography, animated backdrops and digital effects — but the post-screening breakdown of how it was made was jaw-dropping.
The first shot the Lytro team lifted the veil on was a simple image of a boy holding a baseball. Because of the 3D light field information captured by the camera, Lytro can create a “virtual lens” not possible in the real world. This allows, with a simple click and slide in the Lytro software, for filmmakers to radically adjust the image’s depth of field. This was demonstrated by shallow focus shifting from the boy’s face, to the ball, to a shot with tremendous depth of field.
The company also showed how in one scene, which was purposefully shot in an ugly parking lot, Lytro frees filmmakers and actors from the constraints of a green screen studio environment. In the shot, the film’s two lead characters are removed from their parking lot backdrop and set against a blue sky. It was remarkable to see how quickly the Lytro software isolated the couple from the backdrop and how believably they were merged into their new, sunny setting.
The final demonstration was a scene with the film’s protagonists at the alter getting married. The shot featured many problems: crew members and equipment in frame, confetti being tossed in only one take, but not the other, and the camera shaking as it dollied away from the couple. Those issues allowed Lytro to show off its full range of capabilities: the couple was removed from the busy sound stage and dropped into magical wedding, the confetti from one shot was quickly isolated and dropped into a different take, and the camera movement was smoothed without any of the artifacting that normally comes from stabilization filters.
The implications of this technology for film production are obvious: No longer do blockbusters need to be tied to time-consuming green screen sets, nor do certain creative decisions need to be made on set, with problems and mistakes able to be solved quickly and believably.
Ultimately, though, it was the future application of the Lytro camera to the growing world of virtual reality that is likely the company’s long-term play, and that’s what had most of the tech-savvy crowd at NAB excited. Lytro said it was actively working on finding ways to capture a fuller set of volumetric data that would allow VR makers to move objects in live action photography and create in a 360 environment much like animators. After the presentation, Phil Lelyveld, who manages Entertainment Technology Center (ETC) at USC and is the keynote speaker and curator at NAB’s Virtual Summit, told indiewire that Lytro had the potential of being the equivalent in the VR world of moving from the typewriter to the word processor.
In only 18 months, Lytro built the camera prototype, wrote their software, and completed production of “Life.” The only reason they were able to move this quickly and effective is the dozens of heavy hitters in the tech world — including Google, Nvadia, and The Foundry — climbing aboard the project. The key to Lytro’s continued growth is if they can find the same partnerships in Hollywood, so that the Lytro camera can be integrated into film production as quickly as possible.
One of the aspects of Hollywood being willing to experiment with Lytro comes the company’s partnership with Google, which helped design a cloud-based solution so that Lytro could avoid potentially enormous work flow issues on set and in post production. As was demonstrated yesterday, filmmakers are able to access and manipulate footage without it ever being housed on a computer. Without naming names, Lytro CEO Jason Rosenthal indicated in his closing remarks that, once the camera has been fully tested, the company expected it to be used by a major Hollywood production by next winter.
In the meantime, Lytro went to great strains yesterday to make clear they are well aware that their current prototype camera – currently the size of a small car and a reminder of the early technicolor cameras of the 1930s — is much too large to be practical in the long term. The company is actively finding ways to make it smaller, but has no exact timeline for when the next model would be ready.