Despite “major” annual updates, progress can be incremental in the world of iPhone cinematography and photography. And Apple events feature an avalanche of impressive specs and gimmicky features geared toward making consumers feel like the latest and greatest will make them a professional shooter.
To get past the hype, IndieWire spoke with Filmic Pro CTO Chris Cohen. He shared the stage with filmmaker Sean Baker at the big Apple unveiling, and it’s Cohen’s app that allows every serious filmmaker, from Baker to Steven Soderbergh, to use the iPhone like a professional camera. We also talked to the iPhone experts at Moment, a five-year-old company that creates apps and tools for professional iPhone shooters.
Here are the five actual breakthrough camera advances in the iPhone 11 that should have filmmakers excited.
1. The Ultra Wide Lens
If you’ve ever shot anything on an iPhone, you’ll notice that switching from photo to video mode tightens the image to create a more limited field of view. To widen that view, filmmakers rely on a third-party lens attachments: Soderbergh used Moment’s 18mm on “Unsane,” Baker the anamorphic Moondog lens on “Tangerine.” With the new iPhone 11, Apple’s “Ultra Wide” lens solves this problem.
“It looks to sit right around a 13mm,” said Caleb Babcock, chief content creator at Moment. “Which is perfect, because any wider on the iPhone and you start to get that fish-eye look.”
Director Rian Johnson (“Star Wars: The Last Jedi,” “Knives Out”) experimented with an early iPhone 11 Pro. He shot footage in Paris (shared below). It features some of the first shots we’ve seen from the new ultra wide lens, which he tweeted was “a real game changer.” The optics look solid, while being, as Babcock speculated, right on that edge of being too wide.
iPhone cinematography will likely continue to be most effective when shooting subjects who are relatively close to iPhone. The camera still lacks the ability to capture detail for images with too much scope, which makes the ability to get wider and see more in intimate situations an incredibly important feature.
2. The Selfie Camera
Until the iPhone 11, the user-facing camera commonly used for FaceTime and selfies has not been a pro tool, lacking the optics and sensor of the back-facing lenses.
“We’ve always discouraged it to our users,” said Cohen. “We’ve even had internal conversations of whether we should even let users use the front-facing lens, because the quality was just poor.”
Apple’s user-facing camera is now TrueDepth, and represents one of the most significant upgrades made to its camera system. The camera is now 12 megapixels, has a significantly wider lens, and the ability to capture in 4K up to 60 frames per second. Cohen, who got early access to the camera in order to build the new software used in Baker’s demo with jazz musicians, said everyone at Filmic Pro was “blown away” by the massive upgrade, adding, “It’s a worthy addition to the lens kit now.”
Here’s why this matters:
3. Shot Reverse Shot
Much attention has been placed on the iPhone 11’s ability to simultaneously record two video streams from the back-facing cameras — a great feature for photographers, less so for filmmakers. To seamlessly cut together multi-camera coverage, and avoid jump cuts, the two shots need both a different image size (which the iPhone can now do), and a change of angle (which the iPhone still can’t do).
One of the only ways to make two shots cut together is a straight ahead, perfectly centered symmetrical frame — think Stanley Kubrick or Wes Anderson. So while those real-world applications are limited, there’s a lot more potential in the new shot-reverse-shot capabilities.
“As a filmmaker, there’s some really practical use cases for it,” said Babcock. “If someone wanted to record a podcast, you’re sitting across the desk from someone, one camera in the middle, and you’re getting both angles. That goes for documentary use as well.”
In fact, when Apple first invited Filmic Pro into look at the technology and asked them how they could best represent its capabilities to users, Cohen and his team suggested an interview demo.
“That was the first version of the pitch: A news reporter conducting an interview, with shot-reverse-shot, and in the end they wanted something more artsy,” said Cohen. “But that’s how we envisioned this feature. We wanted to empower storytellers, and those will be our early adopters with this feature.”
4. Camera + Super Computer
Smartphone companies love to hype the power of their newest processing chips, and eye rolls from the software engineers usually follow. “We always joke, ‘Great all this power, I wonder how fast this will throttle. 30 seconds? 40 seconds?,” said Cohen. “Because even though there is a lot of peak performance on tap with the processors Apple has been making, they’re sandwiched between two pieces of glass, so for a high performance application like Filmic Pro that has a computation imagining pipeline, we can only really tap into about 30 percent of that maximum potential before the system fails.”
However, the new A13 chips in all iPhone 11s are another matter. At one point, while building the demo app using an iPhone 11 prototype, Cohen’s Filmic team had six composites showing at once. “This thing wasn’t even getting hot to the touch,” said Cohen. “It’s a breakthrough in terms of sustaining performance, and that’s going to have huge implications for what we do.”
Phil Pasqual, the head of Moment’s App team, agrees. “These phones are extremely powerful and the benchmarks on the chips in them are not far off from a laptop computer,” said Pasqual. “You’re basically pairing a camera with a super computer.”
Pasqual said the camera’s ability to take multiple photos simultaneously, combined with an algorithm that can merge them intelligently and in real time, is “a paradigm shift.” “The next two years are going to be very interesting,” said Cohen. “You’re going to see things with real time imaging software that’s going to blow you away.”
An important iPhone professional advance of the last two years was Filmic Pro’s Log V2. This gave cinematographers the ability to record video images that preserved maximum dynamic range information, simulating the process of recording in Log or Raw on professional cameras. These images could then be accessed in a professional post-production color grade setting.
“I would say Log V2 was as far as we could push it in terms of previous versions of software,” said Cohen. “Now, our heads are spinning. We have a lot of things we were planning to put on the road map that we weren’t planning to put in there for the next two or three years. Now we are seriously considering fast-tracking them, because the sustaining performance is so good.”
5. It’s Not Just the iPhone 11 Pro
For professional cinematographers, the focus has been on the most expensive Pro model. However, most of the camera advances are in all the new iPhone 11 models. The Pro does have the third telephoto lens in back, extra battery power, and a matte finish. Most importantly, all iPhone 11s have the A13 chip, ultra wide lens, upgraded user facing camera, and the newest capture sensors which increases the native dynamic range of the iPhone.
“Apple, to their credit,” said Cohen. “They could have arbitrarily made the pro artificially superior to the other ones, but they did not do that.”
The Local Tone Mapping Problem: Soderbergh and others have pleaded with Apple to fix, or at least allow the ability to turn off, the iPhone’s local tone mapping that can adjust the exposure of a portion of the frame in the middle of a shot. It would appear that issue will become more manageable with the iPhone 11.
“I’m not in a position to speak for Apple,” said Cohen. “What I am going to say is that issue looks like it — I’m going to use my words carefully here — I don’t think it’ll be such a problem.”
When Can We Expect the new Filmic Pro App?: “We have never been beholden to hard deadlines because of our internal process,” said Cohen. “We give early access to filmmakers and educators and, with their feedback, we go to market or we may re-tool. We’re just saying the end of the year. That said, we do reserve the right to go behind that if part of the user experience need to improve.”
And will some of the features shown with Baker at the Apple launch event be accessible, through updates, before then? Cohen declined to answer.
Is a Composite Zoom Through all Three Pro Lens Possible? “It’s possible to zoom through all the focal lengths using a combination of digital zoom and lens switching,” said Cohen. “It comes with some caveats. Switching between lenses, you are going to have different effective apertures. You’re also going to have different characteristics of lens compression. If you were to do, let’s call it a ‘composite, multi-cam zoom,’ you wouldn’t notice it if the zoom was relatively fast, but you would notice it if it was very, very slow.”