2013 is arguably the year where phone cameras, and specifically the uiPhone camera, got as good as regular cameras. A DSLR or awesome mirrorless camera will still give you better photos technically, but the iPhone is way more convenient, and will give most folks better results in most instances.
Even in the days of film, convenience could win over quality. Only an enthusiast of a pro would go anywhere near an SLR. In those days, most people used a compact camera with fixed focus (AF crept in in the 1980s), and the real cheapskates opted for crappy 110 or Disc cameras, which used tiny films — the equivalent of small sensors these days.
I own probably the best camera I’ve ever used, the Fujifilm X100S, and I’ve all but given up taking it out with me, saving it for portrait work where it really shines. For everything else, I use the iPhone. So what’s changed to make it so compelling?
The iPhone Got Awesomer
First, the iPhone itself is just a better camera. Hardware-wise it gained a color-matching flash (whatever), an improved sensor (better in low light), along with the biggest new camera feature: the A7 chip.
The super-powerful new chip that powers the iPhone is what makes time travel possible. No, wait. That’s the Flux Capacitor. The A7 chip is what makes burst mode and slo-mo possible.
The Slo-Mo feature of the iPhone 5S is the flashiest aspect of the new hardware, letting you seamlessly slow down a stretch of your video footage, but it’s the burst mode that demonstrates why the iPhone is a better camera than the one in your camera bag: software. All “real” cameras have shitty software. The iPhone not only has great software, but it works in combo with the hardware to make things like burst mode. We have been able to “spray-n-pray” for years, but only now is it just as easy to pick the keepers and delete the rest. Hell, the camera even picks out likely candidates for you.
This year, iPhone camera accessories also got great. Olloclip updated its best-in-class 3-IN-1 lens to the 4-IN-1, not only adding a lens but also improving the optical quality of the lenses. Now, for well under $200, you can kit yourself out with two macro lenses, a telephoto, a polarizer, plus fisheye and wide-angle lenses. They’re not as good as Nikon primes, but they’re good enough, and they’re way cheaper and — this is important — they all go almost unnoticed into a jacket pocket. Just like the iPhone.
There are specialty lenses too, like the anamorphic movie lens from announced on Kickstarter this month, and ring flash adapters built into iPhone cases. You can even buy “depth-of-field” adapters which use a ground-glass screen like the one inside an SLR viewfinder in order to let you get the neat out-of-focus backgrounds you’d get using it on a full-frame camera body.
But the things which really set apart the iPhone from every other kind of camera are its software, and the fact that it’s always connected to the internet.
Software enables some amazing apps (which we’ll see more of in another post this week), but what it really does is let anyone who has a neat idea turn that idea into an actual camera. You want a crazy-specialized time-lapse camera? Easy. You want a camera that shoots high-quality TIFF files and offers manual control over almost every aspect of the picture-taking process? Just download it.
And then there’s sharing. Wi-Fi in SLRs is fine and all, but not really very useful. With the iPhone I can snap some shots of my vacation, edit them over a coffee and have them in a shared photo stream a few minutes after taking them. Or I can have every photo uploaded (privately) to Flickr as soon as I shoot it. The beauty is that this sharing is seamless. You tether your SLR to your iPhone, or some other janky workaround, but until you can just hit share on the camera and have it uploaded over a 3G or LTE connection, the iPhone will have the edge.
And right now, the iPhone’s camera is improving faster than camera makers are adding cellular connections to their devices. Plus, who wants to add yet another data plan to their monthly bills?
“Real” cameras still have the edge when it comes to image quality, though, and sometime in UI: the iPhone has a hardware shutter button and that’s it. So can the iPhone improve even further? Sure:
This one’s tricky, but would make a huge difference to both image quality and low-light capability (thanks to the bigger pixels). It’s also the key to shallower depth-of-field aka blurry backgrounds behind sharp subjects. But a bigger sensor means a bigger lens, and that lens should be further from the sensor. But if anyone can come up with a clever way to marry smart design with new technologies and make them in their millions, it’s Apple.
If camera apps had access to the raw data spat out of the sensor, they could do all kinds of neat things, including lossless reworking of whiter balance, big exposure tweaks and more. Apple could open up the raw data to developers, but I don’t see it happening any time soon. Why? Because I suspect the RAW data from the iPhone sensor isn’t that great, and that the iPhone does a whole lot of processing on that data before turning it into a JPG. And while Apple could send this post-processed data to an app, that would kinda defeat the point of RAW capture.
For now then, “real” cameras win this round.
One of several things I love about my X100S is the array of knobs and dials. Focus and aperture are set with rings around the lens, and pretty much every function I need access to while shooting has a button or dial to let me set it with the camera still at my eye. The iPhone lets you use the volume button to snap a picture.
There have been a few attempts at hardware buttons for the iPhone camera, but they bring the same trouble as carrying a separate camera — they’re not convenient. But what about a bumper case that sunk a plug into the Lightning port and had buttons and maybe sliders (in lieu of bulky dials) around the edges? One that was small enough to stay on all the time? Even Bluetooth would be ok I guess, if it meant you could charge the iPhone with the bumper in place.
The iPhone will only improve as a camera. If ever you needed a good example of the way Apple makes vast improvements in its products while pundits look on, only seeing the small year-to-year incremental tweaks, then the camera is it. The camera was terrible up until the iPhone 3G, and has gotten better every year since. And it seems that Apple is concentrating on crazy new ways to do things that only a camera/computer combo can manage. Burst is one. Live filters is another. Who knows what’s next?