Over the last three years, Google’s Pixel phones have earned a well-deserved reputation for photographic strength. With the Pixel 4 and 4 XL, the company is flexing new camera hardware and software muscles.
The new flagship Android smartphone, which the search giant unveiled Tuesday, gets a second 12-megapixel camera, a key component in an overhauled portrait mode that focuses your attention on the subject by artificially blurring the background. The new portrait mode works more accurately and now handles more subjects and more compositional styles.
The additional camera, a feature Google itself leaked, is just one of the photography advances in the Pixel 4. Many of the others stem from the company’s prowess in computational photography technology, including better zooming, live-view HDR+ for fine-tuning your shots and the extension of Night Sight to astrophotography.
The new features are the surest way Google can stand out in the ruthless, crowded smartphone market. Google knows a lot is riding on the phones. They’re a blip in the marketplace compared with models from smartphone superpowers Samsung and Apple. In June, Google improved its prospects with the low-priced Pixel 3A. But to succeed, Google also needs better alliances with carriers and other retail partners that can steer customers to a Pixel over a Samsung Galaxy.
Improving photography is something Google can do on its own, and photography is important. We’re taking more and more photos as we record our lives and share moments with friends. No wonder Google employs a handful of full-time professional photographers to evaluate its products. So I sat down with the Pixel 4’s camera leaders — Google distinguished engineer Marc Levoy and Pixel camera product manager Isaac Reynolds — to learn how the phone takes advantage of all the new technology.
Levoy himself revealed the computational photography features at the Pixel 4 launch event, even sharing some of the math behind the technology. “It’s not mad science, it’s just simple physics,” he said in a bit of a jab at Apple’s description of its own iPhone 11 computational photography tricks.
Two ways to see three dimensions
To distinguish a close subject from a distant background, the Pixel 4’s portrait mode sees in 3D that borrows from our own stereoscopic vision. Humans reconstruct spatial information by comparing the different views from our two eyes.
The Pixel 4 has two such comparisons, though: a short 1mm distance from one side of its tiny lens to the other, and a longer gap about 10 times that between the two cameras. These dual gaps of different length, an industry first, let the camera judge depth for both close and distant subjects.
“You get to use the best of each. When one is weak, the other one kicks in,” Reynolds said.
Those two gaps are oriented perpendicularly, too, which means one method can judge up-down differences while the other judges left-right differences. That should improve 3D accuracy, especially with things like fences with lots of vertical lines.
Levoy, sitting at Google’s Mountain View, California, headquarters, flipped through photos on his MacBook Pro to show results. In one shot, a motorcycle in its full mechanical glory spans the full width of a shot. In another, a man stands far enough from the camera that you can see him head to toe. The smoothly blurred background in both shots would have been impossible with the Pixel 3 portrait mode.
Google wants you to think of the Pixel 4’s dual cameras as a single unit with a traditional camera’s continuous zoom flexibility. The telephoto focal length is 1.85X longer than the main camera, but the Pixel 4 will digitally zoom up to 3X with the same quality as optical zoom.
That’s because of Google’s technology called Super Res Zoom that cleverly transforms shaky hands from a problem into an asset. Small wobbles let the camera collect more detailed scene data so the phone can magnify the photo better.
“I regularly use it up to 4X, 5X or 6X and don’t even think about it,” Levoy said.
The iPhone 11 has an ultrawide camera that the Pixel 4 lacks. But Levoy said he’d rather zoom in than zoom out. “Wide angle can be fun, but we think telephoto is more important,” he said at the Pixel 4 launch.
HDR+ view as you compose photos
HDR+ is Google’s high dynamic range technology to capture details in both bright and dark areas. It works by blending up to nine heavily underexposed shots taken in rapid succession into a single photo — a computationally intense process that until now took place only after the photo was taken. The Pixel 4, however, applies HDR+ to the scene you see as you’re composing a photo.
That gives you a better idea of what you’ll get so you don’t need to worry about tapping on the screen to set exposure, Levoy said.
Separate camera controls for bright and dark
Live HDR+ lets Google offer better camera controls. Instead of just a single exposure slider to brighten or darken the photo, the Pixel 4 offers separate sliders for bright and dark regions.
That means you can show a shadowed face in the foreground without worrying you’ll wash out the sky behind. Or you can show details both on a white wedding dress and a dark tuxedo.
The dual-control approach is unique, and not just among smartphones, Levoy says. “There’s no camera that’s got live control over two variables of exposure like that,” he said.
Shoot the stars with astrophotography
In 2018, Google extended HDR+ with Night Sight, a path-breaking ability to shoot in dim restaurants and on urban streets by night. On a clear night, the Pixel 4 can go a step further with a special astrophotography mode for stars.
The phone takes 16 quarter-minute shots for a 4-minute total exposure time, reduces sensor noise, then marries the images together into one shot.
AI color correction
Digital cameras try to compensate for color casts like blue shade, yellow streetlights and orange candle light that can mar photos. The Pixel 4 now makes this adjustment, called white balance, based in part on AI software trained on countless real-world photos.
Levoy showed me an example where it makes a difference, a photo of a woman whose face had natural skin tones even though she stood in a richly blue ice cave.
The character of out-of-focus regions is called bokeh in photography circles, and with the Pixel 4 it’s improved to be more like what an SLR would produce. That’s because more of the portrait mode calculations happen with raw image data for better calculations. Point sources of light now produce white discs in the bokeh, not gray, for example.
Depth data for better editing
The Pixel 4 adds the ability to record the 3D scene information called a depth map in every photo. That opens powerful editing abilities for tools like Adobe Lightroom, which can handle depth maps in iPhone photos.
All these features represent a massive investment in computational photography — one Apple is mirroring with its own Night Mode, Smart HDR and Deep Fusion. Google has to “run faster and breathe deeper in order to stay ahead,” Levoy acknowledged.
But Apple also brings more attention to Google’s work. “If Apple follows us, that’s a form of flattery.”
Originally published Oct. 15, 7:49 a.m PT.Updates, 8:08 a.m., 8:24 a.m. and 10 a.m.: Adds detail about Google’s new computational photography abilities.
- Take a first look inside this brand-new multi-million pound Burton school
- Inside Dyson's new £330m Singapore HQ where no one even has a desk
- Samsung Galaxy S10, S10+ and S10e launch with 'punch-hole' cameras
- Google Maps' new feature is perfect for people with a bad sense of direction
- Google reveals top UK searches this year
- Apple fans FURIOUS over controversial Google ad that mocks iPhone camera
- Best cameras for 20189 These are the snappers you should be considering
- Samsung Galaxy S10: Features, specs, release date and everything you need to know
- Prankster caught pulling a MOONIE on Google Street View – and now he says ‘I’ve made it’
- Bonkers Nokia 9 PureView phone revealed with SIX cameras – snapping photos in the dark and taking five shots at once
Go inside Pixel 4's new camera features with Google’s photo technology experts have 1436 words, post on www.cnet.com at October 15, 2019. This is cached page on Europe Breaking News. If you want remove this page, please contact us.