Pixel 7 Pro Cameras' AI and Zoom Could Make You Rethink That iPhone - CNET

2022-10-09 20:20:11 By : Mr. Franky Zhong

Your guide to a better future

Exclusive: New camera hardware and the Tensor G2 AI brain in Google's $899 phone help you zoom, focus, unblur faces and shoot in the dark.

Stephen Shankland has been a reporter at CNET since 1998 and writes about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.

With new hardware, software and artificial intelligence technology, Google expects significant photography improvements to its Pixel 7 Pro . And CNET has the exclusive deep dive into exactly what the company has done with its smartphone camera.

In part because of Google's photography technology, Google positions its $899 flagship Android phone as a direct competitor to Apple's $1,099 iPhone 14 Pro Max, though the smaller $999 iPhone 14 Pro has the same camera hardware.

Google's Pixel phones haven't sold well compared with models from Samsung and Apple. But they've earned high marks for photography year after year. And if anything is going to win customers over, it'll be camera technology.

Last year's Pixel 6 Pro introduced a "camera bar" housing three rear cameras -- its 50-megapixel main camera, a 0.7x ultrawide angle and a 4x telephoto. The Pixel 7 models' keeps the same 50-megapixel camera and f1.85 aperture but houses it in a restyled camera bar. On the Pixel 7 Pro, the ultrawide has the same sensor as last year but gets a macro mode, f2.2 aperture, autofocus and a wider 0.5x zoom. The 7 Pro's telephoto's zoom magnification extends to 5x with an f3.5 aperture, and a new 48-megapixel telephoto sensor enables a 10x zoom mode without using any digital magnification tricks. Both phones get a new front-facing selfie camera.

But the Pixel 7 Pro's improved hardware foundation is only part of the story. New computational photography technology enabled by new AI algorithms and Google's new Tensor G2 processor speeds up Night Sight, unblurs faces, stabilizes video better and merges data from multiple cameras to improve image quality for intermediate zoom levels like 3x. That's more like how a traditional camera behaves.

"What we really tried to do is give you a 12mm-240mm camera," said Pixel camera hardware leader Alexander Schiffhauer, translating the Pixel 7 Pro's 0.5x to 10x magnification range into traditional 35mm camera terms. "That's like the Holy Grail travel lens," an all-purpose setup that photo enthusiasts have long enjoyed for portability and flexibility.

Here's a deeper look into what Google is up to.

High-end smartphones now come with at least three rear-facing cameras so you can capture a wider range of shots. Ultrawide angle cameras are good for photographing people crammed into a room and interesting buildings. Telephoto cameras are better for portraits and more distant subjects.

But having a big gap between zoom levels can be a problem. The Pixel 7 Pro's 50-megapixel main camera and 48-megapixel telephoto cameras offer something of a fix.

Ordinarily, those cameras use a technique called pixel binning to turn a 2x2 grid of pixels into a single larger pixel. That makes 12-megapixel shots with better color and a wider dynamic range of dark and light tones.

But by skipping pixel binning and using only the central 12 megapixels, the 1x main camera can take a 2x shot, and the 5x telephoto can take a 10x shot. The smaller pixels mean image quality isn't as high, but it's still useful, and Google applies its Super Res Zoom technology to improve color and sharpness, too. (The 2x mode works on Google's Pixel 7, too.) 

Apple took the exact same approach with its iPhone 14 Pro phones, but only with the main 1x camera.

The Pixel 7 Pro's 1x camera can take photos zoomed to 2x with its central pixels, and its 5x camera can take 10x photos.

Where Apple went farther is letting you shoot full 48-megapixel photos with the iPhone 14 Pro's 1x camera. At 1x, Google always uses pixel binning and thus captures only 12 megapixels.

Read more: Pixel 7, Pixel 7 Pro and Pixel Watch: Everything Google Just Announced

When shooting between 2.5x and 5x zoom, the Pixel 7 Pro blends image data from the wide and telephoto cameras to improve image quality. That improves photos compared with just digitally upscaling a photo from the main camera, Schiffhauer said.

But it's difficult. The phone has to reconcile the two cameras' perspectives, which means foreground objects block ones in the background differently. You can see this for yourself by covering first one eye and then the other to see how a scene changes. The two cameras also focus differently because of their different focal lengths.

To avoid discontinuities, Google uses artificial intelligence, also called machine learning, and other processing techniques to figure out which portions of each image to include or reject.

Zoom fusion takes place after other processing methods. Those include HDR+, which merges several frames into one image for better dynamic range, and an AI algorithm that monitors hand shake to take photos when the camera is most stable.

Technology called zoom fusion improves the quality of photos taken between 2.5x and 5x zoom by adding pixels from the 5x camera to the central portion of the image. AI helps align the two views and reconcile differences.

Unfortunately for those who like to shoot raw photos, an image format that offers higher quality and more editing flexibility, zoom fusion isn't an option there. You'll get full 12-megapixel raw images only at the Pixel 7 Pro's fixed zoom levels of 0.5x, 1x, 2x, 5x and 10x.

Google introduced technology in 2021 to marry data from the main and ultrawide cameras to cope with motion blur in faces that can spoil photos. This face unblur technology now kicks in three times more often, said Pixel camera software leader Isaac Reynolds.

Specifically, it'll work more often in ordinary light, activate more when it's dim, and function even on faces that aren't in focus.

For processing photos after the fact, the Google Photos app gets a new tool to unblur shots. It works even with digitized film photos from the bygone age of analog photography.

By the way, if you want to hear more from Reynolds himself, you can listen to Google's podcast about the Pixel 7 camera technology.

The Pixel 7 Pro has several autofocus improvements, starting with the addition of autofocus hardware on the ultrawide camera. For all the cameras, though, Google now uses an AI algorithm to process focusing data from the image sensor.

The camera also will be able to spot eyes, not just faces, for autofocus that works more like that found in high-end cameras from companies like Sony, Nikon and Canon.

New AI technology also can focus better as people move in a scene. "If someone turns their head away from the camera or walks away, we can maintain focus on their head," Reynolds said.

The Pixel 7 phones also do better when faces are hard to recognize, like with big hats or really large dark sunglasses.

And new AI-based autofocus technology makes it much faster to switch to telephoto shooting. The Pixel 6 Pro often pauses when activating its telephoto camera. 

Read more: Pixel 7 vs. Pixel 6: Comparing Google's Flagship Phones

From 5x to 10x zoom, the Pixel 7 Pro uses the central pixels of the 5x camera to take a 12-megapixel photo.

With digital magnification methods, though, the camera can reach up to 30x zoom, up from 20x in the Pixel 6 Pro. Google developed a method called Super Res Zoom that takes advantage of hand shake to gather more detailed data about the subject and zoom better.

The Pixel 7 Pro has three rear-facing cameras: the main 1x camera with a 50-megapixel sensor, a 0.5x ultrawide camera that now also can take close-up macro photos, and a "periscope" style 5x telephoto camera that reflects light inside the phone body to accommodate the longer optics required.

Google's digital zoom also can use AI techniques to magnify images. This year, Google trained its AI to predict new pixels better. The phone also calculates a scene attribute called an anisotropic kernel to better predict the subtle changes from one pixel to neighbors to better fill in new data while magnifying.

"Obviously, the quality of 30x isn't going to be quite what it is at 10x," Schiffhauer said. "But you still get these really beautiful photos that you can share."

Night Sight, the pioneering and now widely copied technology to take better shots when it's dim or dark, is now twice as fast. That's because Google uses image frames it collects from before you tap the shutter release button. (That's possible because the camera continuously collects imagery, stashing it in memory but only keeping it if you actually take a photo.)

"Users are waiting half as much time to capture a Night Sight photo," Reynolds said. "They're getting sharper results, and they're paying no penalty on noise."

Google's Tensor G2 processor doubles Night Sight photo speeds on the Pixel 7 and Pixel 7 Pro, capturing some frames for the photo earlier and reducing noise with more effective AI processing.

For the Pixel line's astrophotography mode -- the extreme version of Night Sight -- new AI technology for removing noise speckles now preserves stars better.

Google also overhauled the Pixel 7 cameras' video, a weak point compared with iPhones in many reviewers' eyes. For starters, all Pixel 7 Pro cameras shoot up to 60 frames per second at 4K resolution now, but there's a lot more:

Together, the improvements show Google is fighting hard to maintain its leading smartphone camera technology. "We're pushing hardware, software and machine learning as far as you can go," Schiffhauer said.

Correction, Oct. 7: An earlier version of this story incorrectly described the Pixel 7 and 7 Pro phones' main 50-megapixel camera. The camera is the same as the one used in the Pixel 6 models.