Google Explains Magic Behind Pixel 4 Astrophotography, Night Sight Camera Modes

Google Night Sight
Photography is an art form in every sense, and taking great looking photos can be challenging even when lighting conditions are ideal. Bad lighting will really test a photographer's skills, and night photography is especially tricky. That's what makes Google's Night Sight technology on its PIxel phones so awesome—it allows casual users to snap great looking photos of the night sky and other dark scenes. How exactly does the magic happen, though?

Google provided some insight into its Night Sight feature in a new blog post, which explains how astophotography with Night Sight on Pixel phones is accomplished. This is not a brand new feature, of course, but as Google explains, this year's implementation "pushes the boundaries of low-light photography with phone cameras."

"By allowing exposures up to 4 minutes on Pixel 4, and 1 minute on Pixel 3 and 3a, the latest version makes it possible to take sharp and clear pictures of the stars in the night sky or of nighttime landscapes without any artificial light," Google says.

As Google goes on to explain, the amount of light that a camera's image sensor detects inherently has some uncertainty. This is called shot noise, and it's the reason why low light photos get riddled with grain (or noise). The more light a camera sensor can let in, the better.

Night Sight Blur
Two-minute exposure resulting in motion-blurred stars

"Extending the exposure time for a photo increases the total amount of light captured, but if the exposure is long, motion in the scene being photographed and unsteadiness of the handheld camera can cause blur. To overcome this, Night Sight splits the exposure into a sequence of multiple frames with shorter exposure times and correspondingly less motion blur," Google explains.

This process entails aligning the frames to compensate for both camera shake and in-scene motion (since subjects have a tendency to move). The frames are then averaged. Individual frames from a shot can be filled with noise, but when combined, a cleaner looking image emerges. We can attest to this, as we took night shots when comparing the cameras on the Pixel 4 XL to the Galaxy Note 10.

Night Sight Sky
No sky processing (left) compared to sky darkening (right)

Night Sight also takes into consideration that photos viewed on a screen can appear much brighter. To counter this effect, Night Sight uses machine learning to detect which parts of an image represent the sky.

"An on-device convolutional neural network, trained on over 100,000 images that were manually labeled by tracing the outlines of sky regions, identifies each pixel in a photograph as 'sky' or 'not sky'. Sky detection also makes it possible to perform sky-specific noise reduction, and to selectively increase contrast to make features like clouds, color gradients, or the Milky Way more prominent," Google says.

There's more to it than that, and if you have a coffee break to spare, it's worth reading the blog post in its entirety (hit the link in the Via field below).

Images source: Google