The camera on the Google Pixel and Pixel XL phones has been rated as the best of any smartphone by DxOMark. Other reviews have also shown the camera to compare well with other industry leading smartphones, such as the Samsung Galaxy S7 and iPhone 7. Based on hardware specifications alone, this might seem surprising to some given that the Pixel camera has the same sensor size (1/2.3”), megapixels (12.3) and aperture (f/2.0) as those in the Nexus 5X and Nexus 6P. The Pixel cameras do have the addition of phase detection autofocus over the Nexus devices, but most of the improvement is due to the new software, rather than hardware.
HDR (High Dynamic Range) is a technique used in digital photography as a method of dealing with scenes that include both bright and dark areas, such as an object in shadow in front of a bright sky. A standard single image is able to retain the details either in the bright area or the dark area, but not both. HDR combines multiple images taken in quick succession, traditionally with each image taken at a different level of exposure. The combined image can then be manipulated to show high levels of detail across all areas of the image. Most smartphone cameras over the last few years have included HDR functionality, but now Google has taken the software in a different direction with the Pixel phones.
There are two main differences in how the Pixel and Pixel XL handle HDR images. Firstly the camera is taking images even before you press the shutter button and secondly, each image is taken at a low level of exposure rather than combining high, medium and low levels. The first point is critical to the speed of the camera. Combined with a fast processor, it means that the Pixel can use the button press as a timestamp to combine images that it already has available, thus minimizing any lag. Google argues that by combining multiple low exposure images it can better retain color information and also eliminate unwanted artifacts that can arise in HDR techniques.
Many have commented on the lack of Optimal Image Stabilization in the Pixel cameras, which could lead to blurry images in low light. But Google argues that its HDR software means that each image can be taken at a fast shutter speed, thus eliminating, or at least reducing, the need for hardware stabilization. HDR+ is turned on by default with the Pixel phones, although it can be switched off, and Google has designed it to be used for taking all types of images.