When a new device is unveiled at a keynote event, many details get left out due to their secondary importance or just because of time constraints. So now that the first impressions have settled, we’ve got 5 things that Google has not made obvious about the second generation Pixels.
1. There’s a 'dark' theme
When digging into the system files of the new Android update, modders have many times come across references to a dark mode as a secondary theme of Google devices. But this function has never been active until now.
On the new Pixel 2 and Pixel 2 XL there is a dark mode that automatically activates when a mainly dark background colour is set on the home screen. The theme does not modify the settings, system apps or third-party apps.
The only noticeable difference is on the Pixel Launcher, which adopts a dark theme in the app drawer. Also, by lowering the notification drawer you can see the color of the quick-switch section of the settings in black. Folders will also darken—very convenient especially since the two new Pixels take advantage of OLED and AMOLED display technology that are known for true blacks.
2. Google Assistant also works with the display off. Sort of....
One of the new features Google introduced on its new smartphones is the ability to call up the Assistant with a simple squeeze of the phone’s edges—ready to answer your questions or commands. Few people know, however, that this also works with the display off. Kind of.
By pressing the side edges of a Pixel 2 phone, we can call up Assistant and dictate our commands. But if you have a secure unlock with fingerprint, pin, pattern or anything else you will be asked to unlock your phone to continue. Quite annoying considering it's clear Assistant is actually working in the background.
3. Google Lens works only in Google Photos
Google Lens is a new image search function that only works in a limited fashion so far with the new Google Pixels. You can't yet point the camera at something and ask the Assistant for information.
To use the function we will need to take the picture with the normal camera application, open Google Photos, search for the newly captured image, open it and only then press the button dedicated to Google Lens. Hopefully Google will improve integration with its OS very soon.
4. Offline music recognition is limited
Google has introduced a very interesting feature with its Ambient Services. It can in fact recognize some songs playing in the surrounding environment and show the song and artist name on the Always On Display without you asking for anything. All totally offline.
This seems like black magic, but in reality draws on an internal smartphone archive of about 50MB that contains information on only 10000 songs. Anything that is not on the list will not be recognized offline.
5. In portrait mode, you cannot control the level of blur
Another novelty introduced by Google is the possibility of taking pictures with the (now somewhat over-used) bokeh effect, but using a single camera in contrast to the competition. The camera uses artificial intelligence to recognize the type of scene, select the subject of the close-up and intelligently blur the background based on distance
What Google has not told us, however, is that the level of blur cannot be adjusted (as is the case with competitors) but is instead decided independently by AI. All we can decide on via Google Photo is whether or not to have the bokeh effect. Better than nothing, but not great for the user who wants more control.
What do you think of the new Pixel phones? Are they everything you hoped for?