Although the Google Photos app is only 10 months old, the service has already amassed more than 100 million monthly active users. The reason Google’s cloud-based photo management software has gained so many users so quickly? It makes organizing photos dead easy. It automatically backs up all your images from all of your devices into one central location, plus it collects pictures of the same people or objects into groups, and helps you find images in your archive with text searches. No wonder it’s already become an integral part of photo organization for so many people.

One of its best features is the Assistant, which is sort of a little robotic helper that keeps your photos in check when you just don’t have the time. It combs through your photos, stitches together GIFs and collages, and suggests enhancements and edits to improve your shots.

Today, Assistant will start doing something else: Auto-creating albums and selecting your best shots without any user input whatsoever. It should be especially handy for vacation photos, as it’ll also chart your trips on a map, and automatically recognize and tag famous places in each frame. These new Assistant-created albums are actually a souped-up version of the existing “Stories” feature, which created little pan-and-scan montages out of related photos. Google says Albums—both manual and Assistant-created—will replace Stories.

Auto Focus

In the case of an out-of-town trip, Google Photos will recognize the start and end dates of the journey, identify the “best” images, create an album from them, and add a basic Google Map of your journey as a title card. From there, you can edit which photos appear in the album, fine-tune the locations in the map, and add captions to the album. Once you’re done, you can share the album with your contacts and even make it a collaborative effort.

But how, exactly, does an algorithm pick your “best” photos? And will turning off location services and geotagging affect the accuracy of Google Photos’ vacation-tracking ways? Not so, say two Googlers involved in the project.

“There’s a bunch of ways we can select the ‘best’ photos,” says Google Photos product manager Francois de Halleux. “We use a lot of machine learning to detect the elements in a photo that make it of better quality than another. We also eliminate duplicates.”

In many cases, the system pegs photos with landmarks in them as the “best” shots. And it can identify those famous places without using your location data at all, although geotagging does help with some fact-checking. In the most-useful cases, it will identify and provide names of landmarks—helping you remember where you went or simply spell things correctly.

“We can detect landmarks, we have 255,000 landmarks that we automatically recognize,” says de Halleux. “It’s a combination of both computer vision and geotags. Even without the geotags, we’d be able to recognize a landmark.”

In those cases, Google’s system would recognize a landmark, then double-check that recognition against a location or the photo’s geotagging. In some cases, it helps the system identify the real deal versus a replica.

“If we see a photo of the Eiffel Tower, we know the person is in Paris… or in Las Vegas,” explains Google Photos product lead David Lieb.

Vacations aren’t the only use case for the new Assistant-created albums, but they’re likely to be the most common scenario for the auto-generated albums. Lieb and de Halleux say that the Assistant looks at the distance you are away from your home as a trigger for album auto-creation, but it also pays attention to how many pictures you’ve taken in a short period of time and whether it’s a national holiday or other significant day.

Facing Forward

Outside of those situations, you may still have some albums created and suggested by the Assistant. In those cases, it looks for group shots with the faces of people it deems “important” to you—faces that show up regularly in your other pictures. It also picks the ones where everyone has their eyes open or is smiling. Ideally, both.

While recognizing faces in photos and grouping them together in the “People” section of the app is as impressive as it is creepy, Google says its machine learning stops short of assigning real-world identities to those faces. Users can tag a person as “Mom” or “Grandpa,” for example, but they’re private tags for sorting and organizational purposes within your own photo roll. The Assistant recognizes if a person is important to you based on the frequency of their appearance in your pictures.

“We think it’s a way to get all the benefit of this face-grouping stuff without any of the creepiness or problems that might ensue from it,” says Lieb. “We think it’s the right place to be on that privacy spectrum.”

The new Assistant-created albums will roll out today on the Google Photos app for Android and iOS, as well as the web version of Photos. You’ll likely need to wait for the Assistant to work its magic, but to see if it has created any album suggestions for you, tap the “Assistant” tab at the bottom of the main screen. You can also create your own albums by tapping the “+” button at the top of the app and selecting “Create new Album.”


Google Photos Now Builds Perfect Vacation Albums on Its Own