Is your iPhone sharing photos with Apple by default?

May Be Interested In:Let’s Freakin’ GOOO! Linda McMahon Calls Elizabeth Warren’s BLUFF Challenging Her to a Meeting and HOOBOY


Sure enough, when I checked my iPhone 15 Pro this morning, the toggle was switched to on. You can find it for yourself by going to Settings > Photos (or System Settings > Photos on a Mac). Enhanced Visual Search lets you look up landmarks you’ve taken pictures of or search for those images using the names of those landmarks.

To see what it enables in the Photos app, swipe up on a picture you’ve taken of a building and select “Look up Landmark,” and a card will appear that ideally identifies it. Here are a couple of examples from my phone:

That’s definitely Austin’s Cathedral of Saint Mary, but the image on the right is not a Trappist monastery, but the Dubuque, Iowa city hall building.
Screenshots: Apple Photos

On its face, it’s a convenient expansion of Photos’ Visual Look Up feature that Apple introduced in iOS 15 that lets you identify plants or, say, find out what those symbols on a laundry tag mean. But Visual Look Up doesn’t need special permission to share data with Apple, and this does.

A description under the toggle says you’re giving Apple permission to “privately match places in your photos with a global index maintained by Apple.” As for how, there are details in an Apple machine-learning research blog about Enhanced Visual Search that Johnson links to:

The process starts with an on-device ML model that analyzes a given photo to determine if there is a “region of interest” (ROI) that may contain a landmark. If the model detects an ROI in the “landmark” domain, a vector embedding is calculated for that region of the image.

According to the blog, that vector embedding is then encrypted and sent to Apple to compare with its database. The company offers a very technical explanation of vector embeddings in a research paper, but IBM put it more simply, writing that embeddings transform “a data point, such as a word, sentence or image, into an n-dimensional array of numbers representing that data point’s characteristics.”

Like Johnson, I don’t fully understand Apple’s research blogs and Apple didn’t immediately respond to our request for comment about Johnson’s concerns. It seems as though the company went to great lengths to keep the data private, in part by condensing image data into a format that’s legible to an ML model.

Even so, making the toggle opt-in, like those for sharing analytics data or recordings or Siri interactions, rather than something users have to discover seems like it would have been a better option.

share Share facebook pinterest whatsapp x print

Similar Content

Sick woman blowing her nose while covered with a blanket.
The US is having its most active flu season in 15 years
Wendy Williams claims she passed hospital evaluation with ‘flying colors’
Wendy Williams claims she passed hospital evaluation with ‘flying colors’
NBC Sports BayArea
Steph calls out ‘glaring’ poor play as Warriors find new rock bottom
Rocket League: Do home crowds make a difference in esports?
Rocket League: Do home crowds make a difference in esports?
Video game music has arrived on the festival circuit – and it’s only going to get bigger
Video game music has arrived on the festival circuit – and it’s only going to get bigger
Mark Levin Says District Court Judge Flunked ‘Judging 101’ In Effort To Block Deportation Of Gangsters
Mark Levin Says District Court Judge Flunked ‘Judging 101’ In Effort To Block Deportation Of Gangsters
Breaking Barriers: The Stories that Move Us | © 2024 | Daily News