
Apple’s Update Mistake—New Warning For
Created on 3 January, 2025 • News • 29 views • 4 minutes read
When it comes to protecting the privacy and security of its more than one billion iPhone users, Apple does not make many errors. However,
When it comes to protecting the privacy and security of its more than one billion iPhone users, Apple does not make many errors. However, when they do, they strike with extreme force. Consequently, this week has been marked by a startling uproar, as consumers have begun to inquire about whether or not the information that occurs on their iPhones remains on their iPhones.
We are, of course, referring to photographs, as well as the intricate Enhanced Visual Search tool, which Apple seems to have "auto-opted" all of its customers into, thus "violating" its renowned privacy guarantee. Apple is gathering, masking, and then centrally processing sections of user images in order to flag and then geolocate landmarks against a central dataset, as I documented when this information was initially made public. In the event that it functions as advertised, there is no potential threat to the users' privacy, but very few people will comprehend the technical aspects; thus, it is a leap of faith.
Does this actually constitute an invasion of your privacy? It is not nothing, but neither is it that. Apple is facing two significant challenges in this situation. When you are building a brand inside a privacy bubble, you do not want to tease it with a pin at any time. This is the first consideration pertaining to optics. The second is a thin end of the wedge issue. This type of hybrid device/cloud picture scanning has gotten Apple into difficulty previously, with its ill-fated CSAM plan in 2021.
Forbes New Gmail, Outlook, Apple Mail Warning—AI Nightmare Is Coming True In 2025By Zak Doffman
That kid safety improvement was meant to screen photographs on-device against a hashed collection of known, unlawful CSAM material and then, when numerous images were highlighted, to transmit them for human review. As was pointed out at the time, the concern wasn’t the CSAM per se, but rather a door being opened to screening for other material—religious, sexual, or political—based on local laws and regulations. As I indicated at the time, the robust case that it’s not conceivable technically would suddenly fall away.
And so it is here—possibly. The concept that a user’s images may be screened against a cloud dataset for any reason is unpleasant to many—at least those who consider their iPhones as their personal-eyes-only vault. As privacy expert Matthew Green commented on BlueSky, “It’s very frustrating when you learn about a service two days before New Year's and you find that it’s already been enabled on your phone.”
Forbes Daily: Join over 1 million Forbes Daily members and receive our best stories, exclusive reporting, and key analysis of the day’s news in your inbox every weekday.
By signing up, you agree to our Terms of Service, and you recognize our Privacy Statement. Forbes is secured by reCAPTCHA, and the Google Privacy Policy and Terms of Service apply.
Apple claims Enhanced Visual Search “allows you to search for photos using landmarks or points of interest. Your smartphone secretly compares locales in your images to a worldwide index Apple keeps on our servers. We employ homomorphic encryption and differential privacy, and utilize an OHTTP relay that conceals IP address. This prohibits Apple from knowing about the metadata in your photos.”
But as Jeff Johnson, the blogger who ignited this furor, cautions, “I don't understand most of the technical details of Apple's blog post. I have no way to directly judge the correctness of Apple's implementation of Enhanced Visual Search... Computing privacy is simple: if anything occurs wholly on my computer, then it's private, however if my computer transfers data to the maker of the machine, then it's not private, or at least not fully private.”
ForbesPorn Ban—New Threat For iPhone, iPad, Android UsersBy Zak Doffman
Apple’s total control over on-device vs off-device is already being challenged by the new cloud-based AI services being pushed out. Much work has gone into designing and then advertising Apple’s “groundbreaking” Private Cloud Compute, which effectively offers a cloud extension of the device’s encrypted enclave to allow central processing inside a user’s private zone. Where processing escapes this enclave, such as with ChatGPT, Apple clearly points it out to the user. Contrast that with the clandestine nature of this update. As Green explains, “it was ‘discovered’ not announced by Apple.”
And that’s the problem here—the lack of openness. And it’s made worse because, as Michael Tsai cautions, “not only is it not opt-in, but you can’t effectively opt out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you’ve previously opted out of uploading your images to iCloud... I don’t believe the corporation is living up to its values here.”
That’s the actual problem here—optics versus perception. And it’s a terrible error. Had Apple communicated this more clearly, there would have been little if any controversy and most would not have opted out. But elsewhere the iMaker has gone to such efforts to compel opt-ins for any off-device data collecting, that this stands out as an outlier. I would not be shocked to see a u-turn or retrospective opt-in arise.
Apple notes “you can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On Mac, open Photos and navigate to Settings > General.” I have asked the corporation for any comment to this furor—nothing so far.
Popular posts
-
-
Free site to YouTube thumbnail downloadertech • 37 views
-
-
Apple’s Update Mistake—New Warning ForNews • 25 views
-