Apple creates Visual Intelligence, a proprietary version of Google Lens

Apple creates Visual Intelligence, a proprietary version of Google Lens

Apple appears to be responding to Google Lens with its new feature, Visual Intelligence, which debuted with the iPhone 16. Aiming to facilitate more intelligent interactions between users and their environment, Visual Intelligence was unveiled during its September 2024 event.

The Camera Control button, a new touch-sensitive button on the device’s right side, is where the new feature is activated. Visual Intelligence can recognize items, provide details, and suggest activities based on what you aim it at, all with a single click. For example, pointing it at a restaurant will display the menu, hours, and reviews; meanwhile, taking a picture of an event flier will add it straight to your calendar. You may easily determine the breed of dog by pointing it at one, or you can click on a product to find out where you can buy it online.

According to Apple’s news announcement, later this year, Camera Control will also function as a portal into third-party solutions with specialized domain knowledge. Users will be able to use ChatGPT for problem-solving or Google for product searches, for example, and still be in control of the information given and the times and methods at which these tools are accessible. Apple stressed that the function was created with privacy in mind, which means that customers’ search and identification details remain hidden from the firm.

By processing data on the device itself, Apple argues that Visual Intelligence protects user privacy by preventing the firm from knowing what you clicked on.