Apple has announced a new visual search feature for the iPhone 16, powered by its suite of AI capabilities, "Apple Intelligence." This new feature, accessible through the Camera Control button on the iPhone 16 and 16 Plus, essentially combines reverse image search with text recognition, allowing users to search for information directly from images.
One of the highlighted applications of this visual search feature is finding restaurants. By pointing your camera at a restaurant sign or menu, you can instantly access information about the restaurant, including its hours, ratings, menu options, and the ability to make a reservation.
The applications of visual search extend beyond restaurants. Apple showcased other examples, such as using it to quickly add event information to your calendar, or to gather information about a product or landmark.
Apple's visual search feature utilizes Google Search for information retrieval. However, the company also announced a partnership with OpenAI, allowing users to send visual queries to ChatGPT through the Camera Control button.
Apple emphasizes that its visual search feature respects user privacy and does not store images. The feature is set to launch in beta in October for U.S. English language users, with global availability expected in December and early 2025.
The iPhone 16's new visual search feature represents a significant step forward in how we interact with information. By combining reverse image search with text recognition and AI, users can now quickly access a wealth of information directly from their camera.
The iPhone 16's visual search feature marks a significant shift in how we interact with information. As the technology evolves and expands, we can expect to see even more innovative applications of this powerful capability.
Ask anything...