Until now, Snapchat's camera has been principally associated with disappearing messages and elementary AR effects. Snapchat is expanding the repertoire of its visual search with an update to its Scan feature.
Scan has been a part of the popular messaging app for a while, helping to identify songs or solve math problems. Additionally, Scan identifies objects in the real world, such as clothes or dog breeds,
Read also: How to delete your Snapchat account
Thursday's Scan update on iOS
On Thursday, Scan will be upgraded and placed front and center in the app's camera.
Snapchat's prominent placement of Scan signals that it is gradually becoming more than just a messaging app.
Snapchat users face another growing challenge with Snap Scan: finding AR effects, or Lenses, made by their community creators. Based on what you're looking at, Scan will suggest Lenses based on what you're looking at, which will encourage Snapchat users to make more AR content.

Google Lens' new competitor?
Visual search isn't a new concept. Google launched Lens in 2017, allowing users to scan items with their phone camera and identify them using Google's search results. Google Lens is integrated into the Google Pixel phone, Android phones, and Google's mobile app. Today, Google Lens is the predominant visual search engine.
Additionally, Pinterest offers a visual search feature called Lens that shows similar images based on what you scan in the app.
Snap has a great chance of taking visual search mainstream. Since the camera is Snapchat's main activity and entry point, any change directly affects 300 million daily users' interactions with the app.
Snap claims that more than 170 million people use Scan at least once a month – before Snapchat put it front and center on the camera.
In an exclusive interview, Snap's head of camera product, Eva Zhan, told The Verge: “We think scanning will be one of the priorities for [Snapchat's] camera going forward.” “We envision the camera doing much more than it can do today.”
Snap's Scan partners; Shazam, Amazon, Allrecipes, Photomath, Vivino
Snap began working on Scan several years ago after observing how Snapchat users scanned QR codes to add friends to the app. Snap added the ability to identify items available for sale on Amazon after initially collaborating with Shazam to identify songs and with Photomath to solve math problems at a glance.
Snap previewed this latest Scan version at its developer conference earlier this year. It adds detection for dog breeds, plants, wine, cars, and food nutrition information. Other companies power most of Scan's features; the wine scanning feature, for example, is powered by Vivino.
Soon, Allrecipes will offer a Scan feature that suggests recipes based on a specific food ingredient. Snap plans to continue adding capabilities to Scan over time using outside partners and its in-house technology.

The shopping aspect: Screenshop
Scan's most significant new feature is a shopping feature built by Snap and enabled by its acquisition of Screenshop, an app that lets you upload screenshots of clothing to shop for similar items. You can discover similar clothes based on what you're looking at and buy them using Scan.
Eventually, Snap will also add Scan's shopping feature to Snapchat's Memories section, where users can shop for clothes based on screenshots or pictures they have saved.
Camera shortcuts and Spotlight
Another aspect of Scan is its camera shortcuts. This feature suggests a combination of the camera mode, soundtrack, and Lens. The feature will launch on the iOS version of Snapchat today and roll out to Android by the end of the year.
Therefore, if you point the camera at the sky, lenses designed to work with the sky will show up along with a song clip and color filter so that you can make all the changes at once.
Moreover, Snap said it is looking into adding camera shortcuts to its TikTok rival Spotlight, allowing users to quickly jump into their camera with the same configuration used to create the video they just watched.
Limitations
Scan's camera shortcuts are limited to only a few situations: sky shots, human feet, dogs, and dancing. Snapchat will eventually expand the situations in which camera shortcuts are helpful, and the inclusion of Spotlight suggests how shortcuts could become a more integral part of video creation.
Scan and AR gear
Snap plans to use Scan to introduce users to AR lenses in the future. The company recently started allowing Lens creators to tag their Lenses with relevant keywords to help Scan suggest Lenses based on what the camera sees.
To go further, wearing AR glasses, such as Snap Spectacles, makes scanning more compelling in the future.
The new Snap Spectacles have a dedicated Scan button on the frame, which triggers Lenses depending on what the wearer is looking at.
Even though Scan is fairly bare-bones now, it shows how Snap evolves the camera's use cases.