Trending: Blue Origin gets set to send thousands of postcards to space and back on test flight
SmartLens, as seen on the company’s website, is aimed at turning the iPhone camera into a quick search device. (SmartLens screen grab)

Michael Royzen is at it again. The 18-year-old Seattle software developer and entrepreneur has created another company and today released another new app for iOS — his seventh.

SmartLens is supposed to turn your smartphone into a search box, according to the description in the app store. Royzen told GeekWire he likes to think of it as a “Shazam for the visual world,” referencing the app which can identify song titles after hearing snippets of music.

“My parents and I have always loved going on hikes and trying to identify various plans and animals,” said Royzen, a senior at Seattle’s Bush School and a former GeekWire Geek of the Week. “However, there were no apps at the time that could identify objects offline. About a year ago, as I was starting to pursue AI, I decided to make an app that could do just this. My goal was to make an app that could identify anything (products, animals, landmarks, etc.), with the ability to recognize many of these objects offline.”

Michael Royzen
Michael Royzen has been building apps since age 11. (Photo courtesy of Michael Royzen)

SmartLens relies on multiple convolutional neural networks, and Royzen said that running them offline allows for significantly faster recognition times than cloud-based models. The app can recognize exactly 17,527 objects offline — anything from furniture to obscure animals. He said that to ensure that the app is not too large, product recognition requires an internet connection.

If SmartLens recognizes a product, it will enable you to buy it in one tap. If it identifies an animal, landmark, food, or painting, it will show you a Wikipedia description. If it recognizes a business, it will show you reviews.

The SmartLens app logo.

Royzen built the app himself, including the neural networks, interface design, icons and more, and friends helped him test it and offered advice on its interface. He spent a lot of time reading research papers on computer vision, which he used as an inspiration for his own models — which are patent-pending. All in all, Royzen has more than 1,000 hours in the project.

Since he started making apps at the age of 11, he’s made a bunch:

  • ASprit4Mars (a platform game)
  • Top Verse (an app that displays top song, movie and TV show information)
  • Arcade Ninja (another game)
  • NoCrash (a driving assistant)
  • RecipeReadr (an app that reads recipes aloud)
  • Ryde (an app that tells people when they need to leave for commutes).

This fall he’ll continue to be busy, as he’ll be enrolling in the Turing Scholars Program for computer science undergraduates at the University of Texas at Austin.

Human, human, fork! SmartLens takes a shot at a bobblehead, GeekWire’s Clare McGrane, and a utensil. (GeekWire screen grabs)

I downloaded a beta version of SmartLens for my iPhone X last week and gave it a test ride around the office, pointing it at everything from houseplants and tissue boxes to bobblehead toys and TV remotes. SmartLens did a great job on plant life — it immediately offered up the variety of plant my co-worker has sitting on her desk, and identified some sunflowers in another office.

The app also correctly identified a fork and another co-worker as a “human.” But it also called a Russell Wilson bobblehead a human, which technically is sort of accurate.

A box of Nature Valley granola bars passed the product ID test, and by swiping up on the image generated by my iPhone’s camera, I was given options for how to purchase more of the product, including an Amazon link.

From left, SmartLens’s accurate hit on a sunflower, and the description available to users, as well as a product identification and how to purchase those granola bars. (GeekWire screen grabs)

SmartLens was definitely quick, sometimes alternating between what it thought was the correct “answer” for certain objects, depending on angle or lighting. It struggled with some electronic devices around the office including a TV remote, which it thought was a phone, and an Amazon Echo, which it identified as an oil filter.

“Currently, SmartLens can give useful information about a recognized object more than 90 percent of the time — but it varies with the quality of the image and lighting, as well as the type of object being recognized,” Royzen said. “While this isn’t perfect, it’s far better than Google Lens (which can’t recognize basic things such as chairs and bell peppers). In particular, SmartLens is very good at recognizing books, packaged products, animals, and flora. An area for improvement is recognizing specific technological devices; it will always know whether something is a laptop or smartphone, but sometimes it gets the brand wrong.”

SmartLens misidentifies an Echo device as an oil filter and just misses on the breed of a GeekWire pup — a mini Aussie that definitely has Bernese coloring. (GeekWire screen grabs)

When another co-worker showed up with her dog, it provided another good test for the app, to see if it could nail the breed. Scout is a miniature Australian shepherd, but SmartLens called her a Bernese mountain dog.

To the app’s credit, Scout gets that a lot from humans, too.

“My goal is to get SmartLens’ success rate as close to 100 percent as possible,” Royzen said. “I have designed a new version of its model architecture which will further increase its speed and accuracy. In the future, customers will be able to correct the app if it is wrong and, with their consent, upload the correct image for future model training.”

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline


Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.