Lose It!, one of the earliest food tracking apps, announced the beta launch of a new feature this morning: Snap It, which allows users to log food by taking a picture of it (then selecting the correct food from a list).
“We’ve been thinking about it for years and years an years and the question has been, when will be the time when the technology is close enough that anyone will ever use it? And we think it’s finally at the point,” Charles Teague, CEO and founder of Lose It! told MobiHealthNews.
Other companies have been working on such a solution for a long time too. Notable efforts have emerged from Google, DailyBurn, researchers at Cornell and UCLA, and Massive Health, whose acquisition by Jawbone suggested for a while that the wearable company might pursue the space. But Lose It! believes it has improved on even the best past efforts.
“To test the efficacy of the Snap It Framework, Lose It! used it to build a model for the Food101 dataset, which defines 101 food categories and contains 101,000 images that map to them,” the company explained in a fact sheet about the launch. “It is the goto dataset for food image analysis and allows direct performance comparison for predictive models. FoodDist, the model developed by researchers at Cornell and UCLA, has, until now, held the highest prediction accuracy rate of 83.1 percent for top 1 accuracy and 95.8 percent for top 5 accuracy. Snap It’s model improves upon that benchmark, by meeting an accuracy rate of 87.3 percent for top 1 accuracy and 97.1 percent for top 5 accuracy.”
Teague said that though the eventual goal is to have a totally automated product, Snap It Beta will simply make a guess and display the top few foods it thinks might be in the picture. Then users can choose the right one.
“The experience people have when it works is going to be mindblowing, and when it doesn’t it won’t be any worse than what Lose It! is today,” Teague said.
Like many machine learning apps, Snap It! will get smarter the more people use it.
“I think in the long run, if enough people take a picture of a Dunkin Donuts cup with a lid on it and log coffee, the system will figure out that even though that’s not a brown liquid, that contains coffee,” Teague said.
The app is more than just a cool toy because it addresses the biggest barrier to adoption for food tracking, which has a pretty good track record as weight loss methods go. One of the biggest reasons people don’t track food or stop tracking food is because it’s such a cumbersome process, so efforts to streamline that process are common in mobile health. Food tracking with the camera could some day make tracking calories in as simple as fitness wearables make it to track calories out.
“We’re just at the start of working on that problem,” Teague said. “It’s a much harder problem, but I actually think the benefit of solving it is quite a bit larger. Because what we eat has the tendency to overwhelm the exercise we do pretty quickly.”