Prediction: A Shazam app for heartbeats

By: Brian Dolan | May 10, 2012        

Tags: | | | | | | |  |

Brian Dolan, Editor, MobiHealthNewsSoon enough people will be able to record a snipped of their own heart sounds, upload it to an analytics engine through a smartphone app, and receive a diagnosis. Sort of like a Shazam app for heart sounds.

That is, at least, according to Steinar Pedersen, Founder of Tromso Telemedicine Consult, who made the prediction — among many others — during his closing keynote Monday afternoon at the HIMSS mHealth Symposium in Copenhagen, Denmark. (I was honored to kick things off with an opening presentation on the mobile health landscape, but steered cleared of any prognostications.)

Shazam is a smartphone app that boasts more than 200 million downloads — it claims 1.5 million more download it each week — that helps people identify the music they hear. The idea is pretty simple: Hear a song at the mall or in the car and use Shazam to record a short piece of it and receive the artist, song, and album name. The app makes money a few ways but primarily by connecting users to various online music stores to buy the music they discover.

A Shazam app for heart sounds would work in a similar way except instead of using the smartphone’s microphone, users would use the smartphone’s camera, which can record heart rates via apps like Azumio’s Instant Heart Rate, or a peripheral medical device like AliveCor’s (not yet FDA cleared) iPhoneECG. Unlike Shazam’s database of (almost) all the world’s music, a repository of identified heart sounds that would be necessary to undertake a Shazam for the heart is not yet in existence.

Interestingly, cardiologist Dr. Leslie Saxon, who heads up USC’s Center for Body Computing, announced an initiative last month that aims to create just such a database. In a sense her hoped for database, dubbed EveryHeartBeat.org, turns the Shazam model on its head. People will record their heart sounds and upload them to EveryHeartBeat, but early adopters may not receive much in return at first. Saxon told FastCompany that over time “there could be all sorts of abnormalities that EveryHeartBeat could pick up with relatively simple algorithms” including atrial fibrillation, which typically produces no symptoms. Saxon said the platform could enable “unbelievably predictive analytics across populations.”

During her TEDMED talk, Saxon referred to EveryHeartBeat as a “Facebook of Medicine”, but it seems much more like Shazam based on its core value proposition: What is this I’m hearing?

Saxon told MobiHealthNews during an interview this week that her team is in “super stealth mode” and cannot discuss details about EveryHeartBeat yet, but she has been floored by the amount of interest the project has already garnered — and the video of her TEDMED talk hasn’t event hit the Internet yet. She aims to launch the platform sometime in 2013.

A few years ago another group of researchers made headlines for discussing the potential of another big data project funded by a grant from the Bill & Melinda Gates Foundation. As we wrote way back in November 2009:

“Health workers know the difference between a wet cough and a dry cough; between a productive or non-productive; and between a voluntary and involuntary cough. If one Bedford, MA-based start-up, STAR Analytical Services succeeds, soon health workers will be able to use their smartphones to diagnose patients by recording and automatically analyzing their coughs instead. STAR’s software would record a patient’s cough and compare it to a database of pre-recorded coughs that include the sounds of respiratory diseases from people of both sexes, and various ages, weights, etc. The database only has several dozen patients’ coughs on record right now, but they estimate they will need about 1,000 before the software becomes reliable.”

More recently apps that promise to help users determine whether moles and other skin growths were potentially harmful have hit appstores. Last summer we wrote about a startup in Romania led by two dermatologists that created just such an app, called Skin Scan:

“Skin Scan works by first taking a picture of a mole using the iPhone camera; the app then uses a proprietary algorithm to analyze the fractal-like shapes which exist in human skin. The algorithm then decides if the shape of the mole is developing normally, or abnormally into a potentially cancerous melanoma. Abnormal growth is noted to the user, and there is a feature to search for nearby doctors within the app.”

Overall this is a promising and long-anticipated trend: Big data and analytics are just starting to pair with the ever increasing number of mobile health devices, apps and sensors coming online.

  • Alex Shaykevich

    I do some academic work in this field though not an expert. I really doubt camera-based hr gives enough resolution for the kinds of ambitions they are proposing. An ECG or commercial Polar and Ant+ monitor, definitely, but not camera. 

  • Radostina

    I agree with Alex. While I am not an expert, I work with experts. We know that while the camera sensor can give you a quick read, it is definitely not up to medical quality. Polar and ANT+ sensors which work with SweetWater Health’s SweetBeat app are great for moving individuals, HeartMath’s emwave technology is great for stationary and maybe even AliveCor’s new product will prove to be stable enough. While an EKG can give you information about your heart’s health, you will have to delve deeper to learn about types of diseases and how far into them you are. I mentioned SweetBeat earlier, they use heart rate variability to measure stress. This measurement can be used to diagnose and might be in the future of companies like this.

  • http://geoffclapp.blogspot.com/ Geoffrey Clapp

    Since there are some questions about clinical research (a few comments that sum up to “I’m not an expert but I don’t see how a camera could work”) – here’s a peer-reviewed journal article from IEEE comparing a camera based approach. 

    I’m not saying it’s better, I’m pointing out there’s a good set of research out there we can use to discuss the facts and debate the research, and apply them to use-cases. 

    I think the use-case basis for any debate is the key – in some cases, AliveCor makes more sense, others, a camera, some, high-end diagnostics. I don’t think it’s a black/white selection issue.

    Here’s one:
    http://www.opticsinfobase.org/oe/abstract.cfm?uri=oe-18-10-10762

    And some related work from the same author(s):

    M.Z. Poh, D.J. McDuff and R.W. Picard, “Advancements in non-contact, multiparameter physiological measurements using a webcam,”IEEE Trans Biomed Eng, vol. 58, no. 1, pp. 7-11, 2011.M.Z. Poh, K.H. Kim, A.D. Goessling, N.C. Swenson and R.W. Picard, “Cardiovascular monitoring using earphones and a mobile device,”IEEE Pervasive Computing, to appear, 2010.

  • Chosenleader

    Ladies and gentlemen at this time we are requesting that you return to your set and make sure your seat belt is securely fastened as we begin our initial decent into 1984!