is in terms of analytic sensitivity. If you
test something using a phone, what’s
its absolute accuracy? But context is incredibly valuable. If I’m coughing a lot in
Seattle versus in a high-risk tuberculosis
region in South Africa, it means something very different, and in fact, physicians already use this context indirectly.
“Are you at risk for something? Where
have you been? What’s your family history? What region do you reside in?”
Those things aren’t always built into a
blood draw. So, the blood draw gives you
one number, but now, machine learning
can incorporate all this additional information and maybe even be more indicative of what’s happening.
How have health providers reacted?
I think clinicians and clinical scientists are moving in that direction. They
understand it and they see that it’s
where the field is heading. But health
practitioners still have to think about
the near term—they have to physically
see patients, determine the best course
of treatment, and so on. It’s a challenge
to bridge that gap. If you’re a general
practitioner who’s taking care of 1,000
patients, how are you going to deal with
1,000 hemoglobin readings each day?
It’s just not possible, and that’s why a
lot of these mobile and home health
technologies have not really been successful. If you can’t figure out how to
integrate your tools into the system
we have now, the treatments are never
going adapt to whatever new sensing
techniques you’ve created.
In addition to being a professor at the
University of Washington, you also
spend time at Google, where you direct
a health technologies group. Is there
anything you can say about the work?
A lot of it is looking at new opportunities for machine learning and sensors
in the healthcare space. It’s still early in
our explorations, but one of the exciting things about it is the opportunity
to start thinking about scale. I was able
to validate and prototype a lot of things
in the academic world, but at Google,
we can start to look at disseminating
it more broadly—that’s all I can say for
now, but that’s the high-level goal.
Leah Hoffmann is a technology writer based in Piermont,
N Y, USA.
© 2019 ACM 0001-0782/19/9 $15.00
Yet capturing the right data to get
ahead of a major health issue is incred-
Most people see a doctor every one
to two years. There can be a lot of indicators that could help you get ahead of
a problem well before you are symptomatic. The challenge is, we don’t have
access to that information. With the
intersection of new sensing techniques
and sensors that are lower in cost—not
to mention more capable phone and AI
and machine learning—we are at a time
where this can actually start to work. We
can automate a lot of the work, triage it
using machine learning, and escalate
the cases that look like they’re emergent.
In healthcare, as with energy usage, it
turns out that feedback loops are an
incredibly powerful way to change people’s behavior—giving them relevant
information about what’s happening
at a time when they can actually do
something about it.
Mobile phones give you both a computational platform for the interface
and feedback on the device itself. At the
same time, people have a huge affinity
for their phones, so compliance is inherently higher. You already have this thing
with you for primary reasons, so health-care becomes a secondary use-case.
I understand that it has been an ad-
venture to get some of this work ap-
proved by the Food and Drug Admin-
The regulatory landscape is evolving.
The way the FDA looks at diagnostic tools
on the power lines. It turns out if you
zoom into that noise source, it tells you
a lot about what’s happening. So our ap-
proach was to listen to all the electrical
interference on the power line, then use
machine learning to clarify and pattern-
match to a specific device.
As it turns out, that’s analogous to
the water domain. When you flush the
toilet or use the shower, you disrupt the
water flow the moment you open and
close that valve. And if you have a pressure sensor at any location, you can see
a pressure wave that’s indicative of the
kind of valve that you just closed.
Given that most of us can’t yet afford
to live in “the home of the future” or
put sensors onto all our fixtures and
appliances, it’s a refreshingly practical approach.
Sometimes, you have a scientific
question where you are trying to address an algorithmic problem or find
a more efficient way to do things, but
at the same time you are also trying
to think about how to apply it. If you
come from a purely applied standpoint and you are solving an interesting problem, a lot of the scientific contributions follow, because you are now
discovering new use-cases that you
may not have discovered otherwise.
More recently, you have been working in healthcare, using commodity
devices like mobile phones to do
longitudinal and physiological
We have looked at using the microphones to help people monitor respiratory ailments—so instead of using
a dedicated device like a spirometer,
say, you use machine learning and
audio processing on the microphone
to detect if something’s happening
in your respiratory system. We have
also used the camera and flash to do
non-invasive blood screening. You
might take a picture of a baby to figure out how much bilirubin is in the
blood and whether jaundice is a concern. You can’t do a blood draw every
single day, so having a non-invasive
screening tool can be a really effective way to tell you when you should
get to the next level of screening or
diagnostics, and then you can intervene much sooner.
[CONTINUED FROM P. 112]
“If you come from
a purely applied
you are solving an
a lot of the scientific
because you are now