off-the-shelf devices (e.g., MUSE) only capture the brain
signal to tell how well users are meditating. Different from
them, LIBS further looks at the eye and muscle signals to
analyze the level of relaxation they have more accurately. As
a result, LIBS promisingly helps improve the users’ meditation performance.
Eating habit monitoring. Eating habits can provide critical evidences for various diseases. 8 As LIBS can capture the
muscle signal very clearly, such information can be useful
to infer how often the users chew, how fast they chew, how
much they chew, and what the intensity of their chewing
is. From all of that, LIBS can then predict what foods they
are eating as well as how much they are eating. As a result,
LIBS can provide users guidance to avoid their bad habits by
themselves or to visit a doctor if necessary.
7. 2. Non-health applications
LIBS can benefit applications and systems on other domain
such as improving hearing aid devices, improving driver’s
safety, helping parents with orienting their child early on.
Autonomous audio steering. This application helps solve
a classical problem in hearing aid, which is called cocktail
party problem. As known, state-of-the-art hearing aid devices
try to amplify the sounds coming from the area that has large
amplitude, which is assumed as human voice, in a party.
Consequently, the hearing aid will fail to support the wearers if any group of people behind them is talking very loudly,
which is not the right person they want to talk to. Using our
technology, using the eye signal LIBS can capture, it will possibly detect the area that the users are paying their attention
to. Furthermore, combining with their brain signal, LIBS
can further predict how please the wearers are with the output sound that their hearing aid is producing. With that in
mind, LIBS can steer the hearing aid and improve its quality
of amplification so that the hearing aid can provide the high-quality sounds coming from the right source to the users.
Distraction and drowsiness detection. Distraction and
drowsiness are very serious factors in driving. Specifically,
if people feel drowsy, their brain signal will be in alpha
state, their eyes will be closed, and their chin muscle tone
will become relax. 20 Also, it is easy to detect if people are distracted based on the localization of eye positions when we
analyze the changes of the eye signal. Hence, LIBS with three
separated brain, eye, and muscle signals should be able to
determine the driver’s drowsiness level or distraction to further send an alert for avoiding road accidents.
every 30sec segment. For all studies, the sleeping environ-
ment was set up to be quiet, dark, and cool.
Statistically, we extracted the features from 4313 30sec
segments using the original mixed signal as well as three
separated signals. Training and test data sets are randomly
selected from the same subject pool. Figure 11 displays the
results of the sleep stage classification in comparison to the
hypnogram of the test data scores out of the gold standard
PSG. From this, we observe that the dynamics of the hyp-
nogram is almost completely maintained in the predicted
scores. Moreover, our result show that the end-to-end sleep
staging system can achieve 95% accuracy on average.
We refer the readers to Ref. Nguyen et al. 13 for more
detailed validations of signal acquisition and separation,
their comparison with the signals recorded by the gold-standard device, and our user study.
7. POTENTIALS OF LIBS
We envision LIBS to be an enabling platform for not only
healthcare applications but also those from other domains.
Figure 12 illustrates the eight potential applications including in-home sleep monitoring, autism onset detection,
meditation training, eating habit monitoring, autonomous
audio steering, distraction and drowsiness detection, child’s
interest assessment, and human-computer interaction. We
discuss these exemplary applications below.
7. 1. Healthcare applications
We propose three applications that LIBS can be extended to
serve in healthcare: autism act-out onset detection, medita-
tion coaching, and eating habit monitoring.
Autism onset detection. Thanks to its ability to capture
muscle tension, eye movements, and brain activities, LIBS
has a potential to be an autism on-set detection and predic-
tion wearable. Particularly, people with autism can have very
sensitive sensory (e.g., visual, auditory, and tactile) functions.
When any of their sensory functions leads to an overload, their
brain signal, facial muscle, and eye movement are expected to
change significantly. 5 We hope to explore this phenomenon
to detect the relationship between these three signals and the
on-set event from which a prediction model can be developed.
Meditation training. Meditation has a potential for
improving physical and mental well-being when it is done
in a right way. Hence, it is necessary to understand people’s mindfulness level during the meditation to be able to
provide more efficient instructions. Existing commercial
Figure 11. A hypnogram of 30min data resulted by our classification algorithm.
Segment sequence number
Proposed algorithm Ground truth
0 10 20 30 40 50 60