device validates the response; since
the challenge now incorporates the
command bytes B, the read/write command is also authenticated. The device authenticates the server and the
incoming command before executing
the latter. These might be commands
that allow certain configuration bits
to be written into the device’s on-chip
memory, or certain data to be read. The
device can send back data in a similar
manner. The server can then authenticate both the source of the data (entity
authentication) and that the data from
the device hasn’t been modified in
transit (data authentication).
Authorization. In many applications,
the verifier is a public employee who
obtains access to a private person’s database entry from a cloud server in order to perform a more comprehensive
authentication of an individual. The
sensitive information may include
security questions or other personal
information that can be verbally validated. A private person’s PUF-NFC ID
card can be used to limit database access by a public employee so that such
an employee cannot arbitrarily pilfer
sensitive private data. The employee
is authorized to obtain database access to certain sensitive and personal
information only when a particular
PUF-NFC ID card from a private person
is physically present and produces a
proper response to a server’s challenge.
In many of today’s RFID use cases, it
is also common to store certain infor-
mation associated with a tagged prod-
uct on the RFID device itself, which
incurs a silicon area overhead and an
increase in manufacturing cost asso-
ciated with larger on-chip nonvolatile
storage. Since a conventional RFID de-
vice does not offer dynamic authentica-
tion (it emits only a static public iden-
tifier), the locally stored information
cannot be safely moved into the cloud;
this is because another RFID that is pro-
grammed with the same serial number
would be associated with that data re-
cord in the cloud. For example, the data
record can be the maintenance trail of
an airplane part or the supply-chain
provenance trail of a pharmaceutical
product. If the RFID/NFC device, how-
ever, is used to offer authorization (for
reading or both read/write) to access a
particular database entry in the cloud,
then the data that otherwise would be
stored locally on the RFID device can be
more safely moved into the cloud. This
minimizes the need for large storage
local to the RFID/NFC device. An indi-
vidual on the ground is authorized to
access the cloud data record only when
the PUF-NFC device is physically pres-
ent. This assumes that reader devices
are cloud-connected, which is increas-
ingly becoming the trend.
Also, in the age of big data and cloud
computing, data is worth more when
it is aggregated in the cloud vs. stored
separately in each tag. In the former
case, analytics can be performed to
uncover unauthorized activities or to
gather other forms of business intelligence information. The PUF serves
to bind the tag to a particular database
record in the cloud by providing access
authorization in a manner that a static
public identifier cannot.
For more than a decade, silicon PUFs
have gathered an enormous amount of
interest in applications ranging from
product authentication to secure processors. There have been commercial
deployments and complete integration with authentication servers and
consumer-grade, off-the-shelf smartphones to give the power of authentication to the ordinary person.
People know much more about
PUFs and how to use them, including
vulnerabilities and countermeasures,
than they did a few years ago. As the
PUF field becomes well established,
more attacks and countermeasures
are expected to be published to further
vet the security properties of PUFs.
Such a cycle has also been seen in the
cryptographic world—for example, the
AES-ECB algorithm was subject to the
17 and the plain RSA
algorithm is subject to existential forgery,
2 both of which can be addressed by
using the fundamental primitives in a
A Threat Analysis of RFID Passports
Alan Ramos et al.
The NSA and Snowden:
Securing the All-Seeing Eye
1. Becker, G. The gap between promise and reality: On
the insecurity of XOR arbiter PUFs. International
Workshop on Cryptographic Hardware and Embedded
Systems (2015), 535–555.
2. Boneh, D., Joux, A. and Nguyen, P. Why textbook
elgamal and RSA encryption are insecure. Advances in
Cryptology (2000), 30¬– 43.
3. Counterfeiting and piracy: stamping it out. The
Economist. April 23, 2016.
4. Delvaux, J., Peeters, R., Gu, D. and Verbauwhede, I.
A survey on entity authentication with strong PUFs.
ACM Computing Surveys 48, 2 (2015), 26:1–26: 42.
5. Ganji, F., Tajik, S. and Seifert, J.-P. Why attackers win:
on the learnability of XOR arbiter PUFs. International
Conference on Trust and Trustworthy Computing
6. Gassend, B., Clarke, D., van Dijk, M. and Devadas, S.
Silicon physical random functions. ACM Conference on
Computer and Communication Security (2002).
7. Lim, D. Extracting secret keys from integrated circuits.
Master’s thesis, MIT, 2004.
8. Majzoobi, M. Rostami, M., Koushanfar, F., Wallach, D.
and Devadas, S. SlenderPUF: A lightweight, robust
and secure strong PUF by substring matching. IEEE
International Workshop on Trustworthy Embedded
9. Quadir, S. E., Chen, J., Forte, D., Asadizanjani, N.,
Shahbazmohamadi, S., Wang, L., Chandy, J. and
Tehranipoor, M. A survey on chip-to-system reverse
engineering. ACM Journal on Emerging Technologies
in Computing Systems 13, 1 (2016).
10. Quinn, G. and Grother, P. IREX III: Supplement I:
Failure Analysis. NIST Interagency Report 7853 (2012).
11. Rührmair, U., Sehnke, F., Sölter, J., Dror, G., Devadas,
S. and Schmidhuber, J. Modeling attacks on physical
unclonable functions. ACM Conference on Computer
and Communication Security (2010).
12. Rührmair, U., Sölter, J., Sehnke, F., Xu, X., Mahmoud, A.,
Stoyanova, V., Dror, G., Schmidhuber, J., Burleson, W.
and Devadas, S. PUF modeling attacks on simulated
and silicon data. IEEE Transactions on Information
Forensics and Security 8, 11 (2013), 1876–1891.
13. Schneier, B. Sensible authentication. ACM Queue 1, 10
14. Suh, G.E. and Devadas, S. Physical unclonable functions
for device authentication and secret key generation.
Design Automation Conference (2007), 9–14.
15. Suh, G.E. AEGIS: A single-chip secure processor. Ph.D.
thesis. Electrical Engineering and Computer Science
Dept., MI T, 2005.
16. Valiant, L. A theory of the learnable. Commun. ACM
27, 11 (1984), 1134–1142.
17. Valsorda, F. The ECB penguin, 2013; https://blog.
18. Wilson, C., Hicklin, R., Bone, M., Korves, H., Grother,
P., Ulery, B., Micheals, R., Zoepfl, M., Otto, S. and
Watson, C. Fingerprint vendor technology evaluation
2003: summary of results and analysis report. NIST
Internal Report 7123 (2004).
19. Xilinx Inc. Xilinx addresses rigorous security demands
at 5th Annual Working Group for Broad Range of
Applications, 2016; http://www.prnewswire.com/
20. Yu, M. Hiller, M., Delvaux, J., Sowell, R., Devadas,
S. and Verbauwhede, I. A lockdown technique to
prevent machine learning on PUFs for lightweight
authentication. IEEE Transactions on Multi-Scale
Computing Systems 2, 3 (2016): 146–159.
21. Yu, M., M’Raïhi, D., Verbauwhede, I. and Devadas, S.
A noise bifurcation architecture for linear additive
physical functions. IEEE International Symposium
on Hardware Oriented Security and Trust (2014),
Meng-Day (Mandel) Yu is the chief scientist at Verayo
Inc., a research affiliate for CSAIL/MIT, and is pursuing
a Ph.D. based on a research career with COSIC/KU
Leuven. He was manager of R&D engineering at TSI and
developed a secure digital baseband radio.
Srinivas Devadas is the Webster professor of electrical
engineering and computer science at MI T, where he has
been since 1988. He served as associate head of EECS
from 2005 to 2011. He is a Fellow of the ACM and IEEE.
Copyright held by owner(s)/authors.
Publication rights licensed to ACM. $15.00.