I
N
F
O
G
R
A
P
H
I
C
F
R
O
M
D
E
L
O
I
T
T
E
.
C
O
M
/
I
N
S
I
G
H
T
S
hood of someone committing a future
crime, setting credit scores, and in facial recognition systems. As automated
systems relying on AI and machine
learning become more prevalent, the
trick, of course, is finding a way to ensure they are neutral in their decision-making. Experts have mixed views on
whether they can be.
AI-based technologies can undoubt-
edly play a positive role in helping hu-
man services agencies cut costs, signif-
icantly reduce labor, and deliver faster
and better services. Yet taking the hu-
man element out of the equation can
be dangerous, agrees the 2017 Deloitte
report “AI-augmented human services:
Using cognitive technologies to trans-
form program delivery.”
“AI can augment the work of case-
workers by automating paperwork,
while machine learning can help case-
workers know which cases need urgent
attention. But ultimately, humans are
the users of AI systems, and these sys-
ASK POVERTY ATTORNEY Jo- anna Green Brown for an example of a client who fell through the cracks and lost social services benefits
they may have been eligible for because
of a program driven by artificial intelligence (AI), and you will get an earful.
There was the “highly educated and
capable” client who had had heart failure and was on a heart and lung transplant wait list. The questions he was
presented in a Social Security benefits
application “didn’t encapsulate his issue” and his child subsequently did
not receive benefits.
“It’s almost impossible for an AI system to anticipate issues related to the
nuance of timing,” Green Brown says.
Then there’s the client who had to
apply for a Medicaid recertification,
but misread a question and received a
denial a month later. “Suddenly, Medicaid has ended and you’re not getting
oxygen delivered. This happens to old
people frequently,” she says.
Another client died of cancer that
Green Brown says was preventable, but
the woman did not know social service
programs existed, did not have an edu-
cation, and did not speak English. “I
can’t say it was AI-related,” she notes,
“but she didn’t use a computer, so how
is she going to get access to services?”
Such cautionary tales illustrate what
can happen when systems become
automated, the human element is re-
moved, and a person in need lacks a
support system to help them navigate
the murky waters of applying for gov-
ernment assistance programs like So-
cial Security and Medicaid.
There are so many factors that go
into an application or appeals process
for social services that many people
just give up, Green Brown says. They
can also lose benefits when a line of
questioning ends in the system, but
which may not tell their whole story.
“The art of actual conversation is what
teases out information,” she says. A
human can tell something isn’t right
simply by observing a person for a
few minutes; determining why they
are uncomfortable, for example, and
whether it is because they have a hearing problem, or a cognitive or psychological issue.
“The stakes are high when it comes
to trying to save time and money versus
trying to understand a person’s unique
circumstances,” Green Brown says.
“Data is great at understanding who
the outliers are; it can show fraud and
show a person isn’t necessarily getting
all benefits they need, but it doesn’t
necessarily mean it’s correct informa-
tion, and it’s not always indicative of
eligibility of benefits.”
There are well-documented ex-
amples of bias in automated systems
used to provide guidelines in sentenc-
ing criminals, predicting the likeli-
The Dangers of Automating
Social Programs
Is it possible to keep bias out of a social program
driven by one or more algorithms?
Society | DOI: 10.1145/3264627 Esther Shein
• Scheduling appointments for
human services programs
• Addressing queries
• Auto-filling application forms
Client INTAKE
1
SERVICE DELIVERY
AND CASE
MANAGEMENT
• Automating application
screening
• Automating verification
• Predicting high-risk cases
• Automating eligibility
determinations
Caseworker
Caseworker
CASE
PLANNING
Fraud detection
Prioritizing
resources/inspections
Predicting/preventing
delinquency
• Remote diagnosis
Service provider
recommendations
Client
Caseworker
• Addressing queries
• Automating client
follow-up and
documentation
• Automating redetermination
of eligibility
• Personalizing service delivery
Applications of Robotic
Process Automation (RPA)
and cognitive technologies
across the life cycle of a
human services case
AI Technologies:
Chatbot
RPA
Machine learning