RESULTS AND DISCUSSION OF ASSESSMENT
Most of the cost in the assessment structure comes in the early stages of constructing learning objective mappings. A mapping of knowledge topics to courses and student learning outcomes needs to be created. Appropriate quiz questions need
to be written, vetted, and entered into a CMS quiz database. A
database system for uploading course grades, uploading quiz
results, and automatically generating standard reports has to
be deployed. This level of effort is consistent with the efforts
associated with the construction of program self-studies for accreditation visits.
Once the initial cost of infrastructure development and deployment is complete, however, the term to term assessment
process takes minimal overhead. Instructors merely need to instruct students to take the quiz. Administrative staff can ensure
that the quizzes are available, uploading the quiz results into
the assessment database, and provide the automated reports to
the appropriate faculty and curriculum committees for review.
General maintenance is needed in maintaining an up-to-date
mapping between knowledge topics, student learning outcomes, and courses. Some alterations to quizzes may be needed
to reflect these changes.
The assessment structure provided in this paper provides a
way to give continuous periodic direct measurements of retained relevant knowledge throughout a computer science or
computer engineering curriculum. The assessment gives immediate valuable feedback to students, faculty, and program
reviewers. The online features that allow for the assessment
quizzes to be given outside of class and the regular generation
of automated reports minimize the burden of completing the
This work was supported in part by a Wright State University Teaching Innovation Award
and by the AAC&U Teaching to Increase Diversity and Equity in STEM (TIDES) program.
1. ABET 2016. Criteria for Accrediting Computing Programs. (2016).
2. D2L | Learning Management System (LMS): 2016. https://www.d2l.com/.
3. Doom, T. et al. 2013. Infrastructure for continuous assessment of retained relevant
4. Engineering Accreditation Commission 2014. CRITERIA FOR ACCREDITING
ENGINEERING PROGRAMS. ABET. (2014).
5. The Joint Task Force on Computing Curricula - ACM/IEEE-Computer Society 2016.
Computer Engineering Curricula 2016. (2016).
6. The Joint Task Force on Computing Curricula - ACM/IEEE-Computer Society 2013.
Computer Science Curricula 2013: Curriculum Guidelines for Undergraduate Degree
Programs in Computer Science.
Kathleen Timmerman Travis Doom
Department of Computer Science and Engineering
Wright State University
3600 Colonel Glen Hwy, Dayton, OH USA
© 2017 ACM. DOI: http://dx.doi.org/10.1145/3017680.3017738
can be broken down to look at knowledge areas, knowledge
units, knowledge topics, or individual questions. This report is
reviewed by the curriculum committee each term to determine
if further action should be taken.
ACROSS PROGRAM REPORT
Each term an automated report is generated to access the entire program (See Figure 5). Each quiz question, regardless of
course, is used to determine students’ abilities to complete
each of the ABET student learning outcomes. This allows for
strengths and weaknesses across the entire program to be
examined. With the course report, a weakness in a specific
ABET student learning outcome may be masked by strengths
in other questions. This report is also reviewed by the curriculum committee each term. Similar to the Every Course Report
(See Section 5. 4. 1), this report can then be broken down further. The student learning outcomes can be broken down by
courses, knowledge areas, knowledge units, knowledge topics,
or individual questions. This aids in locating possible problems in the program.
Figure 4: A basic course report
Figure 5: A portion of the ABET Student Learning Outcome summary