UB - University at Buffalo, The State University of New York Chemical and Biological Engineering

Process for Reviewing Achievement of Program Outcomes

Assessment and evaluation of our PO are represented by Loop III of our continuous improvement process. During the summer of 2006 we revised significantly of a series of web-based tools that enable us to manage course information and collect and analyze program assessment data in an efficient and integrated manner, and ultimately provide a vehicle for continuous improvement of our program. Faculty started to use these assessment tools during the Fall 2006 semester. In what follows below, we first describe the web-based tools that have been developed, we then describe the procedure used to complete assessments of student performance for a particular assignment (e.g. exam or homework problem, project, laboratory report, presentation, co-op, etc.), and finally we illustrate how these assessments are collectively used to evaluate the extent to which students attain each of the ABET Outcomes.

Web-based tools

We now describe several of the web-based tools that have been developed. Although it is not necessary, perhaps the best way for a reader of this document to become familiar with these tools is via interactive use at our website. Many of the necessary links are housed within a password-protected section of our undergraduate improvement website. Please contact Jeffrey Errington or David Kofke to request access to this site. After arriving at the website indicated above, select the Access CBE protected content (requires an account) link and enter a suitable username and password. The links referenced below are accessed from the page that follows.

The following tools have been developed

  • Topic tree management. The topic tree plays a key role in the specification of our PO. At the outset of the project the tree was populated largely through faculty input, following the PO established prior to our 2002 ABET review. Various resources, including textbooks and web documents, were also used to generate the initial data. The topic tree is considered a “living” entity – as the profession evolves so will the tree. Any faculty member can propose changes to the topic tree. These requests are sent to the Director of Undergraduate Studies for approval. (we note that the tree itself does not constitute the PO, but instead it defines part of an outcomes space on which we define the PO). The topic tree can be explored from the View topic tree link on the main protected page and proposals for modification can be submitted using a tool provided by the Request changes to the topic tree (restricted) link.
  • Course Management. This tool enables one to view and/or modify information related to a course. Course pages provide a means to examine how CE Program Outcomes are distributed amongst the various courses and serve as a resource for faculty to become familiar with the structure of courses within the curriculum for which they are not directly involved (e.g. courses that serve as prerequisites to one that they teach). Modification of course information follows a similar approach to that used for the topic tree – faculty members propose changes to the course pages, which are forwarded to the Director of Undergraduate Studies for approval. Course pages can be viewed by following the View course information link and modified with the tool provided by the Edit course information (restricted) link. The course page for our Transport Processes I (CE 317) course provides a good example.
  • Assessment tool. We have developed a means to electronically submit data related to student performance for a particular assignment (e.g. exam or homework problem, project, laboratory report, presentation, co-op, etc.). The structure of these individual assessments will be described in detail below. Organizing assessment data electronically within a database enables us to gather and analyze the information in a manner that would otherwise be difficult.
  • ABET notebooks. These provide a comprehensive view of the course and assessment data related to a given ABET Outcome (e.g. (e) an ability to identify, formulate, and solve engineering problems), and are the primary resource used by faculty to evaluate the extent to which students achieve a specified ABET Outcome. These will be discussed further below – for a preview follow the ABET Notebooks (restricted) link and select one of the notebooks.

Assessment procedure

The objective of our assessment procedure is to acquire information that will enable us to evaluate the extent to which students are proficient in the PO at graduation. Our process involves the collection of direct and indirect assessment data that are initially stored within a database and later analyzed to gauge student performance across the program with respect to a given PO. The following items are used within our assessment practices.


Every semester instructors of required CE courses are each asked to select three benchmark “problems” to assess student performance. These problems encompass traditional homework and exam problems as well as individual or team projects, presentations, and laboratory reports. For each problem an instructor completes the following tasks

  1. Provides a solution, including mathematical and conceptual operations
  2. Breaks the problem down into key “rubric” steps, and labels each with a PO, consisting of an ABET Outcome, topic, and Bloom level combination. As described above, the topic points to a branch of the topic tree and the Bloom level corresponds to a cognitive learning level as established by Bloom’s taxonomy or a generic “attitude” label for skills that fall within Bloom’s affective learning domain.
  3. Randomly selects 5-10 examples of student work, and for each student provides coarse-grained 0-3 scores (0: highly deficient, 1: somewhat deficient, 2: somewhat proficient, 3: highly proficient) for each of the rubric steps.
  4. Submits information to our undergraduate improvement website, including the problem statement, solution, notes, rubric steps and scores, examples of student work, course grade information, and a summary of student performance.

The schematic below provides an example of how one might complete Step 2 for a thermodynamics problem. The problem statement reads as follows

Argon steadily enters an adiabatic compressor with an inlet area of 0.1 m2 at 100 kPa, 20 °C, and 150 m/s and leaves at 800 kPa and 200 °C. Determine the power input to the compressor.

Five key rubric steps are identified and associated with an ABET Outcome (A), topic (T), and Bloom level (B). Four of these rubric steps point to specific aspects of the problem while the “arithmetic” component provides a general assessment of a student’s ability to complete simple mathematical operations.

Once the rubric steps have been established, the instructor grades 5-10 students on a coarse-grained 0-3 scale. A score of N/A can also be used to indicate that the student did not reach a point in the problem that permits meaningful assessment of that rubric step. This information is then uploaded to our undergraduate improvement website. The web version of this assessment is captured here (restricted). The “Graded scores” provided after the rubric table indicate the numerical scores students were given on the problem, and are typically the scores used by instructors to assign course letter grades. The information is provided to aid in establishing a link between rubric-based and grade-based scores. The “class average” and “class standard deviation” provide statistics for the entire class, and assist in mapping performance from the small sample set of students to that of the full class. Each assessment page contains a complete description of the problem – problem statements and solutions and examples of student work are all provided via links.

To assist in assessing common educational activities that appear in multiple courses (e.g. written and oral communication), we have developed “rubric templates” that contain a series of rubric steps with predefined PO (ABET Outcome, topic, and Bloom level). For each of the rubric steps a detailed definition of the 0-3 scores is also provided. These allow for consistent evaluation of common instructional activities across the curriculum. The following templates have been developed to aid in the assessment of coursework

  • Oral Communication
  • Written Communication
  • Teamwork
  • Ethics, Safety, Society, Environment
  • Laboratory Report
  • Plant Design Project

These templates can be viewed by following the List of rubric templates (restricted) link from the main protected page described above. For an illustrative example, consider viewing the rubric steps for the oral communication template (restricted) and associated step definitions (restricted).

These course-based assessments provide instructors immediate feedback regarding student performance. Such information is used by individual instructors to modify their teaching approach to resolve course-specific issues. Below, we describe how these assessments are used to perform program-wide evaluations of student performance with respect to a given ABET Outcome. These evaluations serve as the basis for modification of the educational program.

Extracurricular activities

Many of our students participate in “extracurricular” activities such as undergraduate research, industrial co-op experiences, a cheme-car team, etc. Each of these activities requires students to utilize technical and interpersonal skills that are typically not employed within the traditional classroom/laboratory setting. For example, within a co-op experience students typically work within interdisciplinary teams, which consist of individuals with diverse educational backgrounds and levels, such as engineers, scientists, technicians, and support staff. Undergraduate research experiences also place students in settings that involve a broad range of people, including graduate students and post-doctoral researchers from various disciplines. Students typically complete co-op experiences during the Summer semester between the junior and senior year. Enrollments span between 5 and 10 students per year (10 to 25% of a graduating class). Our students actively participate in research experiences – over the past three years on average 14 students per semester have enrolled in our CE 498 Undergraduate Research and Creative Activity course. We have incorporated assessments of student performance in undergraduate research and industrial co-op experiences within our PO evaluation procedures. In each case, student performance is monitored using a rubric template similar to those described above. Templates consist of a number of predefined rubric steps with associated ABET Outcome, topic, and Bloom level. The industrial co-op template (restricted) mimics an evaluation form completed by industrial supervisors (restricted) at the end of the co-op experience – the faculty advisor simply maps responses provided by the industrial supervisor onto rubric scores and loads the information into our database. Free form comments provided by industrial supervisors are also recorded for future analysis. Similar to the coursework templates described above, the undergraduate research template (restricted) was developed in-house by CBE faculty, and provides detailed score definitions (restricted) for each rubric step. Within our undergraduate improvement website these templates are grouped with the coursework templates, and can be viewed by following the List of rubric templates (restricted) link from the main protected page.

Faculty / student survey

Our assessment practices also utilize indirect measures to monitor student performance. Specifically, we periodically survey students and/or faculty to solicit their opinion regarding performance within a given area. For example, during the 2007-2008 academic year we asked junior- and senior-level students to evaluate the performance of their laboratory groups with respect to several teamwork issues. These self-evaluations by students can be helpful is assessing the abilities of students for activities that are completed “away” from faculty.

To summarize, direct assessments of student performance within coursework and extracurricular activities are completed each semester. Periodically indirect assessments (e.g. surveys) are also completed to monitor performance. These data are collected within a database for future evaluation. All of the assessments that have been completed can be viewed by following the List of assessment problems (restricted) link from the main protected webpage. As of June 2008, over 75 assessments had been completed (assessments began during the Fall 2006 semester).

Program Outcome evaluation

PO are evaluated via analysis of assessments collected across the entire program. In what follows we demonstrate how we evaluate student performance with respect to the ABET Outcomes. Essentially, we query the database for all instances of a given ABET Outcome and assemble the course and assessment information that follows into an “ABET (x) Notebook”, where x corresponds to one of a-k. A schematic of this operation is provided below.

These notebooks can be viewed by following the ABET Notebooks (restricted) link from the main page and selecting the a-k of interest.  These notebooks serve as the primary resources used by faculty to evaluate the extent to which students achieve a specified ABET Outcome. Every two years faculty subcommittees are asked to review the ABET Notebooks (restricted) and provide an evaluation of student performance. Specifically, three-member committees are tasked to provide the following (see schematic below)

  1. Summary of CE student performance with respect to ABET (x).
  2. Description of the strengths of CE students with respect to ABET (x).
  3. Description of the weaknesses of CE students with respect to ABET (x).
  4. Suggestions (e.g. course modification, curriculum reform, new initiatives) to improve the performance of CE students with respect to ABET (x).

The documents that stem from this process are referred to as “ABET (x) Reports”. These reports provide the primary input for assessment and evaluation of the attainment of ABET Outcomes.