Assessment and evaluation of our PO are represented by Loop III of our continuous improvement process. During the summer of 2006 we revised significantly of a series of web-based tools that enable us to manage course information and collect and analyze program assessment data in an efficient and integrated manner, and ultimately provide a vehicle for continuous improvement of our program. Faculty started to use these assessment tools during the Fall 2006 semester. In what follows below, we first describe the web-based tools that have been developed, we then describe the procedure used to complete assessments of student performance for a particular assignment (e.g. exam or homework problem, project, laboratory report, presentation, co-op, etc.), and finally we illustrate how these assessments are collectively used to evaluate the extent to which students attain each of the ABET Outcomes.
We now describe several of the web-based tools that have been developed. Although it is not necessary, perhaps the best way for a reader of this document to become familiar with these tools is via interactive use at our website. Many of the necessary links are housed within a password-protected section of our undergraduate improvement website. Please contact Jeffrey Errington or David Kofke to request access to this site. After arriving at the website indicated above, select the Access CBE protected content (requires an account) link and enter a suitable username and password. The links referenced below are accessed from the page that follows.
The following tools have been developed
The objective of our assessment procedure is to acquire information that will enable us to evaluate the extent to which students are proficient in the PO at graduation. Our process involves the collection of direct and indirect assessment data that are initially stored within a database and later analyzed to gauge student performance across the program with respect to a given PO. The following items are used within our assessment practices.
Every semester instructors of required CE courses are each asked to select three benchmark “problems” to assess student performance. These problems encompass traditional homework and exam problems as well as individual or team projects, presentations, and laboratory reports. For each problem an instructor completes the following tasks
The schematic below provides an example of how one might complete Step 2 for a thermodynamics problem. The problem statement reads as follows
Argon steadily enters an adiabatic compressor with an inlet area of 0.1 m2 at 100 kPa, 20 °C, and 150 m/s and leaves at 800 kPa and 200 °C. Determine the power input to the compressor.
Five key rubric steps are identified and associated with an ABET Outcome (A), topic (T), and Bloom level (B). Four of these rubric steps point to specific aspects of the problem while the “arithmetic” component provides a general assessment of a student’s ability to complete simple mathematical operations.
Once the rubric steps have been established, the instructor grades 5-10 students on a coarse-grained 0-3 scale. A score of N/A can also be used to indicate that the student did not reach a point in the problem that permits meaningful assessment of that rubric step. This information is then uploaded to our undergraduate improvement website. The web version of this assessment is captured here (restricted). The “Graded scores” provided after the rubric table indicate the numerical scores students were given on the problem, and are typically the scores used by instructors to assign course letter grades. The information is provided to aid in establishing a link between rubric-based and grade-based scores. The “class average” and “class standard deviation” provide statistics for the entire class, and assist in mapping performance from the small sample set of students to that of the full class. Each assessment page contains a complete description of the problem – problem statements and solutions and examples of student work are all provided via links.
To assist in assessing common educational activities that appear in multiple courses (e.g. written and oral communication), we have developed “rubric templates” that contain a series of rubric steps with predefined PO (ABET Outcome, topic, and Bloom level). For each of the rubric steps a detailed definition of the 0-3 scores is also provided. These allow for consistent evaluation of common instructional activities across the curriculum. The following templates have been developed to aid in the assessment of coursework
These templates can be viewed by following the List of rubric templates (restricted) link from the main protected page described above. For an illustrative example, consider viewing the rubric steps for the oral communication template (restricted) and associated step definitions (restricted).
These course-based assessments provide instructors immediate feedback regarding student performance. Such information is used by individual instructors to modify their teaching approach to resolve course-specific issues. Below, we describe how these assessments are used to perform program-wide evaluations of student performance with respect to a given ABET Outcome. These evaluations serve as the basis for modification of the educational program.
Many of our students participate in “extracurricular” activities such as undergraduate research, industrial co-op experiences, a cheme-car team, etc. Each of these activities requires students to utilize technical and interpersonal skills that are typically not employed within the traditional classroom/laboratory setting. For example, within a co-op experience students typically work within interdisciplinary teams, which consist of individuals with diverse educational backgrounds and levels, such as engineers, scientists, technicians, and support staff. Undergraduate research experiences also place students in settings that involve a broad range of people, including graduate students and post-doctoral researchers from various disciplines. Students typically complete co-op experiences during the Summer semester between the junior and senior year. Enrollments span between 5 and 10 students per year (10 to 25% of a graduating class). Our students actively participate in research experiences – over the past three years on average 14 students per semester have enrolled in our CE 498 Undergraduate Research and Creative Activity course. We have incorporated assessments of student performance in undergraduate research and industrial co-op experiences within our PO evaluation procedures. In each case, student performance is monitored using a rubric template similar to those described above. Templates consist of a number of predefined rubric steps with associated ABET Outcome, topic, and Bloom level. The industrial co-op template (restricted) mimics an evaluation form completed by industrial supervisors (restricted) at the end of the co-op experience – the faculty advisor simply maps responses provided by the industrial supervisor onto rubric scores and loads the information into our database. Free form comments provided by industrial supervisors are also recorded for future analysis. Similar to the coursework templates described above, the undergraduate research template (restricted) was developed in-house by CBE faculty, and provides detailed score definitions (restricted) for each rubric step. Within our undergraduate improvement website these templates are grouped with the coursework templates, and can be viewed by following the List of rubric templates (restricted) link from the main protected page.
Our assessment practices also utilize indirect measures to monitor student performance. Specifically, we periodically survey students and/or faculty to solicit their opinion regarding performance within a given area. For example, during the 2007-2008 academic year we asked junior- and senior-level students to evaluate the performance of their laboratory groups with respect to several teamwork issues. These self-evaluations by students can be helpful is assessing the abilities of students for activities that are completed “away” from faculty.
To summarize, direct assessments of student performance within coursework and extracurricular activities are completed each semester. Periodically indirect assessments (e.g. surveys) are also completed to monitor performance. These data are collected within a database for future evaluation. All of the assessments that have been completed can be viewed by following the List of assessment problems (restricted) link from the main protected webpage. As of June 2008, over 75 assessments had been completed (assessments began during the Fall 2006 semester).
PO are evaluated via analysis of assessments collected across the entire program. In what follows we demonstrate how we evaluate student performance with respect to the ABET Outcomes. Essentially, we query the database for all instances of a given ABET Outcome and assemble the course and assessment information that follows into an “ABET (x) Notebook”, where x corresponds to one of a-k. A schematic of this operation is provided below.
These notebooks can be viewed by following the ABET Notebooks (restricted) link from the main page and selecting the a-k of interest. These notebooks serve as the primary resources used by faculty to evaluate the extent to which students achieve a specified ABET Outcome. Every two years faculty subcommittees are asked to review the ABET Notebooks (restricted) and provide an evaluation of student performance. Specifically, three-member committees are tasked to provide the following (see schematic below)
The documents that stem from this process are referred to as “ABET (x) Reports”. These reports provide the primary input for assessment and evaluation of the attainment of ABET Outcomes.