Description of Educational Effectiveness Approach

Through all of the growth, the construction, the recruitments, we must take care to preserve those important qualities that we have come to call the "Davis advantages." —Larry N. Vanderhoef from "Growth: Meeting the Challenge in the Next Decade" September 27, 2000

Even a cursory examination of the speeches and articles articulated by the UC Davis leadership in recent years will reveal that growth--growth in students, staff, faculty and facilities--has dominated all campus planning endeavors from the long range development plan to this WASC self-study. Our Institutional Portfolio acknowledges that Tidal Wave II growth influenced much of our approach to the evidence assembled for the fall 2002 preparatory review campus visit by the WASC team. In their exit interview comments, the external team indicated that throughout our capacity report and in the short period of time they spent on campus, they could readily observe our dedication to the Davis advantages.

The goal of this final stage of the reaccreditation process is to demonstrate that UC Davis's commitment to educational effectiveness can be discerned in the areas identified by the campus as particularly important to our mission. The faculty consistently and enthusiastically maintains that engaging our students in research ought to be fostered throughout this growth period. As an additional way to enhance the quality of instruction, the faculty and administration are also eager to make the most effective and pedagogically sound use of the tools of educational technology.

In this first essay, we offer a general overview of our approach to educational effectiveness. For a much more specific description at the level of institutional readiness, we urge the reader to consult the summaries and evidence available in Criteria 1.2, 2.1-2.7, 2.10, 3.2-3.4, and 4.3-4.7 of our Institutional Portfolio. The two following essays on student research and educational technology are in-depth analyses of educational effectiveness in those two self-study topics.

The self-study is a snapshot that catches our institution in transition in many areas but most particularly in the evaluation of educational effectiveness. We are beginning to move away from a fairly traditional approach and toward procedures that use direct evidence of student learning. It is too soon to predict where this evolution will lead us.

Design and approach to assuring quality in teaching and learning

There are many institutional procedures that ensure the quality of instruction. Faculty hiring at a research university is based on the assumption that students benefit from taking classes offered by successful researchers who can transmit a passion for their chosen disciplines. The UC academic personnel system privileges the hiring and promotion of faculty who distinguish themselves both in their research and in the delivery of high-quality classes. At the time that faculty members are hired, communication skills as they might bear upon the candidate's potential as an instructor are a consideration. Once joining the faculty, an individual participates in the academic personnel process and is reviewed for a merit action or a promotion every two years or, for full professors, every three years. To be successful, the file for each personnel action must include evidence of high quality instruction including student evaluations, which are required for every class. The file is reviewed through both faculty Senate and administrative channels.

Through the Teaching Resources Center and numerous other resources described in our capacity and self-study reports, we offer opportunities for faculty to evaluate and improve their teaching. Whether they are exploring new methods to enhance already excellent instruction or to address difficulties identified in student or peer evaluations, many faculty take advantage of these services. When it is warranted, department chairs and deans can and do direct faculty to seek assistance.

Quality in instruction also requires a quality curriculum. In selecting and reviewing the curriculum, the faculty is the lead group. Proposals to establish new academic units, majors, and courses are painstakingly reviewed at several Senate and administrative levels.

Each department is subject to separate, regular reviews of its graduate and undergraduate instructional programs. Now that we have formalized educational objectives for students, the guidelines for program review are being revised by the Undergraduate Council. The new guidelines will require units to go further than they have in the past to articulate program level educational objectives and develop measures of student learning in relation to the program and campus objectives. Through its ABET accreditation process, the College of Engineering offers a model of a careful review cycle centered on objectives, measures of student learning, and feedback for improvement.

Some of our educational technology projects and special programs to involve undergraduate students in research have used evaluations of educational effectiveness that collect direct evidence of student learning and have control groups. These will be detailed in the self-study essays.


For the most part, UC Davis has used the traditional and well-established indicators of student learning. These include student performance in classes and the success that students have after graduation in finding suitable employment and in admission to graduate and professional schools. The post-graduation outcomes are collected through our Student Affairs Research and Information (SARI) surveys. By these measures of student outcomes, we are very pleased with the success of our students and the indications of excellent educational effectiveness they provide.

There are also SARI survey data on student learning experiences. Our ability to collect and use this kind of data has increased in the last few years. Some examples related to student research and educational technology are described in the following essays.

As mentioned above, there are specific projects in the areas of educational technology and student research that have collected more direct measures of student learning. These are described in detail in the two main self-study essays that follow.

We recognize that some of the approaches that have been used in our most careful studies and in the College of Engineering need to become more widely adopted on the campus. As mentioned already, the Undergraduate Council is incorporating them into the new program review guidelines.

Other types of evidence of cumulative student learning are portfolios and student work in capstone courses. These are areas to which we are giving increasing attention, and again there are examples available in the essays and in the capacity report.

Use of evidence

Although the evidence we accumulate is used, it is sometimes difficult to identify a distinct, formal procedure. Generally speaking, the existing bureaucracy of resource allocation is strongly influenced by the evidence of success that programs can present. Where real problems are identified, corrective action is taken, and where successes are documented, additional support can be generated. For example, some of the documented successes of early educational technology projects led to the more ambitious Mellon project and to the online pre- and post-lab for Chemistry 2C. Similarly the programs to encourage undergraduate student participation in research were able to generate institutional support following their external start-up funding by demonstrating real success. On the other hand, data from the spring 2001 SARI survey revealed an uneven picture of student understanding of the scope and benefits of research. Thus our self-study includes the recommendation that we improve our communication with students in this area.

Our most institutionalized procedure for ensuring educational effectiveness is program review. In the self-study part of a review, the unit collects and presents evidence to indicate the quality of teaching in its courses. The results of a review lead to recommendations to the program faculty and to the responsible deans. It is not always clear that there is a direct connection between actions that follow a report and the recommendations of the report. One of the aims of the new program review guidelines is to strengthen that connection. An example of a tighter feedback loop is the ABET process for program review. It is now taking root in the College of Engineering, and we will be able to learn from that experience how some of its features might be disseminated to other colleges.

While it is clear that new assessment processes will cost money in the short run, the long run savings and improvements in learning are not yet evident to all our campus constituencies. A significant new commitment of resources will be a very hard sell as the University faces a currently bad and worsening budget crunch. Nevertheless we hope that by promoting discussion of and attention to the issue and by making some recommendations, we will lay a foundation on which more progress can be made when times improve.

Educational Effectiveness Self-Study

The two educational effectiveness self-study topics that we described in our Institutional Proposal are undergraduate involvement in research and educational technology. The next two essays explore those subjects in detail. Both organize parts of the discussion by following the possible chronology of experiences for a typical undergraduate. They draw upon and add to evidence contained in our capacity report.

The integrative essay concludes this self-study. It summarizes our experiences and recommendations from both the self-study and, more broadly, the entire reaccreditation process. 

Integrative Essay