This file was prepared for electronic distribution by the inforM staff.
Questions or comments should be directed to inform-editor@umail.umd.edu. 



                           PART THREE
                 PRACTICAL ASSESSMENT RESOURCES

                           CHAPTER SIX
                    VOICES FROM THE CAMPUSES
                        BY SUZANNE HYERS


                        IN THE BEGINNING

When Caryn McTighe Musil first telephoned to invite programs to
participate in "The Courage to Question." their initial responses
were much the same as Caryn's was to the FIPSE program officer:
"You're asking us to do what?" After that, however, the responses
varied. Some viewed the project as "timely." Lewis and Clark
College "welcomed the opportunity to pause and focus on student
learning." Some, such as Old Dominion University, viewed the
project with excitement, a challenge for their program that had
broad participation and support: "From the beginning, assessment of
the Women's Studies Program...was a collaborative and hands-on
learning project."

Other responses were not so positive. At the University of
Missouri, for example, "faculty members had negative feelings about
assessment," in part because of their experience with
state-mandated assessment that created competition among Missouri
colleges and universities: "At this institution ...assessment was
politicized in such a way that many faculty members saw [it]
primarily as a weapon to be used against them." Faculty members at
the University of Colorado also had their perceptions shaded by a
state-mandated program. In Colorado, they "regarded assessment as
one more bureaucratic requirement for evaluation that impinged on
[their] time."

                        A SECOND LOOK...

"The Courage to Question," however, provided a very different
framework for program evaluation. In response to the National
Assessment Team's encouragement "to take a more comprehensive look
at assessment, its purposes, and its possibilities for
self-reflection," the University of Colorado, for example,
experienced a significant change, moving from state-mandated
procedures to those of feminist assessment. For them, the change
meant "the setting for our process was supportive and
intellectually exciting. The audience for our reports was not a
state bureaucrat but other women's studies programs and educators
interested in assessment."

For the University of Missouri, "The Courage to Question" provided
an alternative to the state's "rigid, quantitative, 'value-added'
approach." Although the project coincided with a difficult time of
transition and rearticulation of program goals at Missouri, faculty
members were clear on one thing: They had a "real passion for
teaching and a long-term commitment [to] exploring feminist
pedagogy." Missouri followed the National Assessment Team's
recommendation to listen to such strong statements: "Rather than
developing a plan that would be imposed on the faculty members...we
worked toward a model of assessment grounded in the activities
faculty members were already carrying out in their classes.... We
talked in terms of 'faculty development' instead of 'assessment',
believing that a good assessment project would, in fact, contribute
to better teaching." Missouri was able to pursue this project in
the midst of such difficulty because the assessment goals were
parallel to individual and programmatic goals.

Ironically, the resistance of faculty members to assessment was
similar to the resistance of some students to material presented in
women's studies classes. However, more information--as well as
time, reflection, and experience--resulted in a greater
understanding and general acceptance of the process of
assessment--if not the word itself. ("Assessment" continues to have
as problematic a reputation as the word "feminist": Many who come
to believe in its principles continue to reject the language.)

The overall approach of the campuses to this project was a familiar
one for women's studies programs: make one task simultaneously
tackle three situations. As described in the introduction to The
Courage to Question: 

     With long experience administering programs without sufficient
     support, the women's studies faculty members and
     administrators in the project drew on that history to create
     assessment instruments that were embedded in what they already
     do; weave data analysis into student research projects; create
     methods that could also have a life beyond the grant such as
     alumnae/i questionnaires and interviews; and make use of the
     project to further women's studies programmatic goals.... 

Consequently, not only did the project accomplish its goals through
creative structuring, but after the project the layers of meaning
understood through assessment became woven into the fabric of the
programs themselves. As they continue to assign research projects
and administer questionnaires and course evaluations, they will
evaluate them with the knowledge gained through "The Courage to
Question."


                         WHERE TO START

                "Begin with what you do already"
In every case each institution started by defining its program's
goals and objectives. The University of Missouri, as noted, began
the project with a simple yet strong acknowledgment of the
faculty's passion for teaching. Old Dominion University had two
basic reasons for participation: They wanted to find out "just what
we were teaching our students and what they were learning"; and
they "wanted to create stronger connections" among members of their
Women's Studies Advisory Council. Wellesley College--the only
women's college among the seven participating institutions--asked
"what makes women's studies at a women's liberal arts college
different?" 

The first item on the "assessment agenda," then, should be to
determine what your program needs to know. Assessment is not a
true/false test. It is a series of open-ended questions.
      "The best we can hope for is to ask better questions:
    What matters in women's studies? What do we care about?"

                  WHAT WORKED AND WHAT DID NOT

HUNTER COLLEGE

             "Assessment is not final but ongoing. "

Hunter College used course syllabi, exams, paper assignments,
informal classroom writings, and a survey of introductory women's
studies classes with open-ended questions that explored the value
of the course overall: "If you had to describe this course to a
friend, what three adjectives would you use?" "Was there a balance
between the survey-scope of the course and some more in-depth
investigation? Please explain." The questions explored whether a
sense of community was built in the classroom and whether the
course met student expectations. They compared women's studies to
other introductory courses. (See page 95.) Hunter believes all
methods gave them invaluable material.

Hunter also investigated how effectively the program accomplished
its complex goal of multiculturalism by focusing on three areas:
curriculum, scholarship, and "collective conversations" with
students, which were organized by a women's studies student. The
voices of these students are at the center of Hunter's report,
creating a particularly strong portrait not only of the program
itself but also of the diversity, passion, and spirit of its
students. As noted in Hunter's report, "Students valued being
consulted regarding the assessment project. It became a concrete
way of enacting the empowerment and cultural thinking the project
itself hoped to investigate."

The project also revised faculty members' attitudes toward
assessment: "For a group of faculty, assessment has lost its
negative overtones of coercion from outside forces." Hunter also
used the project to place the women's studies program at the center
of institutional discussions, such as the college's emphasis on
reaching students with nontraditional backgrounds. Through this
project, Hunter created a core advocacy group for assessment which
has had "an impact university-wide in terms of Freshman Year
Initiative, work done on Undergraduate Course of Study Committee,
the Committee on Remediation, the Provost's Advisory Committee on
Remedial and Developmental Programs, and within the Faculty
Delegate Assembly and University Faculty Senate." The women's
studies program is playing a role in other campus discussions as
well. "The project has focused our attention on the relationship
between women's studies and the liberal arts curriculum.. . at
Hunter College...there is an ongoing debate about whether to
include a pluralism and diversity requirement in basic education
requirement."

UNIVERSITY OF COLORADO

        "What are the passionate questions for students?"

The University of Colorado initially planned to use its
participation in "The Courage to Question" to revise previously
established state-mandated assessment procedures. The year
immediately preceding the FIPSE project, Colorado had complied with
the state-mandated assessment program by selecting "one knowledge
goal and two skills goals to assess," using a required feminist
theory course as the source of information. The investigation went
according to plan, but "the outcome...was not especially
illuminating." As a result, Colorado was especially poised to use
our assessment project as a means of reevaluating its assessment
process.

     We were dissatisfied with the process we had developed for
     several reasons. First, the state mandate created an
     atmosphere that encouraged compliance rather than enthusiasm.
     Our selection of knowledge and skills goals as well as the 
     methods of assessment emerged from a desire for efficiency....
     [O]ur goals and the process of assessing them looked very much
     like standard academic fare: one couldn't tell much difference
     between the women's studies assessment plan and those of
     traditional arts and sciences disciplines. We were resigned to
     the process; we didn't "own" it; and we didn't learn much
     about ourselves as teachers and learners.... We had selected
     particular goals not simply because they might be important,
     but also because they were convenient....

According to its report, Colorado then "stopped asking, 'what do we
want to accomplish ?' and began to ask 'From the perspective of
student learning, what are we actually doing?"' Faculty members
went to the students directly, as other campuses did, through a
series of informal meetings such as potluck dinners to seek their
opinions. Following those discussions, they came up with three
categories for investigation--course content, course structure, and
classroom dynamics--and were interested in two questions: "(1) Were
all three of these categories equally important in fostering active
learning or was one component more important than the others? and
(2) Was the active learning experience that our students identified
with their women's studies courses unique, or could it be found in
other classes?" Using illuminative evaluation for its
investigation, Colorado administered questionnaires, analyzed
syllabi, and chronicled classroom observations.

According to Colorado's report, "Our experience with 'The Courage
to Question' has led us to abandon our previous approach and to
adopt a portfolio method. Our approach rejects a method whereby
faculty alone measure student learning and proceeds from the
assumption of an equal partnership between students and faculty in
assessing student learning."

OLD DOMINION UNIVERSITY (ODU)

           "Focus on improving rather than proving. "

Refusing to be limited to the four areas suggested by the project
(knowledge base, learning skills, feminist pedagogy, and personal
growth), ODU established a fifth area to assess--the impact of
women's studies on faculty members. ODU also examined the role of
students' friendships in their investigation of personal growth.
Project members created specific subcommittees to examine these
five areas, these subcommittees worked well and resulted in "lively
conversations and debate."

In its investigations of the knowledge base, ODU attempted to
identify the five main concepts instructors sought to convey to
students. The method selected was a pre- and post-test administered
at the beginning and end of the semester. The tests were used in
more than a dozen classes over two semesters More than six hundred
students were given the pre-test; more than five hundred took the
post-test. In spite of the amount of information received from the
tests, they were not considered a successful assessment method:

     While these tests were the most efficient way to take a
     reading of students' awareness of some key points for each
     course, they were not a refined instrument in ascertaining
     what students understood. It was not always easy to
     distinguish between wrong answers based on students' lack of
     knowledge and those that were a function of imprecise or
     confusing questions.... Much more time consuming, but more
     useful, was the analysis of final exams for a few courses. In
     retrospect, this may have been the single most valuable
     instrument for knowledge base objectives. (Italics added.) 

ODU also was disappointed in the information resulting from a
series of interviews with graduating minors and alumnae who were
asked to identify "the three most important concepts that they had
learned in women's studies courses." Project participants felt
these interviews resulted in "somewhat general answers which were
only moderately instructive." In addition, the alumnae
questionnaire required a considerable commitment of time to
complete, which they believe was a key factor in the low return
rate. According to Anita Clair Fellman, one of the authors of ODU's
report, "Closer analysis of a few pieces of good data is more
useful than a large amount of less bounteous data." More successful
for ODU were investigations regarding students' friendships and the
impact of women's studies on faculty members. Again, for the
friendship investigation ODU used questionnaires administered at
the beginning and end of semesters: "Does the instructor recommend
or require group discussion or group projects? Currently how many
students in class are friends? How did being in class together
change (if it did) your relationship with this person?" These
questions also appeared on the minors' exit interviews and on the
alumnae questionnaire, all of which provided ODU with information
about students' friendships. (See page 91.)

To assess the impact of women's studies on faculty members, ODU
faculty members interviewed each other--which not only generated
new data but also encouraged both internal and external dialogues
among faculty members about the influence of women's studies: "This
was the first time we had faced one another and asked, 'What are
our goals?'" 

ODU had a distinctly positive experience throughout this assessment
project. They had a large number of faculty members and students
(more than two dozen) who were involved in the project from the
early discussions of assessment to the design to final
interpretation of the results. According to project participants,
"While inclusiveness can be cumbersome, its virtues are the
richness of diverse opinions and perspectives and the commitment of
the participants." The conclusion to ODU's report notes the impact
the project had on the program overall:

     [F]or the first time we have on paper a comprehensive and
     clear statement about what we are doing in women's studies, a
     description of our women's studies program goals that we can
     share with others interested in developing women's studies
     courses in their departments. It was a validating and
     reassuring experience to discover that each of us does have a
     clear picture of what she is trying to communicate to students
     and that, when put together, these individual views reveal a
     shared vision of what the Women's Studies Program is about. We
     have found words to describe what we are trying to do in our
     classroom, and we have discovered in one another resources,
     knowledge, and skills that previously we may have overlooked.

OBERLIN COLLEGE

 "Consider assessment as a movie--not a snapshot with different
       angles, different cameras, and reviewed over time." 

Participants at Oberlin College designed a series of
self-statements given to students in more than fifteen courses.
Through these self-statements, which were administered three times
during one semester, Oberlin was able to measure (and note changes
in) students' perspectives over a period of time. For example, one
question asked (somewhat differently) throughout the semester was:
"Do you expect this class to address questions of race?" (asked at
the beginning of the semester); "Does this class address questions
of race? How?" (asked at mid-semester); and "Has this class
addressed questions of race? How?" (asked at the end of the
semester).

In addition to the self-statements, Oberlin used interviews with
women's studies majors organized by other women's studies majors;
faculty and alumnae questionnaires; and a series of student
interviews conducted by a women's studies student. Like Hunter,
Oberlin emphasized multicultural learning: 

     The shape of the assessment plan...reflect[s] the growing
     national debate about multiculturalism and the questions asked
     about women's studies programs in terms of this debate: What
     fosters student learning and self-empowerment? How can courses
     encourage a relational understanding of gender, race, class,
     and sexuality? Does feminist pedagogy differ from other types?
     How do women's studies courses affect students' lives and life
     choices? 

Oberlin forwarded questionnaires to faculty members campus-wide to
ascertain the program's acceptance. Although results were generally
supportive, the questionnaire did prompt the most critical
responses heard throughout the project--most often from professors
who had never taught a women's studies course. Those comments
ranged from describing women's studies as "one big counseling
session" to saying the program has "politicized and ideologized
students instead of promoting objectivity in education...."
Questions asked of Oberlin faculty members included: "What
significant learning experiences do you think women's studies
courses offer students?"; "Do you believe that women's studies
courses differ in pedagogy from non-women's studies courses?"; and
"Do you ever approach your subject with an integrative analysis of
gender, race, class, and sexuality?" (See page 97.)

While Oberlin's report does not evaluate specifically the methods
used, faculty members have incorporated assessment into their
internal examination of the women's studies program and consider
the process an ongoing one. They do, however, acknowledge that
"assessment doesn't occur in a politically neutral space."

LEWIS AND CLARK COLLEGE

       "Use multiple methods and sources of information."

Lewis and Clark College designed an ambitious assessment plan for
its gender studies program that relied on three principal data
collections: questionnaires, students' papers and projects, and
selected course syllabi. However, the project team also drew upon
data available from their annual four-day Gender Symposium papers
and projects, computer conversations, students' journals and
diaries, students' honors projects, practice reports, and other
previously collected material. Faculty members' engagement in
assessing student learning was nourished by the overall
institutional climate, which invests significantly in faculty
development and places a high priority on maintaining a quality,
student-centered undergraduate education. The fact that Lewis and
Clark honors such investigations of the curriculum, campus climate,
teaching, and student learning was an important factor in the
project's success.

Lewis and Clark wanted to answer three questions: How effectively
do students learn and apply gender analysis? What impact has gender
studies had on the classroom and institutional climate? What impact
has gender studies had on the personal growth of students and
alumnae? As its central organizing group, they relied on a
collaborative team that included one student, one staff member, and
two faculty members. Coupled with extensive campus consultation
with faculty members, students, staff members, and alumnae/i, the
four worked together to oversee the data collection, analyze it,
and write the final report. Like Old Dominion University, they
found multiple perspectives and mutually supportive collaboration
enhanced their work.

A questionnaire was sent to students, faculty members, and alumnae.
(See page 85.) It eventually provided both quantitative and
qualitative data--a combination that Wellesley College points out
is especially illuminating, since numbers alone do not reveal the
full meaning of a particular response. The student questionnaire
was sent to a random sampling stratified by distribution of majors,
while the faculty questionnaire was sent to all faculty members
teaching undergraduates. The alumnae/i questionnaire was sent to
all alumnae/i who had participated in Lewis and Clark's Gender
Symposium during the previous five years. The return rates of 48
percent, 46 percent, and 48 percent, respectively, were unusually
high.

Self-reporting in the questionnaires could be verified by the next
major data collection: student papers and projects. In order to
determine how well students were able to use gender analysis in
their courses, the gender studies program developed a list of eight
basic concepts--referred to as knowledge plots--which became the
basis of the score sheet used to do a content analysis of papers
and projects. (See page 89.) Faculty members then collected papers
and projects from selected gender studies courses and compared them
with a similar set of materials from core curriculum courses, in
both cases using longitudinal materials such as student journals or
final portfolios where possible. These proved especially
illuminating in recording the process of students' intellectual
development. The student work was scored independently by two
readers; if there was disagreement, a third reader was brought in.

For the third of the major sources of data collections, Lewis and
Clark relied on syllabi from discipline-based, non-gender studies
courses to determine how much gender integration had been
incorporated into classes outside the gender studies program. The
comparative syllabi also allowed project participants to examine
what kinds of subject areas were being covered only through gender
studies. The initial student questionnaires once again generated
baseline information for further inquiry. In this case, students
collectively named more than one hundred courses that they claimed
incorporated gender perspectives. Trimming the list to what was a
more manageable number, faculty members in the gender studies
program selected twenty courses, divided proportionately among the
three divisions of the College of Arts and Sciences and between
male and female professors. A score sheet was created to measure
content based on Mary Kay Thompson Tetreault's "feminist
phase theory," again scored independently.

Lewis and Clark's assessment plan was labor intensive. Halfway
through the project, participants felt overwhelmed by the mountains
of data they had collected. Ultimately, however, they chose to use
only material that illuminated their three basic questions, knowing
they could return at another time to pose additional questions.
They were sustained through the process by the belief that their
research would be valued on their campus, by the mutually
supportive working team they had established, and by the rich
information they knew would shape their program's future. Like many
of the participating campuses, they developed documents from their
research that they used internally in various forms for various
audiences. It allowed the work to be applied both nationally and
locally to improve undergraduate education.

UNIVERSITY OF MISSOURI

                    "Pick a plan you can do."

The University of Missouri had a relatively difficult time in the
preliminary stages of the project. Not only did faculty members
have negative feelings toward assessment because of past
experiences with state-mandated assessment, but there also was a
lack of cohesiveness within the women's studies program due to
significant staffing and administrative transitions. If defining
one's goals is the first step, Missouri had difficulty from the
beginning. Many women's studies programs are experiencing similar
situations: 

     We were discovering that goals and processes clearly
     articulated in the early eighties no longer had consensus
     backing from members of the committee. The second half of the
     1980s had been a period of consolidation and
     institutionalization for the program. Departments began hiring
     faculty with expertise in women's studies, greatly expanding
     the course offerings as well as participation in the program.
     Yet these women had not been involved in the development of
     the program and did not necessarily share the perspectives of
     those who had.

As described in other chapters in this volume, the clear definition
of goals and objectives is central to the assessment project. At
Missouri, participants felt "there [were] inherent difficulties in
the process of formulating goals.... [C]onsensus processing
requires shared interests and a long time frame; it was not clear
that we had either."

Faculty members at Missouri did come to agreement on their
commitment to teaching and feminist pedagogy and decided to make
that the starting point for assessment. The University of Missouri
used its first faculty development workshop, led by Pat Hutchings,
to discuss how materials regularly incorporated into women's
studies classes--journals, projects, and papers--could form a basis
for assessment. As the project progressed, Missouri realized that
there were other sources of information easily available, such as
course evaluations collected in all women's studies classes.

They also attempted to retrieve other valuable data regularly
collected elsewhere on campus but ran into problems with access.
They noted that, even if they had had access to data, they did not
have the resources necessary to successfully analyze such
data--limitations true for other project sites as well. The
Missouri report is particularly straightforward in this regard: 

     We were not very successful in executing the quantitative part
     of our project, and we want to note here the sheer difficulty
     we had getting information from "already existing sources."
     Quantitative data, such as the kind the registrar has about
     all students, would have been very useful, but we found it
     virtually inaccessible. Assessment projects. . .might do well
     to think about their own record keeping.... We also
     underestimated the difficulty of analyzing data....

WELLESLEY COLLEGE

        "Stay close to your own strategies and beliefs."

Wellesley College was the only women's college of the participating
campuses and focused its project on that difference, asking, "What
makes women's studies at a women's liberal arts college different?"

     [D]id [women's studies] change or affect student's personal
     lives, their intellectual life, or their political beliefs?
     Did students feel pressure to give 'politically correct'
     answers and to only identify with 'feminist' ideas.... We were
     interested in the quality of de- bate among students and
     whether or not discussion and learning continued outside the
     classroom, and if so, with whom.

Wellesley designed an open-ended questionnaire incorporating these
items: "Has this course changed or affected your personal life? Has
this course affected your intellectual life? Did it change your
political beliefs? If so, how?" (See page 93.) In order to examine
the difference women's studies courses make, Wellesley administered
questionnaires to students in women's studies courses and closely
corresponding non-women's studies courses (control courses) and
administered them near the end of the semester so students would
have more information. Wellesley based its findings on a return of
441 questionnaires--68 percent from women's studies classes and 32
percent from the control courses (only 4 percent of the surveys
were from women's studies majors). Wellesley also used an interview
guide for majors and alumnae of the women's studies program, and a
random sample of alumnae were interviewed by telephone. Both
quantitative and qualitative data were collected. However,
according to Wellesley's report:

     [O]ur findings demonstrate the limitations of relying on
     quantitative evaluative data and the ways they "flatten" human
     experiences. Even when the quantitative answers were
     statistically similar between the women's studies and control
     courses, careful reading of the actual answers suggest the
     meanings of the answers varied widely between the women's
     studies and control courses. Thus, the qualitative answers
     told us much more about what was really happening in the
     courses and gave us a deeper sense of how we might begin to
     "count" the meanings of our students' responses.

As with the other campuses, the project had a significant effect on
the Wellesley program. Their report claimed the project made it
possible "to make self-conscious what is for many of us
unconscious.... [W]e discovered joint problems...in the classrooms,
expressed concern about both silences and pressures, and became
particularly aware of the difficulties facing our colleagues of
color." In addition, project participants learned that "the
pressure of the student evaluation questionnaires [has] kept
faculty, especially junior faculty, fearful of innovation and
controversy in their classrooms."

                           CONCLUSION

Wellesley College's report included the following quote from a
women's studies student: "I will continue to question my beliefs
and I will continue to try to educate myself." After their
three-year experience with this assessment project, the seven
institutions would probably express something similar. As Oberlin
College concluded: 

     As we continue our discussions regarding long range planning
     and the future of the Women's Studies Program. ..we will build
     our future based on insights generated by ["The Courage to
     Question"]. In our original assessment design, we claimed that
     we intended to investigate 'some of the distinctions and
     tensions, as well as the  commonalities, among students and
     faculty of diverse racial, ethnic, class, gender and sexual
     identities.' Three years later, this statement continues to
     challenge and engage.

POINTS TO REMEMBER

The research, contributions, and perspectives of members of the
National  Assessment Team (NATs) are well documented throughout
this book. The  "practical tips" below are brief and informal. They
are meant simply as reminders of what is stated in much more detail
elsewhere.

BEFORE YOU BEGIN ASSESSMENT

* Begin with what you do already.
* Let students participate in the process.
* Determine your community's "passionate questions."
* Take time to conceptualize what you want to know.
* Be sure the process involves diverse campus/student voices, and
     give voice to those who may not otherwise be heard. 
* Use surveys and documents developed by people involved.
* Use multiple measures in gathering data.
* Pick and choose among diverse methods, and do what you have time
     for.
* Aim for unobtrusive ways to evaluate.
* Look for alternative ways to do analysis--narrative,
     conversation, dialogue.
* All assessment techniques are not necessarily appropriate to all
     situations or all institutions.
* Think about longitudinal studies: students who graduated, faculty
     members who have been there a long time, oral histories, and
     so on.
* Pay attention to how information will be used and who the
     audience is.
* Remember to think about the variety of places where learning
     occurs. Learning takes place outside the classroom as well as
     in it.
* Ground your exploration in feminist perspectives, and stay close
     to your own strategies and beliefs.
* Be clear in your mind that assessment is not final but ongoing.

ONCE THE ASSESSMENT PROJECT HAS BEGUN

* Think about creative use of staff time--a senior project for a
     major, graduate student project, an internship, and so on.
* Pick a plan you can do.
* Have consonance between resources and contribution.
* Rely on data already there or that you can obtain easily.
* Remember: You do not have to answer every question you ask.
* Return to excess data later as time and staffing permit.
* Interpret data from several viewpoints over time.
* Consider assessment as a movie--not a snapshot--with different
     angles, different cameras, and reviewed over time.

CONSIDER THE FOLLOWING AS SOURCES OF INFORMATION FOR ASSESSMENT

* Journals, papers, reports, diaries
* Major committee reports
* Syllabi, mid-term exams, final exams, course evaluations
* Enrollment trends
* Classroom observations
* Attendance at optional events
* Library check out/reserve lists
* Faculty appointment books
* Student newspapers
* Program newsletters
* Brochures, prizes, awards                                       
* Audio/visual tapes of classes
* Faculty publications
* Minutes from meetings
* Letters of complaints, grievances, thanks
* Student publications
* Student presentations
* Annual reports
* Faculty searches
* Grant proposals