This file was prepared for electronic distribution by the inforM staff.
Questions or comments should be directed to inform-editor@umail.umd.edu.



                          CHAPTER FOUR

         ASSESSMENT DESIGNS AND THE COURAGE TO INNOVATE
     BY JILL MATTUCK TARULE AND MARY KAY THOMPSON TETREAULT

Resistance, whether psychological or political, can be viewed as
obstructive or informative, as a statement of refusal or an
assertion of a different view, a position, a "standpoint," an
emerging theory.[1] As the "Courage to Question" project met over
the three years, the forms of resistance to assessment--as a word,
a set of practices, or a tool of the majority culture--flourished
in our conversations. We paid attention to the resistance, naming
it as a source for ideas. We understood that new insight,
invention, even wisdom, often reside (albeit hidden and silent) at
the core of resistance.

We talked about resistance as we expressed feelings of aversion or
dislike or simple disinterest toward assessment, and together we
began to narrate for ourselves a set of alternative views,
attempting to imagine how the process of assessment could serve
each campus and the general project goals productively. Though
perhaps not consciously, the process in the twice-yearly project
meetings of members of the National Assessment Team intertwined
with the process at each site and even, to some extent, in the
individual assessment activities. In conversation, we began to know
what mattered. The learning, as Bruffee so aptly says, was not in
the conversation, it was the  conversation.

     "Narrate" is from the Latin narrare (to tell) which is akin to
     the Latin gnarus ("knowing," "acquainted with," "expert in"),
     both derivative from the Indo-European root gna ("to know")...
     Narrative is, it would seem, rather an appropriate term for a
     reflexive activity which seeks to "know"...antecedent events
     and the meaning of those events.... [3]

In recent years, studies and research models have turned to
narrative as a way to explore events and their meaning, as a way to
examine diverse "objects" of study such as individuals, classrooms,
institutions, and cultures. At the heart of narrative lies
conversation and language, whether in interviews, journals, focus
groups, or public meetings. Talking is narrative, stories are
narrative. All the meetings of the "Courage to Question" project
were narrative-rich dialogues between people that shaped and
defined the project. In short, they became the project. For that is
the other aspect of narrative. It is a way of constructing
knowledge in social contexts. It assumes a relationship between
people in a community.

Thus, narrative is the medium of choice for the study of
relationships. The "Courage to Question" project began to define
assessment as an attempt to explore relationships between people
(teacher/student; teacher/teacher; student/student), between people
and ideas or activities (student/class; student/women's studies
programs) and between systems (course/program;program/institution).
Assessment reflects the recent work on the primacy of relationships
in human development and learning and a parallel focus on the
importance of narrative as both epistemology and method. It can be
seen as a systematic (and systemic) narrating of a story about a
particular set of relationships within a given institutional
context.

Understanding assessment this way rescues it from too-rapid,
unconsidered conventional approaches to research, approaches shaped
by the empiricist model of distance, separation, and
logical-deductive proofs as the route to both interpretation and
understanding. As Carolyn Matalese puts it: "The interpretive act
is replacing the objective gaze."[4] Narrative as socially
constructed knowing is an interpretive act. Assessment grounded in
narrative is thus repositioned as a "reflexive activity that seeks
to know."

                     CONTEXT AND BEGINNINGS 

FINDING THE QUESTIONS
Assessment usually begins with some set of questions, some inquiry
that promises to provide increased expertise in a given
conversation or set of conversations. It is not always easy,
however, to get to the questions. The questions for this project
gradually developed over the first year. A set of four principles
to explore (knowledge base, critical skills, feminist pedagogy, and
personal growth) was transformed in these conversations.

The project began locating questions, using free writing as a
technique for narrating both resistance and what people wanted to
know. Each participant did a "free write" on questions of personal
concern. Understanding free writes as a process for narrating one's
own thinking, participants generated questions to pursue. The
resulting material demonstrated that this was a narrative "way of
knowing" that, although informal, securely located our work
together in a conversational, narrative mode.

Three questions summarize what needed to be asked at that
particular moment in the process of developing assessment designs:
* What do I want to know?
* Why do I want to know it?
* Who is the audience for my assessment?

We struggled to find a way to address these questions that crossed
the boundary from the general to the particular, that admitted
value-laden and program specific concerns. This is a critical
moment in creating innovative assessment. "Passionate questions"
can get fished out of the dominant discourse, whether that
discourse is about how "good" assessment proceeds or about whether
a marginal academic program can withstand comparison with other
discipline programs and the mainstream curriculum as a whole, or
about what assessment questions themselves have to be.  

At best, this moment for locating questions of genuine concern
allows the questioner to position herself in relationship to the
process about to begin. For many academics, assessment is a
troublesome issue swimming somewhere in the bywaters of the
academy's true purposes: scholarship, teaching, maybe service.
Recent calls for new definitions of scholarship aside, the most
frequent resistance to program assessment was that it was not only
an uninteresting activity but that it also was quite unrelated to
the reward system of academia. In fact, assessment often is as
unrelated to those rewards as is participation in women's studies.

Two general purposes can be served by the "free write" approach.
First, it is a chance to locate questions of genuine concern.
Second, it is a way to begin a process of locating the assessment
effort and design in some relationship to the dominant culture.
Such a location can identify boundaries as clarity emerges about
which conversations in the academy the questions will address and
which it will not. The University of Colorado found that in this
process their inquiry had a new purpose: "more descriptive than
evaluative."[6] Previously, their women's studies program had
dutifully responded to a state-wide mandate for assessment with
"compliance rather than enthusiasm." In contrast, their new
conversations brought them to an inquiry that intrigued them
because it promised a conversation about things that meant
something to them.

Similarly, the University of Missouri was subject to state mandated
assessment. Their faculty members viewed assessment "primarily as
a weapon to be used against them." Missouri's free write helped
them realize that pedagogy was at the heart of their inquiries--
which, in turn, helped them to initiate this assessment design with
a series of faculty development workshops.
 
Free writes alone were not the only question-locating activity. The 
University of Colorado's conversations had begun at potluck dinners
with students to discuss what they learned. Old Dominion
University, Hunter College, the University of Missouri, and others
organized women's studies retreats or faculty development days
which became forums for a similar narrative activity. In each,
people seeking to understand something about their program gathered
to explore in conversation what that something was, understanding
that the process of voicing and discussing what mattered would be
a process of socially constructing a set of concerns to explore. In
these meetings an inchoate transformation of the assessment process
began, which is reflected in the very etymology of the word. Now
the goal became much more to assess in the sense of sitting beside
(assession) rather than the more dominant sense of the word meaning
fixing or apportioning value.[7]

What began to emerge at each site were descriptions of particular
inquiries within each of the institutional contexts. What the
participating programs wanted to know took on a flavor of
particularity and context-specific concerns. For example, Colorado
found a core inquiry: "From the standpoint of student learning,
what do we actually do?" With this question, they "located" their
concern, detailing a particular perspective from which to view and
a set of activities. Oberlin College, on the other hand, wanted to
look at "some of the distinctions and tensions, as well as the
commonalities, among students and faculty members of diverse
racial, ethnic, class, gender, and sexual identities." They
therefore emphasized "positionalities from which learning and
teaching occur." Lewis and Clark College and Old Dominion
University had fundamental questions about the knowledge students
acquire in gender studies and women's studies classes. Each
program's questions grew in a way that was appropriate and
manageable, shaped by a narrative peculiar to the culture,
concerns, and constraints of that institution and program while
still relevant to the larger conversation of the project.

It is this final point about beginning narratives as
question-locating activities that must be stressed. If narrative as
a way to know and become expert is solidly grounded in relationship
and in socially constructed discourse communities, it will always
bear the mark of individuality and specificity and frequently, as
one faculty participant observed, will seem messy--undetailed and
not amenable to easy generalization.

EXPANDING THE QUESTIONS TO A DESIGN
The mess itself can seem overwhelming: too disordered, too complex,
and too embedded in a dialogue among the convinced. A second set of
questions  can lead out of this morass, which the projects
themselves turned toward as part of their conversations. Colorado's
question about student experience developed in response to
questions they had generated. Old Dominion, having located
particular areas of learning to question, developed small
discussion groups to bring greater detail to those questions.
Generally, for all sites, the questions at this point were:

* How can we find out what we want to know?
* Who are our best informants?
* Who is the audience for our assessment?

Often shorter in time than the preceding dialogues, the
conversations addressing these questions begin the process of
zeroing in on significant ideas, on who can help to develop those
specific ideas, and on an imagined conversation those ideas can
promote. In addition, the second question prompts a specific turn
to assessment methods.

By this point, the ongoing narratives had moved beyond the typical
questions of research validity, reliability, and universality to
critical moments of individuation. The programs all found their
concerns turning toward their own developing narratives and toward
what had emerged as meaningful for them to explore. To some extent,
this left behind many of the previous concerns, especially
conversations about their programs' marginality. This move helped
to diminish the idea that assessment would lead toward some
definitive response to the majority culture.

There are many ways to understand the nature of this critical
juncture. It can be seen as what Audre Lorde so aptly describes as
the fact that one can't dismantle the master's house using the
master's tools.[8] Or it can be seen as a particular stage in a
process of feminist phases of curriculum development, where the
epistemological challenge to dominant ideologies and practices is
explicit and worthy of development on its own and as a route to
transforming epistemological frameworks altogether. Or it can be
understood as a time when the narrative process is uniquely
balanced with the process of listening, when the questions
themselves essentially create their own context and other contexts
grow paler, less figural, more background. Finally, it can be
understood as an essential moment in which each program left behind
defining feminist assessment and instead took up actually doing it.
They turned their attention to creating designs that were
appropriate and to understanding that, in some way what made those
designs feminist was that they were being done by feminists with
processes that they identified as feminist.

Regardless of which analysis fits best, a general outcome from this
moment is that the concern with audience seems to abate at this
point. That is, the answer to the audience question became more
like "Let's wait to see what we have to say before we decide who we
want to talk to." Conversation turns to narrating what is of real
concern. "We welcomed the opportunity to pause and focus on student
learning," explains Lewis and Clark. It seems likely that without
this change in the conversation, most innovation, experimentation,
or maybe any assessment dies on the way to being born, silenced by
imaginary critics before any data have been collected. The audience
has been considered an integral part of the conversations to this
point. Now the audience must be ignored if one is successfully to
get to the question--"How can we find out what we want to
know?"--and create responses to that methods question in a way that
honors the preceding conversations.

Thus, innovative methods do not spring full-blown from a set of
goals or objectives. Achieving unique or innovative ways of inquiry
requires creating conditions that support the endeavor. Notably,
those conditions echo the principles detailed in Joan Poliner
Shapiro's chapter, "What is Feminist Assessment?" Grounded always
in narrative exploration, this approach is participatory. It
redefines the relationship between the researcher and the re-
searched. It is value-laden, relational, context-specific,
collaborative, and concerned with process. Innovative methods not
only emerge from the dialogue, the narrative of each program, and
its concerns, they also enhance that dialogue. In so doing, they
often lead to a revised relationship between not only subject and
object but also process and product. The assessment designs that
emerged manifested these revisions.

                     THE ASSESSMENT PROCESS

HOW CAN WE FIND OUT WHAT WE WANT TO KNOW?
There is an old adage floating around among certain researchers,
particularly those devoted to narrative, that goes something like,
"Don't collect more data than you want to analyze." One of the
important things learned from the "Courage to Question" project had
to do with the courage to question conventional research design and
analysis strategies. Growing out of a general concern with
narrative as an epistemological framework and a productive process,
both the development of designs and the analyses of data took on
principles of narrative-based inquiry: a concern with stories, an
assumption of the relevance of experience, and a willingness or
courage to examine both the nature and the outcome of conversation.
In this time when case study (and portfolios as specific,
individualized case studies) is beginning to be viewed as a
productive line of inquiry and research, the project joined those
efforts either in fact or in spirit--with a concern for locating
ways to examine what, in Patton's terrific phrase, "is actually
going on."

This section explores what some of those innovative designs were
and the analyses they supported. However, a word of reminder is
required. Adopting any of these approaches without reference to
context and without the supportive and explorative conversations
that preceded them will, at best, influence any salutary outcome
and, at worst, diminish the value of the approach and render the
analysis irrelevant or damaging.

METHODS AND INSTRUMENTS
As the campuses turned to the question of how to find out what they
wanted to know, they turned to an array of "instruments" for that
purpose. With a set of defined concerns--referred to as goals or
questions or areas to explore-- each women's studies and gender
studies program began to examine and locate effective and
parsimonious ways to collect data. The questions at this point echo
some of the earlier ones but now with a different end in mind:

* What do I want to know?
* Who are the best informants?
* How will I find out?

On the whole, the data collection instruments of the "Courage to
Question" project do not, in and of themselves, represent radical
departures from conventional practice; given the preceding work,
however, the meaning of both their use and the processes for
analysis hang close to narrative principles. New uses of standard
practice can count as innovation.

Standard practice would dictate the use of designed instruments
such as questionnaires and interview protocols for either
individual or focus group interviews, as well as unobtrusive
measures such as observation or the use of existing documents for
analysis. All of these were used in the project, but the
combinations of them within the particular contexts created
distinct assessment designs in each case. Moreover, the usual
distinctions between designed or unobtrusive measures, as well as
between quantitative and qualitative analysis, diminished. Most
sites did some of both, and all found unique ways to collect and
analyze data. In each case, the narrative approach flourished as
student experience was placed at the center of the inquiry. While
other people's experience--particularly faculty members'--were
examined, all seven campus-based projects focused their inquiry on
some variation of the core question about what students themselves
experienced in the women's studies and gender studies programs. 

Many projects developed their own questionnaires. Wellesley, Old
Dominion, and Lewis and Clark created questionnaires to explore
alumnae/i's experience of the program and their learning in it.
Questionnaires also were developed for students in the program, and
sometimes for control groups of students not in the women's studies
program, to inquire how they learned, the particular nature of the
knowledge gleaned, and what kind of classroom dynamics were
operating.

Old Dominion, for example, created a pre- and post-test for a group
of classes to question what knowledge-based learning was occurring
on particular vectors, such as the social construction of gender,
interlocking oppression of women, women's varied relations to
patriarchy, social construction of knowledge, or women's power and
empowerment. Like many of the other campuses, Old Dominion not only
pursued these questions through the tests and questionnaires but
also began looking at unobtrusive measures--data generated in the
paper-laden, test-laden, routine processes of the academy. In these
cases, the generation of data is not a problem. It occurs in the
daily processes of the classes and programs. Creating ways to
access that data and analyze it is the challenge.

This was a challenge that many took on, in some cases choosing
unobtrusive measures that were especially creative. At least three
of the campuses examined existing processes for honoring student
work and analyzed those works for varied purposes ranging from
content to critical reasoning to knowledge base questions.
Questions often were embedded in course exams. Colorado students
were asked to compile a portfolio of their work in the women's
studies program and conduct an analysis of what they had learned.
Old Dominion asked a question about students' friendships, both at
the beginning and end of a semester of classes. A number of
programs analyzed student papers or journals to determine the
nature of knowledge demonstrated. This attention to students'
language and the naturalistic narratives they pro- duce in the
course of a semester or a year reflected an ongoing concern with
paying attention to the meaning being constructed by students and
the kind of learning that meaning represented.

Another approach to locating and listening to students' meaning
emerged in focus-group interviews. At Hunter College, a graduating
senior conducted these interviews with a cross-section of women's
studies students and graduates as her final project. As mentioned
above, a number of sites used essentially focus-group interview
strategies to generate questions to pursue or to find out how
students or faculty members were experiencing the program.
Wellesley conducted both telephone and face-to-face interviews with
students, alumnae, and faculty members to explore in greater depth
what their open-ended questionnaire asked.

Observation also can be viewed as a relatively unobtrusive measure.
Also involving a student as primary investigator, Colorado
undertook to observe both women's studies classes and comparable
classes in the humanities and social sciences. Grounded in the
narrative-friendly approach of "illuminative evaluation," in which
there is a process of "progressive focusing" on the questions at
hand, the classroom observations included three components:
content, structure, and dynamics. Hunter undertook a less formal
observational approach by examining existing practices and the use
of them. Looking at recently awarded student prizes, students'
choice of internship sites, student fairs, and student
organizations, they evaluated the extent to which their goal of
achieving a multicultural emphasis in the women's studies program
was apparent in these activities.

Overall, what seemed particularly significant in the data
collection phase of the process was that each program found a way
to collect information that was minimally disruptive and not too
time consuming. Given busy schedules and the demands of teaching,
this was absolutely necessary. Each program was quite capable of
creating highly complex assessment programs which could easily have
been accomplished had there been unlimited time and funds. What is
exemplary and laudatory is that the data actually collected seemed
both negligibly intrusive and unimaginably rich.

MAKING MEANING: THE PROCESS OF ANALYZING RESULTS
While most of the data collection activities were essentially
narrative in practice, reliant upon conversations and writing, the
narrative process for constructing meaning flourishes as the data
analysis begins. At the heart of data analysis lies a process of
making meaning, of looking at a set of complex or confusing
materials and beginning to discern nuggets of insight, a sense of
what matters and what is happening and ideas for further research.

Unless one reads the entire report of the seven individual
projects, it is rather difficult to convey the richness of ideas
that emerged. As attention turned to data analysis, often there was
again a rich conversation to support it. Preliminary reports,
themselves frequently written collaboratively, typically were taken
back to a group of faculty members or faculty members and 
students. Such collaboration among analyzers was noted by many as
an opportunity to further both professional work and personal
relationships as well as to refine program designs, curriculum, or
pedagogy.

In some cases, data analysis was seamless with data collection.
Undertaking a fairly recent innovation in the coding of qualitative
data, many did "multiple readings" of the same set of data, so that
one set of materials could be viewed from a couple of
perspectives.[9] Wellesley, for instance, collected quantitative
data comparing women's studies and non-women's studies courses.
They then went back to the student questionnaires to reinterpret
the numbers on the basis of the more extended short-answer
narratives students included as part of their response. Lewis and
Clark gathered journals, papers, exams, and questionnaires to
examine the intellectual progression through which students
incorporated a set of knowledge plots in gender studies. They then
reviewed those same sources to look at the pattern of responses
from male students versus that of female students. Like all good
conversations, multiple readings recognize that meaning is
multilayered and only the opportunity to "replay" the conversation,
listening for the different themes, both captures and honors the
complexity.

What seems most significant in the process of data analysis is
actually two-fold. First, there is a process of bringing certain
questions to the fore in looking at any materials. A number of the
campuses began with some quantitative analysis--often as a starting
point and particularly as a way to frame comparative questions
about women's studies courses or students in contrast to
non-women's studies courses or students. Some developed a way to
code data for particular components. For example, Lewis and Clark
developed a series of coding sheets to inquire of class syllabi how
much gender-based, socially constructed knowledge was integrated
into the courses. Similarly, they "scored" student papers for
knowledge base, as did Old Dominion with student exams and papers.
But even if data analysis did not initially start with a simple or
single analytic technique, all of the sites moved toward
illuminating particular questions of concern--toward examining the
material with some a priori questions and some that emerged as the
analysis progressed.

The second aspect of this kind of analysis is that narrative
inquiry again becomes both a salient and informative procedure.
Just as the early conversations constructed meaning in context, so
does data analysis construct meaning from the data/narratives. The
process is dialectic, emergent, exploratory, and sometimes
described as "soft" as opposed to "hard." By staying close to one's
questions and the material, a clearer picture or fuller story
begins to emerge.
 
RETURNING TO THE CONVERSATION
It is those compelling stories that are told in the individual
reports. Each campus found it had particular things to say about
the strengths and challenges in its program. For some, the data
analysis moved them back into a conversation with a particular
literature. Colorado's report examines the contribution of women's
studies content, structure, and dynamics in the context of the
current literature on the quality of undergraduate programs.
Wellesley suggests how some of their findings reverberate with
"different voice theory" of women's development and learning, while
Old Dominion examines their students' cognitive developmental
position and the impact it has on interpreting not only the
student's experience but also the data they examined. In short, the
interpretative act is powerfully foregrounded in all the
analyses--sometimes in confident statements grounded in data,
sometimes in further questions to pursue.

But the final reports are not, like women's conversational style,
peppered with tag questions and open-ended, hypothesis-generating
statements. Clearly the inquiry has led to significant
recommendations in all cases: recommendations for further study to
be sure, but also specific recommendations such as sequencing or
prerequisites for particular courses of study; re-envisioning
involvement of students; and pedagogical refinements to ensure more
connected learning, in terms of both active involvement and
personalized learning. Recommendations are made that address all
levels of the academic project: teaching, curriculum, and
co-curricular activities.

In addition, there is a series of outcomes which, in a more
conventional approach to assessment, might be ignored. Yet we would
argue they are critical to the health and well-being of the
institutions, the participants, and the assessment process itself.
A number of institutions observed that the mere intention of
undertaking assessment and the energy put toward it spawned a
renewed vitality. Student groups that had been dormant revived and
began operating again. In some cases the process of identifying
alumnae/i led to a revival or creation of an alumnae/i group,
though the process was not always straightforward or simple (on one
campus it took a year to get the registrar's office to state
definitively that it had no way to access information about
graduates' majors). Most notably, a number of places that did not
have active conversations among faculty members found that the
assessment project fostered lively and ongoing discussions; those
faculty members vowed to maintain and continue both the
conversation and the assessment.

In its best form, assessment contributes to the life of an academy
in a way that promotes further thought, interpersonal and
professional connections, and enhanced student development
opportunities. To begin any assessment project is to enter into a
conversation about all the important issues in education: What are
we hoping students learn (goals)? How do we arrange things so they
learn that (curriculum, pedagogy, requirements)? Do we think it is
happening, and, if not, how might it happen better (evaluation)?

If the recent calls for renewed vigor and attention to teaching are
to be taken seriously, the move to assessment must support that
effort. The means of assessment will always be shaped by the ends
it is intended to accomplish or address. When it is grounded in a
conversation, and when that conversation starts with having the
courage to question not only what we do but also what we think we
do, it will become a rich dialogue about the nature of learning,
about the nature of knowledge, and particularly about the insights
that programs struggling on the margins have to tell us about the
limits of practice in the center. As the "Courage to Question"
project participants came to the end of their reports, it was clear
that they were prepared--even eager--to rejoin the conversation.
The audience for their insights had become clear, though different
at each site, and without a doubt they will continue to have rich
narratives and important contributions to make in that dialogue.



1. See Carol Gilligan, Annie G. Rogers, and Deborah L. Tolman,
Women, Girls, and Psychotherapy: Reframing Resistance (New York
Haworth Press, 1991); Sandra Harding, The Science Question in
Feminism (Ithaca, N Y.: Cornell University Press, 1986); Frances
Maher and Mary Kay Thompson Tetreault, Inside Women's Classrooms:
Mastery, Voice, Authority, and Positionality (forthcoming);
Elizabeth V. Spelman, Inessential Women: Problems of Exclusion in
Feminist Thought (Boston: Beacon Press, 1988).
2. Kenneth A. Bruffee, "On Not Listening in Order to Hear,
Collaborative Learning and the Rewards of Classroom Research,"
Journal of Basic Writing 7 (1988) 12.
3. V. Turner, "Social Dramas and Stories about Them," in On
Narrative, W. J. T. Mirchell, ed. (Chicago: Univ. of Chicago Press,
1981), 164.
4. Carolyn Maralese, "Feminist Pedagogy and Assessment"
(presentation at the American Association for Higher Education 1992
Assessment Conference, Miami, June 1992). 
5. Ernest L. Boyer, Scholarship Reconsidered: Priorities of the
Professoriat (Princeton, N. J.: The Carnegie Foundation for the
Advancement of Teaching, 1990).
6. This and subsequent quotations describing the project come from
the campus reports. 
7. See the Oxford English Dictionary.
8. Audre Lorde, Sister Outsider (Trumansburg, N Y.: The Crossing
Press, 1984), 112.
9. Lyn Mikel Brown, ed., A Guide to Reading Narratives of Moral
Conflict and Choice for Self and Social Work (Cambridge Harvard
University Graduate School of Education, 1988).