CHI 97 Electronic Publications: Papers
CHI 97 Prev CHI 97 Electronic Publications: Papers Next

Participatory Analysis: Shared Development of Requirements from Scenarios

George Chin Jr., Mary Beth Rosson and John M. Carroll
Computer Science Department
Virginia Polytechnic Institute and State University
Blacksburg, VA 24061-0106 USA
+1 540 231 6931
chin@csgrad.cs.vt.edu; rosson@vt.edu; carroll@cs.vt.edu

ABSTRACT

Participatory design typically focuses on envisionment and evaluation activities. We explored a method for pushing the participatory activities further "upstream" in the design process, to the initial analysis of requirements. We used a variant of the task-artifact framework, carrying out a participatory claims analysis during a design workshop for a project addressing collaborative science education. The analysis used videotaped classroom sessions as source material. The participant-teachers were highly engaged by the analysis process and contributed significantly to the analysis results. We conclude that the method has promise as a technique for evoking self-reflection and analysis in a participatory design setting.

Keywords

Participatory analysis, participatory design, scenarios, task-artifact framework.

© Copyright ACM 1997



INTRODUCTION

System developers employ a variety of techniques to elicit requirements from users; these include interviews, observations, and artifact analysis [8]. In these techniques, users typically play the role of informants, not initiators - they provide information, but do not do the analysis. In this sense, traditional forms of requirements analysis include but fail to empower the user during the most formative stages of software development. This can limit both the quality of the requirements analysis itself, and the effectiveness of user-developer collaboration over the entire course of a project. We are exploring participatory analysis techniques aimed at increasing user participation from the earliest stages of requirements analysis.

Our analysis approach is built on the philosophies and objectives of participatory design. These objectives include mutual learning between user and designer, application of design tools familiar to users, envisionment of future work situations, and grounding of analysis in the practice of the user [3].

CAN TEACHERS WRITE CLAIMS?

We are working with a scenario-based design framework called the task-artifact framework (TAF, [2]). In this method, system development starts from an analysis of implicit requirements embodied in user tasks. The TAF describes an iterative model of analysis and design: the analysis of requirements scenarios guides the development of envisionment scenarios. In this framework, the question of direct user participation in requirements analysis becomes an investigation of users participating in the analysis of their own usage scenarios. Can users contribute to the identification of implicit requirements embodied in their own current practices, and if so, how?

Scenario-based design generally facilitates user participation in system development: scenarios are informal, evocative, work-oriented, can be sketchy or highly detailed, and are equally accessible to various stakeholders in a design. Perhaps most important, the scenarios belong to the users, they describe and exemplify the users' own practices. This transforms the user from the recipient or consumer of the system development process into an expert participant.

A variety of scenario-based design techniques have emerged that allow users to participate directly in the design and formative evaluation of early prototypes. In Kyng's [5] cooperative method, users and developers collaboratively specify and refine design details as users simulate work activities by stepping through interaction scenarios using mock-ups and prototypes. In PICTIVE (Plastic Interface for Collaborative Technology Initiatives through Video Exploration, [7]), users and developers collaboratively construct informal prototypes using low-tech accessories such as markers, Post-It notes, scissors, and tape.

In general, participatory design techniques have focused on system development activities that presuppose initial prototypes [5] or that address prototype development very concretely (for example, as user interface design, Muller et al. [7]). Little research has been directed at techniques for user participation in earlier phases of design, such as requirements analysis. Muller et. al. recognize this deficiency as they write, "the major failing of PICTIVE is that it often encourages detailed design, without supporting a critical participatory analysis of higher-level issues."

Although scenarios are most frequently used for envisioning designs, it has been argued that they can play an integrating role throughout the software development process [1], including requirements analysis:

People using current technology can be directly observed to build a scenario description of the state-of-the-art and to ground a scenario analysis on what subsequent technology might be appropriate: the requirements scenarios embody the needs apparent in current work practice. [2, p. 7]

The TAF tries to push scenario analysis to Muller's higher level. It attempts to articulate implicit relationships between features in a situation and consequences for users in that situation as "claims" - for example, "a blinking icon quickly captures the attention of the user in the event of an emergency condition." The feature of this situation is a blinking icon; the consequence is quick notification.

This paper describes a case study that applies participatory analysis in the design of a collaborative learning environment for middle and high school students. Thus, our central question is whether public school teachers can participate in claims analysis.

PARTICIPATORY ANALYSIS IN THE TASK-ARTIFACT FRAMEWORK

A basic notion of the TAF is that any particular system development project takes place in a wider context of technology development in which human artifacts and human tasks co-evolve. The artifacts-in-use of current technology embody affordances and constraints for human activity; at the same time, the tasks people engage in embody requirements for further technology development. Understanding artifacts in terms of the scenarios of use they enable and obstruct makes it possible to more deliberately manage the co-evolution of tasks and artifacts.

The TAF has four stages (see Figure 1):

Figure 1

Figure 1. Iterative analysis and design in the TAF.

The TAF is a cyclic integration of the processes of analysis and design. Claims analysis "seeks to get at just how the artifact suggests to users that they do something one way or another, how it supports and fails to support their efforts, and how it signals progress and error [2]." The products of claims analysis (i.e., claim schemas) organize analysis data in a way that is conducive to interpretation and design reasoning.

Design occurs through the envisionment of features and new scenarios. During design, we scrutinize our claims and reason about how to create new features or refine our original features to improve the consequences. For integration and validation of features, the envisioned features are then attached to particular instances of use as they are inserted into envisionment scenarios. The results of the design process are envisioned features and scenarios which may again undergo claims analysis.

We extend the TAF by fully involving users in the analysis and design process. We wanted to investigate whether the TAF is accessible to persons who are not human-computer interaction (HCI) specialists. And more generally, we wanted to investigate whether requirements analysis-working from scenarios and claims representations-could be carried out as a participatory design activity.

To ourselves, we made the argument that TAF is just an incrementally more systematic variant of the decision processes people engage in day-to-day. When people shop, they perform claims analysis as they assess the features of products: how will this automobile do in the ice scenario, are the tires wide enough for the beach? Claims analysis merely asks that the reasons why a specific feature is good or bad be explicitly elaborated.

Extending the TAF to a participatory approach had many ramifications. We needed to codify "raw" user practice to make it more discussible by developers and users. Thus, we collected a substantial sample of videotaped observations of classroom situations, but made no a priori identification of artifacts and features. We wanted to defer the analysis of features and consequences in order to take full advantage of the special expertise of those who understand the work context, namely the users.

CLASSROOM CASE STUDY

Our research is performed as part of a larger educational technology project funded by the National Science Foundation's Networked Infrastructure for Education (NIE) program. The project is coordinated by Virginia Tech and Montgomery County (VA) Public Schools. A primary objective of the project is to develop and evaluate computer-based, collaborative learning tools and environments in support of middle and high school physics education. Participants of the project include teachers and students from four Montgomery County schools and computer science and education researchers from Virginia Tech. We decided to adopt a participatory design approach, since we knew from prior work that the teachers we are working with have a great amount of technical knowledge that no one else on the project team did [6][9].

The requirements analysis case study is drawn from a two-week summer workshop which included all project participants. Activities in the workshop included demonstrations and discussions of commercial and research learning environments, consideration of various proposed user interface metaphors, planning discussions for baseline and post-intervention evaluation of student skills and for classroom projects and activities for the 1996-97 school year. Time was specifically allocated during the workshop for participatory analysis and design.

Participatory analysis and design took place over three sessions. Participatory analysis occurred over two one-half day sessions. In these sessions, we analyzed videotape recordings of classroom situations, gathered during the preceding semester. Participatory design occurred for two hours in a third session. In the following, participatory analysis includes scenario generation and claims analysis while participatory design refers to feature and scenario envisionment (see Figure 1).

Participant Roles

Our project team divided into three constituencies, which corresponded to three roles. The team included four teachers (one was not present during the first analysis session). The teacher role contributed understanding of classroom activities and context. The teacher could assess how elements of current and envisioned situations matched classroom needs.

The team also included four technologists, computer science researchers who were responsible for the technical infrastructure for the project (one of these team members was not present for the final design session). The technologist role contributed an understanding of and interest in computer technology and system development; including new technology not previously applied in a classroom setting. The technologist could assess what features of a design could realistically be developed and under what constraints.

Finally, the team included four HCI designers (one HCI designer was also not present for the final design session). The HCI designer role contributed an understanding of user interface technology, and of the TAF method we were applying. The HCI designers were able to facilitate the participation of both teachers and technologists as well as to participate themselves. Also present were two students who provided videotaping and recording support and occasional contributions to the discussion; a public school administrator joined us for part of one analysis session.

In general, teachers played the teacher role, system developers played the technologist role, and HCI researchers played the HCI designer role. During the course of the analysis, however, participants sometimes changed their roles. One of the teachers was particularly interested in computer technology, and sometimes played the technologist role. Some of the system developers were also experienced university professors, and provided input into the analysis process as teachers. HCI researchers were both college professors and system developers; they contributed within those additional roles during the analysis.

We considered this fluidity of role assignment to be a positive factor. We had been concerned going into this investigation about potential inequalities among the roles: it is often alleged within the CHI community that HCI designers are sometimes subordinated to technologists in design contentions. We worried that this might be even more acute in collisions between teachers and technologists. None of this materialized. Perhaps because many of the people on our project team played multiple roles, there was obvious deference to principal roles: the teachers were the final authority on pedagogical activities, the technologists were the final authority on technology, and the HCI researchers were the final authority on user interface and design methodology.

Data Collection and Scenario Selection

The classroom scenarios used as source material in the participatory analysis were extracted from videotaped observations of classroom activities. Since our aim is to develop a collaborative learning environment for science experimentation, the focus of our observations was students working on experiments together in groups. We also recorded other activities related to the experiments including teachers' lectures, students forming into groups, class discussions, and group presentations.

We supplemented videotaped observations with other forms of data including fieldnotes taken during observations, videotaped interviews of students and teachers, and classroom artifacts such as textbooks, lab instructions, lab assignment sheets, data worksheets, lesson plans, and homework problem sets. In the interviews, we asked students and teachers general, open-ended questions on collaboration, experimentation, and pedagogy. The interviewees were encouraged to elaborate their views and to take the discussion in any direction they wished. Classroom observations, interviews, and artifacts were collected over a period of three months from the four schools participating in the project.

Over the period of our data collection, we observed several lessons in each teacher's classroom. From these observations, we identified key phases of science lessons at the middle and high school levels. In the middle schools, a science lesson often consisted of many small, diverse learning activities such as brainstorming, physical experiments, demonstrations, class discussions, roleplaying, and student presentations. In the high schools, the science lessons were typically focused on one complex, physical experiment. There, the lesson phases were geared towards the stages of an experiment such as equipment assembly and execution, data collection and analysis, and lab reports.

We used these lesson phases as genres. For each genre, we searched for rich episodes of interaction among the teacher and students. We also examined the videotaped interviews to find topics and themes that were important to teachers and students, and then tried to find occurrences of these themes in the classroom videotapes. For example, one teacher expressed a strong commitment to discovery learning: we looked for and found evidence of this teaching style in the videos and selected the relevant episodes.

There were also episodes that we simply found interesting and worthy of analysis. For example, during a high school experiment, a confrontation occurred between two male lab partners vying for control over the same piece of apparatus. A female lab partner resolved the conflict by suggesting and enforcing a turn-taking policy. We felt that this episode was rich in the issues of gender, leadership, and conflict, and would incite good discussion and analysis.

Once a set of episodes was selected for participatory analysis, we transcribed the selected episodes. We referred to our collected artifacts to fill in the details of the scenarios such as the specific instructions that students were given for an experiment, the homework and lab problems that students were to address, and the worksheets students used to organize their experiments.

Ultimately, we prepared two classroom scenario sets. One set described activities surrounding a middle school experiment on waves (Figure 2a). The other described activities from a high school experiment on inelastic collisions (Figure 2b). Each scenario set consisted of one overview scenario and several more focused scenarios. The overview scenario described the chronological teaching activities of the lesson as well as its pedagogical context and goals. Focused scenarios narrated specific interactions, collaborations, and events occurring during the lesson.

A typical concern of designers using any scenario-based design approach is whether selected scenarios provide reasonable coverage of the tasks and artifacts that occur within a system or situation. In these initial sessions, coverage was not a major concern. Our goal was to get the teachers interested and involved in the analysis by selecting scenarios that were motivating and revealing. We selected scenarios that contained interactions and activities, that were interesting, and that would serve as a good source of discussion among the teachers and technologists. We will evaluate scenario coverage in future analysis sessions.

Participatory Analysis of Classroom Scenarios

The classroom scenarios were presented to participants in two different forms. Prior to the participatory analysis sessions, the scenarios were transcribed from the collected data and delivered to all participants in a textual form. We asked all participants to review the written scenarios in preparation for the analysis sessions. We also prepared video presentations of the scenarios from our classroom videotapes. These video scenarios were presented to participants at the beginning of the analysis sessions.
a) Middle school lesson on waves
  • An overview of activities for learning about waves is presented; these include teacher-student brainstorming, a physical experiment, group presentations, roleplaying, and a teacher demonstration.
  • While a group of students is running an experiment, the teacher provides conceptual guidance.
  • Students in a group assume different roles during an experiment. The group experiences difficulty in coordinating among the different tasks and roles.
  • During group presentations, the teacher leads a hesitant group through its presentation.
  • Teachers and students enter into class discussion. The discussion centers around a ball floating on the ocean.
  • Student roleplay as particles in a liquid and a solid.
  • The teacher demonstrates the speed of sound through different media using a musical tuning forks.
b) High school lab on inelastic collision
  • An overview of activities associated with a lesson on inelastic collisions is presented. Activities include a lecture, a physical experiment, and a lab quiz.
  • Members of a group coordinate to assemble the lab equipment prior to the execution of the experiment.
  • Through trial and error, a group makes several attempts at running the experiment before achieving a successful run.
  • Collectively, a group diagnoses a mechanical problem it encounters during a run of the experiment.
  • A largely uninvolved group member forcefully raises an issue regarding the experiment with the rest of the group.
  • In the midst of the experiment, group members vie for control over the direction of the experiment.
Figure 2. Classroom scenarios.

We had several motivations for using the video scenarios. First, we wanted to provide a concise record of the scenarios to jog participants' memories. Second, in the (inevitable) event that a participant did not review the textual scenarios, the video excerpts would quickly familiarize the participant with the scenarios so that s/he would be able to contribute to the discussion. Third, we wished to remind teachers that scenarios were simply the activities of their everyday lives-not something foreign, analytically derived for the purpose of design.

When scenarios are transcribed from videotapes or from notes, they become abstractions of the raw data. The transcriber cannot capture all the details present in the setting. As a result, written scenarios seem more abstract and symbolic, and less real than video scenarios. When teachers are presented with a video scenario, they see the scenario in the same form in which they experienced the scenario in their working lives. The scenario becomes more tangible and more vivid.

At the beginning of the participatory analysis sessions, we gave participants a brief introduction on how to perform claims analysis. We asked the participants to identify any features of the classroom scenarios they found interesting. We described 'feature' as something that captures the observer's attention or something that affects the way a person teaches or learns. For each feature, we further asked participants to elaborate both what is good or desirable ("pros") and bad or undesirable ("cons") about that feature. Finally, we asked participants to extrapolate beyond the context of the particular scenario when brainstorming about the pros and cons of a feature.

During the introduction and throughout the participatory analysis sessions, we used terminology that was familiar to all participants. For instance, we refrained from using technical terms such as "claims analysis," "claims," and "consequences" and instead used more generic terms such as "features" and "pros and cons."

We urged two ground rules to the participants. First, we asked that the teachers "take center stage" in identifying the features and the pros and cons from the classroom scenarios. Since the scenarios essentially recorded activities of the teachers and their students, the teachers were in the best position to understand and to reflect on the scenarios. Second, we asked that participants limit critiquing of ideas contributed by others. We wanted the analysis to be a brainstorming session with a free-flowing exchange of ideas and opinions. We wanted participants to think broadly and divergently about their activities and requirements. Issues of feasibility, criticality, and tradeoffs were to be addressed during the envisionment stage later in the design process. By fostering an open environment where all ideas were equally valuable, we hoped to encourage high engagement and involvement from all, but especially from the teachers.

As features and consequences were generated during the analysis, they were transcribed onto large paper sheets. Each claim schema was transcribed onto a separate sheet of paper. When the discussion of a claim schema was seemingly complete, the paper sheet containing the schema was taped to a wall of the room. Often, the group would return to a claim schema to modify or extend it based on discussions evoked by other claims. Sometimes, the group developed several related claim schemas simultaneously if the issue under discussion was sufficiently complex or broad. By the end of the analysis, the walls of the room were covered with paper sheets listing claim schemas.

From the onset of the participatory analysis session, the teachers were immediately able to generate claims. They naturally and openly spoke about the features and consequences that captured their attention in the classroom scenarios. In some cases, the teachers explicitly identified the features and consequences. In other cases, the teachers would enter into discussion about pedagogical issues such as whether discovery learning is a valuable approach for science or how groups should be formed to best support individual and group learning. The HCI designers (facilitators) allowed such discussion to take place, but tried to structure the discussion in the direction of generating claims. For example, the facilitators would ask questions such as "what are the features of discovery learning?" or "how do groups form in the classroom?" Lengthy, open discussions of this kind stimulated the creation of many related claims and claim schemas.

We illustrate the participatory analysis process through an example taken directly from the analysis sessions. Participants were shown a video scenario of a group of students collecting equipment and organizing themselves at a workbench. In the scenario, three of six students collaborated to assemble the equipment while the other three casually conversed amongst themselves. A feature we extracted from the scenario was labeled, "large group size" (see Figure 3). Eight of thirteen participants attending the first analysis session contributed to this claim schema. The cons listed for the large group size feature were directly identified from studying the video scenario. The pros, however, were not evident in the video scenario, but rather were developed by extrapolating beyond the scenario.

The two participatory analysis sessions produced 32 claim schemas, 12 for the middle school scenarios, 20 for the high school scenarios. The schemas fell into the general categories of science, lessons, experiments, physical setting, groups, student roles, learning styles, and student personalities. Discussion of one feature or claim often lead to the discovery of others. For example, discussion of the "large group size" claim schema lead to the discovery of claims concerning "self-selected groups" and "persistent group composition."

We videotaped the participatory analysis sessions. From the videotape, we associated each claim and consequence with the participant who originally identified it. A total of 32 features and 220 consequences were generated over the two analysis sessions. Of all the claim features generated, 48% originated from teachers (see Figure 4). Of all the consequences generated, 48% also originated from teachers. The average time spent on the analysis of a claim schema was approximately 13 minutes.
Feature: Large Group Size
Pros:
  • (p1) may provide greater input and knowledge than smaller groups
  • (p2) may handle more challenging and complex experiments
  • (p3) requires fewer workstations and equipment
  • (p4) requires less grading by teacher if group is graded as whole
  • (p5) may allow teacher to grade in greater depth since there are less groups
  • (p6) is easier for teacher to provide guidance to each group since there are less groups
Cons:
  • (c1) not everyone in the group may be engaged (group may be too large to keep everyone interested)
  • (c2) is easier for the dominant personalities to simply take control of the experiment
  • (c3) demands greater accessibility to the equipment since the equipment must be shared among a larger number of students
  • (c4) tends to produce subgroups - one subgroup may simply take control of the experiment while the others idly stand by
Figure 3. Claims for large group size.


Figure 4

Figure 4. Relative contributions to generation of claims and consequences by teachers, technologists and HCI designers (contributions by student assistants and school administrator are not shown).

We noticed that teachers were likely to contribute in the discussion of a claim schema even if a teacher did not originate the claim. We examined the videotape to see for which features and consequences teachers had "significant input." We considered a participant to have significant input if s/he originates, modifies, extends, or elaborates a feature or consequence. In the discussion of a lab notebook feature, for example, each teacher provided significant input into the discussion by describing how students use lab notebooks in his/her class. General discussions would often lead to a more rich and comprehensive view of the feature or consequence. From our evaluation, we found that teachers provided significant input to 88% of the features and 68% of the consequences.

Given that only three of thirteen participants on the first day and four of thirteen participants on the second day were teachers, we were encouraged by the amount of teacher participation occurring in the requirements analysis. The level of participation we observed is good evidence that teachers can actively contribute to requirements analysis for new educational technology, and, more specifically, that they can create and work with claims.

Envisionment of New Technology

We were pleased with the degree of participation in the analysis sessions. However, it was also important to us to assess how useful these sessions had been. We wanted to know first whether the result of the analysis-the claims schemas-would be useful in driving design reasoning (see Figure 1). We also wanted to determine whether teachers would remain engaged and involved when we shifted the group's focus from their current classroom activities to the design of new activities. The ensuing envisionment session allowed us to evaluate these questions.

At the beginning of the envisionment session, we provided the following set of guidelines.

We asked the participants to envision two types of features: technology and pedagogy. Technology features comprise those associated with a computer or user interface. Pedagogy features comprise those associated with teaching. When designing systems, we must consider the implications that the system has on the activities it supports. The activity and the system should not be designed in isolation from one another [6].

The teachers understood this dependency between activity and system. During a discussion on student roles during participatory analysis, one teacher told the group that his students typically fell into one of four roles during an experiment-a doer, an equipment manager, a leader, and a recorder. Later, during participatory design, a "defined role" technology feature was suggested. This feature was intended to fix roles or tasks that students perform during a science simulation. The teacher who originally described the four student roles realized that the roles were less applicable for experiments run on computers. After some thought, the teacher came up with three new student roles, "keyboarder," "mouser," and "observer." The new roles related to the sharing of the input devices to a computer.

Referring back to Figure 4, we may demonstrate how reasoning about the "large group size" claim contributed to a feature envisioned during participatory design. Based on other claims concerning group organization, a participant suggested a new feature, a "group selection framework." This technology feature was described as a computer-mediated grouping tool that would allow students to organize into groups with some teacher control. By examining the pros and cons of the "large group size" feature, the group began to elaborate the capabilities or functions of the proposed feature. For example, to mitigate the con that a large group size (con 'c2' in Figure 4) may encourage dominant personalities to take over, a participant suggested that one function of the group selection framework might be an ability to form groups based on student personality profiles. This way, the teacher could define groups such that a dominant personality is never paired with a weak personality.

Details of all proposed features were discussed and elaborated in this way. Figure 5 summarizes the results of envisioning the group selection feature. Seven of the thirteen team members present for the design session contributed to this envisionment.

A total of 17 features and 24 elaborations were envisioned during the two hour design session. Of all new features, 41% were proposed by teachers (see Figure 6). Of the feature elaborations, 71% originated from the teachers.
Feature : Group Selection Framework
  • That is based on the individual's skills (ability groupings) and personalities
  • That allows for anonymous selection of or negotiation for group members
  • That allows students some flexibility in selecting group members
  • That may be student-controlled
  • That allows students from remote locations to collaborate (may want to hide the fact that they're remote)
  • That is teacher-configurable
Figure 5. Envisionment of group selection framework.


Figure 6

Figure 6. Relative contributions to feature envisionment and elaboration by teachers, technologists, and HCI designers (contributions by student assistants and school administrator are not shown).

The average time spent on the envisionment of a feature was approximately 6 minutes. As for the claims analysis, we examined the videotapes to count the number of feature discussions in which one or more teachers participated: we found that teachers provided significant input to 90% of the features and 88% of the elaborations.

Four of the thirteen participants attending the participatory design session were teachers. Again, we are encouraged by the amount of teacher participation in the design activities. The level of participation demonstrates that teachers are quite capable of envisioning new features from claims.

In fact we were surprised to see an apparent increase in teacher participation from analysis to design. We expected lower teacher participation during the design phase because the other participants had greater experience in both using and designing technology, which we assumed would aid them in the envisionment of new technology.

One contributing factor may have been changes in group composition from analysis to design. One of the teachers who was relatively "vocal" was not present on the first of the analysis sessions, but was present during the design session. In contrast, one of the technologists who made many contributions during analysis was not present during the design session. These changes alone could have produced the apparent increase in the relative degree of teacher participation. However, it seems clear that teachers participated in design envisionment at least as effectively as they had in analysis of their own classroom context.

DISCUSSION

We have learned that participatory design is far more involved than merely inviting users to a design meeting. The users must feel engaged; they must have a stake. They must have effective access to relevant information. They must have status, power, and scope of action sufficient to allow them to take positions and contribute to decisions [4]. And of course much about the context of technology development militates against any of this.

The principal insight (thus far) in our investigation is the demonstration that the teachers were able to participate fully in the initial requirements analysis - not as informants, or subjects of analysis, but as analysts.

We believe that several factors contributed to this favorable quality of participatory interaction. The teachers contributed to activities and decision-making from the very start. We had spent months developing a working relationship with these teachers, recruiting them during the original preparation of the grant supporting the project. The videotaped scenarios we used as raw material for our requirements analysis were collected from the teachers' classrooms. This meant that team members had already spent much time in an ethnographer's role, observing and recording classroom activities, interviewing teachers and students about class goals and experiences. Thus the teachers had been an integral part of the months of pre-structuring activities that took place before the analysis sessions.

Effective participatory design requires a common environment-shared media of analysis and design as well as shared terminology-in which the user and the developer equally participate. For our method the shared media included scenarios, claims, and the new features inspired by the claims. We found that users and developers were equally capable of working with these media.

The shared terminology that emerges in participatory analysis often relies extensively on the language of the user, because s/he is most able to identify, label and confirm features and consequences extracted from usage scenarios. For example, recall the four student roles of a doer, an equipment manager, a leader, and a recorder. The names of these roles became elements in our analysis. Today these terms have become part of the project's shared vocabulary.

As designers, we also contribute to the shared terminology, when we introduce concepts associated with our analysis and design methodologies. As part of our participatory stance, we are careful to replace technical terms with more generic ones (we use "features," "pros," and "cons" rather than "artifacts," "claims," and "consequences"). Despite our caution, we now find that the teachers use words such as "scenarios" and "envisionment" in normal conversation. In a conversation toward the end of the workshop, when a developer raised a question about an as-yet-undiscussed feature, a teacher quickly responded, "I don't know about that, I'll have to see if I can come up with a scenario."

Teachers equated scenarios to classroom activities. This definition was easy for teachers to apply as they talked naturally about the activities that occur in their work. As we moved from analysis to design, the teachers' view of scenarios evolved to include envisioned classroom activities as well.

Laughton described the problems of achieving symmetry in collaboration between the designer and the user [6]. We believe that our method successfully shifted power in the direction of the user. Classroom scenarios established the classroom context as the focus of analysis and design, and teachers were explicitly recognized as the owners of the classroom activities. Because introducing new technology necessarily modifies the underlying teaching activity, the developers were placed in the position of "seeking permission" from the teachers to make these changes.

During the analysis sessions, the developers sometimes seemed to respond to this shift in power by focusing first on the possible consequences of a technology feature of interest. For example, one developer repeatedly stressed a particular negative consequence that specific teaching activities require the teacher and students to be co-located. The technology feature that the developer had in mind was video teleconferencing. As a participant in our analysis framework, however, the developer did not propose this simply as an interesting feature, but rather emphasized the rationale in the current situation that motivated it. In this sense the role of the developer seemed to change somewhat, to that of a salesman who wanted to persuade the teachers that his/her new and improved feature would make the teachers' lives easier or better.

Participatory analysis relies on usage scenarios describing the natural setting. This enables the users to immediately connect the scenarios to their own image of their own work. Users develop a greater sense of ownership of the analysis and design because the activities are grounded in authentic activities occurring in the user's world (rather than from abstract manifestations of those activities hidden in the workings of a computer system). As expert participants of the work setting, the users naturally own the real-world activities. From this initial sense of ownership of the classroom scenarios, the teachers were able to establish a more general sense of ownership throughout the design process as their practice was analyzed and re-designed.

Early on in the project, the teachers thought of requirements analysis and system design as processes that computer scientists performed in developing computer systems. We tried to induce a more user-centered view by connecting analysis and design to the classroom activities that teachers conduct and perform. In doing this, the teachers saw how analysis and design would impact and modify their own work. Grounding the analysis and design on classroom scenarios motivated the teachers to participate in the software design process. Establishing the teachers as the owners of the classroom scenarios gave them the power and right to participate.

Thus far, we have carried out a single phase of participatory analysis followed by feature envisionment, to develop an initial set of requirements for the collaborative learning situation. Subsequent work will build in more detail on the claims analyses produced, generating new scenarios that exercise the features envisioned. Continuing within the TAF, these scenarios will be subjected to participatory analysis and further envisionment, as part of the ongoing evolution of the teachers' classroom activities.

ACKNOWLEDGMENTS

This research was supported in part by the National Science Foundation, under grants REC-9554206 and CDA-9303152.

REFERENCES

  1. Carroll, J.M. (1995). Introduction: The Scenario Perspective on System Development. In Carroll, J.M. (Ed.), Scenario-Based Design: Envisioning Work and Technology in System Development, J. Wiley, NY, pp. 1-17.
  2. Carroll, J.M. and Rosson, M.B. (1992). Getting around the task-artifact cycle: How to make claims and design by scenario. ACM Transactions on Information Systems, 10(2), pp. 181-212.
  3. Greenbaum, J. and Kyng M. (1991). Introduction: Situated Design. In Greenbaum, J. and Kyng M. (Eds.), Design at Work: Cooperative Design of Computer Systems, Lawrence Erlbaum Associates, Hillsdale, NJ, pp. 1-24.
  4. Kensing, F. and Munk-Madsen, A. (1993). PD: Structure in the Toolbox. Communications of the ACM, 36(4), pp. 78-85.
  5. Kyng, M. (1995). Creating Context for Design. In Carroll, J.M. (Ed.), Scenario-Based Design: Envisioning Work and Technology in System Development, J. Wiley, NY, pp. 85-107.
  6. Laughton, Stuart. (1996). The Design and Use of Internet-Mediated Communication Applications in Education: An Ethnographic Study. Virginia Tech, Blacksburg, VA, Ph.d. dissertation.
  7. Muller, M.J., Wildman, D.M., and White, E.A. (1993). 'Equal Opportunity' PD Using PICTIVE. Communications of the ACM, 36(4), pp. 54-66.
  8. Preece, J. (1994). Human-Computer Interaction, Addison-Wesley, Wokingham, England.
  9. Williams, M. (1994). Enabling schoolteachers to participate in the design of educational software. Proceedings of PDC'94: Participatory Design (Chapel Hill, NC, October), pp. 153-157.

CHI 97 Prev CHI 97 Electronic Publications: Papers Next

CHI 97 Electronic Publications: Papers