![]() |
|
Our analysis approach is built on the philosophies and objectives of participatory design. These objectives include mutual learning between user and designer, application of design tools familiar to users, envisionment of future work situations, and grounding of analysis in the practice of the user [3].
Scenario-based design generally facilitates user participation in system development: scenarios are informal, evocative, work-oriented, can be sketchy or highly detailed, and are equally accessible to various stakeholders in a design. Perhaps most important, the scenarios belong to the users, they describe and exemplify the users' own practices. This transforms the user from the recipient or consumer of the system development process into an expert participant.
A variety of scenario-based design techniques have emerged that allow users to participate directly in the design and formative evaluation of early prototypes. In Kyng's [5] cooperative method, users and developers collaboratively specify and refine design details as users simulate work activities by stepping through interaction scenarios using mock-ups and prototypes. In PICTIVE (Plastic Interface for Collaborative Technology Initiatives through Video Exploration, [7]), users and developers collaboratively construct informal prototypes using low-tech accessories such as markers, Post-It notes, scissors, and tape.
In general, participatory design techniques have focused on system development activities that presuppose initial prototypes [5] or that address prototype development very concretely (for example, as user interface design, Muller et al. [7]). Little research has been directed at techniques for user participation in earlier phases of design, such as requirements analysis. Muller et. al. recognize this deficiency as they write, "the major failing of PICTIVE is that it often encourages detailed design, without supporting a critical participatory analysis of higher-level issues."
Although scenarios are most frequently used for envisioning designs, it has been argued that they can play an integrating role throughout the software development process [1], including requirements analysis:
People using current technology can be directly observed to build a scenario description of the state-of-the-art and to ground a scenario analysis on what subsequent technology might be appropriate: the requirements scenarios embody the needs apparent in current work practice. [2, p. 7]
The TAF tries to push scenario analysis to Muller's higher level. It attempts to articulate implicit relationships between features in a situation and consequences for users in that situation as "claims" - for example, "a blinking icon quickly captures the attention of the user in the event of an emergency condition." The feature of this situation is a blinking icon; the consequence is quick notification.
This paper describes a case study that applies participatory analysis in the design of a collaborative learning environment for middle and high school students. Thus, our central question is whether public school teachers can participate in claims analysis.
The TAF has four stages (see Figure 1):
|
The TAF is a cyclic integration of the processes of analysis and design. Claims analysis "seeks to get at just how the artifact suggests to users that they do something one way or another, how it supports and fails to support their efforts, and how it signals progress and error [2]." The products of claims analysis (i.e., claim schemas) organize analysis data in a way that is conducive to interpretation and design reasoning.
Design occurs through the envisionment of features and new scenarios. During design, we scrutinize our claims and reason about how to create new features or refine our original features to improve the consequences. For integration and validation of features, the envisioned features are then attached to particular instances of use as they are inserted into envisionment scenarios. The results of the design process are envisioned features and scenarios which may again undergo claims analysis.
We extend the TAF by fully involving users in the analysis and design process. We wanted to investigate whether the TAF is accessible to persons who are not human-computer interaction (HCI) specialists. And more generally, we wanted to investigate whether requirements analysis-working from scenarios and claims representations-could be carried out as a participatory design activity.
To ourselves, we made the argument that TAF is just an incrementally more systematic variant of the decision processes people engage in day-to-day. When people shop, they perform claims analysis as they assess the features of products: how will this automobile do in the ice scenario, are the tires wide enough for the beach? Claims analysis merely asks that the reasons why a specific feature is good or bad be explicitly elaborated.
Extending the TAF to a participatory approach had many ramifications. We needed to codify "raw" user practice to make it more discussible by developers and users. Thus, we collected a substantial sample of videotaped observations of classroom situations, but made no a priori identification of artifacts and features. We wanted to defer the analysis of features and consequences in order to take full advantage of the special expertise of those who understand the work context, namely the users.
The requirements analysis case study is drawn from a two-week summer workshop which included all project participants. Activities in the workshop included demonstrations and discussions of commercial and research learning environments, consideration of various proposed user interface metaphors, planning discussions for baseline and post-intervention evaluation of student skills and for classroom projects and activities for the 1996-97 school year. Time was specifically allocated during the workshop for participatory analysis and design.
Participatory analysis and design took place over three sessions. Participatory analysis occurred over two one-half day sessions. In these sessions, we analyzed videotape recordings of classroom situations, gathered during the preceding semester. Participatory design occurred for two hours in a third session. In the following, participatory analysis includes scenario generation and claims analysis while participatory design refers to feature and scenario envisionment (see Figure 1).
The team also included four technologists, computer science researchers who were responsible for the technical infrastructure for the project (one of these team members was not present for the final design session). The technologist role contributed an understanding of and interest in computer technology and system development; including new technology not previously applied in a classroom setting. The technologist could assess what features of a design could realistically be developed and under what constraints.
Finally, the team included four HCI designers (one HCI designer was also not present for the final design session). The HCI designer role contributed an understanding of user interface technology, and of the TAF method we were applying. The HCI designers were able to facilitate the participation of both teachers and technologists as well as to participate themselves. Also present were two students who provided videotaping and recording support and occasional contributions to the discussion; a public school administrator joined us for part of one analysis session.
In general, teachers played the teacher role, system developers played the technologist role, and HCI researchers played the HCI designer role. During the course of the analysis, however, participants sometimes changed their roles. One of the teachers was particularly interested in computer technology, and sometimes played the technologist role. Some of the system developers were also experienced university professors, and provided input into the analysis process as teachers. HCI researchers were both college professors and system developers; they contributed within those additional roles during the analysis.
We considered this fluidity of role assignment to be a positive factor. We had been concerned going into this investigation about potential inequalities among the roles: it is often alleged within the CHI community that HCI designers are sometimes subordinated to technologists in design contentions. We worried that this might be even more acute in collisions between teachers and technologists. None of this materialized. Perhaps because many of the people on our project team played multiple roles, there was obvious deference to principal roles: the teachers were the final authority on pedagogical activities, the technologists were the final authority on technology, and the HCI researchers were the final authority on user interface and design methodology.
We supplemented videotaped observations with other forms of data including fieldnotes taken during observations, videotaped interviews of students and teachers, and classroom artifacts such as textbooks, lab instructions, lab assignment sheets, data worksheets, lesson plans, and homework problem sets. In the interviews, we asked students and teachers general, open-ended questions on collaboration, experimentation, and pedagogy. The interviewees were encouraged to elaborate their views and to take the discussion in any direction they wished. Classroom observations, interviews, and artifacts were collected over a period of three months from the four schools participating in the project.
Over the period of our data collection, we observed several lessons in each teacher's classroom. From these observations, we identified key phases of science lessons at the middle and high school levels. In the middle schools, a science lesson often consisted of many small, diverse learning activities such as brainstorming, physical experiments, demonstrations, class discussions, roleplaying, and student presentations. In the high schools, the science lessons were typically focused on one complex, physical experiment. There, the lesson phases were geared towards the stages of an experiment such as equipment assembly and execution, data collection and analysis, and lab reports.
We used these lesson phases as genres. For each genre, we searched for rich episodes of interaction among the teacher and students. We also examined the videotaped interviews to find topics and themes that were important to teachers and students, and then tried to find occurrences of these themes in the classroom videotapes. For example, one teacher expressed a strong commitment to discovery learning: we looked for and found evidence of this teaching style in the videos and selected the relevant episodes.
There were also episodes that we simply found interesting and worthy of analysis. For example, during a high school experiment, a confrontation occurred between two male lab partners vying for control over the same piece of apparatus. A female lab partner resolved the conflict by suggesting and enforcing a turn-taking policy. We felt that this episode was rich in the issues of gender, leadership, and conflict, and would incite good discussion and analysis.
Once a set of episodes was selected for participatory analysis, we transcribed the selected episodes. We referred to our collected artifacts to fill in the details of the scenarios such as the specific instructions that students were given for an experiment, the homework and lab problems that students were to address, and the worksheets students used to organize their experiments.
Ultimately, we prepared two classroom scenario sets. One set described activities surrounding a middle school experiment on waves (Figure 2a). The other described activities from a high school experiment on inelastic collisions (Figure 2b). Each scenario set consisted of one overview scenario and several more focused scenarios. The overview scenario described the chronological teaching activities of the lesson as well as its pedagogical context and goals. Focused scenarios narrated specific interactions, collaborations, and events occurring during the lesson.
A typical concern of designers using any scenario-based design approach is whether selected scenarios provide reasonable coverage of the tasks and artifacts that occur within a system or situation. In these initial sessions, coverage was not a major concern. Our goal was to get the teachers interested and involved in the analysis by selecting scenarios that were motivating and revealing. We selected scenarios that contained interactions and activities, that were interesting, and that would serve as a good source of discussion among the teachers and technologists. We will evaluate scenario coverage in future analysis sessions.
|
We had several motivations for using the video scenarios. First, we wanted to provide a concise record of the scenarios to jog participants' memories. Second, in the (inevitable) event that a participant did not review the textual scenarios, the video excerpts would quickly familiarize the participant with the scenarios so that s/he would be able to contribute to the discussion. Third, we wished to remind teachers that scenarios were simply the activities of their everyday lives-not something foreign, analytically derived for the purpose of design.
When scenarios are transcribed from videotapes or from notes, they become abstractions of the raw data. The transcriber cannot capture all the details present in the setting. As a result, written scenarios seem more abstract and symbolic, and less real than video scenarios. When teachers are presented with a video scenario, they see the scenario in the same form in which they experienced the scenario in their working lives. The scenario becomes more tangible and more vivid.
At the beginning of the participatory analysis sessions, we gave participants a brief introduction on how to perform claims analysis. We asked the participants to identify any features of the classroom scenarios they found interesting. We described 'feature' as something that captures the observer's attention or something that affects the way a person teaches or learns. For each feature, we further asked participants to elaborate both what is good or desirable ("pros") and bad or undesirable ("cons") about that feature. Finally, we asked participants to extrapolate beyond the context of the particular scenario when brainstorming about the pros and cons of a feature.
During the introduction and throughout the participatory analysis sessions, we used terminology that was familiar to all participants. For instance, we refrained from using technical terms such as "claims analysis," "claims," and "consequences" and instead used more generic terms such as "features" and "pros and cons."
We urged two ground rules to the participants. First, we asked that the teachers "take center stage" in identifying the features and the pros and cons from the classroom scenarios. Since the scenarios essentially recorded activities of the teachers and their students, the teachers were in the best position to understand and to reflect on the scenarios. Second, we asked that participants limit critiquing of ideas contributed by others. We wanted the analysis to be a brainstorming session with a free-flowing exchange of ideas and opinions. We wanted participants to think broadly and divergently about their activities and requirements. Issues of feasibility, criticality, and tradeoffs were to be addressed during the envisionment stage later in the design process. By fostering an open environment where all ideas were equally valuable, we hoped to encourage high engagement and involvement from all, but especially from the teachers.
As features and consequences were generated during the analysis, they were transcribed onto large paper sheets. Each claim schema was transcribed onto a separate sheet of paper. When the discussion of a claim schema was seemingly complete, the paper sheet containing the schema was taped to a wall of the room. Often, the group would return to a claim schema to modify or extend it based on discussions evoked by other claims. Sometimes, the group developed several related claim schemas simultaneously if the issue under discussion was sufficiently complex or broad. By the end of the analysis, the walls of the room were covered with paper sheets listing claim schemas.
From the onset of the participatory analysis session, the teachers were immediately able to generate claims. They naturally and openly spoke about the features and consequences that captured their attention in the classroom scenarios. In some cases, the teachers explicitly identified the features and consequences. In other cases, the teachers would enter into discussion about pedagogical issues such as whether discovery learning is a valuable approach for science or how groups should be formed to best support individual and group learning. The HCI designers (facilitators) allowed such discussion to take place, but tried to structure the discussion in the direction of generating claims. For example, the facilitators would ask questions such as "what are the features of discovery learning?" or "how do groups form in the classroom?" Lengthy, open discussions of this kind stimulated the creation of many related claims and claim schemas.
We illustrate the participatory analysis process through an example taken directly from the analysis sessions. Participants were shown a video scenario of a group of students collecting equipment and organizing themselves at a workbench. In the scenario, three of six students collaborated to assemble the equipment while the other three casually conversed amongst themselves. A feature we extracted from the scenario was labeled, "large group size" (see Figure 3). Eight of thirteen participants attending the first analysis session contributed to this claim schema. The cons listed for the large group size feature were directly identified from studying the video scenario. The pros, however, were not evident in the video scenario, but rather were developed by extrapolating beyond the scenario.
The two participatory analysis sessions produced 32 claim schemas, 12 for the middle school scenarios, 20 for the high school scenarios. The schemas fell into the general categories of science, lessons, experiments, physical setting, groups, student roles, learning styles, and student personalities. Discussion of one feature or claim often lead to the discovery of others. For example, discussion of the "large group size" claim schema lead to the discovery of claims concerning "self-selected groups" and "persistent group composition."
We videotaped the participatory analysis sessions. From the videotape, we associated each claim and consequence with the participant who originally identified it. A total of 32 features and 220 consequences were generated over the two analysis sessions. Of all the claim features generated, 48% originated from teachers (see Figure 4). Of all the consequences generated, 48% also originated from teachers. The average time spent on the analysis of a claim schema was approximately 13 minutes.
Figure 4. Relative contributions to generation of claims and consequences by teachers, technologists and HCI designers (contributions by student assistants and school administrator are not shown). |
We noticed that teachers were likely to contribute in the discussion of a claim schema even if a teacher did not originate the claim. We examined the videotape to see for which features and consequences teachers had "significant input." We considered a participant to have significant input if s/he originates, modifies, extends, or elaborates a feature or consequence. In the discussion of a lab notebook feature, for example, each teacher provided significant input into the discussion by describing how students use lab notebooks in his/her class. General discussions would often lead to a more rich and comprehensive view of the feature or consequence. From our evaluation, we found that teachers provided significant input to 88% of the features and 68% of the consequences.
Given that only three of thirteen participants on the first day and four of thirteen participants on the second day were teachers, we were encouraged by the amount of teacher participation occurring in the requirements analysis. The level of participation we observed is good evidence that teachers can actively contribute to requirements analysis for new educational technology, and, more specifically, that they can create and work with claims.
At the beginning of the envisionment session, we provided the following set of guidelines.
We asked the participants to envision two types of features: technology and pedagogy. Technology features comprise those associated with a computer or user interface. Pedagogy features comprise those associated with teaching. When designing systems, we must consider the implications that the system has on the activities it supports. The activity and the system should not be designed in isolation from one another [6].
The teachers understood this dependency between activity and system. During a discussion on student roles during participatory analysis, one teacher told the group that his students typically fell into one of four roles during an experiment-a doer, an equipment manager, a leader, and a recorder. Later, during participatory design, a "defined role" technology feature was suggested. This feature was intended to fix roles or tasks that students perform during a science simulation. The teacher who originally described the four student roles realized that the roles were less applicable for experiments run on computers. After some thought, the teacher came up with three new student roles, "keyboarder," "mouser," and "observer." The new roles related to the sharing of the input devices to a computer.
Referring back to Figure 4, we may demonstrate how reasoning about the "large group size" claim contributed to a feature envisioned during participatory design. Based on other claims concerning group organization, a participant suggested a new feature, a "group selection framework." This technology feature was described as a computer-mediated grouping tool that would allow students to organize into groups with some teacher control. By examining the pros and cons of the "large group size" feature, the group began to elaborate the capabilities or functions of the proposed feature. For example, to mitigate the con that a large group size (con 'c2' in Figure 4) may encourage dominant personalities to take over, a participant suggested that one function of the group selection framework might be an ability to form groups based on student personality profiles. This way, the teacher could define groups such that a dominant personality is never paired with a weak personality.
Details of all proposed features were discussed and elaborated in this way. Figure 5 summarizes the results of envisioning the group selection feature. Seven of the thirteen team members present for the design session contributed to this envisionment.
A total of 17 features and 24 elaborations were envisioned during the two hour design session. Of all new features, 41% were proposed by teachers (see Figure 6). Of the feature elaborations, 71% originated from the teachers.
Figure 6. Relative contributions to feature envisionment and elaboration by teachers, technologists, and HCI designers (contributions by student assistants and school administrator are not shown). |
The average time spent on the envisionment of a feature was approximately 6 minutes. As for the claims analysis, we examined the videotapes to count the number of feature discussions in which one or more teachers participated: we found that teachers provided significant input to 90% of the features and 88% of the elaborations.
Four of the thirteen participants attending the participatory design session were teachers. Again, we are encouraged by the amount of teacher participation in the design activities. The level of participation demonstrates that teachers are quite capable of envisioning new features from claims.
In fact we were surprised to see an apparent increase in teacher participation from analysis to design. We expected lower teacher participation during the design phase because the other participants had greater experience in both using and designing technology, which we assumed would aid them in the envisionment of new technology.
One contributing factor may have been changes in group composition from analysis to design. One of the teachers who was relatively "vocal" was not present on the first of the analysis sessions, but was present during the design session. In contrast, one of the technologists who made many contributions during analysis was not present during the design session. These changes alone could have produced the apparent increase in the relative degree of teacher participation. However, it seems clear that teachers participated in design envisionment at least as effectively as they had in analysis of their own classroom context.
The principal insight (thus far) in our investigation is the demonstration that the teachers were able to participate fully in the initial requirements analysis - not as informants, or subjects of analysis, but as analysts.
We believe that several factors contributed to this favorable quality of participatory interaction. The teachers contributed to activities and decision-making from the very start. We had spent months developing a working relationship with these teachers, recruiting them during the original preparation of the grant supporting the project. The videotaped scenarios we used as raw material for our requirements analysis were collected from the teachers' classrooms. This meant that team members had already spent much time in an ethnographer's role, observing and recording classroom activities, interviewing teachers and students about class goals and experiences. Thus the teachers had been an integral part of the months of pre-structuring activities that took place before the analysis sessions.
Effective participatory design requires a common environment-shared media of analysis and design as well as shared terminology-in which the user and the developer equally participate. For our method the shared media included scenarios, claims, and the new features inspired by the claims. We found that users and developers were equally capable of working with these media.
The shared terminology that emerges in participatory analysis often relies extensively on the language of the user, because s/he is most able to identify, label and confirm features and consequences extracted from usage scenarios. For example, recall the four student roles of a doer, an equipment manager, a leader, and a recorder. The names of these roles became elements in our analysis. Today these terms have become part of the project's shared vocabulary.
As designers, we also contribute to the shared terminology, when we introduce concepts associated with our analysis and design methodologies. As part of our participatory stance, we are careful to replace technical terms with more generic ones (we use "features," "pros," and "cons" rather than "artifacts," "claims," and "consequences"). Despite our caution, we now find that the teachers use words such as "scenarios" and "envisionment" in normal conversation. In a conversation toward the end of the workshop, when a developer raised a question about an as-yet-undiscussed feature, a teacher quickly responded, "I don't know about that, I'll have to see if I can come up with a scenario."
Teachers equated scenarios to classroom activities. This definition was easy for teachers to apply as they talked naturally about the activities that occur in their work. As we moved from analysis to design, the teachers' view of scenarios evolved to include envisioned classroom activities as well.
Laughton described the problems of achieving symmetry in collaboration between the designer and the user [6]. We believe that our method successfully shifted power in the direction of the user. Classroom scenarios established the classroom context as the focus of analysis and design, and teachers were explicitly recognized as the owners of the classroom activities. Because introducing new technology necessarily modifies the underlying teaching activity, the developers were placed in the position of "seeking permission" from the teachers to make these changes.
During the analysis sessions, the developers sometimes seemed to respond to this shift in power by focusing first on the possible consequences of a technology feature of interest. For example, one developer repeatedly stressed a particular negative consequence that specific teaching activities require the teacher and students to be co-located. The technology feature that the developer had in mind was video teleconferencing. As a participant in our analysis framework, however, the developer did not propose this simply as an interesting feature, but rather emphasized the rationale in the current situation that motivated it. In this sense the role of the developer seemed to change somewhat, to that of a salesman who wanted to persuade the teachers that his/her new and improved feature would make the teachers' lives easier or better.
Participatory analysis relies on usage scenarios describing the natural setting. This enables the users to immediately connect the scenarios to their own image of their own work. Users develop a greater sense of ownership of the analysis and design because the activities are grounded in authentic activities occurring in the user's world (rather than from abstract manifestations of those activities hidden in the workings of a computer system). As expert participants of the work setting, the users naturally own the real-world activities. From this initial sense of ownership of the classroom scenarios, the teachers were able to establish a more general sense of ownership throughout the design process as their practice was analyzed and re-designed.
Early on in the project, the teachers thought of requirements analysis and system design as processes that computer scientists performed in developing computer systems. We tried to induce a more user-centered view by connecting analysis and design to the classroom activities that teachers conduct and perform. In doing this, the teachers saw how analysis and design would impact and modify their own work. Grounding the analysis and design on classroom scenarios motivated the teachers to participate in the software design process. Establishing the teachers as the owners of the classroom scenarios gave them the power and right to participate.
Thus far, we have carried out a single phase of participatory analysis followed by feature envisionment, to develop an initial set of requirements for the collaborative learning situation. Subsequent work will build in more detail on the claims analyses produced, generating new scenarios that exercise the features envisioned. Continuing within the TAF, these scenarios will be subjected to participatory analysis and further envisionment, as part of the ongoing evolution of the teachers' classroom activities.
![]() |
|