Hara, N., Bonk, C.J., & Angeli, C. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional science. 28(2). pp. 115-152

The study looked at a graduate-level psychology course that used online discussion as a core graded activity. The researchers looked at:

  1. student participation rates
  2. electronic participation patterns (what form of interaction takes place when led by students? does it change over time?)
  3. social cues within the messages (“it’s my birthday.” etc…)
  4. cognitive & metacognitive components of student messages
  5. depth of processing - surface or deep - within message posts

While we were ultimately interested in how a community of learning can be built using online discussion, this study was more specifically focused on the social and cognitive processes exhibited in the electronic transcripts as well as the interactivity patterns among the students.

Content analysis was used to analyze the online discussion - “this particular study is more concerned with analysis and categorization of text than with the process of communication or specific speech acts, as in discourse analysis, it primarily relies on content analysis methodology.”

As indicated, Henri (1992)1 proposes an analytical framework to categorize five dimensions of the learning process evident in electronic messages: student participation, interaction patterns, social cues, cognitive skills and depth of processing, and metacognitive skills and knowledge.

By combining Henri’s criteria related to message interactivity (i.e., explicit, implicit, and independent commenting) and Howell-Richardson and Mellar’s visual representation of message interaction, we created weekly conference activity graphs illustrating the associations between online messages. Quantitative data, such as the number and length of student messages, were also collected.

DN: This combination of methods meant researchers could focus on content analysis while also looking at interaction patterns. Straight discourse analysis would have abstracted the content away. I need to think about how to set this up. I think discourse analysis (speech acts, interaction types) would get at what I’m looking for, but maybe a layer of content analysis is needed too…

Since any message could conceivably contain several ideas, the base “Unit” of the analysis was not a message, but a paragraph.

Quantitative data

  • researchers looked at server logs to see frequency of posts, total number of posts, and weekly posts/activity.

Qualitative data:

  • interaction patterns in the computer-mediated computer conferencing were mapped out. (explicit interaction, implicit interaction, independent statement)
  • social cues apparent in the FirstClass dialogue were coded. (social cues defined as “statement or part of a statement not related to formal content of subject matter.”)
  • both the cognitive and metacognitive skills embedded in these electronic conversa- tions were analyzed to better understand the mental processes involved in the discussions. (using a framework based on Bloom’s Taxonomy)
  • each message was evaluated for the depth of processing, surface or deep.


  • student-centred - students dominated the discussions, with relatively little contribution from instructor
  • most students only posted the one entry required per week, but they were long and substantive posts.
  • several unique patterns of interaction emerged:
    1. the second week had “starter-centered” interaction;
    2. the fourth week had “scattered” interaction, in part, because no one assumed the role of the starter in the discussions that took place that week;
    3. the eighth week had “synergistic” interaction (i.e., it had a cohesive feel with the interaction of the participants creating a combined effect that perhaps was greater than the sum of the individual efforts); and
    4. the tenth week had “explicit” interaction.
  • social cue findings: “In this graduate level course, the num- ber of social cues decreased as the semester progressed. Moreover, student messages gradually became less formal. These findings might be attributed to the fact that students felt more comfortable with each other as the semester continued (Kang, 1998).”2
  • cognitive skill findings: “in this particular research project, most of the messages were fairly deep in terms of information processing. Of the four weeks of detailed analysis, 33 percent of student messages were at the surface level, 55 percent were at an in-depth level of processing, and an additional 12 percent contained aspects of both surface and deep processing.”


“It appears that by structuring electronic learning activity, students will have more time to reflect on course content and make in-depth cognitive and social contributions to a college class than would be possible in a traditional classroom setting.”

“Not only did students share knowledge, but content analyses indicated that students were processing course information at a fairly high cognitive level. Social cues took a back seat to student judgment, inferencing, and clarification.”

  1. Henri, F. (1992). Computer conferencing and content analysis. In A.R. Kaye, ed., Collaborative Learning Through Computer Conferencing: The Najaden Papers, pp. 115–136. New York: Springer. ↩︎

  2. Kang, I. (1998). The use of computer-mediated communication: Electronic collaboration and interactivity. In C.J. Bonk & K.S. King, eds, Electronic Collaborators: Learner-centered Technologies for Literacy, Apprenticeship, and Discourse, pp. 315-337. Mahwah, NJ: Erlbaum. ↩︎