Qualitative Methods

Qualitative research includes "any kind of research that produces findings not arrived at by means of statistical procedures or other means of quantification" (Strauss and Corbin, Basics of Qualitative Research: Grounded Theory Procedures and Techniques, 1990). While quantitative researchers, which we'll be addressing at the end of the course, seek "causal determination, prediction, and generalization of findings," qualitative researchers seek "illumination, understanding, and extrapolation to similar situations" (Marie C. Hoepfl, "Choosing Qualitative Research: A Primer for Technology Education Researchers" Journal of Technology Education 9:1 (Fall 1997)). Qualitative and quantitative research thus differ epistemologically and methodologically.

But which is better? This question has fueled a long debate among researchers.

But it's the wrong question.

According to Jensen, qualitative research is characterized by (1) its interest in the concept of meaning (and meaning's connection to action); (2) its conviction that "meaningful actions should be studied, as far as possible, in their naturalistic contexts"; and (3) the researcher's role as an interpretive subject (236). Qualitative research, unlike much (not all!) quantitative research, takes into consideration the subjectivity of the researcher, and recognizes that the researcher is always engaged in interpretation; interpretation isn't something that the researcher reserves until the last step, after he or she has collected all the data.

Quantitative research, by contrast, uses experimental methods and quantitative measures to test hypotheses. The two methods are based on different sets of assumptions, represent different paradigms, characterize differently the role of the researcher, and produce different kinds of "knowledge." There is room -- and need -- for multiple ways of knowing.

There is a kind of continuum that moves from the fictional that is "true"—the novel for example—to the highly controlled and quantitatively described scientific experiment. Work at either end of this continuum has the capacity to inform significantly. Qualitative research and evaluation are located toward the fictive end of the continuum without being fictional in the narrow sense of the term (Elliot Eisner, The Enlightened Eye: Qualitative Inquiry and the Enhancement of Educational Practice, 1991, 30-31).

Qualitative and quantitative methods needn't be presented in opposition to one another. MQ Patton recommends that we think instead of a "paradigm of choices" that allows for "situational responsiveness" and seeks "methodological appropriateness as the primary criterion for judging methodological quality" (Qualitative Evaluation and Research Methods, 2nd ed., 1990, 39). Some researchers even combine qualitative and quantitative methods in the same project.

So when would qualitative methods be "situationally responsive" or "methodologically appropriate"? Hoepfl (an optimistic surname, yes?), drawing on Strauss and Corbin, identifies several reasons we might opt for a qualitative approach:

  • "to better understand a phenomenon about which little is known"
  • "to gain new perspectives on things about which much is already known..."
  • "...or to gain more in-depth information that may be difficult to convey quantitatively" -- or when the researchers has determined that "quantitative measures cannot adequately describe or interpret a subject"
  • "to identify the variables that might later be tested quantitatively"

Burke Johnson and Larry Christensen, authors of Education Research: Quantitative, Qualitative and Mixed Approaches (Longman), offer a chart that distills the different emphases of quantitative and qualitative approaches.

Hoepful, again drawing on Patton, Eisner, Bogdan & Biklen (Qualitative Research for Education: An Introduction to Theory and Methods, 1982) and Lincoln & Guba (Naturalistiic Inquiry, 1985), adds to Jensen's list of the characteristics that distinguish qualitative research:

  1. She elaborates on qualitative research's "naturalistic setting": " Several writers have identified what they consider to be the prominent characteristics of qualitative, or naturalistic, research (see, for example: Bogdan and Biklen, 1982; Lincoln and Guba, 1985; Patton, 1990; Eisner, 1991). The list that follows represents a synthesis of these authors’ descriptions of qualitative research:

    The researcher attempts to observe, describe and interpret settings as they are." When we get to our discussion of quantitative research, you'll better understand how the research "context" is vastly different between the two approaches.
  2. She provides another way of thinking about the researcher as "interpretive subject" -- that is, as the "human instrument" of data collection.
  3. Qualitative research reports are descriptive and expressive and allow subjectivity to creep in.
  4. "Qualitative researchers pay attention to the idiosyncratic as well as the pervasive, seeking the uniqueness of each case."
  5. "Qualitative research has an emergent (as opposed to predetermined) design, and researchers focus on this emerging process as well as the outcomes or product of the research." Because, as Jensen stressed, the qualitative researcher observes and interprets meanings in context, that context necessitates flexibility in the research design. This is not to say that there is no need for a research plan before one embarks on a project; primary questions and plans for data collection should be spelled out at the beginning.

Jensen offers another way of thinking about the "emergent" nature of qualitative research:

Qualitative researchers tend to conceive of their studies, most generally, as an iterative or repeated process, which allows for the flexible application of theoretical concepts and analytical procedures to a wide variety of empirical domains (Jensen 236).

Qualitative Interpretation

It's important for us to understand this process of interpretation -- and, particularly, how we approach our research subjects both from the perspective of the subjects, or population, we're examining, and from our own perspectives as researchers representing various disciplines, as documentarians, as photographers or videographers, etc. Anthropologist Kenneth Pike claims that we approach the study of a culture through two perspectives: the emic and the etic.


  • internal perspective

  • how a native speaker would understand a language (phonemic)

  • that which has meaning to members of the culture being studied


  • external perspective

  • how a sound engineer would understand a language (phonetic)

  • that which has meaning to those studying the culture

According to Pike, we use the etic to get at the emic; but according to anthropologist Marvin Harris, etics are an end in themselves. Anthropologist James Lett discusses the debate:

From Pike’s point of view, the etic approach is useful for penetrating, discovering, and elucidating emic systems, but etic claims to knowledge have no necessary priority over competing emic claims. From Harris’s perspective, the etic approach is useful in making objective determinations of fact, and etic claims to knowledge are necessarily superior to competing emic claims. Pike believes that objective knowledge is an illusion, and that all claims to knowledge are ultimately subjective; Harris believes that objective knowledge is at least potentially obtainable, and that the pursuit of such knowledge is essential for a discipline that aspires to be a science.

The central questions regarding the relationship between the emic and the etic are obviously questions of epistemology and ontology -- concepts we've explored in relation to basic and applied research (see the chart) and to various theoretical frameworks and methodologies.

Defining Concepts and Sampling

You may recall from our lesson on "Surveying the Field" that we discussed the difference between "constitutive definitions" and "operational definitions." Qualitative research starts with identifying your key concepts, which are informed by your theoretical framework, and the "portion of reality" -- the field -- you plan to study. You then have to determine how to operationalize those concepts by specifying procedures that enable you to observe or measure. How do you take something as nebulous as "gender roles" or "negotiated meanings" and turn them into something observable and measurable -- something empirical?

But even if you're addressing only a "portion of reality," you can't possibly observe that entire portion. If you're studying video game use among teenage girls, you can't possible observe all the teenage gamers in the world. If you're studying married female romance novel readers (remember Radway from our first lesson?), you can't possible interview every member of the population. You need to sample -- and do it in such a way that you can generalize from the "empirical microcosm" to the macrocosm. How can you make larger, population- or universe- wide claims based on what you discovered in the sample you studied? How will you have something to say about all adolescent girls based on the conclusions you drew from the ten you observed or interviewed?

In qualitative research, our research subjects aren't always people -- so, in some cases, our sample might consist of other units of analysis, like settings, activities, events, etc. Furthermore, according to Jensen, in qualitative research, sampling is often multi-step: we first determine the relevant context of meaningful events or appropriate populations, and then sample from within that group. Sampling is part of the interpretive process: "The qualitative research process amounts to a continuous operationalization and refinement of theoretical concepts with reference to empirical evidence generated through several analytical stages" (238).

Here are some common sampling procedures (Patton 1990):

  1. Extreme or Deviant Case: learning from unusual manifestations of the phenomenon under examination, e.g., outstanding successes, notable failures, crises, etc.
  2. Intensity: information-rich cases that manifest the phenomenon intensively, but not to an abnormal extreme, e.g., good students, above/below average.
  3. Maximum Variation: purposefully picking a wide range of variation on dimensions of interest; this method enables you to document unique or diverse variations that emerge in different conditions, and to identify patterns that cut across variations.
  4. Homogeneous: reduces variation, simplifies analysis, facilitates group interviewing.
  5. Typical Case: illustrates or highlights what's typical, normal, average.
  6. Stratified Purposeful: illustrates characteristics of particular subgroups of interest; facilitates comparisons.
  7. Critical Case: permits logical generalizations and maximum application to other cases b/c if it's true here, it's true anywhere.
  8. Snowball or Chain: identifies cases of interest from people who know people who know people who know good research subjects.
  9. Criterion: picking all cases that meet some criterion, such as all children without a computer at home.
  10. Theory-Based or Operational Construct: finding manifestations of a theoretical construct of interest to as to elaborate and examine the construct.
  11. Confirming or Disconfirming: elaborating and deepening initial analysis, seeking exceptions, testing variation.
  12. Opportunistic: following new leads during fieldwork, taking advantage of the unexpected.
  13. Random Purposeful: (still small sample size) adds credibility to sample when potential purposeful sample is larger than one can handle; reduces judgment within a purposeful category
  14. Politically Important Cases: attracts attention to the study (or avoids attracting undesired attention by purposefully eliminating from the sample politically sensitive cases).
  15. Convenience: saves time, money, and effort; poorest rational; lowest credibility
  16. Combination or Mixed Purposeful: triangulation, flexibility, meets multiple interests and needs.

RP: Which of the following sampling techniques might be most appropriate for your research project? How might different sampling procedures produce different results?

Much information from the following discussion of qualitative methods is drawn from Roger D. Wimmer & Joseph R. Dominick, Mass Media Research: An Introduction, 7th ed. (Thompson 2003).


Sampling is of utmost importance in case studies because this method involves choosing and providing a detailed contextual analysis of a limited number of events, people, organizations, settings, etc. The sample is extremely small -- and may even consist of only one -- so its selection is a primary concern.

Yin (Case Study Research, 3rd ed., Sage, 1994) defined a case study as "an empirical inquiry that uses multiple sources of evidence to investigate a contemporary phenomenon within its real-life context, in which the boundaries between the phenomenon and its context are not clearly evident" (paraphrased in Wimmer & Dominick 129). This method is frequently used in medicine, clinical psychology, anthropology, management science, history, and anthropology.

Merriam (Case Study Research in Education, Jossey-Bass, 1988) lists four essential characteristics of a case study:

  1. Particularistic: focuses on a particular situation, event, program, phenomenon
  2. Descriptive: final product is a detailed description
  3. Heuristic: new interpretation, perspectives, and meanings are all goals of a case study
  4. Inductive: principles and generalizations emerge through examining the data (W&D 129)

Advantages of Case Studies:

  • provide tremendous detail
  • may be appropriate when a researcher doesn't know exactly what he or she is looking for
  • useful for a researcher who is looking for clues and ideas for future research
  • useful for revealing the "overdetermined" nature of an event or phenomenon by describing the multiple social forces that have an impact on the subject under examination
  • allow researchers to work with multiple forms of data

Disadvantages of Case Studies:

  • often lack scientific rigor
  • don't allow for generalization
  • may produce massive quantities of data that are difficult to summarize

Conducting a Case Study:


  • What to ask: most appropriate for questions that begin with "why" and "how"
  • What to analyze: what constitutes a "case"? a specific decision, a particular organization at a particular time, a program, a discrete event?

Pilot Study:

  • Construct a study protocol: overview of the project; procedures necessary for gaining access to site or study subjects; methods of accessing records; schedule for data collection; discussion of logistical challenges; questions central to the inquiry and sources of information that will be used to answer those questions
  • Conduct a pilot study: used to refine the research design and field procedures

Data Collection:

  • Sources of evidence: documents, archival records, interviews, observation or participant observation (discussed later), physical artifacts
  • Use of multiple sources is recommended for triangulation

Data Analysis: (see below for a discussion of Qualitative Data Analysis)

  • pattern matching: "an empirically based pattern is compared with one or more predicted patterns"
  • explanation building: "researcher tries to construct an explanation about the case by making statements about the cause or causes of the phenomenon under study" (see "analytic induction strategy" in the "data analysis" section below)
  • time series analysis: "the investigator tries to compare a series of data points to some theoretic trend that was predicted before the research, or to some alternative trend" (W&D 132)


Field observation can take many forms -- and it varies according to (1) the degree of the researcher's participation in the behavior under observation and (2) the degree to which the observation is concealed. The choice -- to participate or not, to make your purpose overt or not -- depends on the research question, the degree of cooperation from the group or individual observed, and ethical considerations. RP: Will making your purpose known, or joining in on their activities, influence your subjects' behavior? Will keeping your purpose secret constitute deception?

Examples: Gieber's study of gatekeeping in the newsroom (1956); Epstein's description of network news operations (1974); Lemish's study of television viewing by infants and toddlers (1987); Durham's observation of peer group interactions among adolescent girls in an attempt to study the impact of media on their sexuality (1999) RP: Can you think of any examples of observation used in applied research?

Advantages of Field Observation:

"often helps the researcher to define basic background information necessary to frame a hypothesis and to isolate independent and dependent variables" (We'll discuss these in our lesson on quantitative research) (W&D 117)

  • "since the data are gathered first-hand, observation is not dependent on the subjects' ability or willingness to report their behavior" (W&D 117)
  • "an observer is able to emphatically become the central instrument of research, relying on several sensory registers and on diverse media information" (Jensen 242)
  • may provide access to groups who would be difficult to reach through any other method
  • usually inexpensive
  • takes place in a natural setting

Disadvantages of Field Observation:

  • difficult to generalize and validate data
  • "since field observation relies heavily on a researcher's preconceived notions about the material under study, experimenter bias may favor specific preconceptions of results, while observations to the contrary are ignored or distorted" (W&D 117)
  • reactivity: the process of being observed may influence subjects' behavior

Doing Field Observation:

Choosing the Research Site:

  • "efforts should be focused on a smaller field which is explored both for relevant phenomena and for descriptive categories" (Jensen 243)
  • the behavior under observation should occur with sufficient frequency
  • the setting should fit the recording forms and instruments you plan to use
  • you should select two or three potential sites and "hang around" to discover their advantages and disadvantages
  • avoid choosing a site where you are well known or have some involvement

Gaining Access:

  • Lindlof offers three suggestions: (1) identify the scene's gatekeeper and attempt to persuade him/her of the project's relevance; (2) find a sponsor who can vouch for the usefulness of the project and can help locate participants; and (3) negotiate and agreement with participants (quoted in W&D 119)


  • See the sampling procedures described above; for observation, maximum variation, snowball, extreme case, politically important case, and typical case sampling are well suited.

Collecting Data:

Field notes & Research Diary: try to separate notes of what happened and what was said (field notes) from personal impressions, feelings, interpretations (research diary). Practice thick description; explain the context of the practices and discourse that take place within a society, such that these practices become meaningful to an "outsider." RP: Could we say that Geertz is advising us to make the emic etic?

Darren Newbury has written an article on "Diaries and Fieldnotes in the Research Process" for Research Issues in Art & Design. Please download and skim. Public Interest Anthropology @ Penn also provides some advice for creating fieldnotes.

[The image to the left shows the field notes of William Clark. That's the "Clark" of Lewis & Clark.]

Researchers are now using photography and video to document their observations, and we'll be discussing the use of media as research instruments in our next two lessons.

Other methods for collecting data include asking the research subjects to keep their own diaries, or providing them with cameras and asking them to make photo essays or keep photographic diaries.

With a little creativity, you might also think of some unobtrusive means of collecting data. Wimmer & Dominick provide one example: if you wanted to determine the most popular radio station in a particular era, you might ask auto mechanics to keep track of the dial positions of cars in their shops. Webb, Campbell, Schwartz, and Sechrest (Unobtrusive Measures Rand McNally 1968) identify two other unobtrusive measures. Erosion measures wear & tear. So, if, for instance, you wanted to determine which textbooks are most heavily used, you might look at the highlighting and dog eared pages in used textbooks. Accretion examines deposits, such as the amount of dust that accumulates on the screen of an unused television set. Of course, the internal validity of these measures is questionable; you're never quite sure if "dustiness" is an appropriate operationalization of the "lack of use" variable. RP: What other "unobtrusive" ways might you gather data about your research subjects?

Analyzing Data:

  • See our discussion below on "qualitative data analysis." As Jensen notes, analyzing data gathered during field research is characterized by "differentiated and staggered process of analysis, interpretation, and self-reflexivity" (Jensen 243). Field research, like much qualitative research, has the "...qualitative ambition of searching out one's analytical categories in the field itself -- even though research questions and purposes inevitably orient a study" (Jensen 243). We'll discuss this on-site theory-building in the "data analysis" section below.


  • You need a plan for leaving the scene -- and must do all that you can to prevent any kind of harm to those under study. RP: Why is this even a concern? How might your leaving the field or scene cause problems?

Here's Family Health International's "Participant Observation" guide, which includes info on the strengths and weaknesses of the method, ethics, logistics, documentation, etc.


We all know what an interview is, so I won't insult your intelligence by providing a definition.

Advantages of Intensive Interviews (W&D 127):

  • They provide detailed background about the reasons why respondents give specific answers -- and, in comparison with other qualitative methods, intensive interviews provide more accurate responses on sensitive issues. The rapport between interviewer and interviewee makes it easier to broach certain topics that would be inappropriate in other contexts.
  • They allow for lengthy observation of respondents' nonverbal responses.
  • They are usually very long; some may last several hours or several sessions. The length allows for great depth -- but is, of course, also a disadvantage.

Disadvantages of Interviews:

  • It is difficult to generalize from interview data.
  • Interviewing is generally done with a nonrandom sample (we'll discuss this in our lesson on quantitative methods).
  • Since interviews are nonstandardized, each interviewee may answer slightly or dramatically different questions.
  • People don't always say what they think.
  • They are sensitive to interviewer bias. And, as a result...
  • They present problems for data analysis -- especially if the person coding the data is not the person who conducted the interview.

You may recall from our lesson on historical research that interviews are often used to collect oral histories. Interviewing is also common in production research. Many students who are researching "the making of" a television show or film, or who are studying media ownership or the organization of a specific media corporation, decide early on that an interview is imperative -- and they shoot straight for the top of the corporate ladder. It should come as no surprise when the CEO of News Corp doesn't return your phone call -- and, in most cases, it's really no loss.

Before you go to the trouble of scheduling an interview with a high-profile filmmakers or media executive, ask yourself: Is it really worth it? Jane Stokes, author of How to Do Media and Cultural Studies, says that “You might find that the information you need is readily available from the public relations or marketing department of a company…. Interviews should be used…only for eliciting personal attitudes and opinions. So you should embark on an interview study only if your primary object of analysis is the words of your interviewees….” (Stokes 118).

Steps in Interviewing:

Stokes identifies several (rather obvious) steps of the interview process:

  1. Select your interviewees carefully: “Interview as few people as necessary to conduct your study…. Don’t assume that you have to interview the chief executive officer of a company to get reliable information” (Stokes 118) [RP: Think about your own studies; what underrepresented or unrecognized voices might contribute a new perspective in your research?];
  2. Decide on how you are going to conduct the interviews (email, phone, letter?) [RP: How might your choice of medium impact the structure or content of the interview?];
  3. Conduct background research: going into the interview well-prepared, and demonstrating that you've done your homework, increases your credibility and often makes your subjects more willing to speak more candidly and about more sophisticated topics [duh.];
  4. Plan the interview: create a list of questions and topics; practice. ;
  5. Conduct the interview: record it and, at the same time, take notes on key comments [RP: Although this, too, might sound insultingly obvious, "how to record" was actually a very difficult decision for me in my field research. I started out taping my interviews, then realized that my discussants were guarded and self-conscious -- so I eventually scrapped the tape and settled for the occasional scribbled note. What's the trade off? Do any of you have personal experiences to share?];
  6. Transcribe your tapes: you can hire a transcription service to do this for you; include the transcript as an appendix to your research report;
  7. Reflect

UTAustin provides a helpful guide for the interview process, with advice for all stages -- from deciding what type of interview to conduct, to drafting questions, to transcribing data. And Canada's International Development Research Center offers a guide for interviews and questionnaires; there's a lot of great information about creating effective, unbiased questions. The Qualitative Report also features several papers, including this one, about using interviews in qualitative research; topics include (1) types of interviews, (2) interviewer tasks and skills, (3) sources of error and bias, (4) preparing and conducting the interview, and (5) handling interview data.

Sound Portraits is a production company "dedicated to telling stories that bring neglected American voices to a national audience." Interviewing is a primary methodology in producing their radio projects. Producer David Isay even provides an "Interview Guide" for producing your own radio documentary. And while you're at it: stop by StoryCorps, too, and NPR's Recording America.

Other Resources:

Public Interest Anthropology @ Penn, "Methods: Interviews"

Delphi Face-to-Face Interviewing


Please see Melinda Lewis's "Focus Group Interviews in Qualitative Research: A Review of the Literature" and Anita Gibbs's "Focus Groups" for an overview. Johanna Moscoso's paper, which you reviewed a few weeks ago, also discusses the use of focus groups in applied research: specifically, in MTV's development research. We can also see a modified focus group format in Eric Zimmerman's game design research. RP: Why are focus groups such a popular production research methodology -- particularly in applied research? What are its drawbacks?


There is also a movement to use qualitative research to solve real-world problems through the research process. Action researchers work with and for people rather than simply conducting research on them. Here are some sites that focus on action research:

Action Research Open Web, University of Sydney
Action Research Electronic Reader, Ian Huges, Ed., Southern Cross University, Australia
Action Research Resources, Southern Cross University, Australia

RP: Is there, or could there be, an action research dimension to your work? What real-world problems are media and communication studies best prepared to tackle? With what other disciplines might we partner to address these issues?


Remember that one of the distinguishing characteristics of qualitative research is that the key concepts are defined and redefined as part of the research process; methods, theoretical concepts, and analytical procedures are all emergent. Interpretation and analysis therefore aren't reserved until the data collection is complete; to the contrary, the researcher engages in both processes throughout the research process.

Furthermore, like most other steps in the qualitative research process, data analysis is iterative. As Jensen notes, "the movement from fieldnotes, tapes, and transcripts to final research reports comprises several steps -- memoing, modeling, drafting, recontextualizing -- each of which lends itself to documentation" (Jensen 246).

In quantitative research, data analysis is deductive: researchers develop hypotheses, then collect relevant data and analyze them to determine whether or not they can confirm the hypothesis. Qualitative data analysis is inductive: data are collected and grouped into meaningful categories. Explanations emerge from the data themselves. Yet another manifestation of emergence in the qualitative research process!

Researchers might begin by organizing their data chronologically according to the sequence of events that took place during the research process. Then the data are organized into a preliminary category system. These categories might be informed by theory or prior research, or they might arise from the data themselves (more emergence!).

Johnson and Christensen identify several additional processes of data analysis:



Interim Analysis

Reiterate analysis to get deeper understanding of data


Reflective notes to keep track of ideas from data


Put data into more easily analyzable form


Divide up data in meaningful ways


To remember themes, concepts, etc., present in the data


Quantification of data to look for the frequency of patterns or themes

Creating Hierarchical Category System

To organize the knowledge so that patterns and themes can be linked with other patterns and themes

Showing Relations Between Categories

To figure out relationship between coding categories. To look for patterns and themes

Drawing Diagrams

To understand and communicate complex systems

Researchers often develop typologies, indices, and graphic representations, like models and figures, to help them organize and find meaningful patterns among their data.

Glaser and Strauss (1967) and Lincoln and Guba (1985) describe one of the most common analytical induction techniques: the constant comparative technique:

  1. Incidents are assigned to categories by comparing them to other units already assigned. If some units don't fit any preexisting category, you may have to create new classifications.
  2. While refining categories, you write rules or propositions that describe the underlying meaning that defines the category. These rules help to refine your categorization and help you to "explore the theoretical dimensions of the emerging categorization system."
  3. You search for relationships and patterns across categories, and examine your propositional statements and look for connections.
  4. Finally, you simplify and integrate your data into a coherent theoretical structure. (Wimmer & Dominick, 2003, 112-3)

This all might sound intimidating, but, when you get right down to it, it's not much different from the method I sometimes use -- and you might use -- to write a paper. I begin by typing all of my notes, and carefully noting sources for each, in a text document. Then, once I reach a point of redundancy in my research -- I start seeing the same information repeatedly in my sources -- I know it's time to stop collecting "data" and to start analyzing that data. I print out my notes and read through them two or three times. While I'm doing this, I make mental notes of patterns of recurring themes. And this is where the fun begins: I get out the colored markers! Wheee! I develop a legend, with each color representing a theme, and proceed to color-code all the notes. (Okay, sure -- maybe I am a bit obsessive. You got a problem with that?) If some notes don't fit into one of my color-defined categories, I realize I have to revisit my categorization scheme -- perhaps add a new category -- or if I find that particular notes could fit in either of two categories, I consider merging or refining the boundaries of my categories. I then return to the Word document and cut-and-paste everything so that all the notes are reorganized by theme -- and then reorganize all the notes within each thematic section. Through this iterative process, the organization -- and, eventually, my argument -- emerge. Iterative and emergent: there they are again! If we were on Pee-wee's Playhouse (seriously -- you've got to follow this link), I'd make "iterative" and "emergent" the words of the day! Aaaaaaauuugggghhh!!!!

According to Jensen, though, there is such a thing as too much emergence. Grounded theory "tends to assume that theory can be 'found' in the field, if the research activity is sufficiently 'grounded' in the categories of that field" (Jensen 247). In this case, though, emergence doesn't just refer to the data analysis stage; it begins with data collection -- or even before: in defining the research question. The grounded theorist assumes that through several iterations of sampling, analyzing, "memoing" (see Johnson & Christensen's chart above), and interpreting data; and by coding the data at varying levels of abstraction, we'll eventually reach a point of "theoretical saturation" -- we'll achieve an "equilibrium between empirical evidence and explanatory concepts" -- and viola! (hence the genie): a theory appears! (Jensen 247). Grounded theory, Jensen says, is often used to justify an "inductive" approach -- which is common in qualitative research -- but, in actuality, the title is often little more than an excuse for leaping in to the field theoretically naked -- with no theoretical presuppositions. The shortcomings of grounded theory, Jensen says, "...may testify to the general weakness of some earlier qualitative data analysis, and to the need to further develop such approaches as thematic coding and discourse analysis" (Jensen 248).

The constant comparative technique is not lacking in rigor. But perhaps Mr. Jensen would be more satisfied with something a little less "emergent": perhaps the analytic induction strategy? Hmmmm?

  1. Define a topic of interest and develop a hypothesis.
  2. Study a case to see whether the hypothesis works. If it doesn't work, reformulate it.
  3. Study other cases until the hypothesis is in refined form.
  4. Look for 'negative cases' that might disprove the hypothesis. Reformulate again.
  5. Continue until the hypothesis is adequately tested. (Wimmer & Dominick 113)

In this method, the explanation is generated at the beginning of the study, while in the constant comparative technique, the explanation emerges at the end.

John Seidel of Qualis Research, creators of a program intended to facilitate qualitative data analysis, provides a useful 15-page booklet on qualitative data analysis, which I urge you to download and skim.

And here are some screen shots from Ethnograph, Qualis's program. Although I can't speak for the quality of the software, I think these screen shots might give you a better sense of what researchers do when they're coding data. Click on the image to go to the Ethnograph website, and choose the "screen shots" button in the bar on the left.

Activity: Search the New School Library’s electronic resources – particularly Communication Abstracts and Communication and Mass Media Complete in the Periodical Databases – to find some articles that feature research using these methodologies. All you have to do is search for "interview" or "focus group" or "participant observation." Of course, if your search yields too many "popular" titles, like AdAge and Media Week, try adding a search term -- like "methodology" -- that you're likely to find in scholarly journals and not in the popular press. Read the abstracts (and skim full-text, if available) of a few of these articles to get a sense of what questions researchers are proposing, how they’re putting qualitative methodologies to use, and what conclusions these methods allow them to make. Here are just a few examples -- some potentially interesting, some that are good for a laugh. Note not only the variety of research topics and the methods used to address them, but also the range of journal titles; there are innumerable venues for publication in our field.

Denise Sevick Bortree, "Presentation of Self on the Web: An Ethnographic Study of Teenage Girls' Weblogs" Education, Communication & Information 5:1 (March 2005): 25+. Abstract: Through their use of weblogs, teenage girls are bridging their offline and online relationships. As the girls use this medium to construct themselves and their relationships, they must address the dual nature of weblogs as a tool for interpersonal communication and mass communication. This study examines two aspects of teen girls’ blog use: (1) challenges and hazards of conducting interpersonal communication in a mass medium, and (2) self-presentation strategies used to negotiate a dual audience. Methodology for the study included an ethnographic study of 40 weblogs, an in-depth analysis of six weblogs and a set of 13 in-depth interviews. (We'll learn more about ethnography in upcoming lessons.)

Chris Atton & Emma Wickenden, "Sourcing Routineness and Representation in Alternative Journalism: A Case Study Approach" Journalism Studies 6:3 (Aug. 2005): 347-359. Abstract: This study is a first attempt to examine how the alternative media select, represent and deploy their news sources. The literature suggests that, in contrast to mainstream sourcing routines, the alternative media privilege “ordinary”, non-elite sources for their news and, through what has been termed native reporting (Atton, 2002a), offer such sources a platform to speak directly to audiences. The primary research examines a single publication, the UK activist newspaper SchNEWS. The paper's sourcing routines are examined through a triangulated approach that combines interviews with content and discourse analysis. Superficially the findings confirm what the literature argues: that the paper does indeed privilege “ordinary” sources above elite sources. However, the depth of the study reveals nuances that are absent from the literature. In particular, the findings of the study suggest that a counter-elite dominates sourcing practices at SchNEWS, and the deployment of these sources is just as reliant on expertise, authoritativeness and legitimacy as are mainstream sourcing routines. Strikingly, the paper's use of “ordinary” citizens (that is, those not explicitly politicised through grassroots activism) is very low, suggesting that the paper's counter-elite sourcing practice is determined more by its own political ideology than by any radical media philosophy. [ABSTRACT FROM AUTHOR]

Aeron Davis, "Media Effects and the Active Elite Audience: A Study of Communications in the London Stock Exchange" European Journal of Communication 20:3 (Sept. 2005): 303-326. Abstract: This article explores the impact of communications on investor behaviour and trading patterns in the London Stock Exchange (LSE). The significance of the work is two-fold. First, for many observers, the wild trading patterns that regularly occur in stock markets suggest the presence of 'strong' media effects in action; a finding in conflict with mainstream audience research. Second, the audience investigated is an elite one that most 'actively' consumes its media and is rarely observed in studies of media effects. The research thus attempts to identify in what ways the media are implicated in stock market movements and, at the same time, how exactly this active, elite audience makes use of its media. The empirical material presented here, is primarily that gained from interviews with elite fund managers at the London Stock Exchange, Europe's largest financial centre. [ABSTRACT FROM AUTHOR]

S. Humphreys, "Productive Players: Online Computer Games' Challenge to Conventional Media Forms" Communication and Critical/Cultural Studies 2:1 (2005): pp. 37-51. Abstract: Computer games are important media to study for a number of reasons. Markets and revenues rival Hollywood, and the player population is a demanding one that has pushed the development of innovations in both technical and interface areas. If we view games as a remarkably successful set of applications in the realm of new media, then understanding how they work becomes a project important for a much broader field of study. The online multiuser game is an exemplar of the emergent structures of interactive media. Social relationships and community networks are formed, and developer/player relationships are negotiated around ongoing development of the game features and player-created content. The line between production and consumption of the text has become blurred, and the lines between social and economic relationships must be redrawn. This article explores these relationships using EverQuest as a case study. It suggests that the dynamic, mutable, and emergent qualities of the online multiplayer game exceed the limits of the reifying processes embodied by copyright law and content regulation regimes.

Z. Wang, Z. Liu & S. Fore, "Facing the Challenge: Chinese Television in the New Media Era" Media International Australia 114 (2005): 135-146. Abstract: Since the mid-1990s, China has been on a fast track to becoming the world's largest market and powerhouse for digital media, communications technologies, and the Internet. Although in the West the Internet economy has experienced difficulties from the late 1990s, China has seen phenomenal success in both technological progress and market development. By the end of 2003, China had more than 400 million phone users, of which 200 million were mobile phone users. The Internet now covers more than 240 counties and regions in China, and it is expected that by 2005 more than 130 million people will have access to the Internet. More recently, China has begun unveiling its next-generation Internet. This paper examines current developments in new media and Chinese television. In particular, a case study of the spring Festival Eve Gala 2002, sponsored by China Central Television Station, is presented. Despite the rapid development of digital technology and new media in recent years, Chinese television is unlikely to be transformed quickly. It is proposed that coevolution and convergence with new media offer the most effective strategy for the future development of Chinese television. The case study indicates that the current progress in media and communications technologies has set the stage for a gradual and incremental transformation of Chinese television.

G. M. Smith & P. Wilson, "Country Cookin' and Cross-Dressin': Television, Southern White Masculinities, and Hierarchies of Cultural Taste" Television & New Media 5:3 (2004): 175-195. Abstract: For decades, chefs have offered television viewers the spectacle of preparing haute cuisine flawlessly and flamboyantly, making dishes with exotic ingredients and demonstrating a range of specialized equipment and techniques. More recently, a new wave of shows has taken over the airwares with campy, energetic hosts dishing out gourmet international fare in a spectacle of performative showmanship. And then there is the show discused in this article, Cookin' Cheap, a regionally produced, nationally available cooking show that mixes Southern humor with overtly cheap recipe preparation. "Cheapness" (of food and cooking technique) is positioned as both nostalgia for simpler "country" values and as differentiation from slick television norms. By showing the hosts engaging in time consuming, often clumsy food preparation, the show pokes fun at the flawless professionalism of television cooks, reemphasizing the chaos of the domestic kitchen. Based on viewer letters, textual analysis, and ethnographic participant/ observation, this article discusses the way Cookin' Cheap makes a place for the viewer in the text, reappropriating strategies of earlier television. When the show's hosts impersonate their aunts in unconvincing drag, they also emphasize the passing of tradition from matriarchal figures to the following underexamined form of masculinity: the feminized Southern working-class man.

B. J. Guzzetti & M. Gamboa, "Zines for Social Justice: Adolescent Girls Writing on Their Own" Reading Research Quarterly 39:4 (2004): 408-436. Abstract: Despite the popularity of self-published teen zines, few studies have been conducted of the adolescent girls who write and read them. Past research on teens' reading and writing shows that adolescents read and write along stereotypical or gendered lines. This study explores the out-of-school literacy practices of three adolescent girls who write and publish their own zines by writing against gender, race, and class stereotypes. The study identifies what motivates and enables these girls in writing differently on their own and describes how young women use and develop their literacy skills to enable them to form and express their identities. Methods of participant observation were used to address these questions. Findings have implications for student-centered instruction by identifying relevant ways to engage adolescents in literacy activity.


If you are interested in using qualitative methods to study computer-mediated communication, see the following:

  • Thomas R. Lindlof and Bryan C. Taylor, “Qualitative Research and Computer-Mediated Communication” Qualitative Communication Research Methods (Thousand Oaks, CA: Sage, 2002): 247-278.
  • More Information on Internet Research

For more on qualitative methods, see the following:

  • Qualitative Research Panel @ the University of Auckland: panelists discuss such topics as interview methods, equipment, and protocol; data analysis; self-reflexivity; participant involvement; focus groups; relationships and power
  • Thomas R. Lindlof and Bryan C. Taylor, “Asking, Listening, and Telling” Qualitative Communication Research Methods (Thousand Oaks, CA: Sage, 2002): 170-208.
  • George Gaskell, “Individual and Group Interviewing” In George Gaskell & Martin Bauer, Eds., Qualitative Researching With Text, Image and Sound: A Practical Handbook for Social Research (Thousand Oaks, CA: Sage, 2000): 38-56.
  • Sandra Jovchelovitch and Martin W. Bauer, “Narrative Interviewing” In George Gaskell & Martin Bauer, Eds., Qualitative Researching With Text, Image and Sound: A Practical Handbook for Social Research (Thousand Oaks, CA: Sage, 2000): 57-74.
  • Uwe Flick, “Episodic Interviewing” In George Gaskell & Martin Bauer, Eds., Qualitative Researching With Text, Image and Sound: A Practical Handbook for Social Research (Thousand Oaks, CA: Sage, 2000): 75-92.
  • Colin Robson, “Flexible Designs” In Real World Research: A Resource for Social Scientists and Practitioner Researchers, 2nd ed. (Malden, MA: Blackwell, 1993): 163-200.
  • Colin Robson, “Interviews” In Real World Research: A Resource for Social Scientists and Practitioner Researchers, 2nd ed. (Malden, MA: Blackwell, 1993): 269-291.
  • Colin Robson, “Observational Methods” In Real World Research: A Resource for Social Scientists and Practitioner Researchers, 2nd ed. (Malden, MA: Blackwell, 1993): 309-345.
  • Thomas R. Lindlof and Bryan C. Taylor, “Observing, Learning, and Reporting” In Qualitative Communication Research Methods (Thousand Oaks, CA: Sage, 2002): 132-169.
  • Colin Robson, “The Analysis of Qualitative Data” In Real World Research: A Resource for Social Scientists and Practitioner Researchers, 2nd ed. (Malden, MA: Blackwell, 1993): 455-499.
  • Thomas R. Lindlof and Bryan C. Taylor, “Qualitative Analysis and Interpretation” Qualitative Communication Research Methods (Thousand Oaks, CA: Sage, 2002): 209-246.
  • Thomas R. Lindlof and Bryan C. Taylor, “Human Subject Protections in Qualitative Research” Qualitative Communication Research Methods (Thousand Oaks, CA: Sage, 2002): 90-97.
  • And then there's "Sense-Making": Make of it what you will.
  • The Qualitative Research and Health Working Group, Liverpool School of Tropical Medicine, "Glossary of Qualitative Research Terms" (2003).
  • Lots of qualitative research links...and more...and more...and more...and more. Sheesh! They're everywhere!

RP: Which of these methodologies might be appropriate for your project? Why or why not?