One of the defining characteristics of grounded theory methodology is the cyclic nature of the research process. Grounded theory methodology provides the framework and tools for data collection and analysis, which occurs concurrently and thus, remains closely linked as the research progresses in successive stages. In this way, theory emerges in close relationship to data collection and analysis. Grounded theory is underpinned by the following seven key characteristics; theoretical sensitivity, theoretical sampling, constant comparative analysis, coding and categorising the data, theoretical memos and diagrams, use of the literature, and integration of the theory (McCann & Clark, 2003). A brief summary of their main elements follows and where variations occur as a result of the different versions of grounded theory, the differences are identified.
3.5.2.1 Theoretical sensitivity
Corbin and Strauss (2008) define sensitivity as “the ability to pick up on subtle nuances and cues in the data that infer or point to meaning” (p.19). Sensitivity is thus a quality of the researcher’s approach to the investigatory process. The researcher can demonstrate sensitivity in a number of ways; according to how she interacts with participants during interviews, the questions she asks, and how she responds to the data. Theoretical sensitivity is commonly referred to as a starting point of seeking some clarity or insight
43 Predictability relates to transferability of a study’s findings, which is discussed in Chapter 4.7.3.
41
about the phenomenon of interest (Strauss & Corbin, 1998). It is a beginning point in the analytic process because it allows the researcher to move beyond a surface understanding and gain access to the meanings in the data at a deeper level. It also allows the data to speak for itself and not be driven by the researcher’s assumptions or preconceived ideas.
Theoretical sensitivity tends to be discussed in terms of its contrasting relationship to objectivity. There is an ongoing debate within the literature concerning sensitivity versus objectivity, and how the researcher influences the research situation (Holloway &
Wheeler, 1996). In order to gain rich data, the researcher becomes immersed in the research setting yet must maintain some analytic distance. Strauss and Corbin (1998) advocate for a balance between sensitivity and objectivity; a situation where the researcher is reflexive and aware of her own assumptions, in order to avoid bias, and be truly open to uncovering participants’ perspectives and the new44.
Theoretical sensitivity is a research skill that the researcher can develop by being aware of her assumptions concerning the area of interest and by adopting a mindset that is open to research participants’ views and experiences. The literature, professional background, and personal experience can be sensitising sources if used in ways that give meaning to events without imposing extant ideas onto the data. Theoretical sensitivity continues to be developed as the research progresses through the use of the analytic tools of grounded theory, such as asking questions, making theoretical comparisons, theoretical sampling, and coding techniques. These tools stimulate the inductive process and further sensitise the researcher to discovering the relevant properties and dimensions of emerging categories and concepts (Strauss & Corbin, 1998).
3.5.2.2 Theoretical sampling
Theoretical sampling is “a method of data collection based on concepts/themes derived from data” (Corbin & Strauss, 2008, p.143). It differs from other purposive sampling techniques in that it pertains only to the ongoing development of concepts and theory. It is the process of obtaining selective data to refine, check and expand major categories (Charmaz, 2006). Initially the researcher collects data from a wide range and as analysis takes place and concepts arise, the emerging theory influences further decisions regarding participants and the nature of new data to be collected. Sampling is cumulative and no longer predetermined as at the start of the project. It builds upon previous data collection and analysis, and with time, becomes more specific. Sampling becomes based
44 See Chapter 4.4 for discussion of the researcher’s expectations and assumptions.
42
on the concepts that have emerged from data analysis and its purpose is to further develop these concepts in terms of their properties and dimensions (Strauss & Corbin, 1998), until no new properties emerge. This point is referred to as data saturation, that is, when no new information is forthcoming and the researcher determines that, for the purpose of the study, each category has been sufficiently well developed in terms of its properties and dimensions and under different conditions (Corbin & Strauss, 2008).
Theoretical sampling provides direction for gathering more focused data in order to answer analytic questions, fill conceptual gaps, and help the researcher to fit the emerging theory with the data (Charmaz, 2006).
3.5.2.3 Constant comparative analysis
Constant comparative analysis is a basic procedure in grounded theory that continues throughout the analytic process in conjunction with other, more defined analytic tools, such as theoretical sampling and coding. Its purpose is to find similarities and differences in the data. Constant comparative analysis involves asking questions that will lead to greater theoretical understanding of the phenomenon of interest. Questions are also directed at identifying variations in patterns found in the data, through comparison of their properties and dimensions under different circumstances (Strauss & Corbin, 1998).
Glaser’s classical grounded theory approach favours constant comparison of event to event and event to emerging concept as a technique that enhances natural emergence of theory (Grbich, 2003). Doing constant comparative analysis is described as an art (Charmaz, 2006; Strauss & Corbin, 1998), as the researcher’s creativity and line of questioning and use of comparison is what yields rich data and innovative analyses. This technique aims to stimulate thinking about the research question and the data in a variety of ways; by enhancing sensitivity, guiding the research direction and theoretical sampling, and by developing abstract concepts that ultimately generate theory.
3.5.2.4 Coding and categorising the data
Coding and categorising is the process of defining and making analytic interpretation of the data; it involves “abstracting, reducing, and relating” events, acts, and outcomes (Strauss & Corbin, 1998, p.66). Coding starts by breaking down or fracturing the data to open it up, uncover ideas, look for patterns, and develop concepts. It involves labelling segments of the data in a way that simultaneously summarises and accounts for each piece of data (Charmaz, 2006). All data are coded but initial codes tend to be modified or transformed as analysis progresses (Holloway & Wheeler, 1996). Codes with a similar meaning are linked and redefined as categories, which have a higher level of abstraction.
Categories are defined as “concepts that stand for phenomenon” (Strauss & Corbin, 1998,
43
p.101). As the relationships between categories are further developed by coding techniques, sub-categories or “concepts that pertain to a category, giving it further clarification and specification” (Strauss & Corbin, 1998, p.101), are also developed. The researcher moves back and forth between the different coding techniques in response to the direction and flow of the research pathway. Three different levels of coding are identified by Strauss and Corbin (1998): open, axial, and selective coding. In classical and constructivist grounded theory, similar levels of coding are found but with some differences; the main one being the absence of axial coding.
Open coding is defined as “the analytic process through which concepts are identified and their properties and dimensions are discovered in data” (Strauss & Corbin, 1998, p.101).
Open coding is used at the beginning of analysis to break down the data into discrete parts, in order to name and give meaning to it (Bluff, 2005; McCann & Clark, 2003).
Microanalysis, which involves very careful examination of the raw data, such as line-by- line analysis, is used initially and at times when new events occur in the data. It is a tool to assist in conceptualising data and giving labels to patterns found within it. Microanalysis forces the researcher to focus on the data, without the undue influence of preconceived ideas. When codifying or assigning names to concepts, the researcher needs to think carefully about her choice of words to describe what takes place in the field of study. The purpose of labelling phenomenon is to make it recognisable and able to be grouped according to its particular property or attributes (Strauss & Corbin, 1998).
As codes arise, they can be labelled in two ways; in vivo codes and sociological constructs (McCann & Clark, 2003 ). The first, in-vivo codes, uses participants’ own words rather than those of the researcher. Various meanings can be attached to a word or phrase and exploring every possible meaning behind participants’ use of a particular word or phrase is an analytical strategy that avoids the problem of the researcher assigning their own interpretation (Corbin & Strauss, 2008). In-vivo codes tend to be found frequently in the data and often contain a sense of imagery, which closely reflects participants’ experiences (Holloway & Wheeler, 1996). Using in-vivo codes has the advantage of keeping the analysis close to the data and more accurately representing participants’ meanings. The second way of codifying data is to use the researcher’s words, which are based on a consideration of theoretical knowledge, expertise, and what appears in the data. Sociological constructs have the advantage of providing a more abstract scholarly perspective but can lack the more vivid imagery of in vivo codes.
Strauss and Corbin (1998) view conceptual labelling as a creative research skill because it involves emphasising a phenomenon in terms of its link to the conditional background in
44
which it is located and on the basis that its name should evoke a sense of imagery that captures a particular action or quality.
Axial coding follows open coding. It is termed ‘axial’ because coding occurs around the axis of a category, as the nature of each category’s properties and dimensions are defined and relationships between categories are further explored. As links between categories and subcategories are made, the data are “reassembled” (Strauss & Corbin, 1998, p.103) or put back together in a different form (Holloway & Wheeler, 1996), as a ‘coherent whole’
according to the emerging analysis (Charmaz, 2006, p.60), and as an important start to building theory. The process draws on the same tools of grounded theory analysis, such as asking questions and making comparisons, using inductive and deductive thinking (Bluff, 2005; McCann & Clark, 2003), which overall, requires a more focused and abstract approach than that needed for open coding.
One of the differences between Glaser and Strauss’s original, and Strauss and Corbin’s later, versions of grounded theory relate to the addition of axial coding and the specific technique of using a prescribed coding paradigm. Strauss and Corbin introduced these analytic devices intending to help the researcher pose relational questions about the data to enable it to be reconstituted in new ways. According to Glaser (1992), this approach was viewed as a distortion of the original ideas of grounded theory, which held that meaning was discovered within the data and allowed to emerge rather than be made to fit or forced into a predetermined structure (Bluff, 2005).
Selective coding is “the process of integrating and refining the theory” (Strauss & Corbin, 1998, p.143). Coding at this stage becomes more sophisticated as major categories, which contain developing theoretical ideas, are constructed. Selective codes represent recurrent themes and are more abstract, general, and “analytically incisive” than the initial codes that they subsume (Charmaz, 2003b, p.322). Selective coding involves the linking of all categories around a core, also described as the basic social-psychological process or “essence of the study” (Holloway & Wheeler, 1996, p.106). The aim is to discover the core or overriding category, to integrate it with other categories and validate the links between them. The processes used at this stage are theoretical coding, theoretical sorting and saturation, memos, and diagrams.
3.5.2.5 Theoretical memos and diagrams
Theoretical memos and diagrams are tools for conceptualising the data and the relationships between the codes and categories, and for ongoing theory development.
Memos are “written records of analysis” (Corbin & Strauss, 2008, p.117). Essentially, they 45
consist of exploratory, creative, and spontaneous notes that reflect the researcher’s interaction with the data at that point in time. They can range in their style, from free flowing ideas and questions to well-constructed analytic statements. Memos link coding of the data to the report writing process and Charmaz (2003b) describes them as “the crucial intermediate step that moves analysis forward” (p.322). Charmaz (2003b) identifies six ways that memo writing helps the researcher; to think about the data, stimulate ideas to check in further interviews, discover gaps in earlier interviews, treat qualitative codes as categories to analyse, clarify and define categories, and make explicit comparisons (p.323). In the current study, when writing memos, I found it helpful to include excerpts from participants’ interviews or observations as the basis around which memos were written. In this way, memos maintain the link between data, the original analysis of the data, and report writing. Memo writing is a tool for taking codes apart analytically, before raising them conceptually to form categories and then to delineate categories in terms of their properties and relationships with each other. In this way, memos provide the foundation for the written drafts of segments of chapters of the thesis and the final written theoretical construct.
Diagrams are “visual devices that depict relationships between analytic concepts” (Corbin
& Strauss, 2008, p.117). Diagrams arise from analysis and provide a visual representation of thought. They develop in complexity and clarity as the research progresses and with time, become more integrative. I found diagrams particularly useful for organising data, capturing the relationships between multiple concepts, and providing a broad overview of the study’s findings. Diagrams force the researcher to work with and present their findings at once, in a systematic way; “in a manner that reduces the data to their essence” (Corbin
& Strauss, 2008, p.125). Together, memos and diagrams work in different but complementary ways to enrich, bring to life, and make explicit the various meanings found in the data.
3.5.2.6 Literature
The place of the literature in grounded theory studies can be confusing. In classical grounded theory, Glaser and Strauss (1967) advocate delaying a review of the literature until after analysis is completed; a practice thought to encourage the researcher to articulate original ideas and not through the lens of known theoretical constructs. In principle, Glaser (1992) continues to adopt this stance but Strauss and Corbin have moved away from this original idea on the basis that it can be impractical and ambiguous.
In general, grounded theorists recognise that the researcher already brings a professional
46
body of knowledge and disciplinary literature with her (Morse et al., 2009) and a review of the literature can be used as a methodological tool in different and specific ways.
A preliminary review of the literature serves a particular purpose. It justifies the need for the study, develops sensitising concepts, and creates a background to the study (McCann
& Clark, 2003). A second literature review is used as an analytical resource. The most common concern about the use of literature is the potential for it to limit creativity and impose existing ideas onto the data. Most grounded theorists agree that it is how the researcher uses the literature in a grounded theory study that is important. Some of the advantages of using the literature, noted in previous discussions, relate to its ability to enhance theoretical sensitivity and theoretical sampling. It is contended that the researcher can look to the literature to relate extant theories to the developing theory without necessarily imposing ideas onto the analysis and theoretical findings (Stern, 1985). Charmaz (2006) contends that a thorough and critical review of the literature can lay the foundation for a scholarly discussion in a substantive area and strengthen the study’s findings, which in turn, enhances its credibility. McCann and Clark (2003) summarise five key benefits of using the literature as enhancing theoretical sensitivity, providing a useful secondary source of data, giving rise to questions about the data, providing an important means of theoretical sampling, and offering an approach to validating the data. In summary, the literature can be used by the researcher in a number of ways as an analytical device to assist with exploring ideas, thinking about the data, evaluating the trustworthiness of findings, and ultimately enhancing conceptualisation.
3.5.2.7 Integration of theory
Integration is the last step of analysis, where all the research threads are pulled together to create a plausible explanatory framework (Corbin & Strauss, 2008). The categories, which are well developed in terms of their variations, are linked around a core category and the final theoretical construction is refined. Integration of theory is achieved by linking data collection and analysis along the way; a process that is continued until a theory with sufficient detail and abstraction is generated. At this stage of the analysis, the researcher might search for a negative case which often represents a variational extreme or exception that will provide a richer conceptual view of a phenomenon. In the present study, the data yielded a broad dimensional range, and while I did not search for a negative case, dimensional extremes were evident in some instances. For example, there were situations where breastfeeding and the baby’s behaviour were deemed to be either significantly improved or worse after osteopathic treatment. This situation, particularly if the latter, had consequential effects for how the osteopath proceeded at the next visit.
47
Exploring the processes involved in dealing with this situation added depth to the analysis.
Other strategies likely to be useful for integration include theoretical sampling, memos and diagrams, and selective sampling of the literature and data. These strategies continue to uncover properties of the main categories and deduce and check hypotheses. The emphasis shifts from exploring to summarising, checking and filling in logical gaps to assist with conceptualisation, fine tuning, and integration of the final theory.