This two-part article aims to address a problem experienced in learning and practicing GTM- after I’ve read as many methodology articles and chapters as I can, how do I REALLY do it? One glaring gap was the lack of discussion and examples of how the researcher as a person engaged with the data through GTM- how can a person with all our subjective perceptions function well as a tool for data analysis?
Preface: Since the creation of “grounded theory” by Glaser & Strauss in 1967, there has been various developments of the GTM to align with the epistemological and ontological bases in research- resulting in different GTM approaches. I use a critical realist GTM (Hoddy, 2019)- a short explainer on the philosophy of critical realism can be found here. As the basic GTM phases are largely the same across GTM approaches (except for maybe prior literature review), the article will be organised along these phases. Phases seem to connote a sequential process- far from that! GTM works best for those who are okay with straddling the in-between of phases and the uncertainty/frustration of iterating the phases. (But hey, isn’t that like life? I believe I have grown as a person through practicing GTM)
Part 1 (link to post) will cover: Literature Integration –> Open Coding –> Axial Coding; while
Part 2 will cover: Theoretical Sampling <—> Interviewing –> Saturation// Abstraction
Theoretical Sampling
A phrase oft-used, but hard to implement. It is also one of the most challenging to execute practically, especially for a small-fry PhD researcher who might not have as many networks to access the sample. (My supervisor also did not have the networks, as our research area was quite different)
How did you gain access to the sample?
- Building networks through contributing: I have to caveat that my research topic is not a sensitive one or with vulnerable populations. That made it easier in a way. However, I still needed to know people and let people know about my research. Not only that, but building the intangible trust. I joined and contributed to networks related to my research (bonus: I learnt so much and enjoyed some too!) These ‘people’ could be gatekeepers to the actual sample too. I felt a bit paiseh (embarrassed) to write this point, as it felt like I was ‘making use of people’. But one way of reframing it was that there is nothing wrong in making requests. I was also respectful in the boundaries and genuine in seeing how I can contribute too.
- Cold emails: More rejections than acceptances, but still worth it as I had the opportunity to interview people with characteristics that the theoretical sampling required. (Forever grateful to those participants who graciously accepted my request 💛)
- Have confidence in your research: Contributing to the sentiment ‘I am making use of people‘ was the fact that I thought my research sucked. When I could quell this sentiment, I was able to see where my research could contribute the networks that I was part of. This facilitated my sharing of the research to people who could help.
How do I maintain reflexivity in theoretical sampling?
I imagine this would not be a top-priority question in the GTM process, especially when it is already so hard to get the sample (like, beggars can’t be choosers). But it should be! I would be a hypocrite if I said that I was able to keep a dispassionate stance on the progress of my sampling.
Instead, I propose that reflexivity to keep in mind the theoretical element of the sampling can be pursued jointly with having ‘sufficient’ interview data.
- Memo-writing: To jot down theoretical considerations for what characteristics you want in the next few interviewees. These memos can inform the updating of the interview guide questions to fill in the ‘theoretical gaps’ in the emerging theory. They are also important to prove the rigour of GTM practice in the write-up stage.
- Constant Review of Codes: (The assumption is that you are collecting and analysing data synchronously) Parallel to the above point, constant review of the codes and data within these codes would help: (1) Whittle down theoretically-irrelevant codes, (2) Combine/rename codes to better reflect fundamental similarity between the data, and (3) With constant comparative method (see Boeije, 2002).
Interviewing
Though by no means the only data collection tool for GTM, interviewing is the most commonly used tool. Participant observation is another tool used. While I am wholly unfamiliar with this tool, the same principle of philosophical base (informing)–> GTM approach –> tool application should be followed. Berthelsen and colleagues (2017) discuss how participant observations are applied differently in two GTM approaches.
How did the philosophical base and GTM approach shape my interviewing approach?
- Conceptualisation of interviewing: This sounds strange but if one shares answers to the following questions about the interview process, I suspect each person will have slightly nuanced responses. (1) What is the nature of the interview data w.r.t. knowledge production? (2) What is the role of the interviewer in this process?
- My answer to the above questions in short: From the critical realist ontology and epistemology, I see the interviewee as experts in their experiences of the researched phenomenon. The interview data represents individual realities. Concurrently, as a social science researcher, I have the responsibility to utilise my analytic lens to pare down these realities to understand the essence behind the fundamental similarities. This especially includes the structural influences, which the individual interviewee may or may not be reflexively conscious of in their experiences. I can abstract social milieu themes from these individual realities as they have intersubjectivity.
Okay, so how did you actually conduct the interviewing?
- Preparation of researcher as the tool: I found that having a mental rehearsal of questions and possible responses is a helpful preparation. This has twofold purposes: Practicing asking the questions and follow-up responses to the interviewee’s answers; and detecting any preconceived notions I might have about the interviewee through this visualisation of the interview
- Interview guide: After the first iteration of the interview guide, questioning gets harder in a way. One has to think about how to communicate the emergent theoretical concepts in layman form to invite interviewees’ responses as to whether this fits with their experiences.
- It was helpful to have my emergent categories (I created diagrams for them) printed together with the questions, so that the interview process is ‘theoretically-driven’. Additionally, as I reached the Saturation phase, my interview guide had notes on specific gaps in the emergent theory that required more data. This again requires constant review of your codes (categories) and the data.
Saturation
Saturation was introduced by Glaser & Strauss (fact-check!) but has now been used across qualitative research. In simple terms, it is when data input yields no further theoretical output.
How do you really know that you have reached saturation?
- Confident, but humble self-evaluation: First off, I believe a self-check is needed. Bearing in mind the impostor syndrome that commonly afflicts PhD researchers, perhaps I will emphasise more on being confident. Are you in that sweet spot of being able to be critical about your work without undermining your skills? The emergent theory will not address all facets of the research topic, but you have to know where the boundaries of your emergent theory lie, and be convinced that it is a theoretical innovation.
- Interview Process: I recall reading somewhere that if you find yourself getting ‘bored’ in an interview, that is a sign that you are reaching saturation. To concretise this experience, I think it is when you find yourself listening in, but also being able to connect most of what is said back to the emergent theory.
Abstraction
I have called this last phase abstraction rather than situatedness (Lo, 2016), which is the positioning of your emergent theory within the existing literature. I believe that beyond situatedness, abstraction of the emergent findings is needed to situate them in the literature.
- Internal Debate: Imagine three debaters in your mind: Literature, Data, and Emergent Theory. How would their arguments stand against each other’s? At this stage, Data and Emergent Theory should largely align. However, there will be divergences between Emergent Theory and Literature, and Data and Emergent Theory. How one reconciles these divergences can lead to higher abstraction of the emergent theory.
- For example, the social identity approach is a major theoretical framework used for my study phenomenon. The data collected however pointed to a wholly different explanation. However, by interfacing the Emergent Theory and Literature (social identity approach), I discovered that the emergent theoretical framework can be integrated into the social identity approach. The core category of my grounded theory thus represents a theoretical innovation for the social identity approach.
That’s all from me about practicing the grounded theory methodology! Hope you found it helpful 😄