Qualitative research, Research methods

Assessing the ‘Quality’ of Qualitative Research

One of the questions that comes up regularly in training courses on qualitative methods is how we should assess the quality of a qualitative study. At some point in their research career, qualitative researchers will inevitably experience the ‘apples versus oranges’ phenomenon, whereby our qualitative research is evaluated based on quantitative principles and criteria, instead of qualitative principles. The quality standards used in quantitative research do not directly translate to qualitative studies.

Should We Use Standardized Criteria to Evaluate Qualitative Research?

Over the years, many qualitative scholars have proposed frameworks and criteria for assessing qualitative research (see Guba and Lincoln 1989; Lather 1993; Schwandt 1996; Bochner 2000; Ritchie et al. 2003; Tracy 2010; Altheide and Johnson 2011). Some have also argued that standardized criteria in are unhelpful in qualitative inquiry (i.e. see Schwandt 1996; Altheide and Johnson 2011). For example, Bochner (2000) argues that ‘traditional empiricist criteria’ are ‘unhelpful’ when applied to new ethnographic approaches (cited in Tracy 2010: 838). As Altheide and Johnson (2011: 582) argue:

“There are many ways to use, practice, promote, and claim qualitative research, and in each there is a proposed or claimed relationship between some field of human experience, a form of representation, and an audience. Researchers and scholars in each of these areas have been grappling with issues of truth, validity, verisimilitude, credibility, trustworthiness, dependability, confirmability, and so on. What is valid for clinical studies or policy studies may not be adequate or relevant for ethnography or autoethnography or performance ethnography.”

Qualitative research is conducted within different research paradigms, which complicates the assessment of the quality of a particular study.

As Tracy (2010) notes, many of these critiques result in the development of new quality standards and criteria for evaluating qualitative inquiry which are seen as more flexible than quantitative standard and of more sensitive to the context bound nature of qualitative research. Below, we explore the main criteria proposed for assessing qualitative research:

Criteria for Assessing Qualitative Research

  1. Trustworthiness

In the 1980s, Guba and Lincoln (1989 see also Krefting 1991) developed criteria which can be used to determine rigor in a qualitative inquiry. Instead of ‘rigor’, they focus on the development of trustworthiness in qualitative inquiry through determining: credibility, transferability, reliability and confirmability.

Qualitative criteriaQuantitative criteria
CredibilityInternal validity
TransferabilityExternal validity or generalizability
  1. Credibility

Credibility asks us to consider if the research findings are plausible and convincing. Questions to consider include:

  • How well does the study capture and portray the world it is trying to describe?
  • How well backed up are the claims made by the research?
  • What is the evidential base for the research?
  • How plausible are the findings?

As Stenfors et al. (2020) point out, there should be alignment between ‘theory, research question, data collection, analysis and results’ while the ‘sampling strategy, the depth and volume of data, and the analytical steps taken’ must be appropriate within that framework.

  1. Transferability

Here, we are interested in how clear the basis is for drawing wider inference (Ritchie et al. 2003) from our study. Can the findings of our study be transferred to another group, context or setting?

As Ritchie et al. (2003) argue, the findings of qualitative research can be generalized but the framework within which this can occur needs greater clarification. Instead, we refer to the transferability of findings in a qualitative study. For example, in an empirical sense: can findings from qualitative research studies be applied to populations or settings beyond the particular sample of the study? We can also explore the generation of theoretical concepts or propositions which are deemed to be of wider, or universal, application from a qualitative study.

When attempting to extrapolate from a qualitative study we should be conscious that meanings and behaviours are context bound. Therefore extrapolation may be possible if offered as a working hypothesis to help us to make sense of findings in other contexts.

Questions to consider include:

  • Sample coverage: did the sample frame contain any known bias; were the criteria used for selection inclusive of the constituencies thought to be of importance?
  • Capture of the phenomena: was the environment, quality of questioning effective for participants to fully express their views?
  • Identification or labelling: have the phenomena been identified, categorised and named in ways that reflect the meanings assigned by participants?
  • Interpretation: is there sufficient internal evidence for the explanatory accounts that have been developed?
  • Display: have the findings been portrayed in a way that remains true to the original data and allows others to see the analytic constructions which have occurred? (see Ritchie et al. 2003)
  1. Dependability

Dependability is ‘the extent to which the research could be replicated in similar conditions’ (Stenfors et al. 2020). The researcher should have provided enough information on the design and conduct of their study that another researcher could follow these and take the same steps in their study. Given the context specific nature of qualitative research, it can be difficult to demonstrate which features of the qualitative data should be expected to be consistent, dependable or reliable.

Questions to consider for reliability include:

  • Was the sample design/selection without bias, ‘symbolically’ representative of the target population, comprehensive of all known constituencies; was there any known feature of non-response or attrition within the sample?
  • Was the fieldwork carried out consistently, did it allow respondents sufficient opportunities to cover relevant ground, to portray their experiences?
  • Was the analysis carried out systematically and comprehensively, were classifications, typologies confirmed by multiple assessment?
  • Is the interpretation well supported by the evidence?
  • Did the design/conduct allow equal opportunity for all perspectives to be identified or were there features that led to selective, or missing, coverage? (see Ritchie et al. 2003).
  1. Confirmability

Here, we are looking for a clear link between the data and the findings. For example, researchers should evidence their claims with the use of quotes/excerpts of data. Qualitative researchers should avoid the temptation to quantify findings with claims such as ‘70% of participants felt that xxx…’ It is also important in the Discussion to demonstrate how the research findings relate to the wider body of literature and to answer the research question. Any limitations of the study should also be flagged up.

  1. Reflexivity

Stenfors et al. (2020) draw attention to reflexivity as another important criteria in assessing qualitative inquiry. For Guba and Lincoln (1989) the reflexive journal is a further means of helping to assess qualitative inquiry. A reflexive approach helps us to be aware of the social, ethical and political impact of our research, the central, fluid and changing nature/s of power relations (with participants, gatekeepers, research funders, etc.) and our relationships with the researched (Lumsden 2019).

We can ask whether the researcher has stepped back and critically reflected on their role in the research process, their relationships with the researched, and their social position? It should be clear how reflexivity has been embedded in the research process (Stenfors et al. 2020). As Altheide and Johnson (2011: 581) write:

‘Good qualitative research—and particularly ethnographies—shows the hand of the ethnographer. The effort may not always be successful, but there should be clear “tracks” that the attempt has been made.’

Additional Criteria: Ethics

Tracy (2010) also provides a useful overview of 8 key criteria for excellent qualitative research: worthy topic, rich rigor, sincerity, credibility, resonance, significant contribution, ethical, meaningful coherence (p.840). There is overlap with the above criteria and some elements could be said to be already subsumed in the above discussion, therefore I will not delve into them all here. However, it is important to draw attention to ethical considerations in qualitative studies. As Tracy notes, the research should consider:

  • Procedural ethics (such as human subjects);
  • Situational and culturally specific ethics;
  • Relational ethics;
  • Exiting ethics (leaving the scene and sharing the research) (see Tracy 2010: 840).

Strategies for Determining Trustworthiness (Rigor)

The strategies adopted in order to determine the trustworthiness of a qualitative study depend on a variety of factors including: research paradigm, the specifics of each research design, the research methods utilised (i.e. interviews, ethnography, observation, focus groups, creative methods, visual methods, secondary data analysis, narratives etc.) and the type of qualitative analysis being conducted.

Moore (2015) provides a useful evaluation of the use of various strategies for ensuring rigor in qualitative studies. Strategies which she evaluates as typically used in attempts to ensure validity and reliability include:

  • Prolonged engagement in ethnographic research via time spent in the field to reduce researcher effect;
  • Prolonged observation in ethnographic research reduces researcher effect;
  • Thick description;
  • Triangulation;
  • Development of a coding system and inter-rater reliability in semi-structured interviews;
  • Researcher bias;
  • Negative case analysis;
  • Peer review debriefing (in team research);
  • Member checks;
  • External audits (viewed as problematic and not routinely used) (see pages 1217-1220).

She provides a useful evaluation of the appropriateness and success of these strategies for ensuring rigor, for those who wish to explore this further. Interestingly, through her critique of these strategies, Moore also suggests that ‘qualitative researchers return to the terminology of social sciences, using rigor, reliability, validity, and generalizability’ (p.1212) instead of those proposed in the 1980s by Guba and Lincoln (1989).


Awareness of the criteria used when assessing the quality of qualitative research is key for anyone conducting qualitative research. As we have seen these criteria typically include: trustworthiness, credibility, transferability, dependability, confirmability, reflexivity and ethics.

However the strategies which each researcher adopts in order to ensure the trustworthiness (rigor) of their study, will depend on a variety of factors specific to each qualitative research project including the research method they adopt and the research paradigm. As Moore (2019: 1219) writes: ‘…rigor, comprising both validity and reliability, is achieved primarily by researchers in the process of data collection and analysis’. In addition, the assessment criteria which are valid when assessing fields such as clinical studies may not be relevant for those working in areas such as ethnography or narrative studies (see Altheide and Johnson 2011). There is no easy route or ‘one size fits all’ approach for assessing the quality of qualitative research, but the above criteria give us a good starting point which we can refer to when designing and conducting our qualitative inquiries.

References and further reading

Altheide, D.L. and Johnson, J.M. (2011) ‘Reflections on Interpretive Adequacy in Qualitative Research.’ In N.K. Denzin and Y.S. Lincoln (eds) Handbook of Qualitative Research, Fifth Edition (pp. 581-594). London: Sage.

Bochner, A. (2000) ‘Criteria Against Ourselves.’ Qualitative Inquiry, 6: 266-272.

Braun, V. and Clarke, V. (2013) Successful Qualitative Research. London: Sage.

Guba, E. and Lincoln, Y. (1989) Fourth Generation Evaluation. Newbury Park, CA: Sage.

Krefting, L. (1991) ‘Rigor in Qualitative Research: The Assessment of Trustworthiness.’ American Journal of Occupational Therapy, 45: 214–222.

Lather, P. (1993) ‘Fertile Obsession: Validity after Poststructuralism.’ Sociological Quarterly, 34: 673-693.

Lingard L. (2015) ‘Joining a Conversation: The Problem/Gap/Hook Heuristic.’ Perspectives on Medical Education, 4(5): 252–253.

Lumsden, K. (2019) Reflexivity: Theory, Method and Practice. London: Routledge.

Morse, J.M. (2015) ‘Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry.’ Qualitative Health Research, 25(9): 1212-1222.

Schwandt, T.A. (1996) ‘Farewell to Criteriology.’ Qualitative Inquiry, 2: 58-72.

Spencer, L., Ritchie, J., Lewis, J., and Dillon, L. (2003) Quality in Qualitative Evaluation: A Framework for Assessing Research Evidence, GCSRO.  Available at: www.policyhub.gov.uk/publications

Stenfors, T., Kajamaa, A. and Bennett, D. (2020) ‘How to… Assess the Quality of Qualitative Research.’ The Clinical Teacher, https://doi.org/10.1111/tct.13242

Tracy, S.J. (2010) ‘Qualitative Quality: Eight “Big-Tent” Criteria for Excellent Qualitative Research.’ Qualitative Inquiry, 16: 837–851.


Resources for doing online qualitative research during Covid-19

I’ve put together a list of readings and resources on conducting qualitative research remotely during Covid-19.

This includes general readings on online qualitative research, online interviewing, online focus groups, online ethnography, diaries, narratives and storytelling, participatory approaches, visual methods, and ethical guidance.

If you notice any readings missing from the list which you would like to share please get in touch and I will add them: karen@qualitativetraining.com

Links to books are via Amazon Affiliate. Earnings from these help me to maintain my website.

List last updated on 18/11/21.

General readings on doing qualitative research online (and/or during Covid-19)

boyd, D. (2009) ‘How can qualitative Internet researchers define the boundaries of their projects?’ In: ​A. Markham and N. Baym (eds) ​Internet Inquiry: Conversations About Method, 26-32. London: Sage. Buy at Amazon UK.

Braun, V., Clarke, V. and Gray, D. (eds) (2017) Collecting Qualitative Data. Cambridge: Cambridge University Press. Buy at Amazon UK.

Coulson, N. (2015) Online Research Methods for Psychologists. London: Palgrave Macmillan. Buy at Amazon UK.

Photo by Ann Nekr on Pexels.com

Covid Realities Project: https://covidrealities.org/

Dodds,  S.  and  Hess,  A.C.  (2021)  ‘Adapting  research  methodology  during  COVID-19:  Lessons  for  transformative service research.’ Journal of Service Management, 32(2): 203-216.

ESRC (2020) Internet-mediated Research: https://esrc.ukri.org/funding/guidance-for-applicants/research-ethics/frequently-raised-topics/internet-mediated-research/

Gaiser, T.J. and Schreiner, A.E. (2009) A Guide to Conducting Online Research. London: Sage. Buy at Amazon UK.

Glassmeyer, D.M. and Dibbs, R.A. (2012) ‘Researching from a distance: Using live web conferencing to mediate data collection.’ International Journal of Qualitative Methods, 11(3): 292-302. https://doi.org/10.1177/160940691201100308

Granello,  D.H.  and  Wheaton,  J.E.  (2004)  ‘Online  data  collection:  Strategies  for  research.’  Journal of Counseling & Development, 82(4): 387-393.

Gregory, K. (2018) ‘Online communication settings and the qualitative research process: Acclimating students and novice researchers.​’ Qualitative Health Research, 28(10): 1610-1620.

Hanna, P. (2012) ‘Using internet technologies (such as Skype) as a research medium: A research note.’ Qualitative Research, 12(2): 239-242. https://doi.org/10.1177/1468794111426607

Hargittai, E. (ed.) (2021) Research Exposed: How Empirical Social Science Gets Done in the Digital Age. New York, NY: Columbia University Press. Buy at Amazon UK.

Hawkins, K., Amegee, J. and Steege, R. (2020) Remote Research Methods to Use During the COVID-19 Pandemic.  Available  at:  http://www.ariseconsortium.org/remote-research-methods-to-use-during-the-covid-19-pandemic/

Herbert, M.B. (2020) ‘Creatively adapting research methods during COVID-19.’ International Journal of Social Research Methodology. Available at: https://ijsrm.org/2020/08/06/creatively-adapting-research-methods-during-covid-19/

Hewson, C. (ed.) (2015) Internet Research Methods, 2nd edition. London: Sage. Buy at Amazon UK.

Hine, C. (2013) Virtual Research Methods (Four Volume Set). London: Sage. Buy at Amazon UK.

Hine, C. (2013) The Internet: Understanding Qualitative Research. Oxford: Oxford University Press. Buy at Amazon UK.

Hine, C., Snee, H., Morey, Y., Roberts, S. and Watson, H. (2015) Digital Methods for Social Science: An Interdisciplinary Guide to Research Innovation. Basingstoke: Palgrave Macmillan. Buy at Amazon UK.

Howlett, M. (2021) ‘Looking at the “field” through a Zoom lens: Methodological reflections on conducting online research during a global pandemic.’ Qualitative Research. iFirst. https://doi.org/10.1177/1468794120985691

Karpf, D. (2012) ‘Social science research methods in Internet time.’ Information, Communication & Society, 15(5): 639-661.

Lee, R.M., Fielding, N.G. and Blank, G. (2008) ‘The internet as a research medium.’ In: N.G. Fielding, R.M. Lee and G. Blank (eds) The SAGE Handbook of Online Research Methods, 4-5. London: Sage. Buy at Amazon UK.

Lee, R.M., Fielding, N.G. and Blank, G. (2017) ‘Online research methods in the social sciences: an editorial introduction. In N.G. Fielding, R.M. Lee and G. Blank (eds.) The SAGE Handbook of Online Research Methods, 2nd edition. Sage, 3-16. Buy at Amazon UK.

Lobe, B., Morgan, D. and Hoffman, K.A. (2020) ‘Qualitative data collection in an era of social distancing.’ International Journal of Qualitative Methods, 19: 1-8.

Lupton, D. (ed.) (2020) Doing fieldwork in a pandemic (crowd-sourced document). Available at: https://docs.google.com/document/d/1clGjGABB2h2qbduTgfqribHmog9B6P0NvMgVuiHZCl8/edit?ts=5e88ae0a#

Małachowska, A., Jones, N., Abu Hamad, B., Al Abbadi, T., Al Almaireh, W., Alheiwidi, S., Bani Odeh, K., Iyasu, A., Gebre,  Y., Gezahegne, K., Guglielmi, S., Mitu, K., Pincock, K., Rashid, S., Saleh, M., Sultan, M., Tilahun, K., Workneh, F., Mann C and Stewart, F. (2000) Internet Communication and Qualitative Research. London: Sage. Buy at Amazon UK.

Markham, A. and Baym, N. (eds) (2009) ​Internet Inquiry: Conversations about Method.​ London: Sage. Buy at Amazon UK.

Meskell, P., Houghton, C. and Biesty, L. (2021) ‘Opening windows behind closed doors: Reflections on working qualitatively during a pandemic.’ International Journal of Qualitative Methods, 20: https://doi.org/10.1177/16094069211008313

Miller, D. and Slater, D. (2000) ​The Internet: An Ethnographic Approach​. Oxford: Berg. Buy at Amazon UK.

Mwambari, D., Purdeková, A. and Bisoka, A.N. (2021) ‘Covid-19 and research in conflict-affected contexts: Distanced methods and the digitalisation of suffering.’ Qualitative Research. iFirst https://doi.org/10.1177/1468794121999014

Nind, M., Coverdale, A. and Meckin, R. (2021) Changing Social Research Practices in the Context of Covid-19: Rapid Evidence Review. Project Report. NCRM. http://eprints.ncrm.ac.uk/4398/

O’Dochartaigh, N. (2001) The Internet Research Handbook. London: Sage.

Orgad, S. (2009) ‘How can researchers make sense of the issues involved in collecting and interpreting online and offline data?’ In: A. Markham and N. Baym (eds) ​Internet Inquiry: Conversations About Method, 33-53. London: Sage.

Rahman, S.A., Tuckerman, L., Vorley, T. et al. (2021) ‘Resilient research in the field: Insights and lessons from adapting qualitative research projects during the Covid-19 pandemic.’ International Journal of Qualitative Methods, 20: 1-16. https://doi.org/10.1177/16094069211016106

Roberts, J.K., Pavlakis, A.E. and Richards, M.P. (2021) ‘It’s more complicated than it seems: Virtual qualitative research in the Covid-19 era.’ International Journal of Qualitative Research, 20. https://doi.org/10.1177/16094069211002959

Rogers, R. (2019) ​Doing Digital Methods. London: Sage. Buy at Amazon UK.

Russell, B. and Purcell, J. (2009) Online Research Essentials. USA: Josey-Bass. Buy at Amazon UK.

Salmons, J. (2016) Doing Qualitative Research Online. London: Sage. Buy at Amazon UK.

Sipes, J.B.A., Roberts, L.D. and Mullan, B. (2019) ‘Voice-only Skype for use in researching sensitive topics: A research note.’ Qualitative Research in Psychology, 1-17. https://doi.org/10.1080/14780887.2019.1577518

Tremblay, S., Castiglione, S., Audet, L. et al. (2021) ‘Conducting qualitative research to respond to Covid-19 challenges: Reflections for the present and beyond.’ International Journal of Qualitative Methods, 20. https://doi.org/10.1177/16094069211009679

Venturini, T., Bounegru, L., Gray, J. and Rogers, R. (2018) ‘A reality check(list) for digital methods.’ New Media & Society, 20(11): 4195-4217.

Whiting, R. and Pritchard, K. (2020) Collecting Qualitative Data Using Digital Methods. London: Sage. Buy at Amazon UK.

Yadete, W. and Youssef, S. (2020) GAGE Virtual Research Toolkit: Qualitative Research with Young People on their Covid-19 Experiences. London: Gender and Adolescence: Global Evidence. Available at: https://www.gage.odi.org/wp-content/uploads/2020/06/GAGE-virtual-research-toolkit-qualitative-research-with-young-people-on-their-covid-19-experiences.pdf

Online qualitative interviewing

Adams-Hutcheson, G. and Longhurst, R. (2017) ‘At least in person there would have been a cup of tea’: Interviewing via Skype. Area, 49(2): 148-155. https://doi.org/10.1111/area.12306

Alkhateeb, M. (2018) ‘Using Skype as a qualitative interview medium within the context of Saudi Arabia: A research note.’ Qualitative Report, 23(10): 2253-2260. https://doi.org/10.46743/2160-3715/2018.2805

Archibald, M.M., Ambagtsheer, R.C., Casey, M.G. and Lawless, M. (2019) ‘Using Zoom videoconferencing for qualitative data collection: Perceptions and experiences of researcher and participants.’ International Journal of Qualitative Methods, 18. https://doi.org/10.1177/1609406919874596

Ayling, R. and Mewse, A.J. (2009) ‘Evaluating internet interviews with gay men.’ Qualitative Health Research, 19(4): 566-576.

Bampton, R. and Cowton, C.J. (2002) ‘The e-interview.’ Forum: Qualitative Social Research, 3(2): Art. 9. https://doi.org/10.17169/fqs-3.2.848

Bampton, R., Cowton, C. J. and Downs, Y. (2013) ‘The e-interview in qualitative research.’ In: N. Sappleton (Ed.), Advancing Research Methods with New Technologies (pp. 329–343). IGI Global.

Bauman, A. (2015) ‘Qualitative online interviews: Strategies, design, and skills.’ Qualitative Research in Organizations and Management, 10(2): 201-202.

Burns, E. (2010) ‘Developing email interview practices in qualitative research.’ Sociological Research Online, 15(4): 24–35.

Curasi, C.F. (2001) ‘A critical exploration of face-to-face interviewing vs. computer-mediated interviewing.’ International Journal of Market Research, 43(4): 361-375.

Davies, L., LeClair, C.L., Bagley, P., Blunt, H., Hinton, L., Ryan S. and Ziebland, S. (2020) ‘Face-to-face compared with online collected accounts of health and illness experiences: A scoping review.’ Qualitative Health Research, 30(13): 2092-2102.

Deakin,  H.  and  Wakefield,  K.  (2014)  ‘Skype  interviewing:  Reflections  of  two  PhD  researchers.’  Qualitative Research, 14(5): 603-616.

Dowling, R., Lloyd, K. and Suchet-Pearson, S. (2015) ‘Qualitative methods 1: Enriching the interview.’ Progress in Human Geography, 40(5): 679-686. https://doi.org/10.1177/0309132515596880

Gray, L.M., Wong-Wylie, G., Rempel, G.R. and Cook, K. (2020) ‘Expanding qualitative research interviewing strategies: Zoom video communication.’ The Qualitative Report, 5(8): 1292-1301.

Hinchcliffe, V. and Gavin, H. (2009) ‘Social and virtual networks: Evaluating synchronous online interviewing using Instant Messenger.’ The Qualitative Report, 14(2): 318-340.

Iacono, V.L., Symonds, P. and Brown, D.H.K. (2016) ‘Skype as a tool for qualitative research interviews.’ Sociological Research Online, 21(2): https://www.socresonline.org.uk/21/2/12.html.bak

Irani, E. (2018) ‘The use of videoconferencing for qualitative interviewing: Opportunities, challenges and considerations.’ Clinical Nursing Research, 28(1): 3-8.

Irvine, A. (2011) ‘Duration, dominance and depth in telephone and face-to-face interviews: A comparative exploration.’ International Journal of Qualitative Methods, 10(3): 202–220.

James, N. and Busher, H. (2009) Online Interviewing. London: Sage.

Janghorban, R., Roudsari, R.L. and Taghipour, A. (2014) ‘Skype interviewing: The new generation of online synchronous interview in qualitative research.’ International Journal of Qualitative Studies on Health and Well-being, 9(1): 241-252.

Jenner, B.M. and Myers, K.C. (2019) ‘Intimacy, rapport, and exceptional disclosure: A comparison of in-person and mediated interview contexts.’ International Journal of Social Research Methodology, 22(2): 165-177. https://doi.org/10.1080/13645579.2018.1512694

Johnson, D.R., Scheitle, C.P. and Ecklund, E.H. (2019) ‘Beyond the in-person interview? How interview quality varies across in-person, telephone and Skype interviews.’ Social Science Computer Review. https://doi.org/10.1177/0894439319893612

Krouwel, M., Jolly, K. and Greenfield, S. (2019) ‘Comparing Skype (video calling) and in-person qualitative interview modes in a study of people with irritable bowel syndrome. An exploratory comparative analysis.’ BMC Medical Research Methodology, 19: 219. https://doi.org/10.1186/s12874-019-0867-9

Lawrence, L. (2020) ‘Conducting cross-cultural qualitative interviews with mainland Chinese participants during COVID: Lessons from the field.’ Qualitative Research, iFirst. https://doi.org/10.1177/1468794120974157

Linabary, J. R. and Hamel, S. A. (2017) ‘Feminist online interviewing: engaging issues of power, resistance and reflexivity in practice.’ Feminist Review, 115(1): 97-113. https://doi.org/http://dx.doi.org/10.1057/s41305-017-0041-3

Livingstone, S. and Locatelli, E. (2012) ‘Ethical dilemmas in qualitative research with youth on/offline.’ International Journal of Learning and Media, 4(2): 67-75.

Lobe, B., Morgan, D. and Hoffman, K.A. (2020) ‘Qualitative Data Collection in an Era of Social Distancing.’ International Journal of Qualitative Methods, 19, https://doi.org/10.1177/1609406920937875

Lo Iacono, V., Symonds, P. and Brown, D.H.K. (2016) ‘Skype as a tool for qualitative research interviews.’ Sociological Research Online, 21(2): 103-117. https://doi.org/10.5153/sro.3952

McCoyd, J.L.M. and Kerson, T.S. (2006) ‘Conducting intensive interviews using email: A serendipitous comparative opportunity.’ Qualitative Social Work, 5(3): 389-406. https://doi.org/10.1177/1473325006067367

Melis, G., Sala, E. and Zaccaria, D. (2021) ‘Remote recruiting and video-interviewing older people: A research note on a qualitative case study carried out in the first Covid-19 Red Zone in Europe.’ International Journal of Social Research Methodology, iFirst https://doi.org/10.1080/13645579.2021.1913921

Mirick, R.G. and Wladkowski, S.P. (2019) ‘Skype in qualitative interviews: participant and researcher perspectives.’ The Qualitative Report, 24(12): 3061-3072.

Oates, J. (2015) ‘Use of Skype in interviews: The impact of the medium in a study of mental health nurses.’ Nurse Researcher, 22(4): 13-17.

Rowe, M., Rosenheck, R., Stern, E. and Bellamy, C. (2014) ‘Video conferencing technology in research on schizophrenia: A qualitative study of site research staff.’ Psychiatry, 77(1): 98-102. https://doi.org/10.1521/psyc.2014.77.1.98

Salmons, J. (2010) Online Interviews in Real Time. Thousand Oaks, CA: Sage.

Salmons, J. (ed.) (2012) Cases in Online Interview Research. Thousand Oaks, CA: Sage.

Salmons, J. (2015) Qualitative Online Interviews. Thousand Oaks, CA: Sage.

Sedgwick, M. and Spiers, J. (2009) ‘The use of videoconferencing as a medium for the qualitative interview.’ International Journal of Qualitative Methods, 8(1): 1-11. https://doi.org/10.1177/160940690900800101

Seitz, S.D. (2020) ‘Skype interviewing.’ In P. Atkinson, S. Delamont, A. Cernat, J.W. Sakshaug and R.A. Williams (eds) SAGE Research Methods Foundations. doi: 10.4135/9781526421036782463. Available at: http://methods.sagepub.com/foundations

Seitz, S. (2015) ‘Pixilated partnerships, overcoming obstacles in qualitative interviews via Skype: A research note.’ Qualitative Research, 16(2): 229-235. https://doi.org/10.1177/1468794115577011

Shapka, J.D., Domene, J.F., Khan, S. and Yang, L.M. (2016) ‘Online versus in-person interviews with adolescents: An exploration of data equivalence.’ Computers in Human Behavior, 58(May 2016): 361-367. https://doi.org/10.1016/j. chb.2016.01.016

Thunberg, S. and Arnell, L. (2021) ‘Pioneering the use of technologies in qualitative research: A research review of the use of digital interviews.’ International Journal of Social Research Methodology, iFirst. https://doi.org/10.1080/13645579.2021.1935565

Topping, M., Douglas, J. and Winkler, D. (2021) ‘General considerations for conducting online qualitative research and practice implications for interviewing people with acquired brain injury.’ International Journal of Qualitative Methods, 20: 1-15. https://doi.org/10.1177/16094069211019615

Weinmann, T., Thomas, S., Brilmayer, S., Heinrich, S. and Radon, K. (2012) ‘Testing Skype as an interview method in epidemiologic research: response and feasibility.’ International Journal of Public Health, 57(6): 959-961. https://doi.org/10.1007/s00038-012-0404-7

Weller, S. (2017) ‘Using internet video calls in qualitative (longitudinal) interviews: Some implications for rapport.’ International Journal of Social Research Methodology, 20(6): 613-625.

Online focus groups and workshops

Abrams, K., Wang, Z., Sing, Y. and Galindo-Gonzalez, S. (2015) ‘Data richness trade-offs between face-to-face, online audiovisual, and online text-only focus groups.’ Social Science Computer Review, 3: 80-96.

Archibald, M.M., Ambagtsheer, R., Casey, M.G. and Lawless, M. (2019) ‘Using Zoom videoconferencing for qualitative data collection: Perceptions and experiences
of researchers and participants.’ International Journal of Qualitative Methods, 18: 1-8.

Chen, J. and Neo, P. (2019) ‘Texting the waters: An assessment of focus groups conducted via the WhatsApp smartphone messaging application.’ Methodological Innovations, 12(3): https://journals.sagepub.com/doi/full/10.1177/2059799119884276

Colom, A. (2021) ‘Using WhatsApp for focus group discussions: Ecological validity, inclusion and deliberation.’ Qualitative Research, iFirst. https://doi.org/10.1177/1468794120986074

Daniels, N., Gillen, P., Casson, K. and Wilson, I. (2019) ‘STEER: Factors to consider when designing online focus groups using audiovisual technology in health research.’ International Journal of Qualitative Methods, 18(1): 1-11.

Deggs, D., et al. (2010) ‘Using Message Boards to Conduct Online Focus Groups.’ The Qualitative Report, 15(4): 1027-1036.

Egid, B., Ozano, K, Hegel, G., Zimmerman, E., Lopez, Y., Roura, M., Sheikhattari, P., Jones, L., Dias, S. and Wallterstein, N. (2021) ‘Can everyone hear me? Reflections on the use of global online workshops for promoting inclusive knowledge generation.’ Qualitative Research, iFirst. https://doi.org/10.1177/14687941211019585

Flynn, R., Albrecht, L. and Scott, S. (2018) ‘Two approaches to focus group data collection for qualitative health research: Maximizing resources and data quality. International Journal of Qualitative Methods, 17(1): https://doi.org/10.1177/1609406917750781

Fox, F. E., Morris, M. and Rumsey, N. (2007) ‘Doing synchronous online focus groups with young people: Methodological reflections. Qualitative Health Research, 17(4): 539-547. https://doi.org/10.1177/1049732306298754

Gaiser, T. (2017) ‘Online focus groups.’ In: N. Fielding, R.M. Lee and G. Blank (eds) The SAGE Handbook of Online Research Methods, 2nd edition, 290–306. London: Sage.

Glassmeyer, D. and Dibbs, R. (2012) ‘Researching from a distance: Using live web conferencing to mediate data collection.’ International Journal of Qualitative Methods, 11: 292-302.

Gratton, M.F. and O’Donnell, S. (2011) ‘Communication technologies for focus groups with remote communities:  A  case  study  of  research  with  First  Nations  in  Canada.’  Qualitative  Research, 11(2): 159-175.

Harmesen, I.A., Mollema, L., Ruiter, R.A.C., Paulussen, T.G.W., de Melker, H.E. and Kok, G. (2013) ‘Why parents refuse childhood vaccination: A qualitative study using online focus groups.’ BMC Public Health, 13: 1183. http://www.biomedcentral.com/1471-2458/13/1183

Hinkes, C. (2020) ‘Key aspects to consider when conducting synchronous text-based online focus groups: A research note.’ International Journal of Social Research Methodology. Available at: https://doi.org/10.1080/13645579.2020.1801277

Im, E.O. and Chee, W. (2012) ‘Practical guidelines for qualitative research using online forums.’ Computers, Informatics, Nursing, 30(11): 604-611.

Kramish C.M., Meier, A., Carr, C., Enga, Z., James, A.S., Reedy, J. and Zheng, B. (2001) ‘Health behavior changes after colon cancer: A comparison of findings from face-to-face and on-line focus groups.’ Family and Community Health, 24(3): 88-103.

Lathen, L. (2021) ‘Reflections on online focus group research with low socio-economic status African American adults during Covid-19.’ International Journal of Qualitative Methods, iFirst. https://doi.org/10.1177/16094069211021713

Lowenthal, P.R. and Dunlap, J.C. (2020) ‘Social presence and online discussions: A mixed  method investigation.’ Distance Education, 41(4): 490-514.

MacNamara, N., Mackle, D., Trew, J.D., Pierson, C. and Bloomer, F. (2020) ‘Reflecting on asynchronous internet mediated focus groups for researching culturally sensitive issues.’ International Journal of Social Research Methodology, ifirst. https://doi.org/10.1080/13645579.2020.1857969

Monolescu, D. and Schifter, C. (2000) ‘Online Focus Group: A Tool to Evaluate Online Students’ Course Experience.’ The Internet and Higher Education, 2(2-3): 171-176.

Moore, T., McKee, K. and McLoughlin, P. (2015) ‘Online focus groups and qualitative research in the social sciences: Their merits and limitations in a study of housing and youth.’ People, Place and Policy, 9(1): 17-28.

Morgan, D. L. (2019) Basic and Advanced Focus Groups. Thousand Oaks, CA: Sage.

Morgan, D. L. and Lobe, B. (2011) ‘Online focus groups.’ In: S.N. Hesse-Biber (ed.) The Handbook of Emergent Technologies in Social Sciences. New York: Oxford University Press.

Nicholas, D.B., Lach, L., King, G., Scott, M., Boydell, K., Sawatzky, B.J., Reisman, J., Schippel, E. and Young, N.L. (2010) ‘Contrasting Internet and face-to-face focus groups for children with chronic health conditions: Outcomes and participant experiences.’ International Journal of Qualitative Methods, 9(1): 105-121.

Nobrega, S., El Ghaziri,M., Giacobbe, L., Rice, S., Punnett, L. and Edwards, K. (2021) ‘Feasibility of virtual focus groups in program impact evaluation.’ International Journal of Qualitative Methods, iFirst. https://doi.org/10.1177/16094069211019896

Oringderf, J. (2004) ‘”My Way”: Piloting an Online Focus Group.’ International Journal of Qualitative Methods, 3(3): 69-75.

Rezabek, R. (2000) ‘Online Focus Groups: Electronic Discussions for Research.’ Forum: Qualitative Social Research, 1:1-18. http://www.qualitative-research.net/fqs/

Richard, B., Sivo, S.A., Ford, R.C., Murphy, J., Boote, D.N. and Orlowski, E.W.M. (2021) ‘A guide to conducting online focus groups via Reddit.’ International Journal of Qualitative Methods, 20. https://doi.org/10.1177/16094069211012217

Ross, L.E., Stroud, L.A., Rose, S.W. and Jorgensen, C.M. (2006) ‘Using Telephone Focus Groups Methodology to Examine the Prostate Cancer Screening Practices of African-American Primary Care Physicians.’ Journal of the National Medical Association 98: 1296.

Rupert, D.J., Poehlman, J.A., Hayes, J.J., Ray, S.E. and Moultrie, R.R. (2017) ‘Virtual versus in-person focus groups: Comparison of costs, recruitment, and participant logistics.’ Journal of Medical Internet Research, 19(3): Article e80. https://doi.org/10.2196/jmir.6980

Schneider, S.J., Kerwin, J., Frechtling, J. and Vivari, B.A. (2002) ‘Characteristics of the discussion in online and face-to-face focus groups.’ Social Science Computer Review, 20(1): 31-42.

Seale, C., Charteris-Black, J., MacFarlane, A. and McPherson, A. (2010) ‘Interviews and internet forums: A comparison of two sources of qualitative data.’ Qualitative Health Research, 20(5): 595–606.

Smith, T. (2014) ‘Experiences of therapists and occupational therapy students using video conferencing in conduction of focus groups.’ The Qualitative Report, 19: 1-13.

Stancanelli, J. (2010) ‘Conducting an Online Focus Group.’ The Qualitative Report, 15(3): 761-765.

Stewart, K. and Williams, M. (2005) ‘Researching Online Populations: The Use of Online Focus Groups for Social Research.’ Qualitative Research, 5(4): 395-416.

Sturges, J. and Hanrahan, K.J. (2004) ‘Comparing telephone and face-to-face qualitative interviewing: A research note.’ Qualitative Research, 4(1): 107–118.

Sweet, C. (2001) ‘Designing and Conducting Virtual Focus Groups.’ Qualitative Market Research, 4(3): 130-135.

Synnot, A., Hill, S., Summers, M. and Taylor, M. (2014) ‘Comparing face-to-face and online qualitative research with people with multiple sclerosis.’ Qualitative Health Research, 24(3): 431–438.

Tates, K. et al. (2009) ‘Online Focus Groups as a Tool to Collect Data in Hard-to-Include Populations: Examples from Pediatric Oncology.’ BMC Medical Research Methodology, 9(15): http://www.biomedcentral.com/1471-2288/9/15

Thrul, J., Belohlavek, A., Hambrick, D., Kaur, M. and Ramo, D.E. (2017) ‘Conducting online focus groups on Facebook to inform health behavior change interventions: Two case studies and lessons learned.’ Internet Interventions, 9: 106-111.

Turney, L. and Pocknee, C. (2005) ‘Virtual Focus Groups: New Frontiers in Social Research.’ International Journal of Qualitative Methods, 4(2): 32-43.

Tuttas, C. (2015) ‘Lessons learned using web conference technology for online focus group interviews.’ Qualitative Health Research, 25: 122-133.

Vicsek, L. (2016) ‘Improving data quality and avoiding pitfalls of online text-based focus groups: A practical guide.’ Qualitative Report, 21(7): 1232–1242.

Walker, D.M. (2013) ‘The Internet as a Medium for Health Services Research: Part 2.’ Nurse Researcher, 20(5): http://journals.rcni.com/doi/abs/10.7748/nr2013.

Watson, M. et al. (2006) ‘The Analysis of Interaction in Online Focus Groups.’ International Journal of Therapy and Rehabilitation, 13(12): 551-557.

Wilkerson, J.M., Iantaffi, A., Grey, J.A., Bockting, W.O. and Rosser, B.S. (2014) ‘Recommendations for internet-based qualitative health research with hard-to-reach populations.’ Qualitative Health Research, 24: 561-574.

Williams, S., Clausen, M.G., Robertson, A., Peacock, S. and McPherson, K. (2012) ‘Methodological reflections on the use of asynchronous online focus groups in health research.’ International Journal of Qualitative Methods, 11(4): 368–383.

Wirtz, A.L. et al. (2019) ‘Computer-Mediated Communication to Facilitate Synchronous Online Focus Group Discussions: Feasibility Study for Qualitative HIV Research Among Transgender Women Across the United States.’ Journal of Medical Internet Research, 21(3): https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6460306/

Woodyatt, C.R. (2016) ‘In-person versus online focus group discussions: A comparative analysis of data quality.’ Qualitative Health Research, 26(6): 741-749.

Visual methods

Bates, E.A., McCann, J.J., Kaye, L.K. and Taylor, J.C. (2017) ‘”Beyond words”: A researcher’s guide to using photo-elicitation in psychology.’ Qualitative Research in Psychology, 14(4): 459-481.

Bell, P., Williams, D., Phillips, R., Sanders, J., Edwards, A., Choy, E. and Grant, A. (2020) ‘Using visual timelines in telephone interviews: Reflections and lessons learned from the Star family study.’ International Journal of Qualitative Methods, 19: 1-11. https://doi.org/10.1177/1609406920913675

Harper, D. (2002) ‘Talking about pictures: A case for photo-elicitation.’ Visual Studies, 17(1): 13-26.

Heng, T. (2017) Visual Methods in the Field: Photography for the Social Sciences. London: Routledge.

Highfield, T. and Leaver, T. (2016) ‘Instagrammatics and digital methods: Studying visual social media, from selfies and GIFs to memes and emoji.’ Communication, Research and Practice, 2(1): 47-62.

Mannay, D. (2016) Visual, Narrative and Creative Methods: Application, Reflection and Ethics. London: Routledge.

Pink, S. and Leder Mackley, K. (2016) ‘Moving, making and atmosphere: Routines of home as sites for mundane improvisation.’ Mobilities, 11(2): 171-187.

Pink, S. and Leder Mackley, K. (2014) ‘Reenactment methodologies for everyday life research: Art therapy insights for video ethnography.’ Visual Studies, 29(2): 146-154.

Pink, S. (2014) ‘Digital-visual-sensory-design anthropology: Ethnography, imagination and intervention.’ Arts and Humanities in Higher Education, 13(4):412-427.

Pink, S., Fors, V. and Glöss, M. (2017) ‘Automated futures and the mobile present: In-car video ethnographies.’ Ethnography, 20(1): 88-107.

Pritchard, K. and Whiting, R. (2015) ‘Taking stock: A visual analysis of gendered ageing.’ Gender, Work & Organization, 22(5): 510-528.

Pritchard, K. and Whiting, R. (2017) ‘Analysing web images.’ In: C. Cassell, A.L. Cunliffe and G. Grandy (eds) The SAGE Handbook of Qualitative Business and Management Research Methods, 282-297. London: Sage.


Ahlin, T. and Fangfang, L. (2019) ‘From field sites to field events: Creating the field with information and communication technologies (ICTs).’ Medicine, Anthropology and Theory, 6(2): 1-24.

Aleszewski, A. (2006) Using Diaries for Social Research. London: Sage.

Bartlett, R. (2012) ‘Modifying the diary interview method to research the lives of people with dementia.’ Qualitative Health Research, 22(12): 1717-1726.

Crozier, S.E. and Cassell, C.M. (2016) ‘Methodological considerations in the use of audio diaries in work psychology: Adding to the qualitative toolkit.’ Journal of Occupational and Organizational Psychology, 89(2): 396-419.

Hyers, L.L. (2018) Diary Methods. New York: Oxford University Press.

Mendoza, J., Seguin, M.L., Lasco, G., Palileo-Villaneuva, L.M., Amit, A., Renedo, A., McKee, M., Palafox, B. andBalabanova, D. (2021) ‘Strengths and weaknesses of digital diaries as a means to study patient pathways: Experiences with a study of hypertension in the Philippines.’ International Journal of Qualitative Methods, iFirst. https://doi.org/10.1177/16094069211002746

Jones, N., Pincock, K., Hamad, B.A., Malachowska, A., Youssef, S., Alheiwidi, S. and Odeh, K.B. (2020) ‘Ensuring no voices are left behind: The use of digital storytelling and diary writing in times of crisis.’ In: H. Kara and S.M. Khoo (eds) Researching in the Age of COVID-19. Volume 2: Care and Resilience. Bristol: Policy Press.

Saltzman, L.Y., Terzis, L.D., Hansel, T.C., Blakey, J.M., Logan, D. and Bordnick, P.S. (2021) ‘Harnessing technology for research during the Covid-19 pandemic: A mixed methods study diary protocol.’ International Journal of Qualitative Methods, 20: https://doi.org/10.1177/1609406920986043

Volpe, C. (2019) ‘Digital diaries: New uses of PhotoVoice in participatory research with young people.’ Children’s Geographies, 17(3): 361-370.

Zimmerman, D.H. and Wieder, D.L (1977) ‘The diary: dDary-interview method.’ Urban Life, 5(4): 479-498.

Narratives and storytelling

Auoardottir, A.M. and Rudolfsdottir, A.G. (2020) ‘Chaos ruined the children’s sleep, diet and behaviour: Gendered discourses on family life in pandemic times.’ Gender, Work & Organization, 28(51): 168-182. https://doi.org/10.1111/gwao.12519

Bell, E. and Leonard, P. (2018) ‘Digital organizational storytelling on YouTube: Constructing plausibility through network protocols of amateurism, affinity, and authenticity.’ Journal of Management Inquiry, 27(3): 339-351.

Clarke, V., Braun, V., Frith, H. and Moller, N. (2019) ‘Editorial introduction to the special issue: Using story completion methods in qualitative research.’ Qualitative Research in Psychology, 16(1): 1-20. https://doi.org/10.1080/14780887.2018.1536378

Gravett, K. (2019) ‘Story completion: Storying as a method of meaning-making and discursive discovery.’ International Journal of Qualitative Methods, 18: https://doi.org/10.1177/1609406919893155

Torronoen, J. (2021) ‘Covid-19 as a generator of pending narratives: Developing an empirical tool to analyze narrative practices in constructing futures.’ International Journal of Qualitative Methods, 20: https://doi.org/10.1177/1609406921996855

Participatory research approaches

Hall, J., Gaved, M. and Sargent, J. (2021) ‘Participatory research approaches in times of Covid-19: A narrative literature review.’ International Journal of Qualitative Research, 20: https://doi.org/10.1177/16094069211010087

NCRM (2021) The NCRM Wayfinder Guide to Adapting Participatory Methods for Covid-19. Southampton, UK: NCRM. Available at: http://eprints.ncrm.ac.uk/4400/1/NCRM%20Wayfinder%20guide%20participatory%20methods.pdf

Mobile methods and social media

Airoldi, M. (2018) ‘Ethnography and the digital fields of social media.’ ​International Journal of Social Research Methodology, 21(6): 661-673.

Boase, J. and Humphreys, L. (2018) ‘Mobile methods: Explorations, innovations, and reflections.’ Mobile Media & Communication, 6(2): 153-162.

Evans, A. (2017) Tinder as a Methodological Tool. ​AllegraLab. Available at: http://allegralaboratory.net/tinder-as-a-methodological-tool/

Kaufmann, K. and Peil, C. (2019) ‘The mobile instant messaging interview (MIMI): Using WhatsApp to enhance self-reporting and explore media usage in situ.’ Mobile Media & Communication, iFirst. https://doi.org/10.1177/2050157919852392

Lange, P.G. (2019) ​Thanks for Watching: An Anthropological Study of Video Sharing on YouTube​. Louisville, CO: University Press of Colorado.

Latzko-Toth, G., Bonneau, C. and Millette, M. (2017) ‘Small data, thick data: Thickening strategies for trace-based social media research.’ In: L. Sloan and A. Quan-Haase (eds)​The SAGE Handbook of Social Media Research Methods, 199-214. London: Sage.

Reich, J.A. (2015) ‘Old methods and new technologies: Social media and shifts in power in qualitative research.’ Ethnography, 16(4): 394–415.

Sloan, L. and Quan-Haase, A. (eds)​The SAGE Handbook of Social Media Research Methods. London: Sage.

Stewart, B. (2017) ‘Twitter as method: Using Twitter as a tool to conduct research.’ In: L. Sloan and A. Quan-Haase (eds) SAGE Handbook of Social Media Research, 251– 265. London: Sage.

Sugie, N.F. (2018) ‘Utilizing smartphones to study disadvantaged and hard-to-reach groups.’ Sociological Methods & Research, 47(3): 458-491.


Kurtz, L.C., Trainer, S., Beresford, M., Wutich, A. and Brewis, A. (2017) ‘Blogs as elusive ethnographic texts: Methodological and ethical challenges in qualitative online research.’ International Journal of Qualitative Methods, 16(1): https://doi.org/10.1177/1609406917705796

Owen, M.I., Braun, L.T., Hamilton, R.J., Grady, K.L., Gary, R.A., Quest, T.E. (2017) ‘Weblogs: A complex data source for qualitative research.’ Journal of Cardiac Failure, 23(11): 826-827.

Online ethnography

Abidin, C. (2020) ‘Somewhere between here and there: Negotiating researcher visibility in a digital ethnography of the influencer industry.’ Journal of Digital Social Research, 2(1): 56-76.

Abidin, C. (2016) ‘“Aren’t these just young, rich women doing vain things online?”: Influencer selfies as subversive frivolity.’ ​Social Media + Society​, ​2(​2): https://doi.org/10.1177/2056305116641342

Abidin, C. and de Seta, G. (eds) (2020) ‘Special issue: Doing digital ethnography: Messages from the field.’ ​Journal of Digital Social Research​, 2(1):​ ​https://jdsr.se/ojs/index.php/jdsr/issue/view/3

Barratt, M.J. and Maddox, A. (2016) ‘Active engagement with stigmatised communities through digital ethnography.’ Qualitative Research, 16(6): 701-719.

Beneito-Montagut, R., Begueria, A. and Cassián, N. (2017) ‘Doing digital team ethnography: Being there together and digital social data.’ ​Qualitative Research, ​17(6): 664-682.

Boellstorff, T. (2008) ​Coming of Age in Second Life: An Anthropologist Explores the Virtually Human.​ Princeton, USA: Princeton University Press.

Boellstorff, T., Nardi, B., Pearce, C. and Taylor, T.L. (2012) Ethnography and Virtual Worlds: A Handbook of Method. Princeton: Princeton University Press.

boyd, D. (2016) ‘​Making sense of teen life: Strategies for capturing ethnographic data in a networked era.’ In: E. ​Hargittai and C. Sandvig (eds) ​Digital Research Confidential: The Secrets of Studying Behavior Online, 79-103.​ Massachusetts, USA: MIT Press.

Burrell, J. (2009) ‘The field site as a network: A strategy for locating ethnographic research’. Field Methods,​ 21(2): 181-199.

Caliandro, A. (2018) ‘Digital methods for ethnography: Analytical concepts for ethnographers exploring social media environments.’ ​Journal of Contemporary Ethnography​, 47(5): 551-578.

Clark, L.S. (2004) ‘Ethnographic interviews on the digital divide.’ ​New Media & Society​, 6(4): 529-547.

Costello, L., McDermott, M.L. and Wallace, R. (2017) ‘Netnography: Range of practices, misperceptions, and missed opportunities.’ International Journal of Qualitative Research, 16: 1-12.

Coleman, G. (2010) ‘Ethnographic approaches to digital media.’ ​Annual Review of Anthropology, 39:(1): 487-505.

Cousineau, L.S., Oakes, H. and Johnson, C.W. (2019) ‘Appnography: Modifying ethnography for app-based culture.’ In: D.C. Parry, C.W. Johnson and S. Fullagar (eds) ​Digital Dilemmas: Transforming Gender Identities and Power Relations in Everyday Life,​ 95-117. Basingstoke: Palgrave.

Forsey, M., Breidenstein, G., Krüger, O. and Roch, A. (2015) ‘Ethnography at a distance: Globally mobile parents choosing international schools.’ ​International Journal of Qualitative Studies in Education​, 28(9): 1112-1128.

Hallett, R.E. and Barber, K. (2014) ‘Ethnographic research in a cyber era.’ Journal of Contemporary Ethnography, 43(3): 306-330.

Hine, C. (2000) Virtual Ethnography. London: SAGE Publications.

Hine, C. (2008) ‘Virtual ethnography: Modes, varieties, affordances.’ In: The SAGE Handbook of Online Research Methods, pp. 257-270.

Hine, C. (2015) ​Ethnography for the Internet: Embedded, Embodied and Everyday.​ London: Bloomsbury Academic.

Hjorth, L., Horst, H., Galloway, A. and Bell, G. (eds) (2017) The Routledge Companion to Digital Ethnography. London: Routledge.

Kendall, L. (2002) ​Hanging out in the Virtual Pub: Masculinities and Relationships Online.​ Berkeley, CA: University of California Press.

Kozinets, R. (2002) ‘The field behind the screen: Using netnography for marketing research in online communities.’ Journal of Marketing Research, 39(1): 61-72.

Kozinets, R.V. (2019) Netnography: The Essential Guide to Qualitative Social Media Research, 3rd edition. London: Sage.

Lee, M. (2017) ‘Don’t give up! A cyber-ethnography and discourse analysis of an online infertility patient forum.’ Culture, Medicine, Psychiatry, 41(3): 341-367.

Markham, A.N. (1998) Life Online: Researching Real Experience in Virtual Space. AltaMira.

Markham, A. (2005) ‘The methods, politics and ethics of representation in online ethnography.’ In: N. Denzin and Y. Lincoln (eds) ​The Sage Handbook of Qualitative Research, 3rd edition, 793-820. London: Sage.

Markham, A.N. (2013) ‘Fieldwork in social media: What would Malinowski do?’ ​Qualitative Communication Research​, 2(4): 434-446.

Møller, K. and Robards, B. (2019) ‘Walking through, going along and scrolling back: Ephemeral mobilities in digital ethnography.’ ​Nordicom Review,​ ​40(​s1): 95-109. https://www.degruyter.com/downloadpdf/j/nor.2019.40.issue-s1/nor-2019-0016/nor-2019-0016.pdf

Murthy, D. (2008) ‘Digital ethnography: An examination of the use of new technologies for social research.’ Sociology, 42(5): 837-855.

Pentzold, C. (2017) ‘“What are these researchers doing in my Wikipedia?” Ethical premises and practical judgment in internet-based ethnography.’ Ethics and Information Technology, 19(2): 143–155.

Pink, S., Horst, H., Posthill, J., Hjorth, L., Lewis, T. and Tacchi, J. (2016) Digital Ethnography: Principles and Practice. London: Sage.

Postill, J. and Pink, S. (2012) ‘Social media ethnography: The digital researcher in a messy web.’ Media International Australia, 145: 123-134.

Reagle, J. (2014) ‘Verklempt: Historically informed digital ethnography.’ ​Ethnography Matters.​ Available at: https://ethnographymatters.net/blog/2014/06/10/verklempt-historically-informed-digital-ethnogra phy/

Shumar, W. and Madison, N. (2013) ‘Ethnography in a virtual world.’ ​Ethnography and Education,
8(2): 255-272.

Tummons, J., MacLeod, A. and Kits, O. (2015) ‘Ethnographies across virtual and physical spaces: A reflexive commentary on a live Canadian/UK ethnography of distributed medical education.’ ​Ethnography and Education​, 10(1): 107-120.

Underberg, N.M. and Zorn, E. (2014) Digital Ethnography: Anthropology, Narrative, and New Media. Austin: University of Texas Press.

Walton, S. (2018) ‘Remote ethnography, virtual presence: Exploring digital-visual methods for anthropological research on the web.’ In: C. Costa and J. Condie (eds) Doing Research in and on the Digital: Research Methods across Fields of Inquiry. London: Routledge.

Wilson, B. (2006) ‘Ethnography, the Internet, and youth culture: Strategies for examining social resistance and “online-offline” relationships.’ ​Canadian Journal of Education​, 29(1): 307–328.

Surveys and questionnaires

Simsek, Z. and Veiga, J.F. (2001) ‘A primer on Internet organizational surveys.’ Organizational Research Methods, 4(3): 218-235.

Yun, G.W. and Trumbo, C.W. (2000) ‘Comparative response to a survey executed by post, e-mail, & web form.’ Journal of Computer-Mediated Communication, 6(1): https://doi.org/10.1111/j.1083-6101.2000.tb00112.x

Ethics guidance

ASA Ethical Guidelines (2011) ​Ethical Guidelines (PDF): https://www.theasa.org/downloads/ASA%20ethics%20guidelines%202011.pdf

American Anthropological Association (2009) ​Code of ethics of the American Anthropological Association.​ Washington D.C.: American Anthropological Association. Available at: http://s3.amazonaws.com/rdcms-aaa/files/production/public/FileDownloads/pdfs/issues/policy-a dvocacy/upload/AAA-Ethics-Code-2009.pdf

Association of Internet Researchers (2019) Internet Research: Ethical Guidelines 3.0 Association of Internet Researchers: https://aoir.org/reports/ethics3.pdf

Association of Internet Researchers (2012) Ethical Decision-Making and Internet Research: Recommendations from the AoIR Ethics Working Committee (Version 2.0): https://aoir.org/reports/ethics2.pdf

British Psychological Society (2017) Ethics Guidelines for Internet-Mediated Research 2017: https://www.bps.org.uk/news-and-policy/ethics-guidelines-internet-mediated-research-2017

British Sociological Association (date unknown). Ethics Guidelines and Collated Resources for Digital Research: Statement of Ethical Practice Annexe: https://www.britsoc.co.uk/media/24309/bsa_statement_of_ethical_practice_annexe.pdf

Markham, A. and Buchanan, E. (2012) ​Ethical Decision-Making and Internet Research: Recommendations from the AoIR Ethics Working Committee Report (Version 2.0).​ Available at: https://aoir.org/reports/ethics2.pdf

Markham, A.N. (2012) ‘Fabrication as ethical practice: Qualitative inquiry in ambiguous internet contexts.’ Information, Communication and Society, 15(3): 334-353. http://dx.doi.org/10.1080/1369118X.2011.641993

Markham, A., Tiidenberg, K. and Herman, A. (2018) ‘Ethics as methods: Doing ethics in the era of big data research – Introduction.’ Social Media + Society, 4(2): iFirst. https://doi.org/10.1177/2056305118784502

Sveningsson E.M. (2009) ‘How do various notions of privacy influence decisions in qualitative internet research?’ In: A. Markham and N. Baym (eds) Internet Inquiry: Dialogue among Researchers, 69-87. Thousand Oaks: Sage.

Tiidenberg, K. (2018) ‘Ethics in digital research​’.​ In: U. Flick (ed.) Handbook of Qualitative Data Collection, Chapter 30. London: Sage.

Tiidenberg K. (2018) ‘Research ethics, vulnerability, and trust on the internet.’ In: J. Hunsinger, L. Klastrup and M. Allen (eds) Second International Handbook of Internet Research. Dordrecht: Springer. Available at: https://doi.org/10.1007/978-94-024-1202-4_55-1

Qualitative research

A kick in the teeth?: The problems with “hierarchies of qualitative research” for policy-making and evidence-based policing

The problem with “gold standard” hierarchies of evidence

Four years ago I co-authored a paper on the rise of the “evidence base” (EB) in policing which drew attention to misguided attempts to replicate the “gold standard” hierarchies of evidence used in medicine, health, social care, education etc. in the context of policing (see Lumsden and Goode 2016). The paper addressed the rise of evidence-based policy and practice as a dominant discourse in policing in the UK, and the implications this has for social scientists conducting research in this area, and for police officers and staff. Our paper was intended as an exploration and conversation starter, to draw attention to the dangers and risks of a narrow focus on research which was ultimately driven by/via a government and policy-makers’ construction (see also Lumsden 2017). Other scholars have made similar observations, drawing our attention to the false premise that we should even attempt an evidence-based approach in policing (see Thacher 2001).

The development of an evidence-base in policing has largely been driven by the Maryland Scale of Scientific Methods, imported from the USA to the UK, and adopted and promoted by its proponents (i.e. Sherman), and which placed systematic reviews, RCTs, positivist scientific methods at the top layers of the hierarchy, with qualitative methods at the bottom. Evidence-based policing (EBP) was also later developed in the form of the “Evidence Based Policing Matrix” devised by Lum and colleagues (see Lum, Koper and Telep 2011): “…a research-to-practice translation tool which organizes moderate to very rigorous evaluations of police interventions visually, allowing agencies and researchers to view the field of research in this area,..” and which in that sense is more sympathetic to qualitative research.

We argued that adoption of evidence-based policing (EBP) and the related “gold standard” used to evaluate research (such as those measurable on the Maryland Scale) act as a “technology of power” (Foucault 1988) to draw boundaries (Gieryn 1983; Styhre 2011) around which methodologies and forms of knowledge are legitimate and useful for policing. We also drew attention to the risks posed to researchers entering the field of the loss of decades of seminal policing research if its utility for informing policing and criminal justice is to be judged using the “gold standard” criteria defined by the evidence-based movement more broadly.

Qualitative methods and the evidence-base

The general disregard of qualitative methods in evidence-based policy is not new, and the debate has been well trodden and rehearsed in social care, education, medicine, and health care from the 1990s onwards (i.e. see Avby, Nilsen and Dahlgren 2014; Dixon-Woods, Bonas and Booth 2006). It is worth noting that the recent College of Policing definition of evidence-based policing in the UK has widened to refer to “best available evidence from appropriate methods”, and highlights the need to have a clear theoretical basis and context of research:

“The ‘best available’ evidence will use appropriate research methods and sources for the question being asked. Research should be carefully conducted, peer reviewed and transparent about its methods, limitations, and how its conclusions were reached. The theoretical basis and context of the research should also be made clear. Where there is little or no formal research, other evidence such as professional consensus and peer review, may be regarded as the ‘best available’, if gathered and documented in a careful and transparent way.” (College of Policing 2017)

What I now want to address in this piece are the main problems concerning the use of hierarchies of evidence for judging the merits and rigour of qualitative research. Although I’m sympathetic of attempts to ensure qualitative research is included in EBP policy-making and the point that guidance is needed for policy-makers, “hierarchies of qualitative research” are misleading and do not assist policy makers in conducting a robust, fair and inclusive evaluation of qualitative studies, and in assessing whether they are “promising” enough to be included in the policing evidence-base.

First, I’ll respond to the very notion of “hierarchies of qualitative research”. Then, I’ll turn to some of the claims made in a related blog on the hierarchy (see Huey 2019) which discusses the problems with “pseudo-scientific” “junky” qualitative research.

A “Hierarchy of Qualitative Research” for policy-making


In a response to tweets concerning the above example of a “hierarchy for qualitative research” for evidence-based policing (EBP), it is claimed that “rigour” is built into the hierarchy at every step. However, the very notion of a “Hierarchy of Qualitative Research” in itself is misleading and only serves to further legitimise and dismiss certain forms of knowledge, types of qualitative research, methods, etc. It engages in boundary work (as noted above) around which particular methodologies, forms of knowledge and “voices”, are viewed as legitimate and useful for policing (Gieryn 1983). We cannot use hierarchies to judge qualitative studies.

An abbreviated (and by no means complete) list of some of the problems I see with the use of hierarchies for judging qualitative research in EBP follows:

1. Systematic reviews are at the top of the hierarchy (not surprising given it is largely following the tenets of previous hierarchies such as the Maryland Scale). Are these qualitative systematic reviews per se, or the use of qualitative data within a systematic review? Is the criteria for a “what works” standard review still determined by quantitative principles (i.e. see Booth 2001)?

2. As stated in the blog which introduces us to the model, it makes one big assumption – “that the study being used was well-designed and well-executed” but that a “new category for 0: studies that manifestly poorly designed and executed” was also added. The question then is how do policy makers know before they apply the model that the study is “poorly designed and executed” – are we assuming they have the skills to be making these judgement calls in evaluating qualitative research, before applying the model, which is then step-two of the process? If so, it is not really assisting them to evaluate qualitative research.

3. It fails to acknowledge the relationship between method and methodologies

4. …And fails to acknowledge how methodology and methods are always tied to the research question being posed. Will the method used help answer the question/s posed? MORE methods (i.e. “quandrangled studies”) does not ensure quality – as methods are ultimately tied to the research question and aims. As Murphy et al. write in their guidance on using qualitative methods in health technology assessment (HTA):

“The goal of all research in HTA should be to establish knowledge about which we can be reasonably confident, and to provide findings that are relevant to policy makers and practitioners. Therefore, decisions about whether qualitative or quantitative methods (or a combination of both) are most appropriate to a particular research problem should be made on the basis of which approach is likely to answer the question most effectively and efficiently.” (1998: iii, author’s emphasis)

5. Quantity over quality: The model gives the impression that more methods are better – i.e. “quadrangled studies” are “very promising” using “more 4 or more different methods or data sources” which could include interviews, focus groups, field observations, media analysis. (Also see point 10 below).

6. “Mixed methods” studies can involve using various forms of qualitative methods, as well as quant/qual mixed methods studies. That said, “very promising” is also mixed methods, even though it only consists of qualitative methods, whereas mixed methods are only “what’s promising”.

7. It doesn’t acknowledge how different qualitative methods also have specific standards which guide their design and use, i.e. focus groups versus participant observation (as well as the general standards for qualitative research).

8. Context is key: Both in qualitative research, and in the study of policing and crime. The latter is not the same as i.e. health-based EBP (although the latter has its own issues concerning context and individual differences in the design and implementation of various initiatives).

9. Case studies are situated at the bottom of the hierarchy alongside “anecdotes” and “expert opinion” when in fact case studies themselves vary widely, and can involve use of multiple methods (qual and/or quant) to explore research question/s (i.e. see use of clinical case studies in nursing, health care).

10. It uses the standards of quantitative research to judge qualitative research (also see point 5 above). The overall message, whether intended or not, is the more qualitative methods the better, the more interviewees the better, etc. as you will then be able to “generalise” (when in fact generalisation is not necessarily the main aim of qualitative studies). (For a discussion of the number of participants in qualitative studies see the insightful report “How many interviews is enough?” (Baker and Edwards 2013)).

In sum: undoubtedly, models such as this will appeal to some police and policy-makers who are looking for a “quick fix”, but it does not help them to evaluate “good” versus “bad” qualitative research: this is more complex, as reflected in for example the various check-lists and tools which are used in health and medicine to evaluate qualitative studies. It will (like EBP models before it) result in the privileging of certain (qualitative) methods, types of research, and therefore forms of knowledge, at the expense of others. Ann Oakley wrote about the need to “dissolve the methods” war in relation to education and the evidence-base in 2002 arguing that:

“A main danger ahead is that large areas of research, traditionally important in education, escape the evidence net either because no one can reach any consensus about how to sort out the reliable from the unreliable, or (which might be worse) because a new orthodoxy sets in according to which qualitative research is simply a world apart—nothing to do with evidence at all.” (Oakley 2002: 284)

I’ve been training non-academic groups and professionals in qualitative methods for the past five years (including police officers, NGOs, policy makers, government, local authorities, health care, charities etc.) and what is clear is that there is a desire to learn the intricacies of qualitative methods so they can evaluate research studies and strengthen their own in-house work. There is also a need for training on how to evaluate qualitative research. Crucially, this includes training in how we should not use the principles of quantitative research to judge qualitative research. These hierarchies do not adequately equip policy-makers to be able to assess the quality of qualitative research.

“Hierarchies of qualitative research” also risk reifying dominant discourses of the evidence-base in policing. There is a risk that it creates expectations that police-researchers, “pracademics”, and academics must mould their research so that it “fits” the model to be i.e. “very promising” – so that it will then be used to build the evidence-base – rather than being robustly linked to questions of epistemology, ontology, methodology and the research question they wish to answer. For example, in the UK the Research Excellence Framework (REF) and proposed Knowledge Excellence Framework (KEF), put pressure on researchers to demonstrate the “usefulness” of their work and EBP aligns with this impetus (Lumsden and Goode 2018).

Claims of “pseudoscientific” “junky qualitative research”

This also relates to a point made in the blog launching the hierarchy which describes an instance of what the author calls “pseudoscientific” and “junky qualitative research”. The argument is that the authors of one particular study which is selected as an example of poor quality research made grander claims than was appropriate from a small sample of only 13 participants, including making national policy recommendations in relation to criminal justice policy. (There is no reference to the particular report in the blog). I agree with the more general point made here about not making national policy recommendations from such a small study (and the other criticisms), however, the hierarchy does not help us address these issues. For example:

1. In the blog it is claimed that there has been “deep resistance among qualitative researchers to the idea of trying to set standards for their work”. Perhaps this is the case in Canada and the US where this model originates, but it hasn’t been my experience. There are multiple examples of ongoing work and discussion in the social sciences, psychology etc., regarding the need to improve transparency and set standards. One example is the influential work of Braun and Clarke in relation to thematic analysis in New Zealand/UK, their particular approach which they call “reflexive thematic analysis”, and their related calls for qualitative researchers to be more transparent in how they have analysed data. This also includes on their website “guidelines for reviewers and editors evaluating thematic analysis transcripts”. The point here is that qualitative research does not occur in a vacuum where “anything goes”.

2. There is a whole continuum of qualitative research styles and inquiry, with i.e. arts-based research inquiries siting at one end of the qualitative continuum. Also, some arts-based projects and inquiries have had impact in relation to policy-making in i.e. the UK, and policy-researchers are also increasingly using these methods. For example, life history and narrative approaches might include only a small number of participants, but for good reasons, which are related to a host of factors including different disciplines, methodologies and philosophies of social science, etc.

3. A lack of training in qualitative methods for graduate students. This is one point that I do agree with in terms of the need for training in both methodologies and methods. I’d also add to that training in the politics of the evidence-base, evaluation methods, and awareness of the impact agenda.

Final thoughts

Researchers have a responsibility not to make grand claims from their research and we need standards for judging qualitative research. However, the hierarchy does not address these issues. It only exacerbates them. We can’t solve these issues by privileging studies which use “more” methods. The issue is more complex. Hierarchies of qualitative research, like those before them, zone in on what their creators see as the low standards in the field (cf Lumsden and Goode 2016) and risk “disciplining” qualitative research(ers) (Denzin, Lincoln and Giardina 2006).

We might also look to, and learn from, the debates and work previously conducted in fields such as health and medical research in terms of evaluating qualitative research for policy and judging what is “good qualitative research’” As Barbour (2001) writes in relation to “qualitative research checklists” in medical research: “Reducing qualitative research to a list of technical procedures … is overly prescriptive and results in ‘the tail wagging the dog’”. These checklists “can strengthen the rigour of qualitative research only if embedded in a broader understanding of qualitative research design and data analysis” (2001: 322, author’s emphasis).

Therefore, any practitioner-focused framework which aims to assess the rigour of qualitative research must attempt to be inclusive of a whole host of epistemological and ontological standpoints, related methodologies and methods. Transparency in how we conducted our research is key, but evidence-based policing also needs to be inclusive rather than exclusive, and not kick qualitative research in the teeth.


Avby G, Nilsen P and Dahlgren MA (2014) Ways of understanding evidence-based practice in social work: A qualitative study. British Journal of Social Work 44: 1366–1383.
Baker SE and Edwards R (2013) How many qualitative interviews Is enough? NCRM Review Paper. Accessed July 2019: http://eprints.ncrm.ac.uk/2273/4/how_many_interviews.pdf
Barbour RS (2001) Checklists for improving rigour in qualitative research: a case of the tail wagging the dog? British Medical Journal 322: 115–117.
Denzin NK, Lincoln YS and Giardina MD (2006) Disciplining qualitative research. International Journal of Qualitative Studies in Education 19(6): 769–782.
Dixon-Woods M, Bonas S, Booth A, et al. (2006) How can systematic reviews incorporate qualitative research? A critical perspective. Qualitative Research 6(1): 27–44.
Foucault M (1988) Technologies of the self. In: Martin LH, Gutman H and Hutton PH (eds) Technologies of the Self. Amherst, MA: University of Massachusetts Press, 16–49.
Gieryn TF (1983) Boundary-work and the demarcation of science from non-science: strains and interests in professional ideologies of scientists. American Sociological Review 48(6): 781–795.
Huey L (2019) If we’re going to use qualitative research for public policy, then we need better standards. 22 July 2019. Accessed July 2019: https://www.lhuey.net/single-post/2019/07/22/If-We’re-Going-to-Use-Qualitative-Research-for-Public-Policy-then-We-Need-to-Better-Standards
Lum C, Koper CS and Telep CW (2011) The evidence-based policing matrix. Journal of Experimental Criminology 7(1): 3–26.
Lumsden K (2016) Police officer and civilian staff receptivity to research and evidence-based policing in England: providing a contextual understanding through qualitative interviews. Policing: a Journal of Policy and Practice 11(2): 157-167.
Lumsden K and Goode JH (2016) Policing research and the rise of the evidence-base: police officer and staff understandings of research, its implementation and “what works”. Sociology 52(4): 813-829.
Lumsden K and Goode JE (2018) Public criminology, reflexivity and the enterprise university: experiences of research, knowledge transfer work and co-option with police forces. Theoretical Criminology 22(2): 243-257.
Murphy E, Dingwall R, Greatbatch D, Parker S, Watson P (1998) Qualitative research methods in health technology assessment: a review of the literature. Health Technology Assessment 1998: 2(16).
Oakley A (2002) Social science and evidence-based everything: the case of education. Educational Review 54(3): 277–286.
Styhre A (2011) Knowledge Sharing in Professions. Surrey: Gower.
Thacher D (2001) Policing is not a treatment. Journal of Research in Crime and Delinquency 38(4): 387-415.


Dr Karen Lumsden is based at the University of Nottingham in the UK. She is a sociologist and criminologist with expertise in qualitative research methods and applied research with a range of audiences including police constabularies and victim organisations. She has held posts at the University of Leicester, Loughborough University, the University of Abertay Dundee, and the University of Aberdeen. Karen has a PhD in Sociology, Masters in Social Research, MA in Sociology, and PGCE in Higher Education Learning & Teaching, all from the University of Aberdeen. Karen has experience of teaching qualitative research methods at postgraduate level and to academics and practitioners via the Social Research Association, her own consultancy (The Qualitative Researcher), and at international summer schools and ESRC doctoral training centres. She is the author of over 40 publications including four books, is on the Editorial Board of the journal Sociology, and is currently the Chair of the Editorial Board of Sociological Research Online. She tweets at @karenlumsden2

Qualitative research, Thematic analysis

Why themes don’t ’emerge’ from the data

Frequently, I read methods sections of articles or dissertations, or over hear students and academics commenting on how in thematic analysis their themes ‘emerged’ from the data. (I have probably been guilty of this myself in the past!)


This assumption that in thematic analysis it is only the job of the researcher to grab these (pre-existing) themes out of an interview transcript or field notes, ignores the labour that goes into qualitative data analysis, including the organisation of data, levels of coding, and the subsequent generation of themes.

Themes are constructed by the researcher/s, and are shaped and reshaped in the often cyclical process of analysis, interpretation, analysis, interpretation and so on…

Qualitative researchers are also storytellers, organising and structuring data via the stages of coding (whether deductive and/or inductive coding), organising these codes, and then constructing themes from these codes.

In this process the messy qualitative data is reorganised and a story is woven and constructed from the themes.

So, how should we refer to this process? Perhaps, instead of ‘emerging’ from the data, themes are generated, identified, and/or constructed by the researcher from the qualitative data.


Reflexivity in research: messy but necessary

“… you must learn to use your life experience in your intellectual work: continually to examine and interpret it. In this sense craftsmanship is the center of yourself and you are personally involved in every intellectual product upon which you work. To say that you can ‘have experience’, means, for one thing, that your past plays into and affects your present, and that it defines your capacity for future experience. As a social scientist, you have to control this rather elaborate interplay, to capture what you experience and sort it out; only in this way can you hope to use it to guide and test your reflection, and in the process shape yourself as an intellectual craftsman [sic].”

(C. Wright Mills, 2000[1959], The Sociological Imagination, p.196)

As this quote from C. Wright Mills demonstrates, reflexivity is wrapped up in intellectual craft and we are personally involved in every research project we work on. The value of reflexivity is now largely accepted by qualitative researchers and it has helped to address the sanitised nature of research accounts which traditionally featured in methods textbooks.

Reflexivity is valuable in that it draws attention to the researcher as part of the world being studied while reminding us that those individuals involved in our research are subjects, not objects. By being reflexive we acknowledge that social researchers cannot be separated from their autobiographies and will bring their values to the research and how they interpret the data (Devine and Heath, 1999).

Reflexivity highlights the ‘messy nature of the social world and social research, including the complex and myriad power contests and relations which must be negotiated and the implications that must be attended to in the course of our research – from design through to data collection, analysis, dissemination and application. It also extends to the contexts and cultures of knowledge production – including research users, participants, funders, universities, publics, and the disciplinary fields we operate within / between / across. It is increasingly likely today that academics will be working across disciplines, which has further implications for the identity of the researcher and the field/s they inhabit.

There are numerous definitions and operationalizations of reflexivity. Lynch refers to a ‘confusing array of versions of reflexivity’ (2000, p.27). Although social scientists now tend to agree on the importance of being reflexive, they do not share a coherent conception of what ‘being reflexive’ means or how to practice reflexivity. The etymological root of the term reflexive means ‘to bend backwards upon oneself’, in contrast to reflection which entails thinking about something after the event (Finlay and Gough, 2003, p.ix). Gough (2003) proposes that we use the plural term ‘reflexivities’ in order to address the assumption that reflexivity is something which we can all agree on and which can be captured, and to signify plurality, flexibility and conflict.

I find the feminist psychologist Sue Wilkinson’s (1988) definition of reflexivity particularly useful. She argues that at its simplest reflexivity involves ‘disciplined self-reflection.’ She distinguishes between three forms of reflexivity. First, ‘personal reflexivity’ (akin to what others have termed ‘self-reflexivity’ (Lather, 1993) or ‘recognition of self’ (Pillow, 2003)) which focuses on the researcher’s own identity where research becomes ‘an expression of personal interests and values’ and is thus an essential aspect of the feminist research paradigm. This form of reflexivity recognizes the reciprocal relationship between life experiences and research. Second, ‘functional reflexivity’ involves reflection on the nature of the research enterprise including the choice of method and the construction of knowledge in order to reveal assumptions, values and biases. Third, ‘disciplinary reflexivity’ focuses on the form and development of a discipline or sub-discipline. This includes, for instance, how the traditional paradigm of psychology has operated to exclude women and stall development of a feminist psychology (Wilkinson, 1988, pp.494-495).

In their book Reflexive Methodologies, Alvesson and Sköldberg (2009) claim that different uses of reflexivity or reflection make us aware of the complex relationship between processes of knowledge production and the various contexts of such processes, including the involvement of the knowledge producer. They propose that reflective research has the characteristics of interpretation and reflection. Firstly, all references to empirical data are the result of interpretation. Here, ‘the idea that measurements, observations, the statements of interview subjects, and the study of secondary data such as statistics or archival data have an unequivocal or unproblematic relationship to anything outside the empirical material is rejected on principle’ (Alvesson and Sköldberg, 2009, p.9). This calls for awareness of theoretical assumptions, language and prior understanding of social phenomena. Secondly, reflection involves turning attention ‘inwards’ to focus on the researcher, the researched, society more generally, the intellectual and cultural traditions, and the ‘problematic nature of language and narrative’ in the research setting (2009, p.9).

Therefore, reflective research is related to the selection of the research topic, the research context, relationships with/between the researched, the choices made in relation to the management and conduct of data collection (and while in the field), the representation of cultures, individuals and the social world, and also the power dynamics and relations implicated, generated and created via research and reflection. Reflexivity also extends beyond the individual academic to include acknowledgment of the limits of knowledge associated with the social scientist’s membership and position in the intellectual field (Bourdieu and Wacquant, 1992).

Finally, reflexivity is difficult (Alvesson and Sköldberg, 2009). It is not merely a quality of the researcher, but is a practice which must be honed, applied, and kept in mind throughout the research process. As Tim May writes:

“Writings on reflexivity tend to be manuals that provide steps for the practitioner to become more reflexive. What is replicated is an inductivism that separates content, character, and context. There are no easy routes and no self-help books with ten steps to ‘becoming reflexive’.” (with Perry, 2011, p.6)

For me, reflexivity focuses on the unfamiliar, the uncomfortable, the messy, difference/s, and writing up our failures (cf Pillow, 2003). A reflexive approach enables us to be conscious of the social, ethical and political impact of our research, the central, fluid and changing nature/s of power relations (with participants, gatekeepers, research funders, etc.) and our relationships with the researched, aspects which diffractive methodologies overlook. Reflexivity is and (can be) creative and involve re-readings of, and re-turnings to our writing and texts (as the chapters on reflexivity in action demonstrate). Crucially, we must consider reflexivity and reflective practices in the context of collaborative research with various research communities, and the politics of these relationships and therefore reflexivity itself.

Note: the above material is taken (and amended) from my new book Reflexivity: Theory, Method and Practice which is in press with Routledge for January 2019.


Alvesson, M. & Sköldberg, K. (2009). Reflexive Methodology: New Vistas for Qualitative Research, 2nd edn. London: Sage.

Bourdieu, P. & Wacquant, L.J.D. (1992).An Invitation to Reflexive Sociology. Cambridge: Polity Press.

Devine, F. & Heath, S. (1999). Sociological Research Methods in Context. New York: Palgrave.

Finlay, L. & Gough, B. (Eds.), (2003). Reflexivity: A Practical Guide for Researchers in Health and Social Sciences. Oxford: Blackwell.

Gough, B. (2003). Deconstructing Reflexivity. In: L. Finlay & B. Gough (Eds.), Reflexivity: A Practical Guide for Researchers in Health and Social Sciences(pp. 21-35). Oxford: Blackwell.

Lather, P. (1993). Fertile Obsession: Validity after Poststructuralism. Sociological Quarterly, 34(4): 673-693.

Lynch, M. (2000). Against Reflexivity as an Academic Virtue and Source of Privileged Knowledge. Theory, Culture & Society, 17(3): 26-54.

May, T. with Perry, B. (2011). Social Research & Reflexivity: Content, Consequence and Context. London: Sage.

Mills, C. Wright. (2000[1959] The Sociological Imagination. New York: Oxford University Press.

Pillow, W. (2003). Confession, Catharsis or Cure? Rethinking the Uses of Reflexivity as Methodological Power in Qualitative Research. International Journal of Qualitative Studies in Education, 16: 175-196.

Wilkinson, S. (1988). The Role of Reflexivity in Feminist Psychology. International Forum of Women’s Studies, 11(5): 493-502.


Audit Culture and the Enterprise University: Sociologists as Servants of the Powerful?

I’m reposting this blog which was originally published by the Sociological Review blog in 2016.

The academic march to the step of metrics / audit culture / instrumental research vis-a-vis impact and income-generation culture continues apace:


In the age of impact and public engagement, demonstrating that we are engaged in activities outside of, or alongside, research has become an increasingly important metric of academic performance and value. Badged under the auspices of enterprise, knowledge transfer and/or impact, engagement with external stakeholders and the creation of outputs which are of economic, social and political value is now a part of the job description of the academic in the neo-liberal entrepreneurial university. However, the problems faced by sociologists in their role as public intellectuals is nothing new, and instead is part of a continuing series of dilemmas that social scientists have had to face throughout modern history.

The vast changes which have occurred in the higher education landscape in past decades in the United Kingdom and other Western countries have been widely debated and critiqued by social scientists commenting on the transfer of new managerialism and ‘audit culture’ to higher education in the neo-liberal context. This ‘panopticization of the university’ leaves academics subject to the recording, monitoring and measuring of performance and output in relation to research impact. The notion of the ‘entrepreneurial funded researcher’ highlights the pressure to draw in research grants and be willing to engage with publics, ensuring impactful research. Sociologists face particular challenges when attempting to engage in enterprise work. There is not always a clear ‘off the shelf product’ to be transferred or supplied. There are barriers to transferring social and cultural activities with external partners into a financial or economic outcome for the corporate income-focused ‘enterprise university’. There are further barriers to overcome in relation to what users consider to be legitimate knowledge, how critical we can be when engaged in its (co)production, and also in models of transferring knowledge into practice.

I wish to briefly draw on my experiences as a sociologist (of crime and policing) conducting an enterprise and knowledge transfer project to build partnerships with police forces in England in order to illuminate some of the challenges faced. This work was conducted in the wider context of increased emphasis on evidence-based policing, related to the move to professionalise police forces (for instance via the establishment of the College of Policing in 2012 and various ‘What Works’ Centres). The focus and shift to an evidence-based approach and the related privileging of gold-standard methodologies such as randomised control trials (RCTs) and systematic reviews has clear implications for sociologists conducting research with police, and the utilisation (and acceptance or legitimation) of alternative methodologies such as qualitative and ethnographic approaches, the latter of which has been crucial for the development of policing studies from the 1970s.

Social scientists unfamiliar with enterprise have to navigate unfamiliar terrain in terms of how universities typically define and separate ‘research’ as opposed to ‘enterprise’. This separation does not transfer easily to a social science context and in relation to the types of organizations and settings, which social scientists will generally engage with. There is also a lack of clarity about the meanings of enterprise, engagement and impact in the sense that these are not well-articulated or differentiated in practice by universities themselves. For instance, individuals tended to use the terms enterprise and impact interchangeably. Enterprise has been experienced and understood as a means of generating impact from research (and for universities with income generation as the end goal). Enterprise becomes a process of impact generation. However the lack of articulation and clarity on the part of universities means that enterprise, impact and engagement occupy semantically and discursively shifting ground – which may enhance their power to operate as a disciplinary mechanism in the Foucauldian sense as academics struggle to understand and meet their requirements.

Understandings of how stakeholders defined, constructed and engaged in contestation regarding what was legitimate knowledge in relation to evidence-based policing became central points for observation and analysis. Boundary-work, which can be understood as a ‘stylistic resource for ideologies of a profession or occupation’, was a crucial part of the engagement process. We were attempting to expand our authority and expertise as sociologists into the domain of policing which was gradually becoming claimed by the evidence-based policing movement, in addition to engaging in boundary creation between practitioner-based working theories and social scientific knowledge. There is a danger that knowledge transfer (whether via consultancy or academia) becomes a means of reaffirming organizational decisions. Our activities and engagement with stakeholders could ‘end up reinscribing the very power geometries’ that sociology should ‘set out to problematise’.

The experience of police-academic partnerships also made evident the often-unsettling compromises which sociologists might be required to consider and/or make in order to maintain positive relationships with stakeholders. However, engagement such as this is vital if we are to ensure that critical sociological knowledge and research is accessible to publics such as the police, and that monolithic ideologies and research practices such as those encompassed in the evidence-based policing movement do not become the status quo, thus silencing and/or overshadowing critical sociological (and other) knowledge and research. The idea of evidence-based practice discredits any opposition, which became evident and required tactful negotiation. Under the auspices of new managerialism and the extensive cuts to public spending experienced by the police and other public sector areas post-2008 recession, an environment has been created for a paradox in police and academic partnerships, in that both police forces and academics are under pressure to engage with external partners as the evidence-base becomes entangled with public management.

Although there are undoubtedly benefits to be had from engagement or enterprise, it is important to be aware of the ways in which this can be co-opted into merely serving the interests of users or consumers of research. The contribution that sociology can make to research in areas such as policing, crime and justice is of value and is crucial for informing police and governmental responses. But, despite universities’ calls for enterprise work by social scientists (and academics more generally), and research funding council calls for impactful research, there is a unique position faced by (some) sociologists who wish to engage publicly in these activities, if they also wish to challenge the relations of power and structures of organizational power. Therefore while public engagement is undoubtedly a vital part of the sociological and academic division of labour, greater critical reflection is needed from sociologists regarding the practice, politics, and ethics of this engagement. Caution is required regarding the ways in which it can be co-opted into the call for impactful and user-focused research in the wider context of the entrepreneurial university and new public managerialism.

Amit, J. 2000 ‘The University as a Panopticon: Moral Claims and Attacks on Academic Freedom’. In M. Strathern (ed.) Audit Cultures, London: Routledge, pp.215-35.

Burawoy, M. 2005 ‘For Public Sociology’, American Sociological Review 70(1): 4-28.

Gattone, C.F. 2006 The Social Scientist as Public Intellectual, Oxford: Rowman & Littlefield.

Gieryn, T.F. 1983 ‘Boundary-Work and the Demarcation of Science from Non-Science: Strains and Interests in Professional Ideologies of Scientists’, American Sociological Review 48(6): 781-95.

Marginson, S. and Considine, M. 2000 The Enterprise University, New York: Cambridge University Press.

Power, M. 1997 The Audit Society, Oxford: Oxford University Press.

Strathern, M. (ed.) 2000 Audit Cultures, London: Routledge.

Taylor, Y. and Addison, A. 2011 ‘Placing Research: “City Publics” and the “Public Sociologist”’, Sociological Research Online 16(4). URL (accessed 12 May 2016): http://www.socresonline.org.uk/16/4/6.html