Author
Lyn Richards

Pub Date: 11/2009
Pages: 256

Click here for more information.
Lyn Richards
Qualitative Software

It's now widely assumed that qualitative researchers will use specialised computer software. As you'll find from Chapter 1 of Handling Qualitative Data, the book doesn't require that readers are using any particular software but it does at each stage discuss the challenges and opportunities you are likely to meet with software. Like most researchers, it expects that they will use some computer program. I expect that too.

This is not, I assure you, because I spent two decades developing qualitative software. On the contrary, that experience taught me to be much more wary of the risks to research projects from software use, and much more reflective about the motivations and abilities of commercial companies providing research software, than most observers would be. The sections below are designed to help researchers approaching qualitative projects to decide why they would want to do this, and then choose well the tools they want to use. They offer help with these issues: [please link internally]

Should you use qualitative software?
What's the state of the art?
Where to go for unbiased (and some biased) information?
Hunting down and talking to the Developer
You and software: managing this relationship
A quick online guide to Stepping into Software.

Should you use qualitative software?
Almost certainly, yes. My exceptions to this assumption are researchers who are totally incompetent on computers and terrified of them. It's now very hard to be there and still researching. If you don't come into that category, it's highly likely that there is software that would help your research greatly, save you an extraordinary amount of time and allow you to do things with your data that couldn't be done manually. So long as you learn it and use it well.

What's the state of the art?
Qualitative computing is, by any count, two decades old. Here's my assessment of those 20 years.

Programs have progressed dramatically and, increasingly, divergently. The processes of developing them were hugely exciting. But in later years, once researchers accepted software, this became a highly commercialized area and researchers are at risk of being exploited by greedy companies if they do not stick up for their rights. The vast majority of material about software tools is written by their developers - for marketing purposes. Researchers whose professional expertise is in analysing meanings of words should use those skills in such situations.

Back in the research world, things could be better. Research tools require constant, collaborative and critical interaction between developers and researchers if innovation is to continue. So software users must be alert to corporate motives and vigilant in reporting directions that don't work. Novice researchers need relevant teaching and writing on methods. But debate about the impact of computing on qualitative research has stuck in the mud of methodological territorialism and conservatism, weighed down by technical incompetence and bogged in the boredom of a development process that is more about advertising claims than research challenges. Have you noticed the dates on those still-cited conference papers and books (many of which are collected papers of long-ago conferences I attended!) The programs they discuss don't even exist now, and the complaints they made about software were answered in the first decade.

Where to go for unbiased (and some biased) information?
In this situation how are you to find impartial, useful and non-marketing advice about software products?

The CAQDAS project (http://caqdas.soc.surrey.ac.uk/) at the University of Surrey is funded to provide an up to date (and regularly updated) overview of the current programs available, with discussions and links to developer sites. You will find there a link to the overview book written by two of the project's experts, Ann Lewins and Christina Silver, in 2007. A .pdf by the same authors can be downloaded there with detailed advice for Choosing a CAQDAS package.

Other sites offer commentary and some are very detailed. The following may help.

Hunting down and talking to the Developer
Once you have explored the range of software available, always go to the developer's site. This is as essential as is meeting the owner before you buy a dog. You need to learn not just about the software but about the people and commercial bodies producing, selling, marketing and upgrading it.

How to find them?

Several sites provide good lists of currently available products, and update them. The following are the most reliable:

  • Text Analysis Info is a very detailed website with definitions and information about text analysis and software supporting it. An comprehensive and updated list of programs for qualitative analysis is provided at http://www.textanalysis.info/
  • The American Evaluation Association website has a well hidden but very full and up to date list of Qualitative Data Analysis Software with details about the products, prices and links to their developers' websites (http://www.eval.org/Resources/QDA.htm).
  • The CAQDDAS project maintains a status report on just nine of the most commonly used packages is provided with links to the developers' websites (http://caqdas.soc.surrey.ac.uk/Softwarenews.html). It also provides links to find out about free and low-cost software.

What are you looking for?

  • Read the material on the website to find what the software is designed for (are its tools relevant to your work?) Do a quick qualitative analysis of the text on that website: is this pertinent and sensible information or is it marketing hype? What does the software actually do? (If you can't work this out from the website, worry. Maybe they don't want you to know?)
  • Download free software and use it. (No free trial software? Ask, and if it's not available, go elsewhere. This may be exciting software in the future but it's not safe to commit your project to it.)
  • Most sites have tutorials available online. This is by far the best way of getting a feel for what it would be like to work in that software. The tutorial should let you actually do something in the software, preferably a lot, following clear instructions, using free downloaded software. (If they don't offer a free detailed tutorial, ask for one. If they are charging money for help you should get free, or if you can't learn enough from the online free resources, beware!)
  • If the developers have an online blog or discussion, join it and listen in. Is this a buzz of interesting research discussion or a litany of complaints? Do the developers contribute and does it sound as though they know about qualitative research and actually like to talk with researchers? (If not, they are not likely to be producing software that does what researchers are trying to do.) Does it sound as though people get into trouble with this product, and if they do, do they get helped? Still unsure? Ask the contributors to the discussion list.

You and software: managing this relationship
OK, you have some software. Now what?

It's highly unlikely that you will learn everything you need to do with that software package just by clicking around. It has to be learned, and the sooner you learn it the better. But you choose how you learn it. There are many ways to learn software and nobody should tell you what will work best for you. You probably know how you like to learn. Alone or with instruction? At your own pace or paced by a tutor? Playing with it or dead seriously doing something useful?

Most products offer courses (many cost serious money). Think before you enrol in one. How big is this course, how relevant will it be to your work, how much individual attention will you get if you're stuck? Check out the tutor and their abilities; do they actually do research? (If not, will they be able to relate to the research goals you have?) For some researchers workshops fast track confidence, for others they sap it.

So don't assume you have to attend a course - perhaps you'd do better quietly exploring tasks with a manual or a tutorial beside you. Or it may help to sit beside someone skilled in the software you are going to use and watch how they work.

But do learn it, and get to know this package before you put it to use. The best advice is to set out on some software learning task. You might use pilot or trial data, (some developers provide this on their websites) in advance of starting your precious project in software. Warning:If you let learning happen (or not) as you work in your own data, you are likely to damage the data and end hating (and blaming) the software. You will also learn only the software functions that do what you were doing without software (since these are the only tools you'll know to look for.) At the end of the project, it's very frustrating to find that you could have asked quite different questions, made quite different discoveries, had you known the software could support different ways of exploring and discovering.

A quick online guide to Stepping into Software.
Once you've made a decision about products, and once you have learned what tools are offered by the software and you can use them fairly competently - use it. You'll forget what you've learned if you don't try to use it for your purposes. It is essential to have the software working for you, in your project.

The link above takes you to a 10-step guide down that path, loosely following the ten chapters of Handling Qualitative Data. It aims to get you up and running in any of the software products whilst focusing on your own project.

For each step, the advice takes you through two sets of questions. What to ask about your project? And what needs to be done now?

Learning software is one thing, using it is another. Many researchers stall at this stage because they don't know where to start or what to ask as they set out to entrust their work to a newly learned software product.

Warnings: It's important to ensure that what you do is directed by your method and goals, not by the software. Any sophisticated software will offer tools to do far more than you, in this or any future project, will ever want to do. If you try to do everything it supports, and invent a reason for doing it in your project, the task will become impossible and (much worse!) the project will lose its purpose.

There will also be some processes for which software doesn't offer a tool - for example, thinking. Don't assume that if something can't be done in software it shouldn't be done.

And finally, a more general warning. It's increasingly common for researchers to learn qualitative software without first learning about qualitative methods. To do so is not wrong or immoral, but it does add massive challenges as they start to use the software tools. Like any powerful tool, (think of the chainsaw) qualitative software can be easily used to make dramatic and unintended changes. If the operator is unsure why this matters, serious damage can result. More importantly, like any tool, its power is wasted if the user cannot understand the purposes to which it can be put. The most common result of researchers learning software without method is that they do very little with the software, reaching only for the most obvious processes, like coding, and continuing in them ritualistically. You can code for ever and the software won't stop you. You have to know the purpose of coding, in your project, to design your codes and coding to that purpose.

Starting a qualitative project can be daunting, even if you are not also starting with a software package. If you have no training in qualitative research, but confront the need to make sense of your data, please start there, with the skills taught in Handling Qualitative Data, not with a software product. Why are you doing this project, and what outcome are you seeking? Now - can software help?