Susan
Berkowitz, Karla Eisen, & Izabella
Zandberg,
Westat, Rockville, Maryland USA States
“Using
NVivo software as part of a multi team analysis of in depth
interview data in a mixed methods evaluation”
This presentation will examine the impact of the
introduction of computer assisted analysis tools,
specifically NVivo, into the qualitative analysis portion
of a large mixed methods study employing three teams of
analysts with widely varying degrees of experience
conducting qualitative analysis as well as different levels
of familiarity and comfort with using the software.
Westat conducted a large mixed methods study of clinical
research networks in the United States and internationally.
As part of the study, qualitative data were collected
through in-depth, semi-structured interviews with
representatives of clinical research networks to discern
barriers to and facilitators of networks’ efficiency
and effectiveness in six domains of network functioning.
Interviews were tape-recorded, transcribed and entered into
NVivo for analysis. The analysis team was trained in use of
NVivo on two separate occasions. For all analysts this was
the first time they had used the software for a major
analytic project.
Nine analysts were clustered in three analysis teams by
content areas. Although this was not the original intent,
each team used the software somewhat differently. For one
team (comprised of the authors) a sample of transcripts
(from the management and governance content area) was first
selected, and initial common coding categories developed by
agreement among the three analysts. The resulting coding
tree was shared with the larger qualitative analysis team,
who adapted it to their needs as they saw fit. The
management and governance team continued to develop and
revise codes iteratively and by agreement as new themes
were discerned in the remaining transcripts. This team also
took advantage of such NVivo functions as Merge, node
reports and matrix intersections to help guide analysis and
would like to have done more had time permitted. Analysts
in the other teams ended up using NVivo largely for rough
data categorization (and unfortunately, in some cases, as a
substitute for true analysis) or abandoned it altogether
due to time pressures and the accompanying conviction that
the process could be expedited by relying on tried and true
“pencil and paper” methods. It is noteworthy
that one of the most experienced and sophisticated
qualitative analysts on the team opted for this solution.
We will present the successes, challenges, and lessons
learned in incorporating NVivo into the analytic process on
this study, with particular emphasis on the implications of
the diversity of the analytic team, especially of varying
levels of experience conducting qualitative analysis and
how this interplayed with differing inclinations to use and
master the software. The presentation will also consider
the impact of working in a time–pressured,
deadline-driven non-academic environment, which made it
impossible to fully exploit either the iterative or
educative potential of the software It will conclude with a
discussion of what we would want to do differently next
time, and why.
Duncan Branley, Goldsmiths College,
University of London, UK
Only Connect!
Only Relate!
"Only connect! That was the whole of her sermon. Only
connect the prose and the passion, and both will be
exalted, and human love will be seen at its height. Live in
fragments no longer. Only connect, and the beast and the
monk, robbed of the isolation that is life to either, will
die.
Nor was the message difficult to give. It need not take the
form of a good "talking." By quiet indications the bridge
would be built and span their lives with beauty.
But she failed. For there was one quality in Henry for
which she was never prepared, however much she reminded
herself of it: his obtuseness. He simply did not notice
things, and there was no more to be said."
("Howards End" E M Forster, 1910)
Using a freely available digitised version of Forster's
novel Duncan Branley will explore the use of the new
feature in NVivo 7, 'Relationships'. What can constitute a
relationship within your data and how can you integrate
their tracking into your NVivo project? Are 'Relationships'
just another marketing creation or can they really add
value to your research?
The text used is deliberately a literary one to demonstrate
that NVivo need not be limited to conventionalised social
science texts. Methods of analysis, though requiring some
attention to disciplinary context, are not the unique
preserve of any one particular academic formation. The
session will not only be exploratory technically, but also
methodologically.
Karen S Buchanan, University College London
“From
Grounded Theory & Discourse Analysis to Political
Ecology and claim-making in socio-environmental
conflicts.”
My
study sets out to research the subjective experiences of
the claim-makers, and the meanings of the environmental and
developmental discourses used by each claim-maker within
the claim-making process of a socio-environmental conflict
and the relational issues of empowerment and disempowerment
of claim-makers, with the aim of theorising the use and
impact of environmental and developmental discourses to
contest or promote large-scale open-cast copper mining in
the cloud-forest of north-west Ecuador. This region is
witnessing the development of opposing local movements
which have explicitly constructed political strategies for
the development of their territory, management of
biodiversity, and social and political autonomy. This study
contributes to theoretical work on nature-society relations
which has tended towards dichotomous analysis by
investigating these relations in the context of daily
livelihood struggles, over competing claims to natural
resources, embedded within larger-scale socio-political
events in the form of national and international
neo-liberal political and economic interventions.
The research study is qualitative, situated within the
social constructionist paradigm of political ecology. Of
the qualitative research methodologies available, the most
appropriate framework for carrying out my investigation and
analysis is the grounded theory approach which lends itself
to discovery-oriented research in this under-theorized area
of political ecology. Using grounded theory methodology to
identify the descriptive categories in my interview
transcriptions, I am using NVivo7 software to assist the
process of analysing my interviews as part of the ongoing
process of generating further categories which influence
the direction of further interviews in which these initial
concepts can be more fully explored.
My second main methodology is discourse analysis, also
located in a social constructionist approach to viewing the
world and its events, which provides a framework for the
deconstruction of meanings in the environmental and
developmental discourses being employed in this
socio-environmental conflict, and how the language,
terminology and forms of knowledge used by each claim-maker
allows or precludes certain themes, beliefs, perceptions,
attitudes and relational issues of empowerment and
disempowerment within the wider international
socio-political and economic context of the copper mining
conflict. Still taking a grounded approach, I am using my
NVivo7 project to perform searches to select parts of the
interview transcripts and secondary materials produced by
the claim-makers which relate clearly to my research
questions; further searches using NVivo7 examine the use of
language in the construction of the discourses, as I search
for any inconsistencies in the meaning in the
constructions, and the assumptions they reveal, finally
assessing the implications of the selection, construction
and use of particular discourses by claim-makers.
Particularly relevant to political ecology research is the
study of the shaping and use of discourses through the
power relationship between claim-makers and its effect on
knowledge systems and social relations. In conclusion my
research strategy aims to show how, using NVivo7 software
to support my analysis, the concepts identified through
grounded theory and the socially-constructed discursive
strategies reveal the operation of power at institutional
and societal levels, which is used in the dynamic process
of empowering and disempowering claim-makers in this
socio-environmental conflict.
Sara Butler & Jessica Vince Ipsos MORI
“Managing
Large-Scale Qualitative Research – Two Case
Studies”
The main thrust of this session
will be to discuss the experiences and techniques we
employed to manage two large-scale qualitative projects
using QSR Xsight. The first was the 2004 qualitative
evaluation of the New Deal for Communities programme which
involved 78 discussion groups in 39 of the most deprived
communities in England. This project was used to pilot the
use of Xsight software in a commercial research context.
The second was the Commission for Rural Communities’
first Thematic Inquiry into rural housing which comprised a
range of different approaches including discussion groups,
depth interviews and public fora which enabled us to access
the views of a broad range of residents, beyond those who
would normally put their views forward.
The session will consider the challenging nature of the
projects which made the use of Xsight particularly
appropriate. For example, both projects required large
project teams, covered broad geographical areas, involved a
wide range of issues that required different research
approaches, and had tight project timetables which meant
that efficient analysis was vital. We will also examine the
design and management of the projects, using Xsight not
only as an important analysis tool, but also as a means of
organising and bringing together the diverse project
elements. Finally, we will discuss the key benefits to
incorporating Xsight in the projects, such as the
systematic approach to cataloguing large quantities of data
or the relative ease with which data can be revisited for
further analysis. However, in order to give a fully
balanced picture of our experiences, we will also highlight
some of the drawbacks involved with using this kind of
software.
Our session will draw on experience and understanding
gained through practical application of the software in
social research and give rise to discussion about how
Xsight can be utilised to effectively manage large-scale
qualitative research.
Karin Fisher, University New England,
Armidale, New South Wales Australia
“Challenges
and experiences using grounded theory and NVIVO”
This paper will address the theme of exploring the impact
of methodology on the applicability and use of QSR
software. The paper will use a PhD research study to
discuss issues relating to the transposition of the raw
transcript data into NVivo using grounded theory methods,
the researchers tendency to give higher priority to the
capacity of NVivo when conducting the analysis and the
requirements associated with learning NVivo. Issues will be
presented from a study that explored peoples experiences
and the factors as they relate to access to services for
Sexually Transmitted Infections in a rural area of
Australia.
The use of grounded theory is a creative and complex
process. The use of software such as NVivo as a tool for
analysis of data is also a complex process that presents a
number of challenges. Developing mastery of NVivo is time
consuming. There is a potential for the researcher to be
distracted from the interpretive and creative processes
involved in grounded theory analysis of research data to
the more routine and mechanistic procedures of NVivo
potentially losing perspective of the meaning of the data.
Nonetheless, NVivo offers a way in which to successfully
manage the data and provide support in the use of manual
methods of analysis. The paper will consider the issues
surrounding the use of NVivo and manual methods and the
choices made in relation to prioritising mastery of method
or software.
Karin
Fisher & Trish
Thornberry,
University of New England, Armidale, New South Wales,
Australia.
"Wanted:
Support mechanisms for NVivo in rural areas"
This paper explores the experiences of two PhD students in
Australia who began as novice users of NVivo and progressed
through the introductory phase to the more advanced phase
of data analysis using this tool. During this process of
frustration, trial and error and the inability to have
questions answered instantly in rural Australia we came to
the conclusion that there must be a better way to 'learn as
you work'. Recognizing gaps in the conditions that make
learning and self directed learning possible we proceeded
to develop ways to help bridge these gaps.
Rural based researchers are frequently faced with the
tyranny of distance, transport issues, information
technology and communication difficulties. The significant
cost involved in non-funded educational sessions detracts
from the lifelong learning environment that is frequently
embraced and encouraged in the world of research. These
conditions hinder education and prevent collegial sharing
of knowledge and techniques in the use of programs such as
NVivo and have the potential to discourage new researchers
from using technology in their research.
These issues raised many questions the most important
being: how can they be resolved? Is an online course the
answer? What funding sources are available to enable
education of advanced NVivo courses for people who live in
disadvantaged areas? How can access to expertise for novice
researchers in rural areas for both NVivo and research
methodologies be enhanced?
The paper will address the theme of communicating research
with software. It will discuss aspects related to the
education and learning processes when using NVivo in rural
areas. In this presentation we would like to outline the
gaps that affected our research process and use of NVivo
and ways in which we strived to address those gaps.
Janice
Fong,
Disability Rights Commission, UK
“Exploring
the use of QSR Software for understanding quality - from a
research funder’s perspective”
The Disability Rights Commission (DRC) commissions a number
of research projects in order to build up a body of
knowledge and evidence to help inform decision making in
relation to policy and practice. The DRC’s reputation
as an authoritative source of information relies on the
acceptance that the information is robust and rigorous. Its
quarterly ‘Disability Briefing’ for instance,
produces a range of statistical information and analyses
that are disseminated to a range of government departments
and disability organisations. From a funder’s
perspective, there can be a general belief that
quantitative data are superior to qualitative data since
the procedures for managing and analysing the former are
perceived to be more straightforward. The process of
quality-checking is also more transparent.
However, to inform policy and practice, different
methodologies and data are required for different purposes.
There is a lack of understanding of methods which can be
used to conduct an assessment of the quality of qualitative
research data. Qualitative Data Analysis Software (QDAS)
such as those produced by QSR provide a means for analysing
qualitative data systemically.
In this paper, I use my experience as a research manager in
the DRC to attempt to explain what a research funder may
look for in commissioned research, and how we can be
satisfied as to its quality. I attempt to determine whether
the use of QSR software could provide a tool for assessing
the quality and rigour of qualitative material.
DRC funded a research project studying the experiences of
disabled students and their families. This is a mixed
method research. The first part generates quantitative data
via a structured survey of parents and carers of disabled
children. The second part is a qualitative study which
involves interviewing about 40 disabled children and young
people in Great Britain about their schooling experiences.
While many tables of figures and statistical analyses are
presented in a report, the answers to open questions within
the questionnaire are not systemically analysed upon the
completion of the structured survey. I began to wonder
about the interviews behind the statistical information and
also whether the answers to open ended questions were
misrepresented. As the second part of the project
commenced, I also start to feel a bit lost in knowing how
to assess the qualitative materials.
I began to look for possible answers from QDAS. Since I was
a complete novice, I signed up for training in NVivo 7. The
course has been useful in teaching me the skills to use the
software. I have learnt to set up the data; to create nodes
and codes and to organise the data with case and
attributes. While attending the course, I was also
assessing the software’s suitability for my purposes.
In this presentation, I argue that there are myths and
misunderstandings surrounding functions that some non-users
may ‘wish’ the software could perform. It can
be seen as a magic wand which will automatically perform
all the analyses and identify relationships between
variables by clicking a button. However, to a certain
extent, this tool can only provide a sophisticated support
for a qualitative data analysis when there is a sound
methodology and a comprehensible analysis plan. It is
essential for a research funder to have a comprehensive
understanding of the research questions underpinning the
commissioned research in order to assess whether this
software can be used appropriately. I conclude by giving
some recommendations for the features which research
funders may want to look out for in terms of being able to
ask sensible questions of the data and analyses through the
understanding QDAS.
García-Varela,
A. B. Universidad de Alcalá,
Spain) del Castillo,
H. Universidad Autónoma de Madrid,
Spain; Lacasa,
P. Universidad de Alcalá, Spain
“Analyzing
educational dialogue using ‘NVivo7’: Children
and families sharing “new literacy”
practices”
In
this presentation we analyze an educational innovative
practice, in which children and their families participate
working together and sharing goals as a community of
learners. In this educational setting, processes of
literacy related to writing on the Internet take place. We
use NVivo7
for the analysis
across the whole research process in this sociocultural
research.
Adopting an ethnographic, action research and
socio-cultural approach in our study, we play the role of
participant observers working in an extra-curricular
workshop, sharing with teachers the task of introducing
children and families to multiple forms of literacy,
related to the use of technologies and mass media (Lacasa
(ed.), 2006). We explore how children and their families
approach the production of written texts in the workshop,
taking into account that their texts will be published on
the Internet in a digital newspaper that they are creating
themselves.
Many researchers’ everyday activities take place
mediated by ICT tools. New technologies afford a wider
number of possibilities for developing research projects,
as well as for analysing data in complex ways and producing
academic reports and publications. Thus, the use of ICT in
the research process is currently one of the most important
challenges for social researchers, especially for those
privileging the development of qualitative research.
In such a context, one crucial question is of particular
interest to us: How can new ICT tools can help us across
the whole research process in sociocultural research? The
aim of this session is to explore how the software for
qualitative data analysis NVivo7
can be used to
facilitate the process of organizing, storing, retrieving
and analyzing data.
All the sessions were audio and/or video-recorded.
Activities were also registered in a logbook which
supported a reconstruction of the general information. We
elaborated fieldnotes and summaries of every session, and
also considered the texts produced by the children. The
process of data analysis combines narrative and analytical
perspectives.
The discussion of this paper focuses on the data analysis.
In the process of analysis we used NVivo7,
which allowed us to work directly with the transcription of
audio and video recordings in the discourse analysis. In
this sense, the first step was to organize the data
chronologically on each family. Then, we segmented the data
in significant ‘sections’ that we described
creating summaries (as memos). Also, we added codes to
these sections to group them in free nodes. This was a
first approach to the categories that was relatively
descriptive but provided an important infrastructure for
later data retrieval and for the following analysis and
interpretation (Hammersley & Atkinson, 1995). On the
other hand, Nvivo7,
helped us in a following step to create an analytical
system of nodes for coding and interpreting the data.
References
Hammersley, M. & Atkinson,
D. (1995) Ethnography: Principles in Practice. London:
Tavistock.
Lacasa, P. (Ed.) (2006) Aprendiendo
periodismo digital: Historias de pequeñas
escritoras.. Madrid: Visor.
Linda
Gilbert,
University of Georgia, Athens, Georgia,
USA
“Tools
and trustworthiness: a historical perspective”
In
1999, I completed a study on the transition experience of
qualitative researchers as they moved from “manual
methods” to using qualitative data analysis software.
At the time, I chose the latest and greatest program on
which to focus: QSR N4, still known primarily as NUD*IST at
that time.
Since then, QSR software has gone through multiple changes
and revisions, leaving the former “latest and
greatest” far behind. N4 gave way to N5 and N6, and a
sibling-product named NVivo was introduced (version 1),
then updated to version 2. Recently the two lines of
software have been merged in a new product, NVivo 7, which
constitutes a complete re-write. In each version, the
software has retained core features, added features and new
capabilities, and represented functionality differently.
Changes have been driven both by rapidly-increasing
computer power and by user demands, both explicit and
revealed in use.
At the same time, qualitative research as a field has faced
new challenges from initiatives such as the United
States’ emphasis on
“scientifically-based” research. Responses from
researchers have ranged from a refusal to engage in debate
to some initial efforts at articulating standards in
qualitative research. These are worth examination in
themselves, but they are of particular interest when
considered in light of current QDA software capabilities.
In this presentation, I wish to revisit some of the
findings from my earlier study in the light of evolving
software and new trends in qualitative research. In
particular, I will focus on the conceptualization of
“a tool” or “tools” when applied to
software and on various issues of trustworthiness related
to software use. In the process, I will reflect on some of
the historical contexts which have affected the use of
software, and raise questions for additional consideration
and discussion. If time allows, small group discussion will
be included.
Linda
Gilbert, University of Georgia, Athens,
Georgia, USA; Melissa
Kelly, University of West Florida in
Pensacola, Florida, USA & Dan
Kaczynski, University of West Florida in
Pensacola, Florida, USA
“Exploring
the forces of application-sharing technologies upon NVivo:
Promoting and Supporting adoption”
Application-sharing technology offers a channel for
leveraging the adoption and use of QDAS tools such as
NVivo. By integrating these technological tools, NVivo
trainers can access a broader audience and client base, and
qualitative research practitioners can extend and enhance
their collaborative efforts. Successful adoption and
implementation of these tools requires that organizations
address two broad sets of issues: user considerations and
technological considerations. One of the key user
considerations to address is the role that users play in
adopting technological innovations. The first of three
papers to be presented in this session examines differences
in adoption patterns across five categories of users:
innovators, early adopters, early majority, late majority
and laggards. The second paper to be presented in the
session expands the discussion of user issues into
considerations that need to be addressed when selecting an
application sharing technology for use with NVivo. This
segment also explores key technical decisions that adopters
must contend with when advocating for application sharing
technology. The third paper to be presented addresses the
functions that academics and trainers have in defining and
shaping innovations in the application of NVivo as a
research tool. Together, these three papers provide insight
into expanding the utility of technological tools such as
NVivo and application-sharing technology and advancing the
craft of qualitative research.
Paper 1: Crossing the
Chasm: How do users of technology approach adoption?
Most
of the literature about technology adoption is geared
toward technology companies; with the most frequently-cited
being Geoffrey Moore’s Crossing
the Chasm. In this seminal work, Moore
offers his view of the bell curve of technology adoption
characterized by innovators, early adoptors, early
majority, late majority, and laggards. The innovators are
“techies” who will try a new product early,
even if they never plan to use it. The early adopters are
visionaries; they like to experiment with new technology.
The early majority consists of pragmatists interested in
what the product will do.
In Moore’s view, the early majority is a critical
group that determines the future of every new technology.
“Crossing the chasm” from the early adopters to
the early majority is a critical point in the acceptance of
any new technology. The late majority is conservative,
tends to use older technology, and is reluctant to change.
Lastly, the laggards are skeptical of the technology and
often resist it entirely.
In addition to categorizing adoption patterns, Moore also
describes new technology in terms of whether it is
discontinuous
or
continuous.
Discontinuous technology is great, but disruptive to the
existing organizational structure or business practices;
continuous technology is nondisruptive, and hence, often
unnoticed. In-between products, in Moore’s view, are
generally not exciting or marketable. Moore’s work is
well-known in the technology industry. Not surprisingly, it
focuses on the implications of the model from an industry
perspective, with the key question being how to
“cross the chasm” to acceptance within the
market. What are its implications in a nonorganizational
setting? What attracts individual users to new
technologies, and what impact does it have on their work
and work relationships?
In the QDA world, there are two interesting topics related
to these questions. The first is QDA software itself.
Despite its long pedigree, QDA software is still not
entirely “adopted” across the varied
qualitative researcher communities, particularly in
academia. Though there are groups that can be labeled as
“early majority” using the software, the late
majority and laggards seem to hold entrenched positions
that marginalize the use of software as an ancillary rather
than integral part of qualitative research.
The second interesting topic involves supportive
technologies for teaching and learning interactively, such
as application-sharing technologies. While in varying
stages of adoption themselves, these programs offer
interesting opportunities for supporting the expansion of
QDA software. The potential intersection of two adoption
curves raises interesting questions for both QSR and
individuals teaching and learning NVivo.
Paper 2: Utilizing
Application-Sharing Technologies in Qualitative Research:
Considerations and Implications for Integration
Often
underlying technology adoption patterns are user-related
issues involving implementation and use, or technical
considerations. Identifying advantages of a technology, as
well as addressing concerns and barriers, are part of the
adoption process. This article will explore the issues of
adoption and implementation with respect to
application-sharing software, particularly with the
perspective of integrating application-sharing technology
into the instruction and practice of qualitative research.
For example, there are a number of technical considerations
that should be addressed proactively in order to realize
many of the benefits of integrating application-sharing
technology into instruction and practice of qualitative
research. A few of the questions that decision makers
should address are: (a) what product model to adopt, (b)
whether to use a hosted solution or host a product
internally, (c) how to define the scope for usage, (d) how
to address user constraints, and (e) how to promote
stakeholder acceptance. Failure to proactively address
these issues can result in implementation problems,
diminished return on investment, and frustrated users.
This exploration of considerations and implications will
help the audience frame issues they are likely to encounter
when positioned as users, adoptors, or implementation
planners for technological innovations such as
application-sharing. It will also highlight areas for
expanded discussion among the forces shaping the
innovations and the adoption of the innovations.
Paper 3: Distance
sharing technologies: academics and trainers shaping future
mainstream adoption of NVivo
How can these separate and distinct adoption patterns
intersect in teaching and learning NVivo? Will the use of
NVivo in online courses and other Web-based, synchronous
collaborative platforms impact mainstream adoption of the
tool by qualitative researchers? This paper builds upon
work presented at Durham 2005 on the use of NVivo in a
virtual setting and explores the future implications which
these questions raise. The advantages and potential
challenges to using NVivo in conjunction with distance
software will be viewed from two perspectives: the academic
using online course delivery and the trainer or facilitator
using other Web-based synchronous collaborative platforms.
Particular attention will be given to how student
challenges, content challenges, and technological
challenges shape and are shaped by the evolving process of
mainstream adoption. Audience participation and discussion
will be a major component of this section, addressing
questions such as:
Where do you perceive the adoption curve at for application
sharing technology and NVivo?
What are your thoughts regarding Lyn Richards question - 20
years on; why aren’t they using NVivo?
Is geographic isolation a major issue in promoting the use
of application sharing technology? What are other major
factors?
What are the potential advantages of adoption?
What are the potential barriers to adoption?
What are potential consequences to adoption?
What rules or standards should apply to adoption practices?
Input from this group discussion will be incorporated into
a collective session paper that will be submitted for
publication following the conference. Of particular
interest in this discussion is insight to emerging rules of
professional conduct and criteria setting standards and how
these emerging standards of practice promote and support
adoption.
Silvana di Gregorio, SdG Associates
“Research
Design Issues for Software Users”
Many people approach software use without much thought of
the design of their research project. It is only in later
stages of the analysis that they may feel frustrated at not
being able to get at the analysis they want and realise
that they need to re-think the organisation of their work
in the software. In this paper, I propose that the design
of the research is the first consideration people should
have when setting up a project in a software package. The
information they need should be in their research proposal.
A framework for issues they need to consider is proposed
according to the complexity of the research design. Three
types of designs are considered: simple, complex and
compound complex . Case studies will be used to exemplify
each type of design and the issues you need to consider.
The paper is derived from a chapter from a book in progress
by myself and Judy Davidson – Qualitative
Research Design for Software Users – which draws on projects
from UK, USA and Europe across a range of disciplines,
range of software packages and from both the academic and
commercial sectors. The paper ends with a check-list of
design issues you need to consider when setting up a piece
of research in a software package. The check list includes
the following questions (which are expanded in greater
detail in the paper) - What is the structure of the
project? What kind of data am I collecting? Am I collecting
more than one kind of data? Is the data structured in any
way? Do I need to import the data in its entirety? Which
package best supports the non-textual material I have? What
format do my data have to be in? Having gone through the
check-list, researchers are in a position to ask the final
question – what kind of project design do I have
– simple, compound or compound complex? – and
the implications each type has for project set up.
Jo
Haynes,
University of Bristol, UK
“CAQDAS:
A ‘posthuman’ research experience?”
The relationship between human beings and new technologies
is often characterised by anxiety or fear about whether
people will be marginalised or intimidated by technology.
This anxiety characterised some of the initial speculation
about the implications of the use of computer software in
an increasing amount of intellectual and creative activity,
specifically here, qualitative research, wherein software
is perceived as potentially eroding the researcher’s
control over the analytical process or will be used
uncritically to produce ‘quick and dirty’
results.
In particular, the introduction of software into
qualitative analysis is considered 1) to be a threat to
‘traditional’ analytical methods; 2) as unduly
influencing researchers to focus on breadth rather than
depth within data; 3) to disrupt or distance the researcher
from the data; and, 4) risky, as it may mean that
methodological, substantive and theoretical matters will
become ‘artefacts’ of technology. All of these
concerns however, have not heralded a decline in their use
within qualitative research. Indeed, the development and
use of qualitative software is arguably becoming more of a
norm within academic research and beginning to have more
widespread use. Researchers are not however, accepting the
uncritical use of software like Nvivo, rather their wider
use and acceptance is indicative of a social and cultural
shift in the way technology is conceptualised. The anxiety
that once determined populist beliefs about technology as a
‘Frankenstein’s monster’, i.e. what
technology will inevitably
do to us, is transforming into an
acknowledgement that technology is an extension of us, i.e.
they are an extension of human intelligence and skill. This
is understood within what is referred to as the
‘posthuman condition’, whereby humans are
viewed as embodied within an extended technological world.
By drawing on experience within research teams conducting
qualitative cultural and social research, this paper
explores the posthuman standpoint in relation to the use of
qualitative software such as NUD*IST and Nvivo. In doing
so, it not only highlights the creative use that can define
its analytical support, but also that what should be feared
in research relates more to wider and familiar political
and economic processes shaping the research context.
Chih Hoong
Sin,
Disability Rights Commission, UK
“Using
QDAS in the production of policy evidence by
non-researchers: strengths, pitfalls and implications for
consumers of policy evidence”
Policy research, at least within the UK, is being conducted
by an increasingly heterogeneous pool of research
providers. In addition to researchers working within
academia or policy and research think tanks, a range of
private sector organisations (such as consultancies),
charities and voluntary groups are also involved in
providing policy-related research evidence. Many large
pieces of policy research are also contracted out to
consortia comprising a diverse range of partners, not all
of whom necessarily have research background. In other
words, research conducted by non-researchers potentially
has a huge impact on the policy-making process.
In some quarters, there can be strong arguments for
commissioning non-researchers to provide research evidence.
Despite the preference, in some cases, for non-academic
organisations to conduct major pieces of social research,
the use of such providers of research is not without
significant challenges. Organisations commissioned to
conduct such research do not always have dedicated research
staff who are trained to conduct research. The training and
management of staff who may not have a research background
can be highly challenging. An ‘on-the-job’
training approach is not uncommon and this has major
repercussions on the way research is conducted. Dedicated
research training may also be provided for practitioner
researchers but there are issues relating to quality,
timeliness, and adequacy.
This paper examines the case of one private sector company
that works solely to provide research and consultancy
services for public sector clients. It looks at the
background of staff; the way in which they are trained to
use QDAS; what QDAS has been used for; how it has been
used; and particular issues arising from training and using
QDAS in a team context. The implications of these on how
qualitative data for significant pieces of policy research
are managed and analysed are explored. It raises genuine
concerns about the robustness of evidence produced to
inform policy and practice. This is often compounded by a
real lack of understanding among consumers of policy
research in relation to qualitative data and analysis as
well as the attendant pressure of working in the policy
environment. It concludes with some recommendations to
consumers of policy evidence.
Andrew
King,
University of Surrey, UK
“Solving
the Researcher’s Dilemma? Doing it ‘quick and
dirty’ and ‘slow and methodical’”
In this paper I outline how a form of conversation analysis
known as Membership Categorisation Analysis (MCA) can be
applied to qualitative data using NVivo software. I begin
by outlining what MCA is; its background in the work of
Harvey Sacks and how it has been developed since. I note
that MCA takes a specific theoretical and methodological
approach to qualitative research and suggest that this may
potentially create a dilemma for those researchers who are
considering using NVivo software for their analysis. In
response to these issues I will outline and evaluate two
strategies of coding that I have used in my doctoral
research to apply MCA to narratives obtained in twenty five
interviews with young people who have taken a Gap Year, a
break in their educational careers between leaving school
and beginning university. These strategies can be
characterised as “quick and dirty” and
“slow and methodical”. The first makes use of
text searching tools to generate codes; whilst the second
uses a more methodical and perhaps more orthodox approach
to reading and coding of documents. However, I will explain
that when combined both of these strategies are valuable
for doing MCA and solving the researcher’s dilemma.
Therefore, I conclude that NVivo can be used by researchers
interested in applying this type of methodology to their
data.
Helen
Marshall,
Centre for applied social Research, Royal Melbourne
Institute of Technology, Australia
“What
examiners like in a qualitative thesis and how software can
help us deliver it”
We know relatively little about how examiners approach
marking a research thesis and what we do know is rather
general. It is clear, for example, that examiners want to
pass theses, that they hope for technical competence and
that quality of writing is important. It is less clear what
‘technical competence’ and ‘good
writing’ mean in practice. To find out more about
what examiners think good qualitative analyses look like,
in 2002-3, I asked some experienced examiners of
qualitative research thesis ‘what are the signs by
which you recognize good qualitative data analysis?’
The examiners’ responses suggested that
‘quality’ is hard to define. Nonetheless they
shared some views about technical competence and good
writing. Technical competence as an analyst of qualitative
data means being reflective, handling data with sensitivity
and ensuring that data are analysed rather than simply
described. In other words, in a good thesis ‘data
don’t just sit there’.
Good qualitative writing is vivid, so that ‘details
stick in your mind’ and has the quality of
authenticity. It means that analyses are imaginative, well
presented and well written.
This paper explores the issues raised by the study, and
makes some suggestions about how NVIVO, by enabling
reflexive presentation of the process of analysis, can help
postgraduates to meet the demands of examiners for quality.
Lyn
Richards,
Founder of QSR
Keynote:
Farewell to
the Lone Ranger? On the trend to Big and Team research
(with software, of course), and the future of 'qualitative'
Qualitative
research is still, in most literature, presented as a solo
act – small is written as not only beautiful but
morally or methodologically preferable. The method
traditionally aims at achievement of insight by an
extraordinarily perceptive solo researcher, creating
“indepth” understanding from small bodies of
amazingly rich data.
It’s often, even usually, not like that in
reality. More obvious, in today’s academic or
commercial marketplace, are the trends to rigorous data
management of even large scale “qualitative”
databases. The blame (or much more occasionally, praise)
for such changes is normally given to software tools, which
are also expected to solve the problems. Teams are required
to meet standards set for small, “indepth”
projects, and lone researchers are under significant
pressures to perform to standards set for teams. I see this
as a crisis in the method, and one, interestingly, not
noticed in the long list of crises normally debated. What
is to be done, and can software help?
Stuart Paul
Robertson, Jr., C.A.G.S.Robertson
Educational Resources, Pelham, New Hampshire, USA
“Brain-based
Learning and NVivo Training: A Critical Connection”
When introducing adult learners to a new piece of software
such as NVivo, there are a multitude of issues to be
addressed in order for the students to achieve a meaningful
level of success. Competency-Based Training (CBT)
approaches are the most frequently employed method during
training, yet they may circumvent an important preliminary
experience which is of benefit to most, if not all,
learners. Brain-based research identifies five stages in
the learning process: preparation, acquisition,
elaboration, memory formation, functional integration. The
preparation stage is often either absent from or minimized
in CBT approaches.
The preparation stage provides a framework for the new
learning and primes the learner’s brain with possible
connections. Furthermore, the more background a learner has
in the subject, the faster they will absorb and process the
new information. When introducing a robust piece of
software such as QSR’s NVivo, this stage is
particularly important. Misunderstandings in the purpose
and role of the software as it relates to the research
process can manifest themselves in underutilization,
misuse, or avoidance of the software.
For many adult learners, especially those new to NVivo or
relational databases in general, the very first issue which
needs to be addressed is establishing a conceptual
understanding of what the program can do for them. This can
be especially challenging for visual learners. By
incorporating visual representations with rich descriptions
that include similes and metaphors describing the
software’s role in the research process as well as
the key components of the software itself, new and
inexperienced users are provided with a way to link the new
information they are learning with understandings they
already possess.
In this paper I draw from the areas of brain-based
research, computer education, and my experiences working
with novice NVivo users to discuss the theoretical
underpinnings of the ideas I put forth and how they were
translated into actual materials to be used with adult
learners.
Sabdra
Spickard Prettyman, University of Akron
& Kristi
Jackson,
University of Colorado
“Ethics,
Technology, and Qualitative Research: Thinking through the
Implications of New Technologies”
As qualitative researchers in the new millennium, we have
access to a wide range of new technologies with which to
collect, analyze, and manage our data, as well as new forms
of data to collect for analysis. The digitization of audio,
video, and photographic data now makes it possible for us
to create, process, and analyze this data in new and
different ways, often allowing us to use these as more than
just tools for data collection, but also as sources of data
in and of themselves. In addition, the growth of the
Internet provides us with not only new ways to collect
qualitative data, but new settings in which to collect it.
Today, computer assisted qualitative data analysis software
(CAQDAS) such as NVivo provides powerful data analysis and
management tools, that can also aid in data collection,
theory building, and data representation and reporting.
However, it is rare that the implications for these new
technologies are scrutinized—especially as they
relate to ethics, technology, and our research. We argue
these tools demand that we reconsider the ethics of the
research process, from beginning to end, from
conceptualizing the questions to reporting the results.
Issues such as confidentiality, validity, and rapport may
surface and must be considered in ways that may be quite
different than we had approached them in the past. As
technologies continue to evolve, often in response to our
demands as researchers, new features and functions appear.
We need to understand not only how to use these, but also
what they mean for us ethically as we engage in our
research and with our research participants.
More often than not, those who use CAQDAS consider how
these tools can: help us efficiently store, manage, and
retrieve large quantities of data; add more rigor and
transparency to data analysis and the research process; or
allow multiple researchers to work effectively on the same
project. Rarely discussed in the literature is how these
new tools may influence the ethical enactment of our
research, and the new ethical considerations that may need
to emerge. For example, it is relatively easy today to
incorporate digital photographic and video data into
CAQDAS, which can then be used for analysis and
representational purposes. What does this mean for
confidentiality of participants and our commitments to
them? While it is true that for many years we have had the
ability to utilize photographs and even videos in our work,
the ease with which this data can now be collected,
manipulated, and analyzed (in new and different ways than
before) means that there are also new questions that arise
about the collection and use of that data.
As we move from manual to more
technologically advanced processes with our research,
increased familiarity with and new attitudes toward these
technologies will emerge. It is clear that these
technologies are now becoming more accepted and trusted in
the research community, although their use continues to be
contested by some. In order for these new technologies to
become more widely accepted and trusted, we must become
familiar with them, and with all of the issues that they
raise, including ethical ones. Their role must shift from
being just a handy storage and retrieval device to a fully
integrated component of research project that is in place
from the design stage.
References
Bourdon,
Sylvain (2002, May). The Integration of Qualitative Data
Analysis Software in Research Strategies: Resistances and
Possibilities. Forum:
Qualitative Social Research [On-line Journal],
3(2).
Retrieved June 20th, 2006 from HYPERLINK
"http://www.qualitative-research.net/fqs-texte/2-02/2-02bourdon-e.htm"
http://www.qualitative-research.net/fqs-texte/2-02/2-02bourdon-e.htm
Ford, K.,
Oberski, I., & Higgins, S. (2000). Computer-Aided
Qualitative Analysis of Interview Data: Some
Recommendations for Collaborative Working.
The
Qualitative Report (4). Retrieved June
20th,
2006 from HYPERLINK
"http://www.nova.edu/ssss/QR/QR4-3/oberski.html"
http://www.nova.edu/ssss/QR/QR4-3/oberski.html
Gibbs, G. R., Friese, S., & Mangabeira, W. C. (2002).
The Use of New Technology in Qualitative Research.
Forum:
Qualitative Social Research [On-line Journal],
3,
(2). Retrieved June
20th, 2006 from HYPERLINK
"http://www.qualitative-research.net/fqs-texte/2-02/2-02hrsg-e.htm"
http://www.qualitative-research.net/fqs-texte/2-02/2-02hrsg-e.htm
Welsh, Elaine (2002, May).
Dealing with Data: Using NVivo in the Qualitative Data
Analysis Process. Forum:
Qualitative Social Research [On-line Journal],
3(2).
Retrieved June 20th, 2006 from HYPERLINK
"http://www.qualitative-research.net/fqs-texte/2-02/2-02welsh-e.htm"
http://www.qualitative-research.net/fqs-texte/2-02/2-02welsh-e.htm
Clare
Tagg Tagg
Oram Partnership & Sarah
Millar et al
“Early Adopters of NVivo7”
It is
normally received wisdom that it is unwise to be an early
adopter of new software particularly with a large important
project. But this is exactly what The Health
Foundation’s evaluation team did when they decided to
use NVivo7 for the evaluation of the charity’s
Leadership Programme[1]. Managing large quantities of
longditudial data to identify evidence for theory building
and theory testing is a key challenge for the evaluation
team that Nvivo7 is helping them to address.
The evaluation of The Health Foundation’s Leadership
Programme is being conducted by a three-strong internal
evaluation team with experience of using N6 and Nvivo V2. A
decision was made to use Nvivo7 to support data analysis
and retrieval because Nvivo7 offers more advanced modeling
and the potential to use the ‘relationships’
node to support the exploration of logic models or
‘theories of change’ within the qualitative
data collected.
The data collected by the team includes baseline
semi-structured interview data, observational data,
previous evaluation reports, case study biographies and
monitoring reports. From the outset, there has been a
desire to bring these sources together to tell longitudinal
case-level stories of the individuals who participate in
the Leadership Programme and to undertake thematic analysis
to support theory-testing.
The paper will examine the reasons for the decision to use
NVivo7 and will track the team's successes and challenges
to date. In particular we will look at how the use of
NVivo7 has impacted on the research approach taken
including data collection, team working, analysis and
reporting.
[1]
The Leadership
Programme comprises of six leadership development schemes
aimed at a wide range of health care professionals. The aim
of the programme is to develop the leadership skills and
capacity of participants and, in doing so, improve the
quality of health care.
Chris
Thorn,
Director, Technical Services, Winconsin Centre for
Education Research
Keynote :
“Are we there yet? Growing maturity in qualitative
tools and methods”
Like the children in the back seat of the family car, we've
been asking the question "Are we there yet?" for quite some
time. I'm seeing lots of signs that we have arrived.
In my area of research, Qual-Quant debates now seem
old fashioned. The American Education Research Association,
for example, has new draft "Standards for Reporting on
Research Methods" that discuss design, interpretation, and
ethics across methods. Many large scale evaluations
routinely include members with qualitative and quantitative
skills who work as a team in an iterative, interactive
design. Likewise, our tools have come of age. The CAQDAS
"choosing a package" paper outlines the growing number and
increasing sophistication of tools available. Indeed,
enterprise technologies in knowledge management and web
based collaboration are moving to link up with our tools as
well. Enterprise search and social network tools are
probably the areas in which we will encounter other social
scientists looking for us.
David K. Woods, Wisconsin Center for
Education Research, University of Wisconsin, Madison, USA
“The
Qualitative Analysis of Video and Audio using
Transana”
The qualitative analysis of video and audio can be
rewarding for researchers, giving them access to raw data
that is considerably richer than the textual representation
of the same events. Working with video data requires a
different set of tools than working with text data.
This presentation will
demonstrate the analytic approach embodied by
Transana,
a free software package designed to help researchers
analyze video and audio data using qualitative techniques.
Transana is particularly useful in the analysis of large
video collections and for those researchers wishing to
analyze video in collaboration with others.
Transana facilitates the
annotated transcription of video data, allowing researchers
to create a text record that corresponds to the video. The
researcher can link the video and the text record together,
allowing instant access to the appropriate video for an
analytically interesting portion of the text. This allows
the researcher to remain very close to the video data at
all times.
A researcher can easily identify analytically interesting
portions of video in Transana, creating a virtual clip
which can be categorized, copied, coded, sorted, and
manipulated in a variety of ways. (Transana does not
require time-consuming manual video editing as part of the
analytic process, as many other qualitative packages do.)
Transana's search interface allows the researcher to
perform data mining and hypothesis testing on a
properly-coded data set, and Transana's Keyword Map report
facilitates the examination of coding across time.
The multi-user version of Transana allows
geographically-dispersed researchers to work with the same
data set at the same time though the Internet, facilitating
the collaborative analysis of video data.
Finally, video can be used in very powerful ways in the
dissemination of the results of qualitative analysis. A few
carefully-chosen video clips often have a stronger impact
on an audience than a string of PowerPoint slides. Transana
provides tools for sharing your results with others.
Taken together, the techniques and tools that will be
described in this presentation allow for powerful analysis
of video and audio data.