Site Loader
Rock Street, San Francisco

Chapter three described and explained in detail the process,
rationale and purpose of the mixed methods research design, with the usage of
both qualitative and quantitative approaches. The methodology employs
qualitative methods with in-depth expert interviews, participant observation
and a quantitative questionnaire survey to validate assumptions, with the
choice of the methods examined in section 3.5. Chapter four examined and
analysed the primary data from the quantitative questionnaire survey section.
The quantitative data was analysed based on the hypotheses with respect to
relationships between the influencing factors of: product features, pricing,
functionality and product appearance and the different stages of the in-store
decision making process.

In this chapter, the captured data from the qualitative
research is presented, analysed, analysed and evaluated in a systematic manner
as the next step of the research process. The documentation and analysis
process aimed to present data in an intelligible and interpretable form in
order to address the two objectives:

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!

order now

Determine and analyse the key factors impacting
the consumer behaviours of young adults in urban Indian centres

Evaluate the contextual, social and
psychological factors that influence the young Indian adults in the retail

This chapter includes the analyses of primary data from two
methods: in-store observation (section 5.3) and expert interviews (section


4.2 Theoretical framework

Marshall and Rossman (1999) describe data analysis as the
process of bringing order, structure and meaning to the mass of collected data.
It is described as messy, ambiguous and time-consuming, but also as a creative
and fascinating process. Broadly speaking – while it does not proceed in linear
fashion -it is the activity of making sense of, interpreting and theorizing
data that signifies a search for general statements among categories of data
(Schwandt, 2007). Therefore one could infer that data analysis requires some
sort or form of logic applied to research. In this regard, Best and Khan (2006)
clearly posit that the analysis and interpretation of data represent the
application of deductive and inductive logic to the research. Verma and Mallick
(1999) and Morrison (2012) on the other hand, state that the interpretive
approach, which involves deduction from the data obtained, relies more on what
it feels like to be a participant in the action under study, which is part of
the qualitative research.

Qualitative data analysis can be described as the process of
making sense from research participants? views and opinions of situations,
corresponding patterns, themes, categories and regular similarities (Cohen et
al., 2007). Gibbs (2007) points out; qualitative data analysis is a process of
transformation of collected qualitative data, done by means of analytic
procedures, into a clear, understandable, insightful, trustworthy and even
original analysis.

Marshall and Rossman (1999), state that qualitative data
analysis is a search for general statements about relationships among
categories of data. In contrast with quantitative methods, that examine cause
and effect, Muijs (2011) posits that qualitative methods are more suited to
looking at the meaning of particular events or circumstances. Creswell (2013)
refer to meaning as the intention of the original author and further state that
data analysis is both inductive and deductive and establishes patterns or
themes. Patton (2002) clarifies that qualitative analysis transforms data into
findings. This involves reducing the volume of raw information, sifting
significance from trivia, identifying significant patterns and constructing a
framework for communicating the essence of what the data reveal. Henning et al.
(2004) summarise data analysis as a continuous, developing and repeating
process during which transcribed data of interviews are investigated. Leedy and
Ormrod (2010) further state that qualitative researchers construct interpretive
narrative from their data and try to capture the complexity of the phenomenon
under study. Qualitative researchers thus use a more personal, literary style,
and they often include the participants’ own language. Robson (2011:468)
concurs with the views of Leedy and Ormrod (2010) and further reiterates that
qualitative analysis remains much closer to codified common sense than to the
complexities of statistical analysis of quantitative data. In summing up, one
could say that qualitative data analysis is based on assumptions, and the use
of interpretive (theoretical/conceptual) frameworks to ensure a final written
report or presentation that includes the voices of participants, the reflexity
of the researcher, a complex description and interpretation of the stated
problem, and its contribution to the literature or a call for change (Creswell,

The first step in analysing qualitative data according to
Best and Khan (2006) involves organising the data. It is however, crucial to
bear in mind that the methods of organising the data, will differ depending on
the research strategy and data collection techniques. Once the data have been
organised, the researcher can proceed to the following stage in data analysis,
namely description. During the second stage of data analysis, the researcher
seeks to describe the various pertinent aspects of the study, which include
inter alia the setting, both temporally and physically; individuals being
studied; the purpose of any activities examined; the viewpoints of participants
and the effects of any activities on the participants. Patton (2002), describes
the third and final phase of the analysis process, namely interpretation, as
involving an explanation of the findings, answering why questions, attaching
significance particular results, and putting patterns into an analytic
framework. The discipline and rigour of qualitative analysis, Patton states,
depend on presenting solid descriptive data in such a way that others reading
the results can understand and draw their own interpretations.

Scott and Usher (2011) posit that a typical qualitative
analytical approach may include the following aspects:

Coding or classifying field notes, observations
or interview transcripts by either inferring from the words being examined what
is significant, or from the repeated use of words (phrases) whether a pattern
is developing (i.e. that all activities which have been recorded are being
understood in a similar way).

Examining the afore said classifications to
identify relationships between them; yet, concurrently beginning the process of
understanding those relationships in general terms, so that they have
credibility beyond the boundaries of the case being examined. Researchers draw
upon previous knowledge about the world that has enabled them to distinguish
between objects and between occurrences in their life.

Making explicit these patterns, commonalities
and differences – in brief, making sense of the data, and taking these by now
more developed theoretical constructs into the field to test or refine them.

Elaborating a set of generalisations, which
suggest that certain relationships hold firm in the setting being examined, and
affirming that these cover all the known eventualities in the data set.

Formalizing these theoretical constructs and
making inferences from them to other cases in place and time.

According to these authors, the process of qualitative data
analysis consists of six stages (steps), namely:

Defining and identifying data: From the outset,
it is crucial to obtain a clear understanding of the meaning of data, and
fundamentally, even more importantly, the data required in accordance with the
research question and aims.

Collecting and storing data: When collecting
data, most researchers start to form opinions and judgement, which result in
theories being developed, in the mind of the researcher, and as such one has to
consider not only ways to collect data, but also to store data to make them
accessible for analysis.

Data reduction and sampling: During the data
collection process, reaching a point of saturation implies that all data were
reduced, filtered and sampled through the process of analysis. It is therefore
critical for the researcher when analysing data to determine what one already
knows to be important or relevant, in accordance with the intended purpose of
the investigation.

Structuring and coding data: Structuring and
coding of data underpin the key research outcomes and can be used to shape the
data to test, refine or confirm established theory, apply theory to new
circumstances or use it to generate a new theory or model, or even in the case
of this study, develop a conceptual framework model

Theory building and testing: An important
purpose of research is to generate new knowledge (Watling and James, 2012). In
building and testing theory, it is important to view the reactions of
respondents and whether they correspond or not, and also to ensure that a point
of saturation of data is reached.

Reporting and writing up research: In brief, the
reporting and writing up of research entails to put words on paper, in the form
of a report, constructing an argument based on the findings of what you have
done, what you have seen and heard, participants you interviewed and the information
that comes forth from the process of data analysis. Ultimately, the conclusions
drawn from the information should contribute to the body of knowledge and
represent new meaning and insight in the research question.

Creswell (2013), contrary to the view of Watling and James
(2012), believes that the process of qualitative data analysis and
interpretation can best be represented by a spiral image – a data analysis
spiral, in which the researcher moves in analytic circles rather than using a
fixed linear approach. One enters with data made up of text or images (e.g.
photographs and videotapes), and exits with an account or a narrative. In
between, the researcher touches on several facets of analysis, circling around
and upwards towards completion of the process.


4.2.1 Coding framework

Based on the paradigms of Grounded Theory (G.T), Charmaz
(2014) recommends multiple levels of coding. She points out that coding is the
pivotal link between collecting data and developing an emergent theory in the
data. Through coding, you define what is happening in the data and begin to
grapple with what it means. This study adopts Charmaz’s foundations of G.T
coding and analysis steps, which she advises ‘…consists of at least two main
phases: 1) an initial phase involving naming each word, line, or segment of
data followed by 2) a focused, selective phase that uses the most significant
or frequent initial codes to sort, synthesize, integrate, and organize large
amounts of data.’ (Charmaz, 2014)

Charmaz associates her system to the inductive-abductive
philosophy and stipulates the following 3 coding levels under GT: Initial
coding, focussed coding, and theoretical coding. This study employs these
coding methods for the analysis of both the qualitative collected data: Interviews
and Observation.


Post Author: admin


I'm Velma!

Would you like to get a custom essay? How about receiving a customized one?

Check it out