We provide a philosophical justification for analyzing qualitative and quantitative data within the same study. First, we present several recent typologies of analyses in social science research that incorporate both monomethod (i.e. purely quantitative research or purely qualitative research) and mixed research studies. Second, we discuss what has been referred to as the fundamental principle of empirical data analysis, wherein both qualitative and quantitative data analysis techniques are shaped by an attempt to analyze data in a way that yields at least one of five types of generalizations. Third, building on the frameworks of Denzin and Lincoln (2005), Heron and Reason (1997) and Johnson and Onwuegbuzie (2004), we compare and contrast three qualitative-based paradigms (i.e. constructivism, critical theory, participatory), one quantitative-based paradigm (i.e. postpositivism) and one mixed research-based paradigm (i.e. pragmatism) with respect to three axiomatic components (i.e. ontological, epistemological and methodological foundations) and seven issues (i.e. nature of knowledge, knowledge accumulation, goodness or quality criteria, values, ethics, inquirer posture and training). Also, we link each paradigm to data analysis strategies. Fourth, we illustrate similarities in goals between some qualitative and quantitative analyses; in so doing, we deconstruct the strong claim that analysis must be either qualitative or quantitative and illustrate that regardless of perspective (e.g. postpositivist or constructivist), both qualitative and quantitative data can be jointly analyzed. Finally, we compare and contrast 11 mixed research paradigms/worldviews, linking them to mixed analysis strategies, thereby situating mixed analyses in the philosophy of social science and promoting mixed research as a distinctive methodology. [PUBLICATION ABSTRACT]
ABSTRACT
We provide a philosophical justification for analyzing qualitative and quantitative data within the same study. First, we present several recent typologies of analyses in social science research that incorporate both monomethod (i.e. purely quantitative research or purely qualitative research) and mixed research studies. Second, we discuss what has been referred to as the fundamental principle of empirical data analysis, wherein both qualitative and quantitative data analysis techniques are shaped by an attempt to analyze data in a way that yields at least one of five types of generalizations. Third, building on the frameworks of Denzin and Lincoln (2005), Heron and Reason (1997) and Johnson and Onwuegbuzie (2004), we compare and contrast three qualitative-based paradigms (i.e. constructivism, critical theory, participatory), one quantitative-based paradigm (i.e. postpositivism) and one mixed research-based paradigm (i.e. pragmatism) with respect to three axiomatic components (i.e. ontological, epistemological and methodological foundations) and seven issues (i.e. nature of knowledge, knowledge accumulation, goodness or quality criteria, values, ethics, inquirer posture and training). Also, we link each paradigm to data analysis strategies. Fourth, we illustrate similarities in goals between some qualitative and quantitative analyses; in so doing, we deconstruct the strong claim that analysis must be either qualitative or quantitative and illustrate that regardless of perspective (e.g. postpositivist or constructivist), both qualitative and quantitative data can be jointly analyzed. Finally, we compare and contrast 11 mixed research paradigms/worldviews, linking them to mixed analysis strategies, thereby situating mixed analyses in the philosophy of social science and promoting mixed research as a distinctive methodology.
Keywords: multiple operationalism, quasi statistics, inferential statistics, descriptive statistics, empirical data analysis, mixed analysis strategies, multiple research approaches, mixed methods, qualitative and quantitative, research paradigm, methodology, epistemology, ontology, post-positivism, critical theory, participatory, constructivism, pragmatism, social science research typology
The process of mixed research - involving 'mix[ing] or combin[ing] quantitative and qualitative research techniques, methods, approaches, concepts or language into a single study' (Johnson & Onwuegbuzie 2004: 17) - has developed a long way since Campbell and Fiske (1959) coined the term multiple operationalism, wherein more than one method is used as part of a validation process to help ensure that the variance explained culminates from the underlying phenomenon or trait and is not a function of the method. However, most of the biggest formalized developments in mixed research have occurred within the last 25 years, being given impetus in the 1980s by several prominent researchers (e.g. Brewer & Hunter 1989; Bryman 1988; Greene, Caracelli, & Graham 1989; Jick 1983; Kidder & Fine 1987; Louis 1982; Madey 1982; Mark & Shotland 1987; Maxwell, Bashook, & Sandlow 1986; Phelan 1987; Rossman & Wilson 1985), who called for the integration of quantitative and qualitative approaches. However, the last five years have witnessed a significant increase in the number of mixed research studies, marked by a handbook on mixed methods (Tashakkori & Teddlie 2003); several mixed research textbooks (e.g. Bergman 2008; Creswell & Plano Clark 2007; Greene 2007; Johnson & Christensen 2008; Plano Clark & Creswell 2007; Ridenour & Newman 2008; Teddlie & Tashakkori 2009); several mixed research articles contained in methodological handbooks (e.g. Teddlie, Tashakkori, & Johnson 2008); mixed research articles contained in encyclopedias (e.g. Onwuegbuzie 2007); two journals devoted to mixed research (i.e. Journal of Mixed Methods Research, International Journal of Multiple Research Approaches); several journals now routinely publishing mixed research (e.g. Field Methods, Educational Evaluation & Policy Analysis, Quality & Quantity, Evaluation, Evaluation Practice, Research in Nursing & Health, Research in the Schools, The Qualitative Report); and websites (e.g. http://www.fiu.edu/~bridges/), conferences (e.g. http://www.mixedmethods.leeds.ac.uk/) and workshops (Creswell & Plano Clark 2008; Mertens 2008; O'Cathain 2008; Onwuegbuzie & Collins 2008; Onwuegbuzie, Slate, Leech, & Collins 2008) devoted to mixed research, and by special issues (Gorard & Smith 2006; Johnson 2006; O'Cathain & Collins 2009) with another underway (Leech & Onwuegbuzie, 2010). These and other sources have helped to increase the visibility of mixed research.
Greene (2008: 8) recently asked the following questions:
* 'Is mixed methods social inquiry a distinctive methodology?
* Is the field moving in that direction?
* What is needed for mixed methods to become a distinctive methodology?
According to Greene (2006, 2008), the development of a methodological or research paradigm (i.e. qualitative, quantitative and mixed research) in the social and behavioral sciences requires a thorough critique of four interrelated but conceptually distinct domains:
(i) philosophical assumptions and stances (i.e. what are the core philosophical or epistemological assumptions of the methodology?);
(ii) inquiry logics (i.e. what traditionally is called methodology and refers to broad inquiry purposes and questions, logic, procedures and designs, quality standards and writing and reporting forms that guide the researcher's gaze);
(iii) guidelines for research practice (i.e. specific strategies and tools that are used to conduct research; the how to component of research methodology);
(iv) sociopolitical commitments (i.e. interests, commitments and power relations surrounding the location in society in which an inquiry is situated; proclamation of values- based rationales and meanings for the practice of social and behavioral science research in society).
Together, Greene's four domains provide a cohesive and interactive framework and an array of practical guidelines for a methodological or research paradigm. Although these domains have been more fully developed with respect to both the quantitative and qualitative research paradigms, this is not the case for the field of mixed research. Indeed, not all of Greene's (2006) domains have been fully articulated and developed in mixed research (Greene 2006, 2008). Consequently, more theoretical, conceptual and practical work is needed in the area of mixed research.
In recent years, much has been written about most of the 13 distinct, interactive, iterative steps of the mixed research process identified by Collins, Onwuegbuzie and Sutton (2006), namely:
(a) determining the mixed goal of the study
(b) formulating the mixed research objective(s)
(c) determining the rationale(s) for mixing quantitative and qualitative approaches
(d) determining the purpose(s) for mixing quantitative and qualitative approaches
(e) determining the mixed research question(s)
(f ) selecting the mixed sampling design
(g) selecting the mixed research design
(h) collecting quantitative and qualitative data
(i) transforming and analyzing the quantitative and qualitative data
(j) legitimating the data sets and mixed research findings
(k) interpreting the mixed research findings
(l) writing the mixed research report
(m) reformulating the mixed research question( s).
However, despite the fact that the mixed analysis step is considered by beginning researchers to be the most difficult step of the mixed research process, because typically this step necessitates competence in conducting both quantitative and qualitative data analyses, it is one of the least developed areas in the mixed research literature, with relatively few published articles on mixed analysis (e.g. Bazeley 2003 2006; Caracelli & Greene 1993; Chi 1997; Greene et al. 1989; Lee & Greene 2007; Li, Marquart, & Zercher 2000; Onwuegbuzie 2003; Onwuegbuzie & Dickinson 2008; Onwuegbuzie & Leech 2004, 2006; Onwuegbuzie, Slate, Leech & Collins 2007; Onwuegbuzie, Slate, Leech, & Collins 2009; Onwuegbuzie & Teddlie 2003; Sandelowski 2000, 2001; Teddlie et al. 2008). Currently, published mixed research textbooks (e.g. Bergman 2008; Creswell & Plano Clark 2007; Greene 2007; Ridenour & Newman 2008) - although groundbreaking - contain at most one chapter on mixed analysis. Thus, as noted by Greene (2008: 14): 'There has also been some work in the area of integrated mixed methods data analysis, although this work has not yet cohered into a widely accepted framework or set of ideas'.
Moreover, when discussing mixed analysis strategies, none of these authors discuss the philosophical underpinnings. Yet, by linking mixed analysis techniques to philosophical assumptions and stances, an iterative, interactive and dynamic linkage is provided among Greene's (2006, 2008) four domains.
With this in mind, the present article is a first attempt explicitly to provide a philosophical justification for conducting mixed analyses. First, we present several recent typologies of analyses in social science research that incorporate both monomethod (i.e. purely quantitative research or purely qualitative research) and mixed research studies. Second, we discuss what has been referred to as the fundamental principle of data analysis, wherein both qualitative and quantitative data analysis techniques are shaped by an attempt to analyze data in a way that yields at least one of five types of generalizations. Third, building on the frameworks of Denzin and Lincoln (2005), Heron and Reason (1997) and Johnson and Onwuegbuzie (2004), we compare and contrast three qualitative-based paradigms (i.e. constructivism, critical theory, participatory), one quantitative-based paradigm (i.e. postpositivism) and one mixed research-based paradigm (i.e. pragmatism) with respect to three axiomatic components (i.e. ontological, epistemological and methodological foundations) and seven issues (i.e. nature of knowledge, knowledge accumulation, goodness or quality criteria, values, ethics, inquirer posture and training). Also, we link each paradigm to data analysis strategies. Fourth, we illustrate similarities in goals between some qualitative and quantitative analyses; in so doing, we deconstruct the strong claim that analysis must be either qualitative or quantitative and illustrate that regardless of perspective (e.g. postpositivist or constructivist), both qualitative and quantitative data can be jointly analyzed. Finally, we compare and contrast 11 mixed research paradigms/worldviews, linking them to mixed analysis strategies.
TYPOLOGY OF ANALYSES IN SOCIAL SCIENCE RESEARCH
Onwuegbuzie et al. (2007) outlined the concept of monotype data, which represents the use of one data type (e.g. qualitative data) that is available for analysis - in contrast to multitype data wherein both types of data (i.e. qualitative and quantitative data) are collected and thus are available for analysis. Onwuegbuzie et al. (2007) coined the phrase monoanalysis to denote when one class of analysis (e.g. qualitative analysis) is used to analyze one data analysis type (e.g. qualitative data) - as opposed to multianalysis, wherein both classes of analyses (i.e. qualitative analysis and quantitative analysis) are used to analyze one or more data analysis types. For example, a quantitative researcher might use multiple regression (Fox 1997) to examine which variables predict some quantitative outcome of interest. Alternatively, a qualitative researcher might use the method of constant comparison (Glaser & Strauss 1967) to analyze responses to open-ended interview questions.
Three dimensional framework for qualitative and quantitative analyses
Onwuegbuzie et al. (2009) also conceptualized a typology for classifying qualitative and quantitative analysis techniques. Specifically, these authors presented a three-dimensional representation for classifying and organizing both qualitative and quantitative analyses, which involves reframing qualitative and quantitative analyses as a case-oriented, variable-oriented, or process/experienceoriented analyses.
1. Case-oriented analyses are analyses that focus primarily or exclusively on the selected case(s), wherein the goal is to analyze and interpret the meanings, experiences, attitudes, opinions, or the like of one or more persons - with a tendency toward particularizing and analytical generalizations. Although case-oriented analyses are best suited for identifying patterns common to one or a relatively small number of cases - and thus lend themselves better to qualitative research in general and qualitative analyses in particular - this class of analyses can be used for any number of cases and, as such, also is pertinent for quantitative research, leading to the use of quantitative analysis techniques such as single-subject analyses, descriptive analyses and profile analyses (Onwuegbuzie et al. 2009).
2. In contrast, variable-oriented analyses involve identifying relationships - often probabilistic in nature - among entities that are treated as variables such that this class of analysis tends to be conceptual and theory-centered from the onset and has a proclivity toward external generalizations. Therefore, variable-oriented analyses, whose '"building blocks" are variables and their intercorrelations, rather than cases' (Miles & Huberman 1994: 174), are more apt for quantitative research in general and quantitative analyses in particular. However, although the use of large and representative samples often facilitates identification of relationships among variables, small samples also can serve this purpose, making variable- oriented analyses also relevant for qualitative data along with the use of qualitative analysis techniques (e.g. examining themes that cut across cases).
3. Finally, process/experience-oriented analyses involve evaluating processes or experiences pertaining to one or more cases within a specific context over time, with processes tending to be associated with variables and experiences tending to be associated with people (i.e. cases).
Because each of these three analysis orientations represent a continuum rather than a dichotomy (e.g. variable-oriented analyses are conceptualized by Onwuegbuzie et al. (2009) as falling on a particularistic-universalistic continuum, classifying the extent to which the metainferences stemming from the variable-oriented analysis can be generalized), this three-dimensional framework supports Johnson and Onwuegbuzie's (2004: 20) assertion that 'the possible number of ways that studies can involve mixing is very large because of the many potential classification dimensions'.
Cross-over mixed analysis
Onwuegbuzie and Combs (2009), building on the works of Greene (2008) and Onwuegbuzie and Teddlie (2003), outlined the concept of cross-over mixed analyses, which involves using one or more analysis types associated with one tradition (e.g. quantitative analysis) to analyze data associated with a different tradition (e.g. qualitative data). Such an analysis can be used to reduce, display, transform, correlate, consolidate, compare, integrate, assert, or import data. Table 1 presents the cross-over mixed analysis types and strategies that Onwuegbuzie and Combs (2009) identified. Cross-over mixed analyses are distinct from other types of mixed analyses (i.e. non-cross-over mixed analyses) such as parallel mixed analysis, wherein both quantitative and qualitative data analyses are conducted separately, neither type of analysis builds on or interacts with the other during the data analysis stage and the findings from each type of analysis are neither compared nor consolidated until both sets of data analyses have been completed. Whereas non-cross-over mixed analyses involve collection of both types of data and the analysis conducted per data set represents the same paradigmatic tradition (i.e. either quantitative or qualitative) - 'within-paradigm analysis' (Onwuegbuzie et al. 2007: 12) - cross-over mixed analyses involve a between-paradigm analysis, which involves 'an analysis technique more associated with one traditional paradigm (e.g. quantitative) to analyze data that originally represented the type of data collected associated with the other traditional paradigm (e.g. qualitative)' (Onwuegbuzie et al. 2007: 12). Thus, cross-over mixed analyses involve more integration of qualitative and quantitative analyses than do other types of mixed analyses because they involve the mixing or combining of qualitative- and quantitative-based paradigmatic assumptions and stances (e.g. using exploratory factor analysis to examine the structure of themes that emerged from a qualitative analysis; cf. Onwuegbuzie 2003), which involves either maintaining an analytical-philosophical stance that the human mind/perception and mathematical algorithms can be used sequentially to examine patterns in qualitative data or adopting an analytical-philosophical stance that transcends the stances underlying both paradigms (e.g. assuming that data saturation [i.e. qualitative information repeats itself such that no new or relevant information seem to emerge pertaining to a category and the category development is well established and validated; Morse 1995] and reliability [i.e. consistency or repeatability of participants' quantitative responses] represent parallel constructs). This distinction between cross-over mixed analyses and non-cross-over mixed analyses is important because certain epistemological, ontological, axiological and methodological stances might lend themselves more to conducting cross-over mixed analyses than do other stances.
Building on the works of Onwuegbuzie et al. (2007), Onwuegbuzie et al. (2009) and Onwuegbuzie and Combs (2009), before conducting an analysis, a researcher explicitly or implicitly makes the following six decisions:
(a) the number of data types that will be analyzed - yielding either monotype data (i.e. use of one data type, namely: qualitative data or quantitative data) or multitype data (i.e. use of both data types, namely: qualitative data and quantitative data);
(b) the number of data analysis types that will be used - yielding monoanalysis (i.e. use of one data analysis type, namely: qualitative data analysis or quantitative data analysis) or multianalysis (i.e. use of both data analysis types, namely: qualitative data analysis and quantitative data analysis);
(c) the analysis emphasis of interest - comprising case-oriented analyses (i.e. analyses that focus primarily or exclusively on the selected case(s)), variable-oriented analyses (i.e. identifying relationships among entities that are conceived as variables) and/or process/experience-oriented analyses (i.e. evaluating processes or experiences pertaining to one or more cases within a specific context over time);
(d) whether or not analysis types associated with one tradition will be used to analyze data associated with a different tradition (i.e. cross-over mixed analysis vs. non-crossover mixed analysis);
(e) whether the qualitative and quantitative analyses will be carried out concurrently (i.e. results stemming from one data analysis phase do not inform the results stemming from the other phase) or sequentially (i.e. the qualitative analyses are conducted first, which then inform the subsequent quantitative analyses, or vice versa);
(f) whether the qualitative or quantitative analyses will be given priority (i.e. quantitative analyses carry the most weight or the qualitative phase carry the most weight), or whether they will be assigned equal status.
Monomethod studies involve the use of monotype data, monoanalysis and non-cross-over mixed analysis, thereby making the fifth (i.e. time orientation of analyses) and sixth (i.e. priority of analyses) decisions irrelevant. Monomethod studies also include a decision as to the analysis emphasis. In contrast, in mixed research, consideration of these six decisions yields a large variety of possible combinations (i.e. except the aforementioned combination associated with monomethod research) that is impossible to capture completely in any single typology.
FUNDAMENTAL PRINCIPLE OF DATA ANALYSIS
Onwuegbuzie et al. (2009) outlined what they refer to as the fundamental principle of data analysis, 1 wherein both qualitative and quantitative data analysis techniques are shaped by an attempt to analyze data in a way that yields at least one of the following five types of generalizations: external (statistical)2 generalizations, internal (statistical)3 generalizations, analytical generalizations, case-tocase transfer and/or naturalistic generalizations. External (statistical) generalization involves making generalizations, inferences, or predictions on data obtained from a representative statistical (i.e. optimally random) sample to the population from which the sample was drawn (i.e. universalistic generalizability). Contrastingly, internal (statistical) generalization involves making generalizations, inferences, or predictions on data obtained from one or more representative or elite participants (e.g. key informants, politically important cases, sub-sample members) to the sample from which the participant(s) was selected (i.e. particularistic generalizability). It should be noted that internal (statistical) generalization can stem from qualitative, quantitative, or mixed analyses. Conversely, with analytic generalizations 'the investigator is striving to generalize a particular set of [case study] results to some broader theory' (Yin 2009: 43). In other words, analytical generalizations are 'applied to wider theory on the basis of how selected cases "fit" with general constructs' (Curtis, Gesler, Smith, & Washburn 2000: 1002) (i.e. particularistic generalizability). Case-to-case transfer involves making generalizations or inferences from one case to another (similar) case (Firestone 1993; Kennedy 1979; Miles & Huberman 1994) (i.e. particularistic generalizability). Finally, with naturalistic generalization, the readers of the article (i.e. consumers of the findings) make generalizations entirely, or at least in part, from their personal or vicarious experiences (Stake 2005), such that meanings stem from personal experience and are adapted and reified by repeated encounter (Stake 1980; Stake & Trumbull 1982). Rather than the researcher making this sort of generalization, it is the reader who generalizes (often based on proximal similarity of the case data to the reader's focal context of interest).
LINKING PARADIGMS TO DATA ANALYSIS STRATEGIES
Over the years, in an attempt to distinguish qualitative- based paradigms from quantitative-based paradigms and to demonstrate that qualitative research represents a distinct tradition, several eminent qualitative researchers have presented frameworks that contrast qualitative-based paradigms (e.g. constructivism, critical theory, participatory) and quantitative-based paradigms (i.e. [logical] positivism and postpositivism). Indu- bitably, the most popularized framework is that developed by Guba (1990) and extended, more recently, by Denzin and Lincoln (2005). Denzin and Lincoln outlined their views of the axiomatic nature of paradigms and the issues they believed were most fundamental to differentiating the paradigms by contrasting three qualitative-based paradigms (i.e. constructivism, critical theory, participatory) and two quantitative-based paradigms (i.e. [logical] positivism and postpositivism) with respect to three axiomatic components (i.e. ontological, epistemological and methodological foundations) and seven issues (i.e. nature of knowledge, knowledge accumulation, goodness or quality criteria, values, ethics, inquirer posture and training).
In Table 2, we build on Denzin and Lincoln's (2005) table of axioms and issues. Specifically, Table 2 contains Denzin and Lincoln's (2005: 193-194) axioms of the paradigms of postpositivism and constructivism, as well as axioms of the participatory paradigms outlined by Heron and Reason (1997) (which was also included in Denzin and Lincoln's [2005] table). Table 2 also contains our proposed pragmatist paradigm, which incorporates some of the major concepts outlined in Johnson and Onwuegbuzie's (2004) seminal article on mixed research. However, Table 2 does not include the axioms of the paradigm of positivism because this paradigm, which incorporates several movements (e.g. analytical philosophy, logical atomism, logical empiricism and semantics), was discredited shortly after World War II.4 As noted by Johnson and Onwuegbuzie (2004):
Positivism is a poor choice for labeling quantitative researchers today because positivism has long been replaced by newer philosophies of science (Yu 2003). The term is more of a straw man (easily knocked down) for attack than standing for any actual practicing researchers. A term that better represents today's practicing quantitative researchers is postpositivism (Phillips & Burbules 2000: 24).
In this table, we have included two additional issues to Denzin and Lincoln's seven issues - namely, qualitative analysis and quantitative analysis - in an attempt to take the first step toward linking data analysis techniques to philosophical assumptions and stances. In Figure 1, we reproduce Onwuegbuzie et al.'s (2009) typology for classifying qualitative and quantitative analysis techniques. This figure presents an array of analytic techniques undertaken in quantitative and qualitative research as a function of the following three analysis emphases: case-oriented analyses, variable-oriented analyses and process/ experience-oriented analyses. Although this list is by no means exhaustive, it does capture most of the major qualitative and quantitative analytical techniques (Onwuegbuzie et al. 2009).
Postpositivist paradigm
Often under-emphasized in the literature, postpositivism is a rejection or modification of several core tenets of positivism. Although postpositivist researchers believe that there is an independent reality that can be studied, they assert that all observation is inherently theory-laden and fallible and that all theory can be modified. They also believe that, as a result of their cultural experiences and worldviews, people are always partially biased in their objective perceptions of reality. Given these inherent biases and perceptions and observations that are fallible, ensuing constructions are imperfect (Phillips & Burbules 2000). Thus, postpositivist researchers assert that we can only approximate the truth of reality but can never explain it perfectly or completely. Notwithstanding, postpositivist researchers believe that objectivity can be approximated by triangulating across these multiple fallible perspectives (i.e. triangulation of method, data and theory). Karl Popper (1994) identified three worlds that postpositivist researchers can address: (a) an objective physical external world (World 1); (b) an interpretative, subjective inner world (World 2); and (c) the theory world where humans mentally and physically represent the first two worlds (World 3). Popper's views of knowledge and reality are just one example of complexity and nuance commonly expressed by quantitative researchers that is not represented in simple statements found in paradigm comparison tables popular in the research literature.
Belief in the fallibility of observations renders statistics in general, and inferential statistics in particular, as suited to postpositivist research due to its emphasis on assigning probabilities (e.g. pvalues, levels of confidence or error) to observed findings. Thus, as well as utilizing descriptive statistics, postpositivist researchers use the whole array of inferential statistics for making external (statistical) generalizations (cf. Figure 1). Contrary to popular depictions of quantitative research as being deductive (and qualitative as inductive), inferential statistical generalizations are inductive. Postpositivist researchers also utilize some qualitative analysis techniques, especially those that yield frequency data such as Word Count and Classical Content Analysis (cf. Leech & Onwuegbuzie 2007, 2008). Postpositivist researchers also employ qualitative data analysis techniques that help them develop quantitative instruments.
Constructivist paradigm
As can be seen in Table 2, constructivist researchers (e.g. radical constructivists, cognitive constructivists, cultural constructivists, social constructivists/constructionists, communal constructivists, critical constructivists, genetic epistemology) often claim to believe that multiple, contradictory, but equally valid accounts of the same phenomenon (i.e. multiple realities) can exist. Thus, it is somewhat contradictory that those who ascribe to strong relativism or strong constructivism do not view the use of quantitative methods in general and quantitative analysis in particular as representing one such valid account of a phenomenon - albeit not their preferred account. However, the fact that constructivists tend to believe that time- and context-free generalizations are neither desirable nor possible, likely renders inappropriate the use of inferential statistics such as the sample mean for the purpose of making external (statistical) generalizations across populations (i.e. claiming that a statistical parameter applies broadly to nearly everyone in a population). Generalizing to a population (e.g. claiming that the sample statistic such as the sample mean is a reasonable representation of the corresponding population parameter) should be less controversial. This distinction of generalizing across versus generalizing to a population (Cook & Campbell 1979) seldom is addressed in the paradigm debates. Parameter estimation would probably be of more interest to a social constructivist than a radical or cognitive constructivist.
Descriptive statistics has long been an important part of ethnography in anthropology as ethnographers include quantitative descriptors to complement narrative description. Descriptive statistics, more generally, can be used to enhance the qualitative researcher's quest for detailed description. Sechrest and Sidani note (1995: 79) 'qualitative researchers regularly use terms such as 'many,' 'most,' 'frequently,' 'several,' 'never,' and so on. These terms are fundamentally quantitative.' Qualitative researchers also can obtain more meaning by obtaining counts of words in addition to their narrative descriptions (Sandelowski 2001). For example, in examining the lived experience in the classroom of Johnny, a child with attention deficit hyperactivity disorder, rather than telling readers that Johnny left his seat on many occasions during a class, it would be informative for the qualitative researcher to note that Johnny left his seat six times during the course of 30 minutes. This way, the readers can (if contextual detail is provided) decide whether six incidences of out-ofseat behavior are significant/meaningful and also will be in a better position to decide whether to make naturalistic generalizations.
Consistent with our recommendation for qualitative researchers to use descriptive statistics more frequently where applicable, more than one half a century ago, Barton and Lazarsfeld (1955) advocated the use of what they coined quasi-statistics in qualitative research. According to these authors, quasi-statistics pertain to the use of descriptive statistics that can be extracted from qualitative data. Interestingly, the prominent symbolic interactionist Howard Becker (1970: 81-82) contended 'one of the greatest faults in most observational case studies has been their failure to make explicit the quasi-statistical basis of their conclusions.'. Further, Joseph Maxwell (1996: 95) noted that:
Quasi-statistics not only allow you to test and support claims that are inherently quantitative, but also enable you to assess the amount of evidence in your data that bears on a particular conclusion or threat, such as how many discrepant instances exist and from how many different sources they were obtained. [emphasis in original]
Interestingly, Becker, Geer, Hughes and Strauss (1961/1977) provided more than 50 tables and graphs in their qualitative works. These tables and graphs complemented the narrative descriptions of their qualitative data.
Internal (statistical) generalizations also are possible in constructivist research. In particular, it is not unusual for constructivist researchers to utilize key informants to provide them with an insider's understanding and to provide information that the researcher is unable to experience. This often includes both qualitative and quantitative information. For example, the informant might state what many members believe or describe as characteristics of many (or only a few) people in the group. These numbers would contribute to internal generalizations. In particular, inferential statistics can be used to make internal (statistical) generalizations. Educational ethnographers Margaret LeCompte and Jude Preissle (1993) discuss enumeration in ethnographic data analysis in some depth in their qualitative research textbook.
The fact that many computer-assisted qualitative data analysis software (CAQDAS) programs allow data to be imported to statistical software programs (e.g. Excel, SIMSTAT) supports our assertion that descriptive and inferential statistical analyses are an option for constructivist researchers should they deem it appropriate (e.g. based on their philosophical stance and research question(s)) to make internal (statistical) generalizations. These statistical analysis tools also can be used to bolster analytical generalizations. One of the most popular methods of qualitative research, grounded theory, is premised on the development of theory for the purposes of generalization. Descriptive and inferential statistics can be used to facilitate rich and detailed description, and to assess and enhance trustworthiness, dependability, confirmability, transferability and authenticity. Consistent with this assertion, Denzin and Lincoln (2005) state that the training of constructivists includes both qualitative and quantitative techniques (cf. Table 2). Whatever findings emerge from descriptive and/or inferential statistical analyses utilized, it should be noted that for constructivists, these findings represent just one of the multiple valid accounts of the phenomenon.
Critical theory paradigm
Critical theory researchers seek to understand the relationship between societal structures (e.g. economic, political) and ideological patterns of thought that impede a person or group from identifying, confronting and addressing unjust social systems. Simply put, critical theory researchers primarily are interested in social change as it emerges in relation to social struggle. Moreover, critical theory researchers operate under the assumption that the knowledge gleaned from their research represents an initial step toward addressing social injustices and promoting social change. As such, critical theory research represents a form of transformative research by providing emancipatory knowledge that identifies the contradictions that are masked or distorted by our everyday thoughts and perceptions (Lather 1986). Whereas many other types of qualitative researchers (e.g. phenomenological researchers, constructivist researchers) are interested in the meanings that people attach to their own actions, critical theory researchers aim to place such actions in a broader context that is framed by social, economic, political and ideological forces that have previously not been identified or acknowledged. That is, critical theory researchers are more concerned with the forces that limit actions rather than the actions themselves. Thus, critical theory researchers tend to embrace a more etic (i.e. outsider's) stance than an emic (i.e. insider's) stance.
Although it might appear that critical theory research is similar to postpositivist researchers - where critical theory researchers seek cause-andeffect relationships wherein the actions of individuals or groups are influenced directly by social, economic, political and/or ideological variables - many critical theory researchers would consider this strategy as being inappropriately reductionistic. Rather, many postmodern critical theory researchers believe that it is not possible to predict reliably how these variables determine actions. Further, instead of focusing on foundational criteria to justify its research findings, critical theory researchers contend that, by attending to the role of power in social systems, their analyses occur at the meta-theoretical level. Drawing upon research from other paradigms, critical theory researchers identify power structures and their processes that are typically ignored in both postpositivist and constructivist research.
Like constructivist researchers, critical theory researchers believe that meaning and language are socially constructed and are neither time- nor context- free, although critical theory researchers assert that social injustice has real (i.e. objective, material) consequences. Also, like constructivist researchers, critical theory researchers primarily assess their findings with respect to the community of researchers to which they belong (i.e. authoritative consensus). Critical theory researchers contend that the subjective/objective dualism camouflages the ways in which both standpoints are constrained by power dynamics and social forces. However, critical theory researchers endorse subjectivism inasmuch as the subjective knower and known cannot be separated and that knowledge is subjective (culturally and historically embedded) and constructed on the basis of power issues (Lather 2006); yet, they reject the stance that all analyses are relative, instead, believing that rational analysis is essential to social justice. Thus, many critical theorists endorse critical realism (Morrow & Brown 1994).
Critical theory researchers believe that constructivists place too much weight on people's perceptions and too little emphasis on the more complex social forces that shape and constrain experiences, events and actions. As such, critical theory researchers deem constructivism to be just as much at risk as is postpositivism of uncritically bolstering the status quo of social injustice. According to Habermas, a leading contemporary proponent of critical theory, critical theory involves the following three types of knowledge: (a) the empirical-analytic sciences (i.e. including both the natural sciences and social sciences, which seek to manipulate knowledge for the purposes of prediction and control over nature and social structure); (b) historical-hermeneutic sciences (i.e. including the cultural and human sciences, which seek to understand communication among and within social groups); and (c) critical theory (i.e. which seeks freedom from oppression and social justice) (Blaikie 1993). Thus, Habermas advocates the use of both the method of the empirical-analytic sciences, historical-hermeneutic sciences and critical theory.
However, critical theory researchers believe that critical theory utilizes but transcends both the empirical-analytic and historical-hermeneutic sciences inasmuch as it provides knowledge that is subjected to rigorous processes of free and rationale discourse, which prevents it from being socially distorted (Blaikie 1993). This suggests that the ontological, epistemological and axiological stances of critical theorist researchers do not prevent them from using, as appropriate, all forms of qualitative analysis techniques and quantitative analysis techniques that include both descriptive and inferential statistics (cf. Figure 1). As noted by Morrow and Brown (1994: 200), 'critical theory has no basis for a priori rejection of any particular methods or techniques as such'. However, whatever analysis is used, critical theory researchers make no claims that their analyses are objective in the manner claimed by some postpositivists.
Participatory paradigm.
As noted by Heron and Reason (1997), proponents of the participatory paradigm subscribe to an experiential subjective-objective reality that stems from a co-creative interaction between the given cosmos and the way the mind connects with it. Thus, what can be known about the given cosmos is that it represents a subjectively articulated world, whose objectivity reflects how it is intersubjectively formed by the knower. A participative worldview acknowledges the following four interdependent ways of knowing: experiential (i.e. direct encounter), presentational (i.e. emerges from and grounded on experiential knowing), propositional (i.e. conceptually knowing that something or other is the case) and practical (i.e. knowing how to do something) (Heron & Reason 1997). Of these, propositional knowing is most compatible with statistical analyses. In particular, propositional knowing represents knowledge that comes to the fore by describing some person, group, entity, location, situation, process, or the like. Simply put, it represents knowledge of facts. This way of knowing is articulated via declarations and theories that stem from mastery of concepts that language provides. This suggests that propositional knowledge can be expressed not only via qualitative analysis techniques but also via an array of quantitative techniques that include both descriptive and inferential statistics. For example, inferential statistics could be used to attach levels of certainty (i.e. probability) to knowing that something is the case. Interestingly, as noted by Heron and Reason (1997), intervention research represents a form of participative inquiry that includes statistical analyses (cf. Fryer & Feather 1994).
When studying a group or community, participatory researchers might use inferential techniques to test theory that represents propositional knowledge. Confirmation of a theory then would lead the participatory researcher to make external (statistical) generalizations from the participants to the group or community from which the participants were selected. Alternatively, when studying key informants or sub-sample members, a participatory researcher might use inferential statistics to make internal (statistical) generalizations to assess propositional knowledge. Alternatively still, when studying one or a few people, then only descriptive statistics might be appropriate.
The quest for practical knowing also invites statistical analyses. In particular, practical knowing assumes a conceptual understanding of standards and principles of practice, presentational stylishness and experiential grounding in the situation within which the action takes place. In this respect, participatory research coincides with pragmatist research, which endorses a strong and practical empiricism, in which both qualitative and qualitative data are collected and analyzed to generate and to test theories against observations of the natural world. As such, participatory research does not appear to invalidate the use of descriptive or inferential statistical analyses and thus all analyses in Figure 1 are available to participatory researchers.
Pragmatist paradigm
The pragmatist paradigm offers an epistemological justification (i.e. via pragmatic epistemic principles and standards) and logic (i.e. combining approaches that help researchers optimally frame, examine and provide tentative answers to one's research question[s]) for mixing approaches and methods (Johnson, Onwuegbuzie, & Turner 2007; Onwuegbuzie & Leech 2005). Further, a pragmatist would reject the incompatibility thesis that qualitative and quantitative research are fully incompatible and cannot, in any useful way, be used in combination in social or behavioral research. A serious problem with the incompati- bility thesis is that it is an a priori argument; it seems to be based on rationalistic, foundational, deductive logic (which oftentimes is said to have been rejected in qualitative research). The incompatibility thesis does not appear to be based on observation of social and behavioral research (i.e. through an examination of how researchers actually conduct their research). In pragmatist research, research paradigms can remain separate, but they also can be mixed or combined into another research paradigm (i.e. mixed research). Johnson et al. (2007: 129) asserts that pragmatist researchers may ascribe to the view of mixed methods research is a research paradigm that:
(a) partners with the philosophy of pragmatism in one of its forms (left, right, middle);
(b) follows the logic of mixed methods research (including the logic of the fundamental principle and any other useful logics imported from qualitative or quantitative research that are helpful for producing defensible and usable research findings);
(c) relies on qualitative and quantitative viewpoints, data collection, analysis and inference techniques combined according to the logic of mixed methods research to address one's research question(s); and
(d) is cognizant, appreciative and inclusive of local and broader sociopolitical realities, resources and needs.
The mixed methods research paradigm offers an important approach for generating important research questions and providing warranted answers to those questions. This type of research should be used when the nexus of contingencies in a situation, in relation to one's research question( s), suggests that mixed methods research is likely to provide superior research findings and outcomes.
As noted in Table 2, pragmatist researchers can use the whole range of qualitative analyses and quantitative (i.e. descriptive and inferential analytical techniques) in an attempt to fulfill one or more of five mixed research purposes identified by Greene et al. (1989): tri-angulation (i.e. comparing findings from quantitative data with qualitative results in hopes of convergence); complementarity (i.e. seeking elaboration, illustration, enhancement and clarification of the results from one method with findings from the other method); development (i.e. using the results from one method to help inform the other method); initiation (i.e. discovering paradoxes and contradictions that culminate in a re-framing of the research question); and expansion (i.e. expanding the breadth and range of a study by using multiple methods for different study phases).
Decisions made regarding these five mixed research purposes and the resulting mixed research designs help pragmatist researchers to determine which of three types of mixed analysis should be undertaken: a parallel mixed analysis (i.e. findings obtained from both analysis phases are interpreted separately), concurrent mixed analysis (i.e. results stemming from one data analysis phase do not inform the results stemming from the other phase), or sequential mixed analysis (i.e. the qualitative analysis phase is conducted first, which then informs the subsequent quantitative analysis phase, or vice versa). In particular, if the purpose of mixing is complementarity, then all three families of mixed analyses (i.e. parallel, concurrent and sequential) can be used. If triangulation or initiation represents the purpose, then both parallel and concurrent mixed analyses are viable. If development is the purpose, then concurrent and sequential mixed analyses are appropriate. Finally, if the purpose of mixing analyses is expansion then a sequential mixed analysis is pertinent (Onwuegbuzie et al. 2007).
The purposes of mixed research and the resulting kinds of analyses also will help pragmatist researchers determine the number of data types that will be analyzed (i.e. monotype data vs. multitype data), the number of data analysis types that will be used (i.e. monoanalysis vs. multianalysis), the analysis emphasis of interest (i.e. case-oriented analyses, variable-oriented analyses and/or process/experience-oriented analyses), whether or not analysis types associated with one tradition will be used to analyze data associated with a different tradition (i.e. cross-over mixed analysis vs. non-cross-over mixed analysis) and whether the qualitative or quantitative analyses will be given priority, or whether they will be assigned equal status.
Pragmatist researchers also can use inferential statistics to make internal (statistical) generalizations. For instance, in the field of linguistics, political science and the like, it is not unusual for researchers analyzing qualitative data to conduct sequence analyses wherein hypotheses regarding patterns and trends in the qualitative data are tested (i.e. using p-values, confidence intervals and/or effect sizes). As an example, these researchers might test whether themes occurring among the participants statistically and/or practically significantly tend to emerge in a particular order. QDA Miner 3.0 is a computer- assisted qualitative data analysis software (CAQDAS) program that allows researchers to conduct qualitative analyses, such as thematic analysis and also to sequence analyses and other statistical and visualization tools such as clustering, multidimensional scaling, heatmaps and correspondence analysis. The fact that CAQDAS programs increasingly are allowing inferential statistical analyses to be conducted supports our assertion that inferential statistical analyses are an option that analysts of qualitative data have should they deem it appropriate to make statistical (i.e. internal and to a lesser degree external) generalizations. These inferential statistical analysis tools also can be used to bolster analytical generalizations. The descriptive and inferential statistics could be used not only to facilitate rich and detailed description but also could be used to assess and enhance trustworthiness, dependability, confirmability, transferability and authenticity.
For pragmatist research, having a postpositivist orientation does not prevent a researcher from conducting qualitative analyses - especially analyses such as word count and content analysis that involve, in part, the counting of words, codes, categories, or other aspects of the qualitative data. Such analyses can form part of quantitative-dominant mixed analyses (cf. Johnson et al. 2007), in which the analyst adopts a postpositivist stance, while, at the same time, believing that the inclusion of qualitative data and approaches are likely to enhance the findings. Similarly, having a constructivist orientation - or any other qualitativebased orientation - does not prevent a researcher from conducting quantitative analyses - especially descriptive statistical analyses that do not involve the analyst making inferences beyond the research participants at hand, which is typically not the goal of qualitative researchers. Such analyses can form part of qualitative-dominant mixed analyses (cf. Johnson et al. 2007), in which the researcher takes a constructivist-poststructuralist- critical stance with respect to the mixed analysis process, while, at the same time, deeming the addition of quantitative data and approaches as helpful in providing richer data and interpretations.
Commonalities regarding data analysis strategies across paradigms
From Table 2, the table of axioms and issues, it can be seen that the ontological, epistemological and methodological assumptions and stances representing all five paradigms allow the conduct of both quantitative and qualitative analyses - at least to a small degree - with postpositivist and constructivist paradigms having the least potential to use analytical techniques that belong to a different paradigm, the critical theory and the participatory paradigms having excellent potential to use analytical techniques that belong to a different paradigm and the pragmatist paradigm, almost by definition, having the greatest potential.
That the ontological, epistemological and methodological assumptions and stances representing all five paradigms legitimate both quantitative and qualitative analyses to be undertaken is especially apparent when one examines the similarities in goals between many quantitative and qualitative analyses rather than emphasizing the differences.5 First and foremost, both quantitative and qualitative researchers analyze empirical observations (i.e. data coming from personal experience, observation, or experiment) to address research questions (Johnson & Onwuegbuzie 2004). Sechrest and Sidani (1995: 78) note that both sets of researchers 'describe their data, construct explanatory arguments from their data and speculate about why the outcomes they observed happened as they did.'. At the level of data analysis, for example, numeric data are similar to textual/visual data inasmuch as they represent (descriptive) codes that characterize the person's meanings, beliefs, attitudes and so on; thus, both data types might be available for collection and analysis regardless of the research paradigm involved, depending on the research question(s). Indeed, both numeric data and textual/ visual data can be used to facilitate any of the following types of coding: inductive coding, deductive coding, abductive coding, interpretive coding, open coding, axial coding and selective coding (cf. Miles & Huberman 1994). As another example, thematic analysis (qualitative technique) and exploratory factor analysis (quantitative technique) have very similar goals, namely, to reduce the dimensionality of the raw data. This similarity allows mixed researchers to use analyses associated with one paradigm on data associated with another paradigm. For instance, it is not unusual for a mixed researcher to factor analyze themes that emerged from textual data (cf. Onwuegbuzie 2003)6 or to undertake a profile analysis of a set of quantitative measures stemming from one or more cases (Tashakkori & Teddlie 1998). Statistical factor analysis is an exploratory data analysis technique that searches for sets of variables that are similar to one another but are different from the other sets of variables found in the data. The researcher must label the factors that emerge from the data; that is, the research must give a name to the factors and interpret what the named factors mean. The naming part of factor analysis is an example of the use of qualitative coding within a technique traditionally viewed as quantitative. Factor analysis, we argue, has both quantitative and qualitative elements and, therefore, is an inherently mixed analysis procedure.
An even more compelling explanation for our assertion - that the ontological, epistemological and methodological assumptions and stances representing all five paradigms allow both quantitative and qualitative analyses to be undertaken - stems from the nature of qualitative and quantitative analyses themselves. Many of the core analytical techniques that are associated with both qualitative and quantitative paradigms are not as pure as is contended by proponents of monomethod research. For instance, with respect to qualitative research, the concepts of data saturation, informational redundancy and/or theoretical saturation (i.e. when no new or relevant information seem to emerge pertaining to a category and the category development is well established and validated; Flick 1998; Lincoln & Guba 1985; Morse 1995; Strauss & Corbin 1990), although inherently interpretivist, have quantitative undertones. These concepts incorporate quantitative assumptions - including the idea of internal replication (i.e. findings that are replicated by other participants in the study). In order to reach conclusions about saturation or informational redundancy, some kind of formal (i.e. conscious) or informal (e.g. subconscious) assessment of degree or amount typically takes place to determine how exhaustive the data are. Also, any conclusion on the part of the qualitative researcher that saturation or informational redundancy has taken place is accompanied by some degree of confidence or even certainty (i.e. high probability) - whether or not this confidence is estimated (which is a strategy that a postpositivist researcher likely would pursue). As the early modern continental philosopher Immanuel Kant (1781/1998) pointed out, the categories of qualitative and quantitative are necessarily part of human thought and the conclusions we construct about entities that are important to us.
With respect to quantitative research, techniques such as exploratory factor analysis and cluster analysis, although inherently postpositivist, have constructivist leanings inasmuch as for any given dataset, there are myriad mathematical solutions (i.e. factor/cluster structures) that can be constructed and the analyst has to make sense of the selected mathematical solution (i.e. meaningmaking). Indeed, there are numerous strategies (e.g. principal component analysis vs. factor analysis; correlation matrix vs. variance-covariance matrix; maximum likelihood vs. unweighted least squares vs. generalized least squares vs. principle axis factoring vs. alpha factoring vs. image factoring; eigenvalues; trace; scree plot; parallel analysis; number of iterations; orthogonal vs. oblique rotation; factor pattern matrix vs. factor structure matrix; communality estimates; internal replication techniques such as bootstrapping, jackknifing and cross-validation; cf. Henson, Capraro, & Capraro 2004; Henson & Roberts 2006; Hetzel 1996; Onwuegbuzie & Daniel 2003) and criteria that are available for deciding on the most appropriate factor structure. With so many decisions to make, two or more analysts easily can arrive at different final factor solutions, which indicates that exploratory factor analysts serve, to a significant degree, as research instruments themselves - a concept that some postpositivist researchers might be reluctant to acknowledge due to a stance that social science inquiry should be objective and that postpositivist researchers should eliminate all biases (cf. Table 2).
Interestingly, the role of postpositivist researcher-as-instrument is not restricted to analyses that are exploratory in nature (e.g. exploratory factor analysis, cluster analysis, multidimensional scaling) but also include analyses that involve null hypothesis significance testing (NHST). In fact, the more sophisticated the NHST-based analysis, the more subjective decisions the postpositivist analyst has to make, making him/her serve as a research instrument to an even larger degree. For example, structural equation modeling (SEM; e.g. Schumacker & Lomax 1996) and hierarchical linear modeling (HLM; e.g. Bryk & Raudenbush 1992) typically involve the analyst making numerous subjective decisions, including selecting how many and which models to test and selecting from the numerous goodness-of-fit indices and criteria available. Typically, the theoretical models are literally created by the researcher (after an inductive analysis of past research, current theory, hunches, etc.). Thus, it is difficult for advocates of any of the research paradigms to claim justifiably a one-to-one correspondence between ontology/epistemology and type of analysis (i.e. qualitative vs. quantitative).
Moreover, we contend that pragmatist researchers who hold a mixture of philosophical positions (i.e. belonging to both quantitative and qualitative traditions) find it natural to combine statistical analyses with an array of qualitative analyses. In order to accomplish such an embed- ded analysis, the mixed researcher has to make Gestalt switches from a quantitative lens to a qualitative lens and vice versa, going back and forth, multiple times (Kuhn 1962). According to Onwuegbuzie and Johnson (2006), this series of switching yields a new or consolidated analysis - such as cross-over mixed analyses - which, if effective, can yield more fully mixed meta-inferences that incorporate a strong use of both quantitative and qualitative assumptions and stances and that represent the strongest paradigmatic mixing. More generally in the mixed research literature, this purposeful switching is called the dialectical approach (Greene 2007; Johnson 2008). This approach can rely on a single individual trained in both qualitative and quantitative research or, if no qualified individual is available, it can rely on a research team composed of at least one qualitative and one quantitative researcher. A recent example of a qualitative-quantitative research team is found in Corden and Hirst (2008).
Mixed research philosophical paradigms
Pragmatism is only one of many stances that underlie mixed research (Greene 2007, 2008; Johnson et al. 2007). Current stances include the pragmatism-of-the-middle philosophy (Johnson & Onwuegbuzie 2004), pragmatism-of-the-right philosophy (Putnam 2002; Rescher 2000), pragmatism- of-the-left philosophy (Maxcy 2003; Rorty 1991), the anti-conflationist philosophy (Bryman 1992; Hammersley 1992; Layder 1993; Roberts 2002), critical realist orientation (Houston 2001; Maxwell 2004; McEvoy & Richards 2003, 2006), the dialectical stance (Greene 2008; Greene & Caracelli 1997; Maxwell & Loomis 2003), complementary strengths stance (Brewer & Hunter 1989; Morse 2003), transformativeemancipatory stance (Mertens 2003), a-paradigmatic stance (Patton 2002; Reichardt & Cook 1979), substantive theory stance (Chen 2006) and, most recently, communities of practice stance (Denscombe 2008).
Table 3 provides one way of representing the major mixed research stances. In this table, these leading mixed research philosophical paradigms are presented (left column), alongside a summary of their major stances (middle column) and the core analyses that are associated with these stances (right column). This table represents a first attempt to link mixed research paradigms and stances to mixed analysis strategies. However, clearly, more work is needed here to enhance our understanding of the mixed analysis choices made by mixed researchers and thus address Greene's (2008: 13) important question: 'how do the assumptions and stancesinfluence inquiry decisions?'.
CONCLUSIONS
The present article represents a first attempt, albeit a tentative one, explicitly to provide a philosophical justification for conducting mixed analyses. As noted by Greene (2008: 12): 'Our thinking about the nature and role of philosophical assumptions in our practice needs to make more practical sense, as well as offer possibilities to practitioners not yet envisioned, which is one key role of mixed methods theory'. Although we recognize that much more work is needed in this area, we hope that our article will motivate constructive and positive dialogues among mixed researchers and researchers representing qualitative- and quantitative- based paradigms (Onwuegbuzie 2002), so that, regardless of our differences in philosophical assumptions and stances, we all come to view all researchers as belonging to communities of interest (cf. Fischer 2001). Further, we hope that our article will motivate others either to refute or to build on the ideas and concepts we have presented. To the extent that our ideas and concepts make practical sense, we hope that our article will help better situate mixed analyses in the philosophy of science, thereby promoting mixed research as a distinctive methodology - consistent with the call made by Greene (2006).
1 Onwuegbuzie et al. (2009) used the word fundamental because this principle is pertinent (i.e. fundamental) to qualitative, quantitative and mixed analyses.
2 Unlike Onwuegbuzie et al. (2009), we place the word 'statistical' in parentheses to denote the fact that external generalization can arise predominantly or exclusively from either quantitative or qualitative analyses. The term 'statistical' is used to denote the possible assumption that the sample was representative in some way (e.g. probabilistically, vicariously) of the larger group from which the sample was drawn.
3 Unlike Onwuegbuzie et al. (2009), as is the case for external generalizations, we place the word 'statistical' in parentheses to denote the fact that internal generalization can arise predominantly or exclusively from either quantitative or qualitative analyses. Again, the term 'statistical' is used to denote the possible assumption that the sample was representative in some way (e.g. probabilistically, vicariously) of the larger group from which the sample was drawn.
4 Disturbingly, many authors appear to be unaware of the six major tenets of positivism, namely: (a) an emphasis on verification, (b) pro-observation, (c) anti-cause, (d) downplaying explanation, (e) anti-theoretical entities and (f ) anti-metaphysics (Hacking 1983).
5 It is legitimate and useful to examine and emphasize the differences between qualitative and quantitative research. We should remain aware, however, that this emphasis glosses over the many, sometimes great, differences within qualitative and quantitative research. Furthermore, our point here is that it also is legitimate to examine similarities that sometimes are present in applications of qualitative and quantitative research.
6 The exploratory factor analysis is conducted after converting the themes to a '1' if the theme is present for the research participant and a '0' if the theme is not present for the research participant. This conversion yields what Onwuegbuzie (2003) refers to as an inter-respondent matrix (i.e. participant x theme matrix), consisting only of 0s and 1s that is subsequently converted to a matrix of bivariate associations among the responses pertaining to each of the emergent themes. These bivariate associations then are converted to tetrachoric correlation coefficients because the themes had been quantitized to dichotomous data (i.e. '0' vs. '1') and tetrachoric correlation coefficients are appropriate to use when one is determining the relationship between two (artificial) dichotomous variables. The matrix of tetrachoric correlation coefficients then becomes the basis of the exploratory factor analysis (see, for example Onwuegbuzie, Witcher, Collins, Filer, Wiedmaier and Moore 2007).
References
Barton, A. and Lazarsfeld, F. (1955) Some functions of qualitative data analysis in sociological research. Sociologica, 1, 321-361.
Bazeley, P. (2003) Computerized data analysis for mixed methods research. In A. Tashakkori and C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 385-422) Thousand Oaks CA: Sage.
Bazeley, P. (2006) The contribution of computer software to integrating qualitative and quantitative data and analyses. Research in the Schools, 13(1), 64-74.
Becker, H. S. (1970) Sociological work: Method and substance. New Brunswick NJ: Transaction Books.
Becker, H. S., Geer, B., Hughes, E. C. and Strauss, A. L. (1977) Boys in white: Student culture in medical school. New Brunswick NJ: Transaction Books. (Original work published by University of Chicago Press 1961)
Bergman, M. (Ed.) (2008) Advances in mixed methods research. Theories and applications. Thousand Oaks CA: Sage.
Blaikie, N. (1993) Approaches to social enquiry. Cambridge: Polity.
Brewer, J. and Hunter, A. (1989) Multimethod research. Thousand Oaks CA: Sage.
Bryman, A. (1988) Quantity and quality in social research. London: Urwin Hyman.
Bryman, A. (1992) Quantitative and qualitative research: Further reflections on their integration. In J. Brannen (Ed.), Mixing methods: Qualitative and quantitative research (pp. 89-111) Aldershot: Avebury Press.
Bryk, A. S. and Raudenbush, S. W. (1992) Hierarchical linear models: Applications and data analysis methods. Newbury Park CA: Sage.
Campbell, D. T. and Fiske, D. W. (1959) Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81-105.
Caracelli, V. W. and Greene, J. C. (1993) Data analysis strategies for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 15 195-207.
Chen, H. T. (2006) A theory-driven evaluation perspective on mixed methods research. Research in the Schools, 13(1), 75-83.
Chi, M. T. H. (1997) Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6, 271-315.
Collins, K. M. T., Onwuegbuzie, A. J. and Sutton, I. L. (2006) A model incorporating the rationale and purpose for conducting mixed methods research in special education and beyond. Learning Disabilities: A Contemporary Journal, 4, 67-100.
Cook, T. D. and Campbell, D. T. (1979) Quasiexperimentation: Design and analysis issues for field settings. Chicago: Rand McNally.
Corden, A. and Hirst, M. (2008) Implementing a mixed methods approach to explore the financial implications of death of a life partner. Journal of Mixed Methods, 2, 208-220.
Creswell, J. W. and Plano Clark, V. L. (2007) Designing and conducting mixed methods research. Thousand Oaks CA: Sage.
Creswell, J. W. and Plano Clark, V. L. (2008) Conducting Mixed Methods Research and the Practice of Research. Workshop presented at the annual Mixed Methods Conference. Cambridge UK.
Curtis, S., Gesler, W., Smith, G. and Washburn, S. (2000) Approaches to sampling and case selection in qualitative research: Examples in the geography of health. Social Science and Medicine, 50, 1001-1014.
Denscombe, M. (2008) Communities of practice: A research paradigm for the mixed methods approach. Journal of Mixed Methods Research, 2, 270-283.
Denzin, N. K. and Lincoln, Y. S. (2005) Paradigmatic controversies, contradictions and emerging confluences. In N. K. Denzin and Y. S. Lincoln (Eds.), Sage handbook of qualitative research (2nd edn, pp. 191-216) Thousand Oaks CA: Sage.
Firestone, W. A. (1993) Alternative arguments for generalizing from data, as applied to qualitative research. Educational Researcher, 22(4), 16-23.
Fischer, G. (2001) Communities of interest: Learning through the interaction of multiple knowledge systems. In S. Bjornestad, R. Moe, A. Morch and A. Opdahl (Eds.), Proceedings of the 24th IRIS Conference, Ulvik, Norway.
Flick, U. (1998) An introduction to qualitative research: Theory, method and applications. London: Sage.
Fox, J. (1997) Applied regression analysis, linear models and related methods. Thousand Oaks CA: Sage.
Fryer, D. and Feather, N. T. (1994) Intervention techniques. In C. Cassell and G. Symon (Eds.), Qualitative methods in organizational research (pp. 230-247) London: Sage.
Glaser, B. G. and Strauss, A. L. (1967) The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine.
Gorard, S. and Smith, E. (Eds.) (2006) Special Issue on 'Combining numbers with narratives.' Evaluation and Research in Education 19(2) Retrieved March 28 2009, from http://www.multilingual-matters .net/erie/019/2/
Greene, J. C. (2006) Toward a methodology of mixed methods social inquiry. Research in the Schools, 13(1), 93-98.
Greene, J. C. (2007) Mixed methods in social inquiry. San Francisco: Jossey-Bass.
Greene, J. C. (2008) Is mixed methods social inquiry a distinctive methodology? Journal of Mixed Methods Research, 2, 7-22.
Greene, J. C. and Caracelli, V. J. (1997) Defining and describing the paradigm issue in mixedmethod evaluation. In J. C. Greene and V. J. Caracelli (Eds.), Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms (New Directions for Evaluation, No. 74, pp. 5-17) San Francisco: Jossey-Bass.
Greene, J. C., Caracelli, V. J. and Graham, W. F. (1989) Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11, 255-274.
Guba, E. G. (1990) The alternative paradigm dialog. In E. G. Guba (Ed.), The paradigm dialog (pp. 17-27) Newbury Park CA: Sage.
Hacking, I. (1983) Representing and intervening: Introductory topics in the philosophy of natural science. New York: Cambridge University Press.
Hammersley, M. (1992) Deconstructing the qualitative-quantitative divide. In J. Brannen (Ed.) Mixing methods: Qualitative and quantitative research (pp. 39-55) Aldershot: Avebury Press.
Henson, R. K., Capraro, R. M. and Capraro, M. M. (2004) Reporting practice and use of exploratory factor analysis in educational research journals: Errors and explanation. Research in the Schools, 11(2), 61-72.
Henson, R. K. and Roberts, J. K. (2006) Use of exploratory factor analysis in published research. Educational and Psychological Measurement, 66, 393-416.
Heron, J. and Reason, P. (1997) A participatory inquiry paradigm. Qualitative Inquiry, 3, 274-294.
Hetzel, R. D. (1996) A primer on factor analysis with comments on patterns of practice and reporting. In B. Thompson (Ed.), Advances in social science methodology (Vol. 4, pp. 175-206) Greenwich CT: JAI Press.
Houston, S. (2001) Beyond social constructionism: Critical realism and social work. British Journal of Social Work, 31, 845-861.
Jick, T. D. (1983) Mixing qualitative and quantitative methods: Triangulation in action. In J. Vam Maanen (Ed.), Qualitative methodology (pp. 135-148) Thousand Oaks CA: Sage.
Johnson, R. B. (2006) Special issue on 'New directions in mixed methods research.' Research in the Schools, 13(1) Retrieved July 8 2008, from http://www.msera.org/rits_131.htm
Johnson, R. B. (2008) Editorial: Living with tensions. Journal of Mixed Methods Research, 2, 202-207.
Johnson, R. B. and Christensen, L. B. (2008) Educational research: Quantitative, qualitative and mixed approaches (3rd edn) Thousand Oaks CA: Sage.
Johnson, R. B. and Onwuegbuzie, A. J. (2004) Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26.
Johnson, R. B., Onwuegbuzie, A. J. and Turner, L. A. (2007) Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1, 112-133.
Kant, I. (1781/1998) The Cambridge edition of the works of Immanuel Kant: Critique of pure reason. Cambridge UK: Cambridge University Press.
Kennedy, M. (1979) Generalizing from single case studies. Evaluation Quarterly, 3, 661-678.
Kidder, L. H. and Fine, M. (1987) Qualitative and quantitative methods: When stories converge. In M. M. Mark and R. L. Shotland (Eds.), Multiple methods in program evaluation (New Directions of Program Evaluation, No. 35, pp. 57-75) San Francisco: Jossey-Bass.
Kuhn, T. S. (1962) The structure of scientific revolutions. Chicago: University of Chicago Press.
Lather, P. (1986) Research as praxis. Harvard Educational Review, 56, 257-277.
Lather, P. (2006) Paradigm proliferation as a good thing to think with: Teaching research in education as a wild profusion. International Journal of Qualitative Studies 19(1), 35-57.
Layder, D. (1993) New strategies in social research: An introduction and guide. Cambridge UK: Polity Press.
LeCompte, M. D. and Preissle, J. (1993) Ethnography and qualitative design in educational research (2nd edn) San Diego CA: Academic Press.
Lee, Y-j. and Greene, J. C. (2007) The predictive validity of an ESL placement test: A mixed methods approach. Journal of Mixed Methods Research, 1, 366-389.
Leech, N. L. and Onwuegbuzie, A. J. (2007) An array of qualitative data analysis tools: A call for qualitative data analysis triangulation. School Psychology Quarterly, 22, 557-584.
Leech, N. L. and Onwuegbuzie, A. J. (2008) Qualitative data analysis: A compendium of techniques and a framework for selection for school psychology research and beyond. School Psychology Quarterly, 23, 587-604.
Leech, N. L. and Onwuegbuzie, A. J. (Eds) (2010) Special issue on 'Teaching mixed methods.' International Journal of Multiple Research Approaches, 4(1), in press.
Li, S., Marquart, J. M. and Zercher, C. (2000) Conceptual issues and analytical strategies in mixed-method studies of preschool inclusion. Journal of Early Intervention, 23, 116-132.
Lincoln, Y. S. and Guba, E. G. (1985) Naturalistic inquiry. Beverly Hills CA: Sage.
Louis, K. S. (1982) Sociologist as sleuth: Integrating methods in the RDU study. American Behavioral Scientist, 26(1), 101-120.
Madey, D. L. (1982) Some benefits of integrating qualitative and quantitative methods in program evaluation, with some illustrations. Educational Evaluation and Policy Analysis, 4, 223-236.
Mark, M. M. and Shotland, R. L. (1987) Alternative models for the use of multiple methods. In M. M. Mark and R. L. Shotland (Eds.), Multiple methods in program evaluation (New Directions of Program Evaluation, No. 35 pp. 95-100) San Francisco: Jossey-Bass.
Maxcy, S. J. (2003) Pragmatic threads in mixed methods research in the social sciences: The search for multiple modes of inquiry and the end of the philosophy of formalism. In A. Tashakkori and C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 51- 89) Thousand Oaks CA: Sage.
Maxwell, J. A. (1996) Qualitative research design. Newbury Park CA: Sage.
Maxwell, J. A. (2004, April) Realism as a stance for mixed methods research. Paper presented at the annual meeting of the American Educational Research Association, San Diego CA.
Maxwell, J. A., Bashook, G. and Sandlow, C. J. (1986) Combining ethnographic and experimental methods in educational evaluation. In D. M. Fetterman and M. A. Pittman (Eds.), Educational evaluation: Ethnography in theory, practice and politics (pp. 121-143) Thousand Oaks CA: Sage.
Maxwell, J. A. and Loomis, D. M. (2003) Mixed methods design: An alternative approach. In A. Tashakkori and C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 241-272) Thousand Oaks CA: Sage.
McConney, A., Rudd, A. and Ayres, R. (2002) Getting to the bottom line: A method for synthesizing findings within mixed-method program evaluations. American Journal of Evaluation, 23, 121-140.
McEvoy, P. and Richards, D. (2003) Critical realism: A way forward for evaluation research in nursing? Journal of Advanced Nursing, 43, 411-420.
McEvoy, P. and Richards, D. (2006) A critical realist rationale for using a combination of quantitative and qualitative methods. Journal of Research in Nursing, 11, 66-78.
Mertens, D. (2003) Mixed methods and the politics of human research: The transformativeemancipatory perspective. In A. Tashakkori and C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 135-164) Thousand Oaks CA: Sage.
Mertens, D. (2008, July) Transformative mixed methods research. Workshop presented at the annual mixed methods conference. Cambridge UK.
Miles, M. B. and Huberman, A. M. (1994) Qualitative data analysis: An expanded sourcebook (2nd edn) Thousand Oaks CA: Sage.
Morrow, R. A. and Brown, D. D. (1994) Critical theory and methodology (Contemporary Social Theory) Thousand Oaks CA: Sage.
Morse, J. M. (1995) The significance of saturation. Qualitative Health Research, 5, 147- 149.
Morse, J. M. (2003) Principles of mixed methods and multimethod research design. In A. Tashakkori and C. Teddlie (Eds), Handbook of mixed methods in social and behavioral research (pp. 189-208) Thousand Oaks CA: Sage.
O'Cathain, A. (2008, July) 'Any other comments?' The use of open questions in questionnaires. Workshop presented at the annual mixed methods conference. Cambridge UK.
O'Cathain, A. and Collins, K. M. T. (Eds) (2009) Special issue 'Mixed methods for novice researchers.' International Journal of Multiple Research Approaches, 3(1), 1-112.
Onwuegbuzie, A. J. (2002) Why can't we all get along? Towards a framework for unifying research paradigms. Education, 122, 518-530.
Onwuegbuzie, A. J. (2003) Effect sizes in qualitative research: A prolegomenon. Quality and Quantity: International Journal of Methodology, 37, 393-409.
Onwuegbuzie, A. J. (2007) Mixed methods research in sociology and beyond. In G. Ritzer (Ed.), Encyclopedia of Sociology (Vol. VI, pp. 2978- 2981) Oxford: Blackwell.
Onwuegbuzie, A. J. and Collins, K. M. T. (2008, July) A step-by-step guide to publishing mixed research articles. Workshop presented at the annual mixed methods conference. Cambridge UK.
Onwuegbuzie, A. J. and Combs, J. P. (2009) Toward a conceptual framework for mixing data analysis strategies. Manuscript submitted for publication.
Onwuegbuzie, A. J. and Daniel, L. G. (2003, February 12) Typology of analytical and interpretational errors in quantitative and qualitative educational research. Current Issues in Education [On-line], 6(2) Retrieved March 28 2009, from http://cie.ed.asu.edu/volume6/number2/
Onwuegbuzie, A. J. and Dickinson, W. B. (2008) Mixed methods analysis and information visualization: Graphical display for effective communication of research results. The Qualitative Report, 13, 204-225. Retrieved March 28 2009, from http://www.nova.edu/ssss/QR/QR13-2/onwueg buzie.pdf
Onwuegbuzie, A. J. and Johnson, R. B. (2006) The validity issue in mixed research. Research in the Schools, 13(1), 48-63.
Onwuegbuzie, A. J. and Leech, N. L. (2004) Enhancing the interpretation of 'significant' findings: The role of mixed methods research. The Qualitative Report, 9, 770-792. Retrieved March 28 2009, from http://www.nova.edu/ssss/ QR/QR9-4/ onwuegbuzie.pdf
Onwuegbuzie, A. J. and Leech, N. L. (2005) On becoming a pragmatist researcher: The importance of combining quantitative and qualitative research methodologies. International Journal of Social Research Methodology: Theory and Practice, 8, 375-387.
Onwuegbuzie, A. J. and Leech, N. L. (2006) Linking research questions to mixed methods data analysis procedures. The Qualitative Report, 11, 474 498. Retrieved July 8 2008, from http://www.nova.edu/ssss/QR/QR11-3/ onwuegbuzie.pdf
Onwuegbuzie, A. J., Slate, J. R., Leech, N. L. and Collins, K. M. T. (2007) Conducting mixed analyses: A general typology. International Journal of Multiple Research Approaches, 1, 4-17.
Onwuegbuzie, A. J., Slate, J. R., Leech, N. L. and Collins, K. M. T. (2008, March) Mixed data analysis techniques: A comprehensive step-by-step approach. Professional Development Training Course presented at the annual meeting of the American Educational Research Association, New York.
Onwuegbuzie, A. J., Slate, J. R., Leech, N. L. and Collins, K. M. T. (2009) Mixed data analysis: Innovative integration techniques. International Journal of Multiple Research Approaches, 3(1), 13-33.
Onwuegbuzie, A. J. and Teddlie, C. (2003) A framework for analyzing data in mixed methods research. In A. Tashakkori and C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 351-383) Thousand Oaks CA: Sage.
Onwuegbuzie, A. J., Witcher, A. E., Collins, K. M. T., Filer, J. D., Wiedmaier, C. D. and Moore, C. W. (2007) Students' perceptions of characteristics of effective college teachers: A validity study of a teaching evaluation form using a mixed methods analysis. American Educational Research Journal, 44, 113-160.
Patton, M. Q. (2002) Qualitative research and evaluation methods. Thousand Oaks CA: Sage.
Phelan, P. (1987) Compatibility of qualitative and quantitative methods: Studying child sexual abuse in America. Education and Urban Society, 20(1), 35-41.
Phillips, D. C. and Burbules, N. C. (2000) Postpositivism and educational research. New York: Rowman & Littlefield.
Plano Clark, V. L. and Creswell, J. W. (2007) The mixed methods reader. Thousand Oaks CA: Sage.
Popper, K. (1994) In search of a better world: Lectures and essays from thirty years. London: Routledge.
Putnam, H. (2002) The collapse of the fact/value dichotomy and other essays. Cambridge MA: Harvard University Press.
Reichardt, C. S. and Cook, T. D. (1979) Beyond qualitative versus quantitative methods. In T. D. Cook and C. S. Reichardt (Eds.), Qualitative and quantitative methods in evaluation research (pp. 7- 32) Thousand Oaks CA: Sage.
Rescher, N. (2000) Realistic pragmatism: An introduction to pragmatic philosophy. Albany: State University of New York Press.
Ridenour, C. S. and Newman, I. (2008) Mixed methods research: Exploring the interactive continuum. Carbondale IL: Southern Illinois Press.
Roberts, A. (2002) A principled complementarity of method: In defence of methodological eclecticism and the qualitative-qualitative debate. The Qualitative Report 7(3) Retrieved March 28 2009, from www.nova.edu/ssss/QR/QR7-3/roberts.html.
Rorty, R. (1991) Objectivity, relativism and truth: Philosophical papers (Vol. 1). Cambridge UK: Cambridge University Press.
Rossman, G. B. and Wilson, B. L. (1985) Numbers and words: Combining quantitative and qualitative methods in a single large-scale evaluation study. Evaluation Review, 9, 627-643.
Sandelowski, M. (2000) Combining qualitative and quantitative sampling, data collection and analysis techniques in mixed-method studies. Research in Nursing Health, 23, 246-255.
Sandelowski, M. (2001) Real qualitative researchers don't count: The use of numbers in qualitative research. Research in Nursing and Health, 24, 230-240.
Sechrest, L. and Sidana, S. (1995) Quantitative and qualitative methods: Is there an alternative? Evaluation and Program Planning, 18, 77-87.
Schumacker, R. E. and Lomax, R. G. (1996) A beginner's guide to structural equation modeling. Mahwah NJ: Lawrence Erlbaum Associates.
Smith, M. L. (1997) Mixing and matching: Methods and models. In J. C. Greene and V. J Caracelli (Eds), Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms (New Directions for Evaluation No. 74, pp. 73-85) San Francisco: Jossey-Bass.
Stake, R. E. (1980) The case study method in social enquiry. In H. Simons (Ed.), Towards a science of the singular (pp. 62-75) CARE Occasional Publications No. 10. Norwich, UK: Center for Applied Research in Education, University of East Anglia.
Stake, R. E. (2005) Qualitative case studies. In N. K. Denzin and Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (3rd edn, pp. 443-466) Thousand Oaks CA: Sage.
Stake, R. E. and Trumbull, D. J. (1982) Naturalistic generalizations. Review Journal of Philosophy and Social Science, 7, 3-12.
Strauss, A. and Corbin, J. (1990) Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park CA: Sage.
Tashakkori, A. and Teddlie, C. (1998) Mixed methodology: Combining qualitative and quantitative approaches. Applied Social Research Methods Series (Vol. 46) Thousand Oaks CA: Sage.
Tashakkori, A. and Teddlie, C. (Eds) (2003) Handbook of mixed methods in social and behavioral research. Thousand Oaks CA: Sage.
Teddlie, C. and Johnson, R. B. (2009) Methodological thought since the 20th century. In C. Teddlie and A. Tashakkori (Eds), Foundations of mixed methods research: Integrating quantitative and qualitative techniques in the social and behavioral sciences (pp. 62-82) Thousand Oaks CA: Sage.
Teddlie, C. and Tashakkori, A. (2009) Foundations of mixed methods research: Integrating quantitative and qualitative techniques in the social and behavioral sciences. Thousand Oaks CA: Sage.
Teddlie, C., Tashakkori, A. and Johnson, R. B. (2008) Emergent techniques in the gathering and analysis of mixed methods data. In S. N. Hesse-Biber and P. Leavy (Eds), Handbook of emergent methods (pp. 389-414) New York: The Guilford Press.
Yin, R. K. (2009) Case study research: Design and methods (4th edn) Thousand Oaks CA: Sage.
Yu, C. H. (2003) Misconceived relationships between logical positivism and quantitative research. Research Methods Forum [On-line]. Retrieved September 2 2004, from http://www.aom.pace. edu/rmd/2002forum.html.
Received 3 September 2008 Accepted 7 May 2009
ANTHONY J ONWUEGBUZIE
Department of Educational Leadership & Counseling, Sam Houston State University, Huntsville
TX, USA
R BURKE JOHNSON
College of Education, University of South Alabama, Mobile AL, USA
KATHLEEN MT COLLINS
Department of Curriculum & Instruction, University of Arkansas at Fayetteville, Fayetteville AR, USA
Copyright eContent Management Pty Ltd Aug 2009
Be the first to add a shared tag to this document.