Exploratory and Confirmatory Factor Analysis

Factor analysis is a broad term for multivariate statistical methods used to identify common underlying variables called factors within a larger set of measures. Basically, factor analysis determines which variables group or go together. Nurses Announcements Archive Article

Exploratory and Confirmatory Factor Analysis

A factor is a group of related variables representing an underlying domain or theme. Factors are indicated by shared variance (covariances) among two or more variables. As variables are reduced to factors by inter-item correlational statistical analysis, relationships between the factors begin to emerge in the variables they represent. The observed variables are then modeled as linear combinations of the factors.

Factor analysis empirically explores the interrelationships and dimensions among variables to cluster inter-correlated variables into smaller sets of basic factors. It reduces the number of variables and also classifies variables by exploring the underlying theoretical structure(s). The process simplifies data and eliminates redundant variables, unclear variables, and irrelevant variables.

Factor analysis is commonly used in psychometric instrument development. For example, psychological questionnaires often aim to operationalize abstract psychological constructs, with multiple empiric indicators on the questionnaire measuring each construct (in order to enhance reliability and validity). Factor analysis statistical techniques have been utilized in the formation and verification of tens of thousands of psychological screening and measurement tests.

Definitions:

Observed variable: measured directly; a measured variable, an indicator, or a manifest variable.

Latent construct: measured indirectly by determining its influence to responses on observed variables; a factor, underlying construct, or unobserved variable.

Factor scores: estimates of underlying latent constructs.

Eigenvalues: the amount of variance explained by each factor.

Orthogonal: 90-degree angle, perpendicular

Oblique: other than a 90-degree angle

The main steps in factor analysis are an assessment of the suitability of data for factor analysis (meeting assumptions and assessing the correlation matrices), factor extraction, factor retention, and factor rotation/ interpretation.

Assumptions of factor analysis:

All variables at least on an ordinal scale. Nominal data not appropriate for factor analysis.

Overall sample size 150+ or ratio 20:1 (cases per variable).

Random sampling; normal distribution.

Linear relations among the variables.

Little to no outliers.

Factors independent of one another (no correlation).

There should be at least 5 salient variables for each factor.

No measurement error.

Assessing the correlation matrix:

Correlations of r = .3 to .8. Initial communalities > .6. Bartlett's test of sphericity Statistically significant at p .7. Together, these tests suggest that the matrix is factorable.

Factor extraction: this phase condenses items in the data matrix into a smaller number of factors. It is used to define the number of underlying dimensions. It results in an unrotated factor matrix, which contains coefficients or weights for all original items on each extracted factor. Two widely used factor extraction techniques are principal components analysis (PCA) and common factor analysis.

According to Costello and Osborne (2005), common factor analysis is preferable to principal components analysis (PCA), which is the "default" data reduction method in spss. Principle components analysis transforms variables into smaller sets of linear combinations with all the variance in the variables utilized. This analysis technique can produce inflated values of the variance in the components. Common factor analysis utilizes only the shared variance amongst the items. Each indicator is typically linked to only one factor since cross-loadings suggest poorer construct validity.

Besides the "default" PCA, spss has six-factor extraction methods from which to choose: unweighted least squares, generalized least squares, maximum likelihood, principal axis factoring, alpha factoring, and image factoring. Costello and Osborne recommend the common factor analysis methods of maximum likelihood extraction (for normally distributed data) or principal axis factoring (for non-normal data), over principal components analysis.

Factor retention: there are a number of techniques used to determine which factors to retain: kaiser's criterion, screen test, and parallel analysis. Kaiser's criterion retains all factors with eigenvalues greater than 1.0. This is considered one of the least accurate methods for deciding which factors to retain. The scree plot is a two-dimensional graph with factors on the x-axis and eigenvalues on the y-axis. The number of data points above the natural bend or "elbow" is retained. Generally, this graph yields good results if there is a clear "break" in the plot of eigenvalues at the "elbow." Costello and Osborne (2005) contend that the screen test is the best choice for researchers. Pallant (2007), on the other hand, states that parallel analysis is the best approach to deciding the number of factors. The parallel analysis compares the size of the eigenvalues with those obtained from a randomly generated data of the same size. Only those eigenvalues that exceed the corresponding values are retained.

Factor rotation and interpretation: rotation refers to the shifting of the factors in the "factor space" to maximize the interpretation of the factor loadings. The goal of rotation is to simplify, clarify, and interpret the data by either orthogonal or oblique methods of rotation. Researchers typically evaluate rotated rather than unrotated factor loadings for the obvious reason that they are easier to interpret.

Spss offers a variety of factor extraction choices: rotated factor matrix after orthogonal rotation (varimax), or pattern matrix after oblique rotation (oblimin) (Costello & Osborne, 2005).

If the factors are truly independent and uncorrelated, orthogonal and oblique rotation methods produce essentially identical results and interpretations, but the orthogonal (varimax) method is preferred. If the factors are related, with some correlations, the oblique (oblimin) method is preferred. Pallant (2007) suggests that the researcher starts with oblimin rotation, as it provides information about the degree of correlation between the factors.

Exploratory factor analysis versus confirmatory factor analysis

Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are two statistical approaches used to examine the internal reliability of a measure.

Exploratory factor analysis

exploratory factor analysis (EFA) explores and summarizes the underlying correlational structure for a data set. It is used in the early stages of research to explore interrelationships among variable sets, to find the model. Efa is complex, has little absolute guidelines, and many application choices. This factor analysis approach is used when the underlying structure of a data set is unknown, to determine which domains comprise a construct of interest. In EFA, the researcher explores how many factors there are, whether the factors are correlated, and which observed variables appear to best measure each factor. Efa can reduce a large set of variables to a couple of underlying factors.

Exploratory factor analysis is used to discover the factor structure of a measure and to examine its internal reliability. Efa is recommended when researchers have no preconceived hypotheses or prior theory about the nature of the underlying factor structure of their measure. As such, it is an inductive approach using the factor loadings to uncover the factor structure of the data. Since EFA is exploratory in scope, there are no inferential statistical processes. Efa is not appropriate to use for testing hypotheses or theories, but only to clarify and describe relationships. Efa is subject to error and a wide variation of subjective interpretations, even with optimal data and large samples. This method is as much an "art" as it is a "science."

Exploratory factor analysis has three basic decision points: (1) deciding the number of factors, (2) choosing an extraction method, and (3) choosing a rotation method. Efa then hypothesizes the underlying construct, the latent structure (dimensions) of a set of variables.

Confirmatory factor analysis

confirmatory factor analysis (CFA) is a set of more complex and sophisticated statistical techniques used later in the research process to confirm the hypotheses or theories concerning the underlying structure generated by a. It is a hypotheses testing approach, used to test the model. Confirmatory factor analysis tests the correlational structure of a data set against the hypothesized structure and rates the "goodness of fit." CFA tests hypotheses that state the number of factors representing data and the items comprising each factor. In CFA, the researcher specifies a certain number of factors, which factors are correlated, and which observed variables measure each factor.

Cfa seeks to determine if the number of factors conforms to what is expected on the basis of pre-established theory. Indicator variables are selected on the basis of prior theory and factor analysis is used to see if they load as predicted on the expected number of factors. A minimum requirement of confirmatory factor analysis is that the researcher hypothesizes beforehand the number of factors in the model. He or she should also posit expectations about which variables will load on which factors. The researcher seeks to determine, for instance, if measures created to represent a latent construct really belong together.

Statistics:

exploratory factor analysis uses a correlation matrix to see if any variables are components of factors. Shared variance indicates an underlying factor. The eigenvalue shows the amount of variance (a type of effect size). Factors with larger eigenvalues account for greater variance as opposed to factors with lower eigenvalues. As discussed earlier, inferential statistics should not be used in EFA. The statistical problem is not one of testing a given hypothesis, but rather one of fitting the model to the data to decide where common variances are. The dimensionality of a set of items emerges empirically.

Confirmatory factor uses a variance-covariance matrix to test hypotheses with inferential statistical techniques, using an advanced class of statistical techniques referred to as structural equation modeling (sem). In sum, the most frequently used estimation procedure is maximum likelihood estimation. Cfa tests a measurement model by testing correlations (between observed and latent variables, pairs of latent variables, and among the errors). The researcher can compare the estimated matrices representing the relationships between variables in the model to the actual matrices. The researcher specifies a hypothesis by designating certain parameters in the factor matrices. The hypothesis is confirmed to the extent that the model still fits.

Researcher information to report

When a researcher reports the results from factor analysis, he or she needs to include the following information: verification of assumptions; details of the method of factor extraction used; criteria used to determine details of the method of factor extraction used; criteria used to determine the number of factors retained; type of rotation technique used; total variance explained; initial eigenvalues; eigenvalues after rotation; and a table of loadings showing all values.

An example follows.

Presenting the results from the factor analysis (Pallant, 2007, p. 197-198)

the 20 items of the positive and negative affect scale (panas) were subjected to principal components analysis (PCA) using spss version 15. Prior to performing PCA, the suitability of data for factor analysis was assessed. Inspection of the correlation matrix revealed the presence of many coefficients of .3 and above. The Kaiser-Meyer-Elkin value was .87, exceeding the recommended value of .6 (Kaiser 1970, 1974) and Bartlett's test of sphericity (Bartlett 1954) reached statistical significance, supporting the factorability of the correlation matrix.

Principle components analysis revealed the presence of four components with eigenvalues exceeding 1, explaining 31.2%, 17%, 6.1%, and 5.8% of the variance respectively. An inspection of the screeplot revealed a clear break after the second component. Using castell's (1966) scree test, it was decided to retain two components for further investigation. This was further supported by the results of a parallel analysis, which showed only two components with eigenvalues exceeding the corresponding criterion values for a randomly generated data matrix of the same size (20 variables x 435 respondents).

The two-component solution explained a total of 48.2% of the variance, with component 1 contributing 31.25% and component 2 contributing 17.0%. To aid in the interpretation of these two components, oblimin rotation was performed. The rotated solution revealed the presence of simple structure (Thurstone 1947), with both components showing a number of strong loadings and all variables loading substantially on only one component. The interpretation of the two components was consistent with previous research on the Panas scale, with positive affect items loading strongly on component 1 and negative affect items loading strongly on component 2. There was a weak negative correlation between the two factors (r = -.28). The results of this analysis support the use of the positive affect items and the negative affect items as separate scales, as suggested by the scale authors (Watson, Clark & Tellegen 1988).

References

Costello, a. B., & Osborne, j. W. (2005). Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Practical assessment research & evaluation, 10(7), 1-9.

O'Brien, k. (2007). Factor analysis: an overview in the field of measurement. Physiotherapy Canada, 59, 142-155.

Pallant, j. (2007). Spss survival manual. New York: McGraw-hill education.

Polit, d. F., & Beck, c. T. (2008). Nursing research: generating and assessing evidence for nursing practice (8th ed.). Philadelphia: Wolters Kluwer Health.

49 Articles   5,349 Posts

Share this post