Showing posts with label crystallized intelligence. Show all posts
Showing posts with label crystallized intelligence. Show all posts

Tuesday, October 14, 2014

Differentiating Cognitive Abilities: A Factor Analysis of JCCES and GAMA Subtests

Abstract

This study aimed to investigate the differentiation between cognitive abilities assessed by the Jouve Cerebrals Crystallized Educational Scale (JCCES) and General Ability Measure for Adults (GAMA). A sample of 63 participants completed both JCCES and GAMA subtests. Pearson correlation and factor analysis were used to analyze the data. The results revealed significant positive correlations between most of the JCCES subtests, while correlations between GAMA and JCCES subtests were generally lower. Factor analysis extracted two distinct factors, with JCCES subtests loading on one factor and GAMA subtests loading on the other. The findings supported the hypothesis that JCCES and GAMA measure distinct cognitive abilities, with JCCES assessing crystallized abilities and GAMA evaluating nonverbal and figurative aspects of general cognitive abilities. This differentiation has important implications for the interpretation of JCCES and GAMA scores and their application in educational, clinical, and research settings.

Keywords: cognitive abilities, JCCES, GAMA, factor analysis, crystallized intelligence, nonverbal cognitive abilities

Introduction

The field of psychometrics has advanced significantly over the years, with numerous theories and instruments developed to assess various aspects of human cognitive abilities (Embretson & Reise, 2000). Among these, both crystallized and fluid intelligence, have been widely acknowledged as two essential dimensions of cognitive functioning (Cattell, 1987; Horn & Cattell, 1966). Crystallized intelligence refers to the acquired knowledge and skills gained through education and experience, while fluid intelligence involves the capacity for abstract reasoning, problem-solving, and adapting to novel situations (Cattell, 1987).

Instruments designed to measure these cognitive abilities often target specific domains, such as the Jouve Cerebrals Crystallized Educational Scale (JCCES) for crystallized intelligence (Jouve, 2010) and the General Ability Measure for Adults (GAMA) for nonverbal, figurative aspects of general cognitive abilities (Naglieri & Bardos, 1997). However, the relationship between these instruments and the cognitive domains they assess remains an area of ongoing research.

The present study aims to investigate the relationship between the JCCES and GAMA subtest scores to determine whether these instruments measure distinct cognitive abilities. In particular, the research hypothesis posits that the JCCES and GAMA subtests will load on separate factors in factor analysis, indicating that they assess different aspects of cognitive functioning. This hypothesis is grounded in previous literature on the differentiation of crystallized and fluid intelligence (Cattell, 1987; Horn & Cattell, 1966) and the design of the JCCES and GAMA instruments (Jouve, 2010a, 2010b, 2010c; Naglieri & Bardos, 1997).

To test the research hypothesis, the study employs Pearson correlation and principal factor analysis with Varimax rotation. These methods are widely used in psychometrics to explore the underlying structure of datasets and identify latent factors that explain shared variance among variables (Fabrigar, et al., 1999; Stevens, 2009). Additionally, the Kaiser-Meyer-Olkin (KMO) measure and Cronbach's alpha are computed to assess the sampling adequacy and internal consistency of the factors, respectively (Field, 2009).

The investigation of the relationship between the JCCES and GAMA subtest scores has practical implications for the assessment of cognitive abilities in various settings, including educational, clinical, and research contexts. By understanding the distinct cognitive domains assessed by these instruments, practitioners can make better-informed decisions about their use and interpretation, leading to more accurate and comprehensive evaluations of an individual's cognitive profile.

Method

Research Design

The study employed a correlational research design to investigate the relationship between cognitive abilities as assessed by the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the General Ability Measure for Adults (GAMA). The correlational design was chosen to identify patterns of association between the two sets of cognitive measures without manipulating any variables (Creswell, 2014).

Participants

A total of 63 participants were recruited for the study. Demographic information regarding age, gender, and ethnicity was collected but not used in this study. The participants were selected based on their willingness to participate and their ability to complete the JCCES and GAMA assessments. No exclusion criteria were set.

Materials

The JCCES is a measure of crystallized cognitive abilities, which reflect an individual's acquired knowledge and skills (Cattell, 1971). It consists of three subtests: Verbal Analogies (VA), Mathematical Problems (MP), and General Knowledge (GK).

The GAMA is a standardized measure of nonverbal and figurative general cognitive abilities (Naglieri & Bardos, 1997). It consists of four subtests: Matching (MAT), Analogies (ANA), Sequences (SEQ), and Construction (CON).

Procedures

Data collection was conducted in a quiet and well-lit testing environment. Participants first completed the JCCES, followed by the GAMA. Standardized instructions were provided to ensure that participants understood the requirements of each subtest. The JCCES and GAMA were administered according to their respective guidelines. 

Statistical Analyses

Data were analyzed using Excel. Descriptive statistics were computed for the JCCES and GAMA subtest scores. Pearson correlations were calculated to examine the relationships between the JCCES and GAMA subtests. Principal factor analysis with Varimax rotation was conducted to explore the underlying structure of the dataset and identify latent factors that could explain the shared variance among the subtests. The Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy and Cronbach's alpha were calculated to assess the quality of the factor analysis results (Cronbach, 1951).

Results

The research hypotheses were tested using Pearson correlation and principal factor analysis with Varimax rotation. The initial communalities were computed using squared multiple correlations, and the analysis was stopped based on convergence criteria (0.0001) and a maximum of 50 iterations. The Kaiser-Meyer-Olkin (KMO) measure was used to assess the sampling adequacy, and Cronbach's alpha was computed to determine the internal consistency of the factors.

Descriptive Statistics and Correlations

The sample consisted of 63 participants, with no missing data for the Jouve Cerebrals Crystallized Educational Scale (JCCES) and General Ability Measure for Adults (GAMA) subtest scores. The Pearson correlation matrix revealed significant positive correlations between most of the subtests.

In this study, the strongest correlations were observed between the JCCES subtests: Verbal Analogies (VA) and General Knowledge (GK) had a correlation of 0.712, indicating a strong positive relationship between these measures of crystallized abilities. Similarly, the VA and Mathematical Problems (MP) subtests were positively correlated (r = 0.542), suggesting a moderate relationship between these variables (Stevens, 2009). The MP and GK subtests also had a moderate positive correlation of 0.590.

Correlations between GAMA subtests and JCCES subtests were generally lower, with the highest correlation observed between MP and the GAMA Matching (MAT) subtest (r = 0.427). This suggests a moderate positive relationship between the nonverbal cognitive abilities assessed by GAMA and the crystallized mathematical abilities assessed by the JCCES MP subtest. The correlations between GAMA Analogies (ANA) and JCCES subtests were weak to moderate, ranging from 0.141 (ANA-GK) to 0.298 (ANA-VA). The GAMA Sequences (SEQ) subtest had weak correlations with JCCES subtests, ranging from 0.076 (SEQ-VA) to 0.391 (SEQ-MP). Lastly, the GAMA Construction (CON) subtest had weak to moderate correlations with JCCES subtests, ranging from 0.169 (CON-GK) to 0.452 (CON-MP).

Factor Analysis

The factor analysis aimed to explore the underlying structure of the dataset and to identify the latent factors that could explain the shared variance among the JCCES and GAMA subtests. The Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy was computed to ensure that the sample size was suitable for conducting factor analysis. A KMO value of 0.695 was obtained, which is considered adequate for factor analysis, as it is above the commonly accepted threshold of 0.6.

Two factors were extracted from the data based on their eigenvalues, which represent the total variance explained by each factor. Factor 1 (F1) had an eigenvalue of 2.904 and accounted for 41.482% of the variance, while Factor 2 (F2) had an eigenvalue of 1.331 and accounted for 19.016% of the variance. The cumulative explained variance by both factors was 60.498%, indicating that a substantial proportion of the total variance in the dataset was explained by these two factors.

To better interpret the factors, Varimax rotation was applied to achieve a simpler factor structure by maximizing the variance of factor loadings within each factor. The rotation resulted in two factors, denoted as D1 and D2, which accounted for 32.256% and 28.242% of the variance, respectively.

The rotated factor pattern demonstrated the relationships between the original subtests and the rotated factors. The GAMA subtests, including Analogies (ANA), Sequences (SEQ), and Construction (CON), had high factor loadings on D1 (0.685, 0.911, and 0.841, respectively). This indicates that these subtests share a common underlying factor, which is represented by D1.

In contrast, the JCCES subtests, including Verbal Analogies (VA), Mathematical Problems (MP), and General Knowledge (GK), had high factor loadings on D2 (0.796, 0.687, and 0.845, respectively). This suggests that these subtests also share a common underlying factor, which is represented by D2.

Internal Consistency

The internal consistency of the factors was assessed using Cronbach's alpha. The results showed that both factors had good internal consistency, with D1 having a Cronbach's alpha of 0.862 and D2 having a Cronbach's alpha of 0.762.

Interpretation and Significance

The factor analysis results provided strong evidence for the research hypothesis that the JCCES and GAMA measure distinct cognitive abilities. The separate cognitive domains represented by the two factors were clearly differentiated by the respective loadings of the JCCES and GAMA subtests.

Factor D1 was primarily associated with the GAMA subtests, which include Matching (MAT), Analogies (ANA), Sequences (SEQ), and Construction (CON). These subtests focus on nonverbal and figurative aspects of general cognitive abilities, capturing skills such as pattern recognition, abstract reasoning, and visual-spatial problem-solving. The high loadings of the GAMA subtests on factor D1 (ANA = 0.685, SEQ = 0.911, CON = 0.841) indicate that this factor reflects the underlying construct of nonverbal and figurative general cognitive abilities, as assessed by the GAMA.

Factor D2, on the other hand, was predominantly associated with the JCCES subtests, which include Verbal Analogies (VA), Mathematical Problems (MP), and General Knowledge (GK). These subtests are designed to measure crystallized abilities, reflecting the accumulated knowledge and skills acquired through education and experience. The high loadings of the JCCES subtests on factor D2 (VA = 0.796, MP = 0.687, GK = 0.845) suggest that this factor represents the underlying construct of crystallized cognitive abilities, as measured by the JCCES.

The distinct loadings of the JCCES and GAMA subtests on separate factors highlight the differences in the cognitive abilities assessed by these instruments. The JCCES primarily focuses on crystallized abilities, capturing an individual's acquired knowledge and skills, whereas the GAMA assesses nonverbal and figurative aspects of general cognitive abilities, tapping into more abstract and fluid cognitive processes. This differentiation between the two instruments supports the research hypothesis and emphasizes the unique contributions of each instrument in evaluating cognitive functioning.

The significant differences between the cognitive domains represented by the two factors have important implications for the interpretation of the JCCES and GAMA scores. These findings suggest that the JCCES and GAMA should be considered complementary tools in assessing an individual's cognitive abilities, as they provide unique insights into different aspects of cognitive functioning. Using both instruments together can offer a more comprehensive understanding of an individual's cognitive profile, facilitating better-informed decisions in educational, clinical, and research settings.

Limitations

There are some limitations to the study that should be considered. First, the sample size was relatively small (N = 63), which may limit the generalizability of the findings. Second, no demographic data were available for the participants, making it difficult to assess whether the sample was representative of the larger population.

Discussion

Interpretation of the Results and Comparison with Previous Research

The results of this study provide strong support for the research hypothesis that the Jouve Cerebrals Crystallized Educational Scale (JCCES) and General Ability Measure for Adults (GAMA) assess distinct cognitive abilities. The factor analysis revealed two separate factors, with JCCES subtests loading on one factor (D2) and GAMA subtests loading on another factor (D1). This finding is consistent with the theoretical distinction between crystallized and fluid cognitive abilities, as proposed by Cattell (1971) and supported by subsequent research (e.g., Carroll, 1993; Horn & Cattell, 1967).

The observed differentiation between the JCCES and GAMA is consistent with previous research demonstrating that crystallized abilities are more closely related to acquired knowledge and skills, while fluid abilities are more associated with abstract reasoning, pattern recognition, and visual-spatial problem-solving (Cattell, 1971; Horn & Cattell, 1967). This distinction is important, as it highlights the unique contributions of each instrument in evaluating cognitive functioning.

Implications for Theory, Practice, and Future Research

The findings of this study have important implications for both theory and practice. The clear differentiation between the JCCES and GAMA supports the notion that crystallized and fluid cognitive abilities are distinct constructs, which can be measured separately using appropriate assessment tools. This distinction has practical implications for educational, clinical, and research settings, where a comprehensive understanding of an individual's cognitive profile is essential for informed decision-making.

For example, in educational settings, the use of both the JCCES and GAMA can provide valuable information about a student's cognitive strengths and weaknesses, facilitating targeted interventions to support learning and development. In clinical settings, the combined use of these instruments can help clinicians identify cognitive impairments associated with various neurological and psychiatric conditions and inform treatment planning.

Future research could extend the current study by examining the relationship between the JCCES and GAMA and other cognitive measures, further exploring the distinctiveness and convergent validity of these instruments. Additionally, the research could investigate the potential impact of demographic factors, such as age, education, and cultural background, on the performance in the JCCES and GAMA subtests, enhancing our understanding of the factors that may influence the assessment of cognitive abilities.

Limitations

Despite the significant findings of this study, several limitations should be acknowledged. First, the relatively small sample size (N = 63) may limit the generalizability of the findings. Future research with larger, more diverse samples is needed to confirm the observed differentiation between the JCCES and GAMA.

Second, the lack of demographic data for the participants precludes an analysis of potential demographic factors that may influence the observed relationships between the JCCES and GAMA subtests. Future research should collect demographic information to explore potential differences in cognitive abilities based on factors such as age, education, and cultural background.

Directions for Future Research

Future research could build on the findings of this study by exploring the relationships between the JCCES, GAMA, and other cognitive measures to further investigate the distinctiveness and convergent validity of these instruments. Moreover, researchers could examine the potential impact of demographic factors, such as age, education level, and cultural background, on performance in the JCCES and GAMA subtests. This would provide valuable insights into the factors that may influence the assessment of cognitive abilities and contribute to a more comprehensive understanding of the constructs measured by these instruments.

Additionally, future research could investigate the predictive validity of the JCCES and GAMA in various applied settings, such as academic performance, vocational success, or clinical outcomes. This would help determine the practical utility of these instruments in making informed decisions across a range of contexts.

It would also be beneficial to examine the potential moderating role of factors such as motivation, test-taking strategies, or test anxiety on the relationship between the JCCES and GAMA subtests. This could provide valuable information regarding the potential influence of non-cognitive factors on cognitive assessment outcomes.

Longitudinal studies could be conducted to explore the developmental trajectories of crystallized and fluid cognitive abilities as assessed by the JCCES and GAMA, as well as the potential factors that may influence these trajectories, such as educational experiences or cognitive interventions. Such studies would contribute to a deeper understanding of the development and change of cognitive abilities over time.

Finally, future research could explore the potential benefits of integrating the JCCES and GAMA into comprehensive cognitive assessment batteries, alongside other cognitive measures assessing additional domains (e.g., working memory, processing speed, or executive functioning). This would help determine the optimal combination of measures for assessing an individual's cognitive profile in a comprehensive and efficient manner.

Conclusion

The present study demonstrated that the JCCES and GAMA assess distinct cognitive abilities, with the JCCES primarily measuring crystallized abilities and the GAMA focusing on nonverbal and figurative aspects of general cognitive abilities. The strong positive correlations observed between JCCES subtests and the moderate positive correlations between GAMA and JCCES subtests support these findings. Factor analysis further substantiated the differentiation between the two instruments, revealing two distinct factors, each associated with either the JCCES or GAMA subtests.

These findings have important implications for the broader field of cognitive assessment, suggesting that the JCCES and GAMA should be employed as complementary tools to obtain a comprehensive understanding of an individual's cognitive profile. This comprehensive approach can better inform decisions in educational, clinical, and research settings. However, the study's small sample size and lack of demographic information limit the generalizability of the results.

Future research should focus on replicating these findings in larger, more diverse samples and exploring the potential utility of combining the JCCES and GAMA to predict various cognitive and academic outcomes. Additionally, the research could investigate the relationship between these cognitive abilities and other relevant factors, such as socioeconomic background or educational attainment. Overall, this study highlights the importance of considering both crystallized and nonverbal cognitive abilities in cognitive assessment and emphasizes the unique contributions of the JCCES and GAMA in evaluating cognitive functioning.

References

Cattell, R. B. (1971). Abilities: Their structure, growth, and action. Boston, MA: Houghton Mifflin.

Cattell, R. B. (1987). Intelligence: Its Structure, Growth and Action. New York: North-Holland.

Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297-334. https://doi.org/10.1007/BF02310555

Creswell, J. W. (2014). Research Design: Qualitative, Quantitative and Mixed Methods Approaches (4th ed.). Thousand Oaks, CA: Sage. 

Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, NJ: Lawrence Erlbaum Associates. https://doi.org/10.4324/9781410605269

Fabrigar, L. R., Wegener, D. T., MacCallum, R. C., & Strahan, E. J. (1999). Evaluating the Use of Exploratory Factor Analysis in Psychological Research. Psychological Methods, 4, 272-299. https://doi.org/10.1037/1082-989X.4.3.272

Field, A. (2009). Discovering statistics using SPSS (3rd ed.). London, UK: Sage Publications.

Horn, J. L., & Cattell, R. B. (1966). Refinement and test of the theory of fluid and crystallized general intelligences. Journal of Educational Psychology, 57(5), 253-270. https://doi.org/10.1037/h0023816

Jouve, X. (2010a). Jouve Cerebrals Crystallized Educational Scale. Retrieved from http://www.cogn-iq.org/tests/jouve-cerebrals-crystallized-educational-scale-jcces

Jouve, X. (2010b). Investigating the Relationship Between JCCES and RIAS Verbal Scale: A Principal Component Analysis Approach. Retrieved from https://cogniqblog.blogspot.com/2010/02/on-relationship-between-jcces-and.html

Jouve, X. (2010c). Relationship between Jouve Cerebrals Crystallized Educational Scale (JCCES) Crystallized Educational Index (CEI) and Cognitive and Academic Measures. Retrieved from https://cogniqblog.blogspot.com/2010/02/correlations-between-jcces-and-other.html

Naglieri, J. A., & Bardos, A. N. (1997). General Ability Measure for Adults (GAMA). Minneapolis, MN: National Computer Systems.

Stevens, J. P. (2009). Applied multivariate statistics for the social sciences (5th ed.). New York, NY: Routledge.

Thursday, February 4, 2010

Investigating the Relationship Between JCCES and RIAS Verbal Scale: A Principal Component Analysis Approach

Abstract


This study examined the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale using Principal Component Analysis (PCA). The PCA revealed a strong relationship between JCCES and RIAS Verbal Scale, supporting the hypothesis that there is a common underlying construct representing general verbal and crystallized intelligence. Additionally, mathematical problem-solving was found to be a distinct construct from general verbal and crystallized intelligence. Despite some limitations, this study provides empirical support for the relationship between crystallized intelligence and verbal abilities, as well as the distinction between mathematical and verbal abilities, which can inform educational interventions and assessments.


Keywords: Jouve Cerebrals Crystallized Educational Scale, Reynolds Intellectual Assessment, Verbal Scale, Principal Component Analysis, crystallized intelligence, mathematical problem-solving


Introduction


Psychometrics, the science of measuring psychological attributes, has a long history of developing and refining theories and instruments to assess cognitive abilities (Cattell, 1963; Carroll, 1993). The present study focuses on the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale (Reynolds & Kamphaus, 2003), two psychometric instruments designed to assess crystallized intelligence and verbal abilities, respectively. Crystallized intelligence, first proposed by Cattell (1963), refers to the ability to access and utilize accumulated knowledge and experience, which is closely related to verbal abilities (Ackerman, 1996; Kaufman & Lichtenberger, 2006). Theories of cognitive abilities, such as those proposed by Cattell (1971), Horn and Cattell (1966), and Carroll (1993), have suggested that crystallized intelligence and verbal abilities share a common underlying construct.


Previous research has supported the relationship between crystallized intelligence and verbal abilities (Cattell, 1971; Horn & Cattell, 1966; Carroll, 1993), as well as the distinction between mathematical and verbal abilities (Deary et al., 2007). However, few studies have specifically examined the relationship between the JCCES and RIAS Verbal Scale. The present study aims to address this gap by investigating the relationship between these two instruments using principal component analysis (PCA), a statistical technique commonly employed in psychometrics to reduce data complexity and identify underlying constructs (Jolliffe, 1986).


The research question guiding this study is: What is the relationship between the JCCES and RIAS Verbal Scale, as assessed by PCA? To answer this question, the study will test the hypothesis that there is a strong relationship between the JCCES and RIAS Verbal Scale, as indicated by high factor loadings on a common underlying construct, which may represent general verbal and crystallized intelligence. Additionally, the study will explore the relationship between mathematical problem-solving and the other variables, given the distinction between mathematical and verbal abilities noted in previous research (Deary et al., 2007).


This study builds on the existing literature by providing a more detailed examination of the relationship between the JCCES and RIAS Verbal Scale, which has implications for both theory and practice. Understanding the relationship between these two instruments can inform the development of educational interventions and assessments tailored to the specific needs of learners with different cognitive profiles (Kaufman & Lichtenberger, 2006; McGrew & Flanagan, 1998). Furthermore, the findings contribute to the understanding of the structure of cognitive abilities, particularly the relationship between crystallized intelligence and verbal abilities (Ackerman, 1996). This may be useful for refining theoretical models of cognitive abilities and guiding future research in this area (Carroll, 2003; Deary et al., 2007).


Method


Research Design


The current study employed a correlational research design to investigate the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale. This design allowed for the examination of associations between the variables without manipulating or controlling any of the measures (Creswell, 2009). A correlational design was chosen because it is well-suited for studying the relationships among naturally occurring variables, such as crystallized intelligence and verbal abilities (Campbell & Stanley, 1963; Kerlinger, 2000).


Participants


A total of 125 participants were recruited for this study, 81 males (64.71%) and 44 females (35.29%). The participants' mean age of 33.82 years old (SD = 12.56). In terms of education, 79.83% of the participants held at least a college degree. Participants were recruited using convenience sampling methods, such as posting advertisements on social media and online forums. 


Materials


The JCCES is a measure of crystallized intelligence, consisting of three subtests: Verbal Analogies (VA), Mathematical Problems (MP), and General Knowledge (GK). The RIAS Verbal Scale (Reynolds & Kamphaus, 2003) is a measure of verbal intelligence, consisting of two subtests: Guess What? (GWH) and Verbal Reasoning (VRZ). Both the JCCES and RIAS have been validated in previous research and have demonstrated strong psychometric properties.


Procedure


Participants were provided with informed consent forms. The tasks were presented in a fixed order, starting with the JCCES (VA, MP, and GK) followed by the RIAS Verbal Scale (GWH and VRZ). Instructions for each task were provided before the commencement of each subtest. Participants were given unlimited time to complete the tasks. Upon completion of the tasks, participants were debriefed and thanked for their participation.


Statistical Analysis


Data were analyzed using Excel. Descriptive statistics were calculated for the demographic variables, and a Principal Component Analysis (PCA) was conducted to examine the relationships among the JCCES and RIAS Verbal Scale subtests (Jolliffe, 1986). The PCA included Bartlett's sphericity test to assess the suitability of the data for PCA (Bartlett, 1954) and the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy to evaluate the adequacy of the sample size for each variable (Kaiser, 1974; Hutcheson & Sofroniou, 1999).


Results


The present study investigated the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale. A Principal Component Analysis (PCA) was conducted to test the research hypotheses. This analysis was performed on five variables: Verbal Analogies (VA), Mathematical Problems (MP), and General Knowledge (GK) from JCCES, and Guess What? (GWH) and Verbal Reasoning (VRZ) from RIAS. The PCA was a Pearson (n) type, with no missing data for any of the variables. The analysis included Bartlett's sphericity test to assess the suitability of the data for PCA, and the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy to evaluate the adequacy of the sample size for each variable.


Results of the Statistical Analyses


The correlation matrix revealed significant positive correlations between all the variables, with coefficients ranging from 0.471 to 0.761. Bartlett's sphericity test confirmed the appropriateness of the data for PCA (χ² = 385.145, df = 10, p < 0.0001, α = 0.05). The KMO measure of sampling adequacy was satisfactory for all variables (0.844 to 0.891) and the overall KMO value was 0.868, indicating an adequate sample size.


The PCA extracted five factors with eigenvalues ranging from 0.224 to 3.574. The first factor (F1) accounted for 71.472% of the total variance, the second factor (F2) for 12.329%, and the remaining factors (F3 to F5) for 16.199%. A Varimax rotation was applied to facilitate the interpretation of the factor loadings. After rotation, the percentage of variance accounted for by the first two factors (D1 and D2) was 57.213% and 26.588%, respectively, totaling 83.801% of the cumulative variance.


The rotated factor loadings revealed that VA, GK, GWH, and VRZ loaded highly on the first factor (D1), with loadings ranging from 0.774 to 0.894. MP loaded highly on the second factor (D2), with a loading of 0.952.


Interpretation of the Results


The results of the PCA support the hypothesis that there is a strong relationship between the JCCES and the RIAS Verbal Scale, as indicated by the high loadings of the VA, GK, GWH, and VRZ variables on the first factor (D1). This factor can be interpreted as a common underlying construct, which may represent general verbal and crystallized intelligence. The high loading of the MP variable on the second factor (D2) suggests that mathematical problem-solving is a distinct construct from the general verbal and crystallized intelligence measured by the other variables.


Limitations


There are some limitations to this study that may have affected the results. First, the sample size of 125 participants is relatively small, which may limit the generalizability of the findings. However, the KMO measure indicated that the sample size was adequate for the PCA. Second, the sample was not equally distributed in terms of gender, with a majority of males (64.71%) and a high percentage of participants with at least one college degree (79.83%). This may have introduced selection bias, potentially limiting the applicability of the findings to more diverse populations. Lastly, the study relied solely on PCA to analyze the relationships between the variables, and future research may benefit from using additional statistical techniques, such as confirmatory factor analysis, structural equation modeling, or multiple regression, to further validate the findings and provide a more comprehensive understanding of the relationships among the variables.


Discussion

Interpretation of the Results in the Context of the Research Hypotheses and Previous Research

The present study aimed to investigate the relationship between the JCCES and the RIAS Verbal Scale, with the results supporting a strong relationship between these two measures. This finding is consistent with previous research on the relationship between crystallized intelligence and verbal abilities (e.g., Cattell, 1971; Horn & Cattell, 1966; Carroll, 1993). The high loadings of VA, GK, GWH, and VRZ on the first factor (D1) suggest a common underlying construct, which may represent general verbal and crystallized intelligence. This is supported by the notion that crystallized intelligence involves the acquisition and application of verbal and cultural knowledge, as well as the ability to reason using previously learned information (Cattell, 1963). Studies have consistently demonstrated that crystallized intelligence is closely related to verbal abilities, reflecting an individual's ability to access and utilize their accumulated knowledge base (Ackerman, 1996; Kaufman & Lichtenberger, 2006).

The high loading of MP on the second factor (D2) indicates that mathematical problem-solving is a distinct construct from general verbal and crystallized intelligence. This finding adds to the existing literature on the differentiation of mathematical abilities from verbal abilities (e.g., Deary, et al., 2007). Moreover, the significant positive correlations between all the variables suggest that there may be some shared cognitive processes underlying performance on these tasks, consistent with the concept of a general factor of intelligence (Spearman, 1904).

Implications for Theory, Practice, and Future Research

The results of this study have several important implications. First, they provide empirical support for the relationship between crystallized intelligence and verbal abilities, as well as the distinction between mathematical and verbal abilities (Cattell, 1971; Horn & Cattell, 1966; Carroll, 1993). This can inform the development of educational interventions and assessments that are tailored to the specific needs of learners with different cognitive profiles (Kaufman & Lichtenberger, 2006; McGrew & Flanagan, 1998). For example, educators can use the JCCES and RIAS Verbal Scale to identify students who may benefit from additional support in developing their verbal or mathematical skills (Fletcher et al., 2007).

Second, the findings contribute to the understanding of the structure of cognitive abilities, particularly the relationship between crystallized intelligence and verbal abilities (Ackerman, 1996). This may be useful for refining theoretical models of cognitive abilities and guiding future research in this area (Carroll, 2003; Deary et al., 2007). For instance, researchers could investigate the underlying cognitive processes that contribute to the shared variance between the JCCES and RIAS Verbal Scale, as well as the processes that differentiate mathematical problem-solving from verbal and crystallized intelligence (Geary, 1994; Hattie, 2009). This could involve examining the role of working memory, attention, and executive functions in the development of these cognitive abilities (Baddeley, 2003; Conway et al., 2002).

Limitations and Alternative Explanations

Although the present study has several strengths, there are also some limitations that warrant consideration. As noted earlier, the sample size was relatively small, and the sample was not equally distributed in terms of gender and educational attainment. This may limit the generalizability of the findings and introduce potential selection bias (Maxwell, 2004; Pedhazur & Schmelkin, 1991). Future research should aim to replicate these findings in larger and more diverse samples to increase the robustness and external validity of the results (Cook & Campbell, 1979; Shadish, et al., 2002).

Additionally, the study relied solely on PCA to analyze the relationships between the variables. Future research could employ other statistical techniques, such as confirmatory factor analysis (Jöreskog, 1969; Bollen, 1989), structural equation modeling (Kline, 2005; Schumacker & Lomax, 2004), or multiple regression (Cohen, et al., 2003; Tabachnick & Fidell, 2007), to further validate the findings and provide a more comprehensive understanding of the relationships among the variables. These alternative statistical approaches could help to address potential methodological issues, such as measurement error and the influence of confounding variables, and strengthen the evidence base for the observed relationships between crystallized intelligence and verbal abilities (Bryant & Yarnold, 1995; Little, et al., 1999).

Directions for Future Research

Based on the findings and limitations of the present study, several directions for future research can be identified. First, researchers could investigate the underlying cognitive processes that contribute to the shared variance between the JCCES and RIAS Verbal Scale, as well as the processes that differentiate mathematical problem-solving from verbal and crystallized intelligence (Neisser et al., 1996; Stanovich & West, 2000). This could involve examining the neural substrates of these cognitive abilities (Jung & Haier, 2007), as well as the role of environmental and genetic factors in their development (Plomin & Spinath, 2004).

Second, future research could examine the predictive validity of the JCCES and RIAS Verbal Scale for various educational and occupational outcomes, such as academic achievement, job performance, and job satisfaction (Deary, 2001; Kuncel, et al., 2004). This would help to establish the practical utility of these measures in real-world settings and inform the development of evidence-based interventions and policies aimed at fostering individual success and well-being (Gottfredson, 1997).

Third, researchers could explore the potential moderating role of individual differences, such as age, gender, and socioeconomic status, on the relationship between the JCCES and RIAS Verbal Scale (Deary, et al., 2005; Lubinski & Benbow, 2006). This would help to identify specific subgroups of the population for whom these measures may be particularly informative or relevant and inform the development of targeted interventions and supports.

Finally, future research could investigate the longitudinal stability of the relationships between the JCCES and RIAS Verbal Scale, as well as the potential causal mechanisms underlying these relationships (McArdle, et al., 2002). Longitudinal designs would allow researchers to examine the development and change of cognitive abilities over time (Baltes, et al., 1980) and provide insights into the factors that contribute to the observed patterns of covariation among the variables (Salthouse, 2004).

Conclusion

In conclusion, the present study examined the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale using a Principal Component Analysis (PCA). The results of the PCA supported the hypothesis that there is a strong relationship between the JCCES and the RIAS Verbal Scale, with the first factor representing a common underlying construct of general verbal and crystallized intelligence and the second-factor representing mathematical problem-solving as a distinct construct. The findings contribute to the understanding of the structure of cognitive abilities and have implications for theory, practice, and future research. The study provides empirical support for the relationship between crystallized intelligence and verbal abilities and the differentiation of mathematical abilities from verbal abilities. The results can inform the development of educational interventions and assessments that are tailored to the specific needs of learners with different cognitive profiles. The study's limitations include a relatively small sample size and an unequal distribution of gender and educational attainment, highlighting the need for future research to replicate the findings in larger and more diverse samples. Future research could also investigate the underlying cognitive processes, predictive validity, individual differences, and longitudinal stability of the relationships among these variables.

References

Ackerman, P. L. (1996). A theory of adult intellectual development: Process, personality, interests, and knowledge. Intelligence, 22(2), 227–257. https://doi.org/10.1016/S0160-2896(96)90016-1

Baltes, P. B., Reese, H. W., & Lipsitt, L. P. (Eds.). (1980). Life-span developmental psychology: Introduction to research methods. New York: Academic Press. https://doi.org/10.4324/9781315799704

Bollen, K. A. (1989). Structural equations with latent variables. New York, NY: John Wiley & Sons.

Baddeley, A. (2003). Working Memory: Looking Back and Looking Forward. Nature Reviews Neuroscience, 4(10), 829–839. https://doi.org/10.1038/nrn1201

Bartlett, M. S. (1954). A note on the multiplying factors for various chi square approximations. Journal of the Royal Statistical Society: Series B, 16(2), 296-298. 

Bryant, F. B., & Yarnold, P. R. (1995). Principal-components analysis and exploratory and confirmatory factor analysis. In L. G. Grimm & P. R. Yarnold (Eds.), Reading and understanding multivariate statistics (pp. 99-136). Washington, DC: American Psychological Association.

Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago, IL: Rand McNally.

Cattell, R. B. (1963). Theory of fluid and crystallized intelligence: A critical experiment. Journal of Educational Psychology, 54(1), 1–22. https://doi.org/10.1037/h0046743

Cattell, R. B. (1971). Abilities: their structure, growth, and action. Boston, MA: Houghton Mifflin.

Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. New York: Cambridge University Press. https://doi.org/10.1017/CBO9780511571312

Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences. Mahwah, NJ: Lawrence Erlbaum Associates. https://doi.org/10.4324/9780203774441

Conway, A. R. A., Cowan, N., Bunting, M. F., Therriault, D. J., & Minkoff, S. R. B. (2002). A latent variable analysis of working memory capacity, short-term memory capacity, processing speed, and general fluid intelligence. Intelligence, 30(2), 163-183. https://doi.org/10.1016/S0160-2896(01)00096-4

Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Boston, MA: Houghton Mifflin.

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). New York: Sage Publications, Inc.

Deary, I. J. (2001). Intelligence: A very short introduction. Oxford, UK: Oxford University Press.

Deary, I. J., Strand, S., Smith, P., & Fernandes, C. (2007). Intelligence and educational achievement. Intelligence, 35(1), 13-21. https://doi.org/10.1016/j.intell.2006.02.001

Deary, I. J., Taylor, M. D., Hart, C. L., Wilson, V., Smith, G. D., Blane, D., & Starr, J. M. (2005). Intergenerational social mobility and mid-life status attainment: Influences of childhood intelligence, childhood social factors, and education. Intelligence, 33(5), 455–472. https://doi.org/10.1016/j.intell.2005.06.003

Fletcher, J.M., Lyon, G.R., Fuchs, L.S., and Barnes, M.A. (2007). Learning disabilities: From identification to intervention. New York: Guilford Press.

Geary, D. C. (1994). Children's mathematical development: Research and practical applications. Washington, DC: American Psychological Association. https://doi.org/10.1037/10163-000

Gottfredson, L. S. (1997). Why g matters: The complexity of everyday life. Intelligence, 24(1), 79–132. https://doi.org/10.1016/S0160-2896(97)90014-3

Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. London, UK: Routledge.

Horn, J. L., & Cattell, R. B. (1966). Refinement and test of the theory of fluid and crystallized general intelligences. Journal of Educational Psychology, 57(5), 253-270. https://doi.org/10.1037/h0023816

Hutcheson, G. D., & Sofroniou, N. (1999). The Multivariate Social Scientist: Introductory Statistics Using Generalized Linear Models. Thousand Oaks, CA: Sage Publications. https://doi.org/10.4135/9780857028075

Jolliffe, I. T. (1986). Principal component analysis and factor analysis. In I. T. Jolliffe (Ed.), Principal component analysis (pp. 115-128). New York, NY: Springer-Verlag.

Joreskog, K. G. (1969). A general approach to confirmatory maximum likelihood factor analysis. Psychometrika, 34(2, Pt.1), 183–202. https://doi.org/10.1007/BF02289343

Jung, R. E., & Haier, R. J. (2007). The Parieto-Frontal Integration Theory (P-FIT) of intelligence: converging neuroimaging evidence. The Behavioral and brain sciences, 30(2), 135–187. https://doi.org/10.1017/S0140525X07001185

Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39(1), 31–36. https://doi.org/10.1007/BF02291575

Kaufman, A. S., & Lichtenberger, E. O. (2006). Assessing adolescent and adult intelligence (3rd ed.). Hoboken, NJ: John Wiley & Sons Inc.

Kline, R. B. (2005). Principles and practice of structural equation modeling. New York, NY: Guilford Press.

Little, T. D., Lindenberger, U., & Nesselroade, J. R. (1999). On selecting indicators for multivariate measurement and modeling with latent variables: When "good" indicators are bad and "bad" indicators are good. Psychological Methods, 4(2), 192–211. https://doi.org/10.1037/1082-989X.4.2.192

Lubinski, D., & Benbow, C. P. (2006). Study of mathematically precocious youth after 35 years: Uncovering antecedents for the development of math-science expertise. Perspectives on Psychological Science, 1(4), 316-345. https://doi.org/10.1111/j.1745-6916.2006.00019.x

Kerlinger, F. N. (2000). Foundations of Behavioral Research. San Diego, CA: Harcourt College Publishers.

Kuncel, N. R., Hezlett, S. A., & Ones, D. S. (2004). Academic performance, career potential, creativity, and job performance: can one construct predict them all?. Journal of personality and social psychology, 86(1), 148–161. https://doi.org/10.1037/0022-3514.86.1.148

Maxwell, S. E. (2004). The Persistence of Underpowered Studies in Psychological Research: Causes, Consequences, and Remedies. Psychological Methods, 9(2), 147–163. https://doi.org/10.1037/1082-989X.9.2.147

McArdle, J. J., Ferrer-Caja, E., Hamagami, F., & Woodcock, R. W. (2002). Comparative longitudinal structural analyses of the growth and decline of multiple intellectual abilities over the life span. Developmental psychology, 38(1), 115–142.

McGrew, K. S., & Flanagan, D. P. (1998). The intelligence test desk reference (ITDR): Gf-Gc cross-battery assessment. Boston, MA: Allyn & Bacon.

Neisser, U., Boodoo, G., Bouchard, T. J., Jr., Boykin, A. W., Brody, N., Ceci, S. J., Halpern, D. F., Loehlin, J. C., Perloff, R., Sternberg, R. J., & Urbina, S. (1996). Intelligence: Knowns and unknowns. American Psychologist, 51(2), 77–101. https://doi.org/10.1037/0003-066X.51.2.77

Pedhazur, E. J., & Schmelkin, L. P. (1991). Measurement, design, and analysis: An integrated approach. Hillsdale, NJ: Lawrence Erlbaum Associates.

Plomin, R., & Spinath, F. M. (2004). Intelligence: genetics, genes, and genomics. Journal of personality and social psychology, 86(1), 112–129. https://doi.org/10.1037/0022-3514.86.1.112

Reynolds, C. R., & Kamphaus, R. W. (2003). Reynolds Intellectual Assessment Scales (RIAS) and the Reynolds Intellectual Screening Test (RIST), Professional Manual. Lutz, FL: Psychological Assessment Resources.

Salthouse, T. A. (2004). What and When of Cognitive Aging. Current Directions in Psychological Science, 13(4), 140–144. https://doi.org/10.1111/j.0963-7214.2004.00293.x

Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin.

Schumacker, R. E., & Lomax, R. G. (2004). A beginner's guide to structural equation modeling. Mahwah, NJ: Lawrence Erlbaum Associates. https://doi.org/10.4324/9781410610904

Spearman, C. (1904). "General intelligence," is objectively determined and measured. The American Journal of Psychology, 15(2), 201-292. https://doi.org/10.2307/1412107

Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23(5), 645–665. https://doi.org/10.1017/S0140525X00003435

Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed.). Boston, MA: Pearson Education.