Sunday, February 14, 2010

Relationship between Jouve Cerebrals Crystallized Educational Scale (JCCES) Crystallized Educational Index (CEI) and Cognitive and Academic Measures

Abstract

This study aimed to examine the relationships between the Jouve Cerebrals Crystallized Educational Scale (JCCES) Crystallized Educational Index (CEI) and various measures of cognitive abilities and academic achievement. Pearson correlation analyses were used to test the research hypotheses. The results showed strong correlations between the JCCES CEI and measures of cognitive abilities, including the Reynolds Intellectual Assessment Scale (RIAS), Wechsler Adult Intelligence Scale - Third Edition (WAIS-III), Wechsler Intelligence Scale for Children - Third Edition (WISC-III), General Ability Measure for Adults (GAMA), and Stanford Binet Intelligence Scale (SBIS). Additionally, strong correlations were observed between the JCCES CEI and measures of academic achievement, including the Scholastic Assessment Test (SAT), American College Test (ACT), and Graduate Record Examination (GRE). The results suggest that the JCCES CEI is an effective measure of general cognitive ability and academic achievement across different age groups.

Keywords: Jouve Cerebrals Crystallized Educational Scale, Crystallized Educational Index, cognitive abilities, academic achievement, Pearson correlation analyses, Scholastic Assessment Test, American College Test, Graduate Record Examination.

Introduction

Psychometrics, the scientific study of psychological measurement, has been a critical aspect of psychology since the early 20th century, with the development of the first intelligence tests by pioneers such as Binet and Simon (1905) and Wechsler (1939). These seminal works laid the foundation for the development of various instruments to assess cognitive abilities, personality traits, and educational outcomes (Anastasi & Urbina, 1997). Over the years, psychometric theories have evolved, with advancements in factor analysis, item response theory, and other methodologies contributing to the refinement of existing instruments and the development of new ones (Embretson & Reise, 2000).

One such instrument is the Jouve Cerebrals Crystallized Educational Scale (JCCES), which assesses crystallized intelligence, a key component of general cognitive ability (Cattell, 1971; Horn & Cattell, 1966). Crystallized intelligence, often considered the product of accumulated knowledge and experiences, has been shown to be a reliable predictor of academic achievement and occupational success (Deary et al., 2007; Neisser et al., 1996).

The present study aims to examine the relationships between the JCCES Crystallized Educational Index (CEI) and various other measures of cognitive abilities and academic achievement, such as the Reynolds Intellectual Assessment Scale (RIAS), the Wechsler Adult Intelligence Scale - Third Edition (WAIS-III), the Scholastic Assessment Test (SAT), the American College Test (ACT), the Graduate Record Examination (GRE), the Armed Forces Qualification Test (AFQT), the Wechsler Intelligence Scale for Children - Third Edition (WISC-III), the General Ability Measure for Adults (GAMA), and the Stanford Binet Intelligence Scale (SBIS). Pearson correlation analyses were employed to investigate these relationships.

A comprehensive understanding of the relationships between the JCCES CEI and these well-established measures can provide valuable insight into the validity and utility of the JCCES in various contexts. Previous research has demonstrated that crystallized intelligence is a significant predictor of academic achievement (Deary et al., 2007) and is often correlated with other measures of cognitive abilities (Carroll, 1993). Therefore, the present study seeks to extend the existing literature by further examining these relationships, while also assessing the JCCES CEI's potential as an effective tool for predicting academic and cognitive outcomes.

The results of this study may have important implications for the use of the JCCES in educational and occupational settings and may contribute to the ongoing refinement of psychometric theories and methodologies. By exploring the relationships between the JCCES CEI and a range of well-established cognitive and achievement measures, this study aims to provide a comprehensive understanding of the JCCES's validity and utility within the broader context of psychometrics research.

Results

Statistical Analyses

The research hypotheses were tested using Pearson correlations to examine the relationships between the Jouve Cerebrals Crystallized Educational Scale (JCCES) Crystallized Educational Index (CEI) and various other measures. Assumptions made for the Pearson correlation analyses included linearity, homoscedasticity, and normality of the data.

Presentation of Results

The results of the Pearson correlation analyses between the JCCES CEI and various measures of cognitive abilities and academic achievement are presented in detail below. The majority of correlations were statistically significant at the p < .001 level, indicating strong relationships between the JCCES CEI and the respective measures.

Reynolds Intellectual Assessment Scale (RIAS, N = 138): The JCCES CEI demonstrated strong correlations with the Verbal Intelligence Index (VII) (r = .859, p < .001), Guess What? (GWH) (Information) (r = .814, p < .001), and Verbal Reasoning (VRZ) (r = .859, p < .001).

Wechsler Adult Intelligence Scale - Third Edition (WAIS-III, N =76): The JCCES CEI showed strong correlations with Full Scale IQ (FSIQ) (r = .821, p < .001), Verbal IQ (VIQ) (r = .837, p < .001), Performance IQ (PIQ) (r = .660, p < .001), Verbal Comprehension Index (VCI) (r = .816, p < .001), Vocabulary (VOC) (r = .775, p < .001), Similarities (SIM) (r = .579, p < .001), and Information (INF) (r = .769, p < .001).

Scholastic Assessment Test (SAT) (three different versions): The JCCES CEI exhibited strong correlations with SAT Composite scores for all three versions: <1995 (r = .814, p < .001, N = 87), 1995-2005 (r = .826, p < .001, N = 118), and >2005 (r = .858, p < .001, N = 125). Similarly, significant correlations were observed with Verbal and Mathematical scores across the three versions.

American College Test (ACT, N = 133): The JCCES CEI was significantly correlated with the ACT Composite score (r = .691, p < .001) and all subscales, including English (r = .636, p < .001), Mathematical (r = .600, p < .001), Reading (r = .676, p < .001), and Science (r = .685, p < .001).

Graduate Record Examination (GRE, N = 66): The JCCES CEI demonstrated a strong correlation with the GRE Composite score (r = .844, p < .001), Verbal (r = .768, p < .001), and Quantitative (r = .819, p < .001) scores. However, the correlation with the GRE Analytical subscale was weaker (r = .430, p = .020, N = 29).

Armed Forces Qualification Test (AFQT, N = 62): The JCCES CEI showed a strong correlation with the AFQT percentile converted to a deviation IQ (r = .825, p < .001).

Wechsler Intelligence Scale for Children - Third Edition (WISC-III, N = 29): The JCCES CEI had strong correlations with Full Scale IQ (FSIQ) (r = .851, p < .001), Verbal IQ (VIQ) (r = .665, p = .003, N = 18), and Performance IQ (PIQ) (r = .703, p = .001, N = 18).

General Ability Measure for Adults (GAMA, N = 64): The JCCES CEI was significantly correlated with the GAMA IQ score (r = .617, p < .001) and all subscales, including Matching (r = .467, p < .001), Analogies (r = .612, p < .001), Sequences (r = .455, p < .001), and Construction (r = .482, p <.001).

Stanford Binet Intelligence Scale (SBIS, N = 10): The JCCES CEI exhibited the strongest correlation with the SBIS Full Scale IQ (FSIQ) (r = .883, p = .001).

Interpretation of Results

Upon examining the Pearson correlation analysis results in greater detail, we can further interpret the relationships between the JCCES CEI and various cognitive and academic measures. The majority of the correlations were strong, supporting the research hypothesis that the JCCES CEI is positively related to these measures.

The strong relationships between the JCCES CEI and various intelligence scales provide further evidence that the JCCES CEI is an effective measure of general cognitive ability across different age groups. Both the Wechsler Adult Intelligence Scale - Third Edition (WAIS-III) and the Wechsler Intelligence Scale for Children - Third Edition (WISC-III) are widely recognized and well-established measures of cognitive ability, assessing various domains such as verbal comprehension, perceptual organization, working memory, and processing speed.

Intelligence Tests

  1. Wechsler Adult Intelligence Scale - Third Edition (WAIS-III): The WAIS-III is designed for individuals aged 16 to 89 years, assessing cognitive abilities across multiple domains. The strong correlation between the JCCES CEI and the WAIS-III Full Scale IQ (FSIQ) (r = .821, p < .001, N = 76) indicates that the JCCES CEI effectively captures general cognitive ability in adults. This positive relationship suggests that the JCCES CEI could be a useful tool for assessing cognitive abilities in various settings, such as educational, clinical, and occupational contexts.
  2. Wechsler Intelligence Scale for Children - Third Edition (WISC-III): The WISC-III is designed for children aged 6 to 16 years, assessing cognitive abilities across a similar range of domains as the WAIS-III. The strong correlation between the JCCES CEI and the WISC-III Full Scale IQ (FSIQ) (r = .851, p < .001, N = 29) suggests that the JCCES CEI is also effective in measuring general cognitive ability in children. This positive relationship implies that the JCCES CEI could be a valuable instrument for evaluating cognitive abilities in educational settings, as well as for identifying potential learning difficulties or giftedness in children.
Academic Tests

The Scholastic Assessment Test (SAT) is a widely used standardized test for college admissions in the United States, designed to measure students' critical thinking, problem-solving, and overall academic aptitude. The strong relationships between the JCCES CEI and SAT Composite scores across all three versions suggest that the JCCES CEI is a reliable indicator of academic achievement as measured by the SAT.

The SAT has undergone several changes over the years, resulting in three distinct versions. The following details illustrate the strong relationships between the JCCES CEI and each version of the SAT:

  1. SAT <1995: This version of the SAT consisted of two main sections: Verbal and Mathematical. The JCCES CEI showed a strong correlation with the SAT Composite score for this version (r = .814, p < .001, N = 87), indicating that the JCCES CEI is positively related to both verbal and mathematical abilities as measured by the SAT <1995.
  2. SAT 1995-2005: This version of the SAT maintained the Verbal and Mathematical sections, but introduced a new format and scoring system. The JCCES CEI displayed a strong correlation with the SAT Composite score for this version (r = .826, p < .001, N = 118), suggesting that the JCCES CEI remains a reliable indicator of academic achievement despite changes to the SAT format.
  3. SAT >2005: This version of the SAT introduced a third section, Writing, in addition to the existing Verbal (renamed as Reading) and Mathematical sections. The JCCES CEI demonstrated a strong correlation with the SAT Composite score for this version (r = .858, p < .001, N = 125), implying that the JCCES CEI is positively related to all three aspects of the SAT: Reading, Mathematical, and Writing.

The American College Test (ACT) correlations with the JCCES CEI provide further evidence that the JCCES CEI captures various aspects of academic achievement across multiple subject areas. The ACT is a standardized test that assesses high school student's general educational development and their ability to complete college-level work, covering four main subject areas: English, Mathematics, Reading, and Science.

The Pearson correlation analyses results for the ACT subscales are as follows:

  1. English: The JCCES CEI exhibited a strong correlation with the ACT English subscale (r = .636, p < .001, N = 133). This suggests that the JCCES CEI is positively related to English language skills, including grammar, punctuation, sentence structure, and rhetorical skills.
  2. Mathematics: The JCCES CEI displayed a strong correlation with the ACT Mathematics subscale (r = .600, p < .001, N = 133). This indicates a positive relationship between the JCCES CEI and mathematical problem-solving abilities, including knowledge of algebra, geometry, and trigonometry.
  3. Reading: The JCCES CEI showed a strong correlation with the ACT Reading subscale (r = .676, p < .001, N = 133). This implies that the JCCES CEI is positively associated with reading comprehension skills, including the ability to understand and analyze complex literary and informational texts.
  4. Science: The JCCES CEI demonstrated a strong correlation with the ACT Science subscale (r = .685, p < .001, N = 133). This suggests that the JCCES CEI is positively related to scientific reasoning and problem-solving skills, including the ability to interpret and analyze data from various scientific disciplines.

The moderate correlation between the JCCES CEI and the Graduate Record Examination (GRE) Analytical subscale (r = .430, p = .020, N = 29) is indeed notable, as it suggests a weaker relationship between the JCCES CEI and analytical abilities compared to the strong correlations observed with other cognitive and academic measures. Several factors might contribute to this finding, including:
  1. Differences in assessed skills: The JCCES CEI, which consists of Verbal Analogies, Mathematical Problems, and General Knowledge subtests, primarily measures crystallized intelligence. Crystallized intelligence refers to the knowledge and skills acquired through experience and education, such as vocabulary and factual information. In contrast, the GRE Analytical subscale assesses analytical writing skills, including the ability to articulate complex ideas, support arguments with relevant reasons and examples, and demonstrate critical thinking. The moderate correlation between the JCCES CEI and the GRE Analytical subscale may reflect the differences in the skills assessed by these two measures.
  2. Variability in the sample: The sample used in this study might have influenced the observed correlation between the JCCES CEI and the GRE Analytical subscale. The study participants might have had varying levels of exposure to analytical writing tasks, which could affect their performance on the GRE Analytical subscale. Additionally, the sample size for the GRE Analytical subscale (N = 29) was smaller than that of other measures, which might limit the generalizability of the findings.

Discussion

The present study aimed to examine the relationships between the Jouve Cerebrals Crystallized Educational Scale (JCCES) Crystallized Educational Index (CEI) and various measures of cognitive abilities and academic achievement. The results of the study support the research hypothesis that the JCCES CEI is positively related to these measures. Specifically, the JCCES CEI demonstrated strong correlations with measures of verbal intelligence, information, verbal reasoning, full-scale IQ, verbal IQ, performance IQ, verbal comprehension, vocabulary, similarities, information, SAT composite scores across three different versions, ACT composite score, and subscales, GRE composite score, quantitative score, and AFQT IQ score. The JCCES CEI also exhibited strong correlations with the GAMA IQ score and all subscales, as well as the SBIS Full Scale IQ.

The strong correlations between the JCCES CEI and various intelligence scales provide further evidence that the JCCES CEI is an effective measure of general cognitive ability across different age groups. The positive relationships between the JCCES CEI and various cognitive and academic measures suggest that the JCCES CEI could be a useful tool for assessing cognitive abilities and academic achievement in various settings, such as educational, clinical, and occupational contexts (Deary et al., 2007).

The strong correlations between the JCCES CEI and the SAT Composite scores across all three versions suggest that the JCCES CEI is a reliable indicator of academic achievement as measured by the SAT. The strong correlations observed between the JCCES CEI and the ACT composite score and subscales suggest that the JCCES CEI captures various aspects of academic achievement across multiple subject areas.

The moderate correlation between the JCCES CEI and the GRE Analytical subscale suggests a weaker relationship between the JCCES CEI and analytical abilities compared to the strong correlations observed with other cognitive and academic measures. This finding may reflect differences in the skills assessed by these two measures, as well as the variability in the sample used in this study.

Implications for Theory, Practice, and Future Research

The findings of the present study have several implications for theory and practice. The strong correlations observed between the JCCES CEI and measures of cognitive abilities and academic achievement support the validity and reliability of the JCCES as a measure of general cognitive ability and academic achievement. The JCCES CEI could be a valuable tool for assessing cognitive abilities and academic achievement in educational, clinical, and occupational settings.

The results of this study also have implications for future research. The present study used a cross-sectional design, and future research could use a longitudinal design to examine the stability and predictive validity of the JCCES CEI over time. Additionally, future research could explore the relationship between the JCCES CEI and other measures of academic achievement, such as high school and college GPA. Furthermore, future research could examine the factor structure of the JCCES and its relationships with other measures of cognitive abilities.

Limitations

There are several limitations to this study that may have affected the results. First, the sample size varied across the different measures, with smaller sample sizes for some of the tests. Smaller sample sizes may have limited the statistical power to detect significant correlations.

Second, selection bias may have influenced the results, as participants may have been more likely to respond to the survey if they had higher cognitive abilities or academic achievement. This could have resulted in an overestimation of the correlations between the JCCES CEI and other measures.

Finally, many samples relied on self-reported data, which may be subject to reporting biases and inaccuracies. Although the JCCES is an untimed, self-administered, open-ended test, it is possible that participants' responses were influenced by factors such as social desirability or recall biases, which may have affected the validity of the study results.

Future Research

Future research could address some of the limitations of this study, including increasing sample sizes for certain measures and using more diverse samples to improve generalizability. Additionally, future research could examine the JCCES CEI's relationship with other cognitive and academic measures not included in this study, such as measures of creativity or problem-solving ability.

Further exploration of the weaker relationship between the JCCES CEI and the GRE Analytical subscale could also be valuable. Additional research could investigate whether the moderate correlation is due to differences in the skills assessed or limitations of the sample used in this study. Future studies could also examine the JCCES CEI's relationship with other measures of analytical abilities, such as performance on analytical writing tasks or measures of critical thinking.

Implications

The results of this study have important implications for both theory and practice. The strong relationships between the JCCES CEI and various measures of cognitive abilities and academic achievement provide further evidence for the construct validity of the JCCES as a measure of general cognitive ability. The JCCES CEI may be particularly useful in educational and occupational settings for assessing individuals' cognitive abilities, identifying potential learning difficulties or giftedness, and predicting academic and occupational success.

Additionally, the strong correlations between the JCCES CEI and the SAT and ACT suggest that the JCCES CEI is an effective tool for predicting academic achievement. As such, the JCCES CEI may be useful for guiding educational interventions and for identifying individuals who may benefit from academic support.

However, it is important to note that the JCCES CEI should not be used as the sole measure for assessing cognitive abilities or academic achievement. Rather, the JCCES CEI should be used in conjunction with other measures to provide a more comprehensive evaluation of an individual's strengths and weaknesses.

Conclusion

In conclusion, the results of this study provide strong evidence for the construct validity of the JCCES as a measure of general cognitive ability. The JCCES CEI demonstrated strong correlations with various measures of cognitive abilities and academic achievement, including well-established measures such as the WAIS-III and the SAT. The study results suggest that the JCCES CEI may be a useful tool for assessing cognitive abilities and predicting academic and occupational success. However, the limitations of the study should be taken into consideration when interpreting the results. Future research could address some of the limitations and further explore the JCCES CEI's relationship with other measures of cognitive abilities and academic achievement.

References

Anastasi, A., & Urbina, S. (1997). Psychological testing (7th ed.). Upper Saddle River, NJ: Prentice-Hall.

Binet, A., & Simon, T. (1905). New methods for the diagnosis of the intellectual level of subnormals. L'Année Psychologique, 11, 191-244.

Cattell, R. B. (1971). Abilities: Their structure, growth, and action. Boston, MA: Houghton Mifflin.

Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. Cambridge, UK: Cambridge University Press.

Deary, I. J., Strand, S., Smith, P., & Fernandes, C. (2007). Intelligence and educational achievement. Intelligence, 35(1), 13-21. https://doi.org/10.1016/j.intell.2006.02.001

Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, NJ: Lawrence Erlbaum Associates.

Wechsler, D. (1939). The measurement of adult intelligence. Baltimore, MD: Williams & Wilkins.

Thursday, February 4, 2010

Investigating the Relationship Between JCCES and RIAS Verbal Scale: A Principal Component Analysis Approach

Abstract


This study examined the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale using Principal Component Analysis (PCA). The PCA revealed a strong relationship between JCCES and RIAS Verbal Scale, supporting the hypothesis that there is a common underlying construct representing general verbal and crystallized intelligence. Additionally, mathematical problem-solving was found to be a distinct construct from general verbal and crystallized intelligence. Despite some limitations, this study provides empirical support for the relationship between crystallized intelligence and verbal abilities, as well as the distinction between mathematical and verbal abilities, which can inform educational interventions and assessments.


Keywords: Jouve Cerebrals Crystallized Educational Scale, Reynolds Intellectual Assessment, Verbal Scale, Principal Component Analysis, crystallized intelligence, mathematical problem-solving


Introduction


Psychometrics, the science of measuring psychological attributes, has a long history of developing and refining theories and instruments to assess cognitive abilities (Cattell, 1963; Carroll, 1993). The present study focuses on the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale (Reynolds & Kamphaus, 2003), two psychometric instruments designed to assess crystallized intelligence and verbal abilities, respectively. Crystallized intelligence, first proposed by Cattell (1963), refers to the ability to access and utilize accumulated knowledge and experience, which is closely related to verbal abilities (Ackerman, 1996; Kaufman & Lichtenberger, 2006). Theories of cognitive abilities, such as those proposed by Cattell (1971), Horn and Cattell (1966), and Carroll (1993), have suggested that crystallized intelligence and verbal abilities share a common underlying construct.


Previous research has supported the relationship between crystallized intelligence and verbal abilities (Cattell, 1971; Horn & Cattell, 1966; Carroll, 1993), as well as the distinction between mathematical and verbal abilities (Deary et al., 2007). However, few studies have specifically examined the relationship between the JCCES and RIAS Verbal Scale. The present study aims to address this gap by investigating the relationship between these two instruments using principal component analysis (PCA), a statistical technique commonly employed in psychometrics to reduce data complexity and identify underlying constructs (Jolliffe, 1986).


The research question guiding this study is: What is the relationship between the JCCES and RIAS Verbal Scale, as assessed by PCA? To answer this question, the study will test the hypothesis that there is a strong relationship between the JCCES and RIAS Verbal Scale, as indicated by high factor loadings on a common underlying construct, which may represent general verbal and crystallized intelligence. Additionally, the study will explore the relationship between mathematical problem-solving and the other variables, given the distinction between mathematical and verbal abilities noted in previous research (Deary et al., 2007).


This study builds on the existing literature by providing a more detailed examination of the relationship between the JCCES and RIAS Verbal Scale, which has implications for both theory and practice. Understanding the relationship between these two instruments can inform the development of educational interventions and assessments tailored to the specific needs of learners with different cognitive profiles (Kaufman & Lichtenberger, 2006; McGrew & Flanagan, 1998). Furthermore, the findings contribute to the understanding of the structure of cognitive abilities, particularly the relationship between crystallized intelligence and verbal abilities (Ackerman, 1996). This may be useful for refining theoretical models of cognitive abilities and guiding future research in this area (Carroll, 2003; Deary et al., 2007).


Method


Research Design


The current study employed a correlational research design to investigate the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale. This design allowed for the examination of associations between the variables without manipulating or controlling any of the measures (Creswell, 2009). A correlational design was chosen because it is well-suited for studying the relationships among naturally occurring variables, such as crystallized intelligence and verbal abilities (Campbell & Stanley, 1963; Kerlinger, 2000).


Participants


A total of 125 participants were recruited for this study, 81 males (64.71%) and 44 females (35.29%). The participants' mean age of 33.82 years old (SD = 12.56). In terms of education, 79.83% of the participants held at least a college degree. Participants were recruited using convenience sampling methods, such as posting advertisements on social media and online forums. 


Materials


The JCCES is a measure of crystallized intelligence, consisting of three subtests: Verbal Analogies (VA), Mathematical Problems (MP), and General Knowledge (GK). The RIAS Verbal Scale (Reynolds & Kamphaus, 2003) is a measure of verbal intelligence, consisting of two subtests: Guess What? (GWH) and Verbal Reasoning (VRZ). Both the JCCES and RIAS have been validated in previous research and have demonstrated strong psychometric properties.


Procedure


Participants were provided with informed consent forms. The tasks were presented in a fixed order, starting with the JCCES (VA, MP, and GK) followed by the RIAS Verbal Scale (GWH and VRZ). Instructions for each task were provided before the commencement of each subtest. Participants were given unlimited time to complete the tasks. Upon completion of the tasks, participants were debriefed and thanked for their participation.


Statistical Analysis


Data were analyzed using Excel. Descriptive statistics were calculated for the demographic variables, and a Principal Component Analysis (PCA) was conducted to examine the relationships among the JCCES and RIAS Verbal Scale subtests (Jolliffe, 1986). The PCA included Bartlett's sphericity test to assess the suitability of the data for PCA (Bartlett, 1954) and the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy to evaluate the adequacy of the sample size for each variable (Kaiser, 1974; Hutcheson & Sofroniou, 1999).


Results


The present study investigated the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale. A Principal Component Analysis (PCA) was conducted to test the research hypotheses. This analysis was performed on five variables: Verbal Analogies (VA), Mathematical Problems (MP), and General Knowledge (GK) from JCCES, and Guess What? (GWH) and Verbal Reasoning (VRZ) from RIAS. The PCA was a Pearson (n) type, with no missing data for any of the variables. The analysis included Bartlett's sphericity test to assess the suitability of the data for PCA, and the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy to evaluate the adequacy of the sample size for each variable.


Results of the Statistical Analyses


The correlation matrix revealed significant positive correlations between all the variables, with coefficients ranging from 0.471 to 0.761. Bartlett's sphericity test confirmed the appropriateness of the data for PCA (χ² = 385.145, df = 10, p < 0.0001, α = 0.05). The KMO measure of sampling adequacy was satisfactory for all variables (0.844 to 0.891) and the overall KMO value was 0.868, indicating an adequate sample size.


The PCA extracted five factors with eigenvalues ranging from 0.224 to 3.574. The first factor (F1) accounted for 71.472% of the total variance, the second factor (F2) for 12.329%, and the remaining factors (F3 to F5) for 16.199%. A Varimax rotation was applied to facilitate the interpretation of the factor loadings. After rotation, the percentage of variance accounted for by the first two factors (D1 and D2) was 57.213% and 26.588%, respectively, totaling 83.801% of the cumulative variance.


The rotated factor loadings revealed that VA, GK, GWH, and VRZ loaded highly on the first factor (D1), with loadings ranging from 0.774 to 0.894. MP loaded highly on the second factor (D2), with a loading of 0.952.


Interpretation of the Results


The results of the PCA support the hypothesis that there is a strong relationship between the JCCES and the RIAS Verbal Scale, as indicated by the high loadings of the VA, GK, GWH, and VRZ variables on the first factor (D1). This factor can be interpreted as a common underlying construct, which may represent general verbal and crystallized intelligence. The high loading of the MP variable on the second factor (D2) suggests that mathematical problem-solving is a distinct construct from the general verbal and crystallized intelligence measured by the other variables.


Limitations


There are some limitations to this study that may have affected the results. First, the sample size of 125 participants is relatively small, which may limit the generalizability of the findings. However, the KMO measure indicated that the sample size was adequate for the PCA. Second, the sample was not equally distributed in terms of gender, with a majority of males (64.71%) and a high percentage of participants with at least one college degree (79.83%). This may have introduced selection bias, potentially limiting the applicability of the findings to more diverse populations. Lastly, the study relied solely on PCA to analyze the relationships between the variables, and future research may benefit from using additional statistical techniques, such as confirmatory factor analysis, structural equation modeling, or multiple regression, to further validate the findings and provide a more comprehensive understanding of the relationships among the variables.


Discussion

Interpretation of the Results in the Context of the Research Hypotheses and Previous Research

The present study aimed to investigate the relationship between the JCCES and the RIAS Verbal Scale, with the results supporting a strong relationship between these two measures. This finding is consistent with previous research on the relationship between crystallized intelligence and verbal abilities (e.g., Cattell, 1971; Horn & Cattell, 1966; Carroll, 1993). The high loadings of VA, GK, GWH, and VRZ on the first factor (D1) suggest a common underlying construct, which may represent general verbal and crystallized intelligence. This is supported by the notion that crystallized intelligence involves the acquisition and application of verbal and cultural knowledge, as well as the ability to reason using previously learned information (Cattell, 1963). Studies have consistently demonstrated that crystallized intelligence is closely related to verbal abilities, reflecting an individual's ability to access and utilize their accumulated knowledge base (Ackerman, 1996; Kaufman & Lichtenberger, 2006).

The high loading of MP on the second factor (D2) indicates that mathematical problem-solving is a distinct construct from general verbal and crystallized intelligence. This finding adds to the existing literature on the differentiation of mathematical abilities from verbal abilities (e.g., Deary, et al., 2007). Moreover, the significant positive correlations between all the variables suggest that there may be some shared cognitive processes underlying performance on these tasks, consistent with the concept of a general factor of intelligence (Spearman, 1904).

Implications for Theory, Practice, and Future Research

The results of this study have several important implications. First, they provide empirical support for the relationship between crystallized intelligence and verbal abilities, as well as the distinction between mathematical and verbal abilities (Cattell, 1971; Horn & Cattell, 1966; Carroll, 1993). This can inform the development of educational interventions and assessments that are tailored to the specific needs of learners with different cognitive profiles (Kaufman & Lichtenberger, 2006; McGrew & Flanagan, 1998). For example, educators can use the JCCES and RIAS Verbal Scale to identify students who may benefit from additional support in developing their verbal or mathematical skills (Fletcher et al., 2007).

Second, the findings contribute to the understanding of the structure of cognitive abilities, particularly the relationship between crystallized intelligence and verbal abilities (Ackerman, 1996). This may be useful for refining theoretical models of cognitive abilities and guiding future research in this area (Carroll, 2003; Deary et al., 2007). For instance, researchers could investigate the underlying cognitive processes that contribute to the shared variance between the JCCES and RIAS Verbal Scale, as well as the processes that differentiate mathematical problem-solving from verbal and crystallized intelligence (Geary, 1994; Hattie, 2009). This could involve examining the role of working memory, attention, and executive functions in the development of these cognitive abilities (Baddeley, 2003; Conway et al., 2002).

Limitations and Alternative Explanations

Although the present study has several strengths, there are also some limitations that warrant consideration. As noted earlier, the sample size was relatively small, and the sample was not equally distributed in terms of gender and educational attainment. This may limit the generalizability of the findings and introduce potential selection bias (Maxwell, 2004; Pedhazur & Schmelkin, 1991). Future research should aim to replicate these findings in larger and more diverse samples to increase the robustness and external validity of the results (Cook & Campbell, 1979; Shadish, et al., 2002).

Additionally, the study relied solely on PCA to analyze the relationships between the variables. Future research could employ other statistical techniques, such as confirmatory factor analysis (Jöreskog, 1969; Bollen, 1989), structural equation modeling (Kline, 2005; Schumacker & Lomax, 2004), or multiple regression (Cohen, et al., 2003; Tabachnick & Fidell, 2007), to further validate the findings and provide a more comprehensive understanding of the relationships among the variables. These alternative statistical approaches could help to address potential methodological issues, such as measurement error and the influence of confounding variables, and strengthen the evidence base for the observed relationships between crystallized intelligence and verbal abilities (Bryant & Yarnold, 1995; Little, et al., 1999).

Directions for Future Research

Based on the findings and limitations of the present study, several directions for future research can be identified. First, researchers could investigate the underlying cognitive processes that contribute to the shared variance between the JCCES and RIAS Verbal Scale, as well as the processes that differentiate mathematical problem-solving from verbal and crystallized intelligence (Neisser et al., 1996; Stanovich & West, 2000). This could involve examining the neural substrates of these cognitive abilities (Jung & Haier, 2007), as well as the role of environmental and genetic factors in their development (Plomin & Spinath, 2004).

Second, future research could examine the predictive validity of the JCCES and RIAS Verbal Scale for various educational and occupational outcomes, such as academic achievement, job performance, and job satisfaction (Deary, 2001; Kuncel, et al., 2004). This would help to establish the practical utility of these measures in real-world settings and inform the development of evidence-based interventions and policies aimed at fostering individual success and well-being (Gottfredson, 1997).

Third, researchers could explore the potential moderating role of individual differences, such as age, gender, and socioeconomic status, on the relationship between the JCCES and RIAS Verbal Scale (Deary, et al., 2005; Lubinski & Benbow, 2006). This would help to identify specific subgroups of the population for whom these measures may be particularly informative or relevant and inform the development of targeted interventions and supports.

Finally, future research could investigate the longitudinal stability of the relationships between the JCCES and RIAS Verbal Scale, as well as the potential causal mechanisms underlying these relationships (McArdle, et al., 2002). Longitudinal designs would allow researchers to examine the development and change of cognitive abilities over time (Baltes, et al., 1980) and provide insights into the factors that contribute to the observed patterns of covariation among the variables (Salthouse, 2004).

Conclusion

In conclusion, the present study examined the relationship between the Jouve Cerebrals Crystallized Educational Scale (JCCES) and the Reynolds Intellectual Assessment (RIAS) Verbal Scale using a Principal Component Analysis (PCA). The results of the PCA supported the hypothesis that there is a strong relationship between the JCCES and the RIAS Verbal Scale, with the first factor representing a common underlying construct of general verbal and crystallized intelligence and the second-factor representing mathematical problem-solving as a distinct construct. The findings contribute to the understanding of the structure of cognitive abilities and have implications for theory, practice, and future research. The study provides empirical support for the relationship between crystallized intelligence and verbal abilities and the differentiation of mathematical abilities from verbal abilities. The results can inform the development of educational interventions and assessments that are tailored to the specific needs of learners with different cognitive profiles. The study's limitations include a relatively small sample size and an unequal distribution of gender and educational attainment, highlighting the need for future research to replicate the findings in larger and more diverse samples. Future research could also investigate the underlying cognitive processes, predictive validity, individual differences, and longitudinal stability of the relationships among these variables.

References

Ackerman, P. L. (1996). A theory of adult intellectual development: Process, personality, interests, and knowledge. Intelligence, 22(2), 227–257. https://doi.org/10.1016/S0160-2896(96)90016-1

Baltes, P. B., Reese, H. W., & Lipsitt, L. P. (Eds.). (1980). Life-span developmental psychology: Introduction to research methods. New York: Academic Press. https://doi.org/10.4324/9781315799704

Bollen, K. A. (1989). Structural equations with latent variables. New York, NY: John Wiley & Sons.

Baddeley, A. (2003). Working Memory: Looking Back and Looking Forward. Nature Reviews Neuroscience, 4(10), 829–839. https://doi.org/10.1038/nrn1201

Bartlett, M. S. (1954). A note on the multiplying factors for various chi square approximations. Journal of the Royal Statistical Society: Series B, 16(2), 296-298. 

Bryant, F. B., & Yarnold, P. R. (1995). Principal-components analysis and exploratory and confirmatory factor analysis. In L. G. Grimm & P. R. Yarnold (Eds.), Reading and understanding multivariate statistics (pp. 99-136). Washington, DC: American Psychological Association.

Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago, IL: Rand McNally.

Cattell, R. B. (1963). Theory of fluid and crystallized intelligence: A critical experiment. Journal of Educational Psychology, 54(1), 1–22. https://doi.org/10.1037/h0046743

Cattell, R. B. (1971). Abilities: their structure, growth, and action. Boston, MA: Houghton Mifflin.

Carroll, J. B. (1993). Human cognitive abilities: A survey of factor-analytic studies. New York: Cambridge University Press. https://doi.org/10.1017/CBO9780511571312

Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences. Mahwah, NJ: Lawrence Erlbaum Associates. https://doi.org/10.4324/9780203774441

Conway, A. R. A., Cowan, N., Bunting, M. F., Therriault, D. J., & Minkoff, S. R. B. (2002). A latent variable analysis of working memory capacity, short-term memory capacity, processing speed, and general fluid intelligence. Intelligence, 30(2), 163-183. https://doi.org/10.1016/S0160-2896(01)00096-4

Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Boston, MA: Houghton Mifflin.

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). New York: Sage Publications, Inc.

Deary, I. J. (2001). Intelligence: A very short introduction. Oxford, UK: Oxford University Press.

Deary, I. J., Strand, S., Smith, P., & Fernandes, C. (2007). Intelligence and educational achievement. Intelligence, 35(1), 13-21. https://doi.org/10.1016/j.intell.2006.02.001

Deary, I. J., Taylor, M. D., Hart, C. L., Wilson, V., Smith, G. D., Blane, D., & Starr, J. M. (2005). Intergenerational social mobility and mid-life status attainment: Influences of childhood intelligence, childhood social factors, and education. Intelligence, 33(5), 455–472. https://doi.org/10.1016/j.intell.2005.06.003

Fletcher, J.M., Lyon, G.R., Fuchs, L.S., and Barnes, M.A. (2007). Learning disabilities: From identification to intervention. New York: Guilford Press.

Geary, D. C. (1994). Children's mathematical development: Research and practical applications. Washington, DC: American Psychological Association. https://doi.org/10.1037/10163-000

Gottfredson, L. S. (1997). Why g matters: The complexity of everyday life. Intelligence, 24(1), 79–132. https://doi.org/10.1016/S0160-2896(97)90014-3

Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses related to achievement. London, UK: Routledge.

Horn, J. L., & Cattell, R. B. (1966). Refinement and test of the theory of fluid and crystallized general intelligences. Journal of Educational Psychology, 57(5), 253-270. https://doi.org/10.1037/h0023816

Hutcheson, G. D., & Sofroniou, N. (1999). The Multivariate Social Scientist: Introductory Statistics Using Generalized Linear Models. Thousand Oaks, CA: Sage Publications. https://doi.org/10.4135/9780857028075

Jolliffe, I. T. (1986). Principal component analysis and factor analysis. In I. T. Jolliffe (Ed.), Principal component analysis (pp. 115-128). New York, NY: Springer-Verlag.

Joreskog, K. G. (1969). A general approach to confirmatory maximum likelihood factor analysis. Psychometrika, 34(2, Pt.1), 183–202. https://doi.org/10.1007/BF02289343

Jung, R. E., & Haier, R. J. (2007). The Parieto-Frontal Integration Theory (P-FIT) of intelligence: converging neuroimaging evidence. The Behavioral and brain sciences, 30(2), 135–187. https://doi.org/10.1017/S0140525X07001185

Kaiser, H. F. (1974). An index of factorial simplicity. Psychometrika, 39(1), 31–36. https://doi.org/10.1007/BF02291575

Kaufman, A. S., & Lichtenberger, E. O. (2006). Assessing adolescent and adult intelligence (3rd ed.). Hoboken, NJ: John Wiley & Sons Inc.

Kline, R. B. (2005). Principles and practice of structural equation modeling. New York, NY: Guilford Press.

Little, T. D., Lindenberger, U., & Nesselroade, J. R. (1999). On selecting indicators for multivariate measurement and modeling with latent variables: When "good" indicators are bad and "bad" indicators are good. Psychological Methods, 4(2), 192–211. https://doi.org/10.1037/1082-989X.4.2.192

Lubinski, D., & Benbow, C. P. (2006). Study of mathematically precocious youth after 35 years: Uncovering antecedents for the development of math-science expertise. Perspectives on Psychological Science, 1(4), 316-345. https://doi.org/10.1111/j.1745-6916.2006.00019.x

Kerlinger, F. N. (2000). Foundations of Behavioral Research. San Diego, CA: Harcourt College Publishers.

Kuncel, N. R., Hezlett, S. A., & Ones, D. S. (2004). Academic performance, career potential, creativity, and job performance: can one construct predict them all?. Journal of personality and social psychology, 86(1), 148–161. https://doi.org/10.1037/0022-3514.86.1.148

Maxwell, S. E. (2004). The Persistence of Underpowered Studies in Psychological Research: Causes, Consequences, and Remedies. Psychological Methods, 9(2), 147–163. https://doi.org/10.1037/1082-989X.9.2.147

McArdle, J. J., Ferrer-Caja, E., Hamagami, F., & Woodcock, R. W. (2002). Comparative longitudinal structural analyses of the growth and decline of multiple intellectual abilities over the life span. Developmental psychology, 38(1), 115–142.

McGrew, K. S., & Flanagan, D. P. (1998). The intelligence test desk reference (ITDR): Gf-Gc cross-battery assessment. Boston, MA: Allyn & Bacon.

Neisser, U., Boodoo, G., Bouchard, T. J., Jr., Boykin, A. W., Brody, N., Ceci, S. J., Halpern, D. F., Loehlin, J. C., Perloff, R., Sternberg, R. J., & Urbina, S. (1996). Intelligence: Knowns and unknowns. American Psychologist, 51(2), 77–101. https://doi.org/10.1037/0003-066X.51.2.77

Pedhazur, E. J., & Schmelkin, L. P. (1991). Measurement, design, and analysis: An integrated approach. Hillsdale, NJ: Lawrence Erlbaum Associates.

Plomin, R., & Spinath, F. M. (2004). Intelligence: genetics, genes, and genomics. Journal of personality and social psychology, 86(1), 112–129. https://doi.org/10.1037/0022-3514.86.1.112

Reynolds, C. R., & Kamphaus, R. W. (2003). Reynolds Intellectual Assessment Scales (RIAS) and the Reynolds Intellectual Screening Test (RIST), Professional Manual. Lutz, FL: Psychological Assessment Resources.

Salthouse, T. A. (2004). What and When of Cognitive Aging. Current Directions in Psychological Science, 13(4), 140–144. https://doi.org/10.1111/j.0963-7214.2004.00293.x

Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin.

Schumacker, R. E., & Lomax, R. G. (2004). A beginner's guide to structural equation modeling. Mahwah, NJ: Lawrence Erlbaum Associates. https://doi.org/10.4324/9781410610904

Spearman, C. (1904). "General intelligence," is objectively determined and measured. The American Journal of Psychology, 15(2), 201-292. https://doi.org/10.2307/1412107

Stanovich, K. E., & West, R. F. (2000). Individual differences in reasoning: Implications for the rationality debate? Behavioral and Brain Sciences, 23(5), 645–665. https://doi.org/10.1017/S0140525X00003435

Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed.). Boston, MA: Pearson Education.