Next Article in Journal
Overconfidence in the Cognitive Reflection Test: Comparing Confidence Resolution for Reasoning vs. General Knowledge
Next Article in Special Issue
AI for Psychometrics: Validating Machine Learning Models in Measuring Emotional Intelligence with Eye-Tracking Techniques
Previous Article in Journal
The Use of Cognitive Tests in the Assessment of Dyslexia
Previous Article in Special Issue
Reducing Black–White Racial Differences on Intelligence Tests Used in Hiring for Public Safety Jobs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Person-Centered Study of Cognitive Ability Dimensions Using Latent Profile Analysis

by
Jeffrey M. Conte
1,* and
Rebecca K. Harmata
2
1
Department of Psychology, San Diego State University, San Diego, CA 92182, USA
2
Department of Psychology, University of Georgia, Athens, GA 30602, USA
*
Author to whom correspondence should be addressed.
J. Intell. 2023, 11(5), 80; https://doi.org/10.3390/jintelligence11050080
Submission received: 15 November 2022 / Revised: 18 April 2023 / Accepted: 20 April 2023 / Published: 26 April 2023

Abstract

:
A number of researchers have called for additional investigations into cognitive ability and intelligence in recent years. This paper utilized a person-centered approach, multiple cognitive ability dimensions, and latent profile analysis to investigate multivariate relationships among cognitive ability dimensions in a sample of 1681 Army recruits. Six cognitive ability dimensions were assessed via the Armed Services Vocational Aptitude Battery. Performance measures were obtained from supervisor ratings of Effort, Discipline, and Peer Leadership. Using latent profile analysis, the results identified five distinct cognitive profiles or classes, which differed significantly across the three types of supervisor ratings.

1. Introduction

A number of researchers have called for additional investigations of cognitive ability and intelligence in recent years. In a focal article in Industrial and Organizational Psychology: Perspectives on Science and Practice, Scherbaum et al. (2012) noted that although the field of industrial and organizational (I–O) psychology has not studied the intelligence construct in depth in a number of years, other fields (e.g., clinical, educational, and developmental) have continued to study this construct and have made substantial progress in understanding it in more detail. Scherbaum and colleagues challenged the field of I–O psychology to pursue new research initiatives on this critical construct.
As noted by Conte and Landy (2019), people use cognitive abilities to acquire knowledge and solve problems. Studies across many decades of research indicate that cognitive ability is one of the best predictors of job performance (e.g., Sackett et al. 2022; Schmidt and Hunter 1998). Further, meta-analyses investigating the relationship between general intelligence and job performance have demonstrated that as the complexity and cognitive demands of jobs increase, the predictive validity of tests of general intelligence also increases (Schmidt and Hunter 2004).
Schneider and Newman (2015) reviewed a wide array of research to support their claim that intelligence is a multidimensional construct. They recommended a renewed interest in narrower cognitive abilities, as compared to the unidimensional view captured by “general mental ability”—or, simply, “g”. Reeve et al. (2015) similarly noted that although the focus on “g” continues to provide insight into successful behavior at work, there is a need to give additional consideration of specific aspects of intelligence. They argued that giving attention to constellations of specific cognitive aptitudes can provide additional insight into the various abilities that are required for success in today’s workplace. Thus, their article encouraged I–O psychologists to use the science of mental abilities and measurement theory to better understand how basic constructs within the intelligence literature affect job performance.
In the developmental psychology area, Fuchs et al. (2010) examined whether different types of school mathematics development depend on different constellations of numerical versus general cognitive abilities. First graders (n = 280) were assessed on two types of basic numerical cognition, eight domain-general abilities, procedural calculations (PCs), and word problems (WPs) in the Fall, and then reassessed on PCs and WPs in the Spring. The results suggested that the development of different types of formal school mathematics depends on various constellations of numerical versus general cognitive abilities.
This study used multiple cognitive ability dimensions and the person-centered approach to identify cognitive profiles and to predict supervisory ratings of job performance. It used a multidisciplinary approach by considering research and theories on cognitive ability from educational, developmental, cognitive, and I–O psychology to investigate multivariate relationships between cognitive ability dimensions and performance outcomes.

1.1. Concerns about Positive Manifold of Cognitive Ability

Any study of multiple cognitive ability dimensions naturally should address concerns about the fact that positive manifold (correlations) found in specific measures of cognitive ability might limit the utility of class-based approaches, such as Latent Profile Analysis (LPA). There are several reasons why exploring multiple dimensions of cognitive ability using class-based approaches such as LPA can be beneficial and informative. First, dominant general factors and positive manifold have been found in variety of constructs beyond cognitive ability, including in constructs (personality, emotional intelligence) that have successfully used class-based analyses.
Ree et al. (2015) noted that positive manifold is common among many constructs (not just cognitive ability). For example, they present evidence that the following constructs show positive manifold and a dominant general factor (DGF): emotional intelligence, personality, psychomotor ability, job performance, job satisfaction, core self-evaluation, leadership, and physical ability. Ree and colleagues indicated that the percentage of variance accounted for by the DGF for cognitive ability measures ranged from 41% to 64%, whereas the percentage of variance accounted for by the DGF for personality measures ranged from 40% to 50%. Another construct that has multiple dimensions and has been investigated with class-based approaches is emotional intelligence (Keefer et al. 2012). Ree and colleagues noted that the percentage of variance accounted for by DGF for EI measures ranges from 52% to 70%. So, although for example, personality and emotional intelligence have dominant general factors and positive manifold, researchers have nevertheless been able to produce meaningfully different sets of classes or profiles with different dimensions of personality and emotional intelligence.
Studies examining specific cognitive ability dimensions should be conducted despite positive manifold, which has previously led to a lack of research in I–O psychology on specific dimensions of cognitive ability, despite the important progress that other areas of psychology (such as clinical, developmental, and educational psychology) have made in this area.

1.2. Person-Centered Approach

An issue of the journal Industrial and Organizational Psychology: Perspectives on Science and Practice included a focal article and several commentaries on person-centric psychology (Weiss and Rupp 2011). In contrast to the prevailing paradigm in I–O psychology that focuses on variables and relationships between variables, person-centric psychology focuses on the person, and takes a holistic and dynamic view of people (Foti et al. 2011). Person-centric approaches utilize multiple individual difference characteristics or dimensions to identify profiles or patterns of individuals who are similar (Wang et al. 2013).
The person-centered approach has been used with personality variables (Asendorpf 2015), but very little (if any) research has adopted this approach in the cognitive ability or intelligence domain. This study uses a person-centered approach to examine multiple dimensions of cognitive ability and how they relate to several performance outcomes. This study uses Latent Profile Analysis (LPA), which is a relatively new statistical approach that identifies homogeneous groups based on a number of predictor variables, such as cognitive ability dimensions. In particular, LPA (described further below) can be used to determine whether interactions between different cognitive dimensions can help organize respondents into groups with similar profiles. The identification of profile groups using Latent Profile Analysis can reveal new multivariate relationships among predictors and criteria that can contribute to improving the selection and job classification of employees in organizations.

1.3. Latent Profile Analysis

Latent profile analysis (LPA) is a multivariate approach that defines classes of people based on common characteristics (Merz and Roesch 2011; Spurk et al. 2020). It is an empirically driven method that allows researchers to examine multiple observed dimensions simultaneously to define these classes via maximum likelihood estimation. For example, LPA can use all observations from cognitive ability scales to define classes via a maximum likelihood estimation. The primary goal is to maximize the homogeneity within groups (i.e., individuals within a class/profile should look similar) and maximize the heterogeneity between groups (i.e., individuals in different classes/profile groups should look different). Models are estimated with classes added iteratively to determine which model provides the best fit to the data. Latent profile analysis also has the advantage of allowing the testing of variables that are antecedents or outcomes of the different classes and profile groups.
Many cognitive ability dimensions have been found to covary (Carroll 1993). When considering multiple cognitive ability dimensions as predictors, the number of higher-order interactions is quite high, which can result in statistical problems such as reduced statistical power (Cohen et al. 2003). Because it is often impractical to model all higher-order interactions of interest, person-centered statistical approaches such as LPA can be used to mimic higher-order interaction terms (Merz and Roesch 2011). Further, LPA can organize interactive effects as subtypes in a way that offers a brief and simple summary of complicated relationships (Herzberg and Roth 2006). For this study, LPA was used to derive categorical latent variables that represent classes of individuals who share similar cognitive ability profiles (Lanza et al. 2003).
Psychologists have spent a lot of time classifying attributes of individuals using, for example, the Big Five personality dimensions, but much less time in classifying people themselves (Robins and Tracy 2003). Latent profile analysis provides a way to classify people and to focus more holistically on types of people, and thus works well with the person-centered approach. An increasing number of studies have used personality dimensions to classify people, but few (if any) studies have used cognitive ability dimensions. Thus, this study is among the first to do so.
Although evaluating the first-order effects of the cognitive dimensions might be considered a reasonable way to examine how cognitive ability is related to work outcomes, this method overlooks empirical evidence that cognitive dimensions do not exist in isolation. That is, although simultaneously entering the multiple dimensions as predictors in a statistical model controls for common variance, these types of models are only informative with regard to the additive effects (i.e., first order effects) of these dimensions (Merz and Roesch 2011) and preclude the possibility that the multiple dimensions can be modeled as multiplicative effects (i.e., interactions). As noted earlier, when considering multiple dimensions as predictors, the number of higher-order interactions is quite high, which can result in reduced statistical power (Cohen et al. 2003). Because it is often impractical to model all higher-order interactions, person-centered statistical approaches such as LPA can be used to mimic higher-order interaction terms (Lanza et al. 2010; Spurk et al. 2020).

2. Present Investigation

Research using LPA provides an innovative examination of cognitive ability profiles and their potential links to performance outcomes. In particular, examining cognitive profiles can provide an understanding of different configurations of cognitive abilities. Although general intelligence is important for the prediction of job performance, research indicates that specific cognitive abilities can play an important role in predicting job and career success (Park et al. 2007; Webb et al. 2007). Specifically, research has shown the value of specific abilities in predicting performance using primarily correlation and regression methods (Lang and Kell 2020; Lang et al. 2010; Nye et al. 2022; Wee et al. 2014). These regression methods use a variable-centered approach that is common in I–O psychology and the psychology field in general. This paper has extended this work by investigating multiple cognitive ability measures simultaneously with the adoption of a person-centered approach (Foti et al. 2011). A number of studies have used the person-centered approach (e.g., Gabriel et al. 2015; Grunschel et al. 2013; Meyer et al. 2013); however, none to our knowledge has used the person-centered approach in the cognitive ability domain.
Empirical research indicates that specific cognitive ability dimensions are relatively independent from each other, and thus, they are likely to contribute to differing cognitive profiles that have meaningfully different cognitive strengths (Carroll 1993; Fleishman and Reilly 1992). In particular, research indicates that the cognitive ability measure (Armed Services Vocational Aptitude Battery, or ASVAB) used in this study has relatively independent dimensions (Segall 2004; Welsh et al. 1990).
Once profile groups based on cognitive ability dimensions are established, they can be examined in relation to the performance outcomes to assess for meaningful prediction based on these latent classes. Investigating such profiles using LPA is relatively new to I–O psychology, and there have been increased calls for such research (e.g., Foti et al. 2011). By adopting a person-centered view and utilizing a relatively new multivariate approach to investigate relationships between multiple cognitive dimensions, the LPA results can help enhance the understanding of ways individuals differ in cognitive ability and enhance performance prediction as well. In particular, LPA can help to identify and understand previously unobserved subpopulations (Wang and Hanges 2011), which may then help with making more efficient and accurate selection and classification decisions. Based on reviews of the literature, the following research questions will be examined:
RQ1: Does Latent Profile Analysis provide evidence of multiple classes/profiles based on cognitive ability dimensions?
RQ2: Do performance outcomes differ across the cognitive ability profiles?

3. Materials and Method

3.1. Sample and Measures

The sample for this study comes from data from the Army Research Institute’s Personnel Assessment Research Unit (Knapp and Heffner 2010). The sample comprises 1681 Army recruits. In terms of the gender breakdown, 151 (9%) were female, and 1530 (91%) were male. The LPA analyses included 6 of the 10 subdimensions from the Armed Services Vocational Aptitude Battery (ASVAB). This subset focuses on the non-overlapping dimensions that still generally cover the overall domain of cognitive ability. The ASVAB is split into 5 Knowledge subtests and 5 Ability subtests. The current study used 2 dimensions from Knowledge subtests (General Science, Mechanical Comprehension) and 4 dimensions from the Ability subtests (Verbal Expression, Arithmetic Reasoning, Paragraph Comprehension, and Assembling Objects). From the original 10 ASVAB dimensions, this approach resulted in dropping Word Knowledge (which was correlated 0.93 with Verbal Expression) and Math Knowledge (which was correlated 0.51 with Arithmetic Reasoning). Other dimensions that were dropped were Automotive Shop and Electronics Information, both of which had correlations near 0.45 with Mechanical Comprehension. Overall, the dimensions that were not retained had correlations ranging from 0.45 to 0.93 with the dimensions that were retained.
Outcome measures included behaviorally anchored performance rating scales, which were completed by supervisors. The performance ratings were assessed on 3 dimensions. Effort was a 3-item measure (internal consistency reliability of 0.89) assessing Soldiers’ persistence and initiative demonstrated when completing study, practice, preparation, and participation activities during training. Discipline was a 5-item measure (internal consistency reliability of 0.90) assessing Soldiers’ willingness to follow directions and regulations, and to behave in a manner consistent with the Army’s Core Values (e.g., showing up on time and showing proper respect for superiors). Peer Leadership (internal consistency reliability of 0.87) was a 3-item measure assessing Soldier’s proficiency in leading their peers (e.g., gaining the cooperation of peers, taking on leader roles as assigned, and giving clear directions to peers) when assigned to a leadership position.

3.2. Analytic Approach

Data analyses were conducted using the Statistical Package for the Social Sciences (SPSS) and Mplus software program (Muthén and Muthén 2017). First, the data were analyzed using descriptive statistics that examined the means, standard deviations, and correlations across the relevant variables. Next, Latent Profile Analysis (LPA) was used to derive categorical latent variables, which represent classes of individuals who share similar profiles (Lanza et al. 2003). After assessing model assumptions (Bauer 2007), determination of the optimal number of classes/profiles in LPA was conducted. This process required the specification and testing of multiple class solutions (1-class, 2-class, 3-class, etc.). Each class solution was examined using two types of estimation procedures, as implemented by the Mplus software: (1) a full-information maximum likelihood approach and (2) a Bayesian approach. From these models, the designation of the “best-fitting” model was determined. The Lo–Mendell–Rubin Adjusted Likelihood Ratio Test (LMRT; Lo et al. 2001) was developed as an inferential statistical test to determine model fit. The LMRT provided an indication of statistically significant improvement in fit for a model with k latent classes/profiles—as compared to a model with k-1 latent classes/profiles—by approximating the differences between two log likelihood values (instead of using the χ2 distribution). Thus, a significant LMRT test indicated that a more complex model (e.g., 3-class) provided superior fit to a less complex model (e.g., 2-class). A second inferential test that was used when evaluating class enumeration was the Bootstrapped Likelihood Ratio Test (BLRT; Arminger et al. 1999). Rather than approximating a log likelihood difference distribution like the LMRT does, the BLRT in effect estimates a “difference” distribution by which different models can be compared, through the use of repeated sampling methods.
A number of fit indicators based on information criteria were also employed, and these include the Akaike Information Criteria (AIC; Akaike 1974), the Bayesian Information Criterion (BIC; Schwarz 1978), and the sample size-adjusted BIC (sBIC; Sclove 1987). Each of these information criteria is based on the log likelihood function for individual models (rather than comparing two log likelihood values, as LMRT and BLRT do). In addition, another statistical indicator of class enumeration is entropy (Ramaswamy et al. 1993). Entropy is a measure of how well classes or profiles can be distinguished, or the percentage of individuals in the sample that were correctly classified given the specific class model. In contrast to other statistical grouping approaches, such as cluster analysis, individuals in LPA are assigned a posterior probability for each class/profile, rather than outright assignment to just one class/profile. Entropy is the aggregate of these posterior probabilities, with values greater than 80% being considered noteworthy.
The interpretability of each class/profile was used to facilitate the determination of whether or not a specific class solution is consistent with past theory and empirical research. Two primary model parameters are useful in this regard in LPA: (1) latent class probabilities (LCP), and (2) conditional response means (CRM). CRMs are analogous to factor loadings (Lanza et al. 2003), and they refer to the mean for each observed variable within a latent profile. These classes/profiles are then substantively characterized by interpreting responses both within and between classes. The CRMs are good indications of which observed variables within and between classes best identify the separate classes or profiles. Once classes/profiles are substantively interpreted, the probability or the proportion of cases within each class/profile helps identify the prevalence of class/profile membership.

4. Results

Descriptive statistics and correlations were calculated among the study variables (see Table 1). Latent profile analyses were conducted to identify the best-fitting profile solution. Table 2 shows the results for the profile solutions that ranged from two classes to eight classes. The results indicated that the Akaike Information Criteria (AIC), Bayesian Information Criterion (BIC), and the sample-adjusted BIC (sBIC) generally decreased as the number of profiles increased from two to eight. The largest decrease in the sBIC was between the 4-profile solution (sBIC = 62,228.96) and 5-profile solution (sBIC = 61,859.75), and the sBIC remained relatively stable from the 4-profile solution to 8-profile solution (sBIC = 61,699.32). In addition, although the 3-profile solution presented the largest entropy value (0.78), both the Lo–Mendell–Rubin Adjusted Likelihood Ratio Test (LMRT) and the Bootstrapped Likelihood Ratio Test (BLRT) were significant, suggesting the 3-profile solution was not a better fit of the data than the 2-profile solution. As such, additional profile solutions were examined. The 6-profile solution presented with a non-significant LMRT, suggesting that the 5-profile solution was the best fitting model for the data.
The 5-profile solution was further examined for theoretical and practical significance using conditional response means (CRMs). Based on fit statistics, meaningfulness of the CRMs, the minimum number of individuals per class, and distinct relationships between the profiles and performance outcomes, the 5-profile solution was determined to best represent the underlying data. Notably, profiles within the 5-profile solution had overlapping mean structures, suggesting the response patterns within each profile had unique interactional effects (see Figure 1). Additionally, the 5-profile solution showed evidence of acceptable classification probabilities (see Table 3), suggesting the profiles had reliable classification rates. In Table 3, the accuracy classification probabilities for each of the classes are in the diagonal (higher and closer to 1 is better). The average for the diagonal entries was 0.81.
Each profile highlighted relative cognitive ability strengths based on six dimensions of the ASVAB: General Science, Mechanical Comprehension, Verbal Expression, Arithmetic Reasoning, Paragraph Comprehension, and Assembling Objects. Table 4 shows the overall sample means and conditional response means for the 5-class solution. Profile 1 (13% of the sample) was labeled Arithmetic Reasoning/Assembling Objects. Profile 2 (15% of the sample) was labeled Verbal Expression and Paragraph Comprehension. Profile 3 (35% of the sample) was labeled Math Knowledge/Assembling Objects. Profile 4 (20% of the sample) was labeled Verbal Expression, Paragraph Comprehension, Assembling Objects, and Profile 5 (17% of the sample) was labeled Uniformly High, including General Science, Math, and Verbal.
Next, a 3-step mixture modeling approach (Asparouhov and Muthén 2015) was conducted to examine if there were significant differences between the cognitive profiles and performance ratings. As shown in Table 5, significant differences were found across classes for the Effort (M = 3.51; SD = 0.74), Discipline (M = 3.77; SD = 0.73), and the Peer Leadership (M = 3.39; SD = 0.79) ratings. Those classified as Profile 1 received significantly lower Effort ratings (M = 3.40; SD = 0.73) than Profile 3 (p < 0.01), Profile 4 (p < 0.05), and Profile 5 (p < 0.01). No significant differences in Effort ratings were found between those classified as Profile 3 (M = 3.55; SD = 0.71), Profile 4 (M = 3.56; SD = 0.69), and Profile 5 (M = 3.68; SD = 0.72). Regarding Discipline, those classified as Profile 2 (M = 3.53; SD = 0.80) reported significantly (p < 0.01) lower Discipline ratings than Profile 1 (M = 3.66; SD = 0.71, p < 0.01), Profile 3 (M = 3.79, SD = 0.72; p < 0.05), Profile 4 ((M = 3.83; SD = 0.69, p < 0.01), and Profile 5 (M = 3.94; SD = 0.69, p < 0.01). Those classified as Profile 4 (M = 3.83; SD = 0.69) and Profile 5 (M = 3.94; SD = 0.69) had significantly (p < 0.01) higher Discipline ratings as compared to those classified as Profile 1 (M = 3.66; SD = 0.71). Finally, for Peer Leadership, those classified as Profile 5 reported significantly (p < 0.05) higher ratings (M = 3.56; SD = 0.78) than those classified as Profile 1 (M = 3.31; SD = 0.73, p < 0.01), Profile 2 (M = 3.22; SD = 0.85, p < 0.01), and Profile 4 (M = 3.38; SD = 0.77, p < 0.05), but not Profile 3 (M = 3.42; SD = 0.78).

5. Discussion

This paper utilized a person-centered approach, multiple cognitive ability dimensions, and latent profile analysis to investigate multivariate relationships between cognitive ability dimensions. Specifically, this study proposed that cognitive ability can be considered more holistically across multiple dimensions, instead of relying on an overall “g” factor. This approach was investigated using a person-centered analysis, in which profiles were generated using latent profile analysis. The results identified five distinct profiles or classes, which differed significantly across the three types of supervisor ratings. Thus, in terms of the research questions proposed earlier in this paper, this study indicated that the Latent Profile Analyses provided evidence of multiple classes/profiles based on cognitive ability dimensions. Further, this study indicated that the performance outcomes differed across the cognitive ability profiles.
As noted earlier in this paper, research on cognitive ability from I–O psychologists has not made much progress in many years, and new perspectives on cognitive ability are needed. This paper provides one new way of examining cognitive ability in terms of using a person-centered approach that considers multiple dimensions of cognitive ability simultaneously; that is, applicants and workers are not just described and judged by their overall cognitive ability score, or even by separate specific cognitive ability dimensions. Instead, individuals can be viewed as having patterns of cognitive abilities that interact simultaneously (rather than using a variable-centered approach that considers independent cognitive dimensions). Kaufman (2019) noted that if one really wants to understand the complexities of a person’s intelligence, one can do much better than simply looking at a person’s overall IQ score. Kaufman argued that the overall IQ or intelligence score doesn’t offer nearly as much information as the pattern of meaningful cognitive dimensions. The current study is one of the first to empirically support this argument.
This research has extended the work on person-centered approaches to the cognitive domain. In this study, the LPA results described five profiles that have differing patterns across the cognitive dimensions. More specifically, each of the five profiles had different sets of cognitive strengths. Profile 5 comprised individuals who scored highly across all cognitive dimensions; however, not all applicants or employees can be above average. Thus, it is important to examine the other cognitive profiles or classes. Individuals in Profile 4 were relatively high on Verbal Expression, Paragraph Comprehension, and Assembling Objects. Although they did not have the very high cognitive ability across all dimensions as those in Profile 5, those in Profile 4 nevertheless had similar average scores on both Effort and Discipline performance ratings. As another example, those in Profile 3 (who scored highly in Math Knowledge and Assembling Objects, but not as high in Verbal Expression and Paragraph Comprehension) received high Peer Leadership ratings (which were not significantly different from the Uniformly High members of Profile 5).
It is notable that the profiles did not show a simple linear pattern when they are examined across the different performance ratings. Specifically, across the different performance ratings, the highest ability profile was not necessarily statistically significantly different from some of the other profiles. This finding is unique compared to how general cognitive abilities are typically discussed in the literature, which very often assumes that more cognitive ability is always better (e.g., Schmidt 2002). Thus, the results of this study provide another indication that the consideration of cognitive profiles using the person-centered approach can provide an alternative perspective to the prevailing view.

Strengths, Limitations, and Future Research

A strength of the current study is that it is the first to examine relationships among multiple cognitive dimensions using a person-centered approach and LPA. One limitation with the current study is the heavily male sample. A second limitation is that this study only investigated cognitive ability dimensions as part of the latent profile analysis. Future research should include both cognitive ability and personality dimensions in a latent profile analysis to investigate patterns and profiles across these two important ways in which individuals differ. Following the research on personality profiles identified using the person-centered approach (Asendorpf 2015; Donnellan and Robins 2010), future research should examine whether the cognitive profiles identified in the current study can be replicated in other samples. To our knowledge, no other studies have been conducted using both cognitive ability dimensions and a person-centered approach; thus, the profiles identified will need to be examined with other samples and other cognitive measures. This study used six dimensions from the ASVAB (General Science, Mechanical Comprehension, Verbal Expression, Arithmetic Reasoning, Paragraph Comprehension, and Assembling Objects). Although these six dimensions include both ability and knowledge subtests, other cognitive ability measures differ somewhat on which dimensions they measure. For example, the Wechsler Adult Intelligence Scale—Fourth Edition (WAIS-IV) includes indexes on four factors: verbal comprehension, perceptual reasoning (spatial reasoning), working memory (including arithmetic and quantitative reasoning), and processing speed (Holdnack 2019). As another example, the Stanford–Binet intelligence test assesses five factors: knowledge, quantitative reasoning, visual–spatial processing, working memory, and fluid reasoning (Bain and Allin 2005). Roberts et al. (2000) examined the ASVAB’s factorial structure in relation to the theory of fluid and crystallized intelligence and Carroll’s (1993) three-stratum model that includes narrow abilities (Stratum 1), broad abilities (Stratum 2), and general cognitive abilities (Stratum 3). They found that the ASVAB measured primarily crystallized intelligence, and thus did not measure fluid intelligence or memory as much as other well-known cognitive ability tests. Adopting a person-centered approach and considering important theoretical models of cognitive ability (Carroll 1993; McGrew 2005) while examining additional samples and other cognitive measures will help researchers to develop further understanding of patterns across cognitive dimensions.
Wai et al. (2009) found that spatial ability plays a critical role in developing expertise in STEM (science, technology, engineering, and mathematics) jobs. They recommended that including spatial ability in modern talent searches would identify adolescents with potential for STEM jobs. The current study found that combinations of cognitive abilities (including spatial ability, which is assessed by the Assembling Objects dimension of the ASVAB) are related to work outcomes. Future research should further investigate how spatial ability works in combination with other cognitive dimensions to predict performance outcomes.
Oswald and Hough (2012) noted that research has documented that, even within range-restricted samples of gifted young students, higher SAT scores predict higher levels of career outcomes in middle age (Wai et al. 2005). They noted that researchers could investigate similar questions within occupational samples. For example, how does talent distinguish itself (are there different patterns of cognitive abilities?) within high-ability professionals, even within the same occupation? Oswald and Hough further suggested that knowing more about how highly talented individuals develop could have important implications for understanding and managing talent and varying cognitive abilities in the workforce. Future research should investigate these issues with profiles of multiple cognitive abilities along with incorporating antecedents of these cognitive profiles.
Overall, this study used multiple cognitive ability dimensions and LPA to identify five latent classes. The classes were found to have significant differences across a number of supervisor performance ratings. Future research should continue to use a person-centered approach to investigate cognitive dimensions as part of latent classes that may be predictive of performance outcomes.

Author Contributions

Conceptualization, J.M.C. and R.K.H.; methodology, J.M.C.; statistical analysis, R.K.H.; writing—original draft preparation, J.M.C.; writing—review and editing, J.M.C. and R.K.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and the project and archival data were determined to be exempt and approved by the Institutional Review Board of San Diego State University (protocol number 1112087 on 8 October 2012).

Informed Consent Statement

Informed consent was obtained from all participants involved in the study.

Data Availability Statement

Data used in this study are unavailable due to privacy restrictions.

Acknowledgments

The authors would like to gratefully acknowledge the U.S. Army Research Institute for the Behavioral and Social Sciences for providing the archival data set used in this study. The views expressed in this article, however, are solely those of the authors and should not be construed as the official policy or position of the U.S. Department of Defense, the U.S. Army, or the U.S. Army Research Institute.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Akaike, Hirotugu. 1974. A new look at the statistical model identification. IEEE Transactions on Automatic Control 19: 716–23. [Google Scholar] [CrossRef]
  2. Arminger, Gerhard, Petra Stein, and Jörg Wittenberg. 1999. Mixtures of conditional mean-and covariance-structure models. Psychometrika 64: 475–94. [Google Scholar] [CrossRef]
  3. Asendorpf, Jens B. 2015. Person-centered approaches to personality. In APA Handbook of Personality and Social Psychology, Vol. 4. Personality Processes and Individual Differences. Edited by M. Mikulincer, P. R. Shaver, M. L. Cooper and R. J. Larsen. Washington, DC: American Psychological Association, pp. 403–24. [Google Scholar]
  4. Asparouhov, Tihomir, and Bengt Muthén. 2015. Auxiliary variables in mixture modeling: Three-step approaches using Mplus. Structural Equation Modeling: A Multidisciplinary Journal 21: 329–41. [Google Scholar] [CrossRef]
  5. Bain, Sherry K., and Jessica D. Allin. 2005. Book review: Stanford–Binet intelligence scales (Fifth Edition). Journal of Psychoeducational Assessment 23: 87–95. [Google Scholar]
  6. Bauer, Daniel J. 2007. Observations on the use of growth mixture models in psychological research. Multivariate Behavioral Research 42: 757–86. [Google Scholar] [CrossRef]
  7. Carroll, John B. 1993. Human Cognitive Abilities: A Survey of Factor-Analytic Studies. Cambridge: Cambridge University Press. [Google Scholar]
  8. Cohen, Jacob, Patricia Cohen, Stephen G. West, and Leona S. Aiken. 2003. Applied Multiple Regression/Correlation Analysis for the Behavioral Sciences, 3rd ed. Hillsdale: Erlbaum. [Google Scholar]
  9. Conte, Jeffrey M., and Frank J. Landy. 2019. Work in the 21st Century: An Introduction to Industrial and Organizational Psychology, 6th ed. Hoboken: Wiley. [Google Scholar]
  10. Donnellan, M. Brent, and Richard W. Robins. 2010. Resilient, overcontrolled, and undercontrolled personality types: Issues and controversies. Personality and Social Psychology Compass 4: 1070–83. [Google Scholar] [CrossRef]
  11. Fleishman, Edwin A., and Maureen E. Reilly. 1992. Handbook of Human Abilities: Definitions, Measurements, and Job Task Requirements. Palo Alto: Consulting Psychologists Press. [Google Scholar]
  12. Foti, Roseanne J., Nicole J. Thompson, and Sarah F. Allgood. 2011. The pattern-oriented approach: A framework for the experience of work. Industrial and Organizational Psychology: Perspectives on Science and Practice 4: 122–25. [Google Scholar] [CrossRef]
  13. Fuchs, Lynn S., David C. Geary, Donald L. Compton, Douglas Fuchs, Carol L. Hamlett, Pamela M. Seethaler, Joan D. Bryant, and Christopher Schatschneider. 2010. Do different types of school mathematics development depend on different constellations of numerical versus general cognitive abilities? Developmental Psychology 46: 1731–46. [Google Scholar] [CrossRef] [PubMed]
  14. Gabriel, Allison S., Michael A. Daniels, James M. Diefendorff, and Gary J. Greguras. 2015. Emotional labor actors: A latent profile analysis of emotional labor strategies. Journal of Applied Psychology 100: 863–79. [Google Scholar] [CrossRef]
  15. Grunschel, Carola, Justine Patrzek, and Stefan Fries. 2013. Exploring different types of academic delayers: A latent profile analysis. Learning and Individual Differences 23: 225–33. [Google Scholar] [CrossRef]
  16. Herzberg, Philipp Yorck, and Marcus Roth. 2006. Beyond resilients, undercontrollers, and overcontrollers? An extension of personality prototype research. European Journal of Personality 20: 5–28. [Google Scholar] [CrossRef]
  17. Holdnack, James A. 2019. The development, expansion, and future of the WAIS-IV as a cornerstone in comprehensive cognitive assessments. In Handbook of Psychological Assessment. Cambridge: Academic Press, pp. 103–39. [Google Scholar]
  18. Kaufman, Scott B. 2019. Toward a New Frontier in Human Intelligence: The Person-Centered Approach. Scientific American. Available online: https://blogs.scientificamerican.com/beautiful-minds/toward-a-new-frontier-in-human-intelligence-the-person-centered-approach/ (accessed on 1 December 2022).
  19. Keefer, Kateryna V., James D. A. Parker, and Laura M. Wood. 2012. Trait emotional intelligence and university graduation outcomes using Latent Profile Analysis to identify students at risk for degree noncompletion. Journal of Psychoeducational Assessment 30: 402–13. [Google Scholar] [CrossRef]
  20. Knapp, Deirdre J., and Tonia S. Heffner. 2010. Expanded Enlistment Eligibility Metrics (EEEM): Recommendations on a Non-Cognitive Screen for New Soldier Selection (Technical Report 1267). Alexandria: U.S. Army Research Institute for the Behavioral and Social Sciences. [Google Scholar]
  21. Lang, Jonas W. B., and Harrison J. Kell. 2020. General mental ability and specific abilities: Their relative importance for extrinsic career success. Journal of Applied Psychology 105: 1047–61. [Google Scholar] [CrossRef] [PubMed]
  22. Lang, Jonas W. B., Martin Kersting, Ute R. Hülsheger, and Jessica Lang. 2010. General mental ability, narrower cognitive abilities, and job performance: The perspective of the nested-factors model of cognitive abilities. Personnel Psychology 63: 595–640. [Google Scholar] [CrossRef]
  23. Lanza, Stephanie T., Brian P. Flaherty, and Linda M. Collins. 2003. Latent class and latent transition analysis. In Handbook of Psychology: Research Methods in Psychology. Edited by J. A. Schinka and W. A. Velicer. New York: Wiley, pp. 663–85. [Google Scholar]
  24. Lanza, Stephanie T., Brittany L. Rhoades, Robert L. Nix, and Mark T. Greenberg. 2010. Modeling the interplay of multilevel risk factors for future academic and behavior problems: A person-centered approach. Development and Psychopathology 22: 313–35. [Google Scholar]
  25. Lo, Yungtai, Nancy R. Mendell, and Donald B. Rubin. 2001. Testing the number of components in a normal mixture. Biometrika 88: 767–78. [Google Scholar] [CrossRef]
  26. McGrew, Kevin S. 2005. The Cattell-Horn-Carroll theory of cognitive abilities: Past, present, and future. In Contemporary Intellectual Assessment: Theories, Tests, and Issues. Edited by D. P. Flanagan and P. L. Harrison. New York: Guilford Press, pp. 136–81. [Google Scholar]
  27. Merz, Erin L., and Scott C. Roesch. 2011. A latent profile analysis of the Five Factor Model of personality: Modeling trait interactions. Personality and Individual Differences 51: 915–19. [Google Scholar] [CrossRef]
  28. Meyer, John P., Chester Kam, Irina Goldenberg, and Nicholas L. Bremner. 2013. Organizational commitment in the military: Application of a profile approach. Military Psychology 25: 381–401. [Google Scholar] [CrossRef]
  29. Muthén, Linda K., and Bengt O. Muthén. 2017. Mplus User’s Guide, 8th ed. Los Angeles: Muthén and Muthén. [Google Scholar]
  30. Nye, Christopher D., Jingjing Ma, and Serena Wee. 2022. Cognitive ability and job performance: Meta-analytic evidence for the validity of narrow cognitive abilities. Journal of Business and Psychology 37: 1119–39. [Google Scholar] [CrossRef]
  31. Oswald, Fred L., and Leatta Hough. 2012. I–O 2.0 from intelligence 1.5: Staying (just) behind the cutting edge of intelligence theories. Industrial and Organizational Psychology 5: 172–75. [Google Scholar] [CrossRef]
  32. Park, Gregory, David Lubinski, and Camilla P. Benbow. 2007. Contrasting intellectual patterns predict creativity in the arts and sciences: Tracking intellectually precocious youth over 25 years. Psychological Science 18: 948–52. [Google Scholar] [CrossRef]
  33. Ramaswamy, Venkatram, Wayne S. DeSarbo, David J. Reibstein, and William T. Robinson. 1993. An empirical pooling approach for estimating marketing mix elasticities with PIMS data. Marketing Science 12: 103–24. [Google Scholar] [CrossRef]
  34. Ree, Malcolm James, Thomas R. Carretta, and Mark S. Teachout. 2015. Pervasiveness of dominant general factors in organizational measurement. Industrial and Organizational Psychology: Perspectives on Science and Practice 8: 409–27. [Google Scholar] [CrossRef]
  35. Reeve, Charlie L., Charles Scherbaum, and Harold Goldstein. 2015. Manifestations of intelligence: Expanding the measurement space to reconsider specific cognitive abilities. Human Resource Management Review 25: 28–37. [Google Scholar] [CrossRef]
  36. Roberts, Richard D., Ginger Nelson Goff, Fadi Anjoul, Patrick C. Kyllonen, Gerry Pallier, and Lazar Stankov. 2000. The armed services vocational aptitude battery (ASVAB): Little more than acculturated learning (Gc)!? Learning and Individual Differences 12: 81–103. [Google Scholar] [CrossRef]
  37. Robins, Richard W., and Jessica L. Tracy. 2003. Setting an agenda for a person-centered approach to personality development: Commentary. Monographs of the Society for Research in Child Development 68: 110–22. [Google Scholar]
  38. Sackett, Paul R., Charlene Zhang, Christopher M. Berry, and Filip Lievens. 2022. Revisiting meta-analytic estimates of validity in personnel selection: Addressing systematic overcorrection for restriction of range. Journal of Applied Psychology 107: 2040–68. [Google Scholar] [CrossRef]
  39. Scherbaum, Charles A., Harold W. Goldstein, Kenneth P. Yusko, Rachel Ryan, and Paul J. Hanges. 2012. Intelligence 2.0: Reestablishing a research program on g in I–O psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice 5: 128–48. [Google Scholar] [CrossRef]
  40. Schmidt, Frank L. 2002. The role of general cognitive ability and job performance: Why there cannot be a debate. Human Performance 15: 187–210. [Google Scholar]
  41. Schmidt, Frank L., and John E. Hunter. 1998. The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin 124: 262–74. [Google Scholar] [CrossRef]
  42. Schmidt, Frank L., and John E. Hunter. 2004. General mental ability in the world of work: Occupational attainment and job performance. Journal of Personality and Social Psychology 86: 162–73. [Google Scholar] [CrossRef] [PubMed]
  43. Schneider, W. Joel, and Daniel A. Newman. 2015. Intelligence is multidimensional: Theoretical review and implications of specific cognitive abilities. Human Resource Management Review 25: 12–27. [Google Scholar] [CrossRef]
  44. Schwarz, Gideon. 1978. Estimating the dimension of a model. Annals of Statistics 6: 461–64. [Google Scholar] [CrossRef]
  45. Sclove, Stanley L. 1987. Application of model-selection criteria to some problems in multivariate analysis. Psychometrika 52: 333–43. [Google Scholar] [CrossRef]
  46. Segall, Daniel O. 2004. Development and Evaluation of the 1997 ASVAB Score Scale (Technical Report No. 2004-002). Seaside: Defense Manpower Data Center. [Google Scholar]
  47. Spurk, Daniel, Andreas Hirschi, Mo Wang, Domingo Valero, and Simone Kauffeld. 2020. Latent profile analysis: A review and “how to” guide of its application within vocational behavior research. Journal of Vocational Behavior 120: 103445. [Google Scholar] [CrossRef]
  48. Wai, Jonathan, David Lubinski, and Camilla P. Benbow. 2005. Creativity and occupational accomplishments among intellectually precocious youth: An age 13 to age 33 longitudinal study. Journal of Educational Psychology 97: 484–92. [Google Scholar] [CrossRef]
  49. Wai, Jonathan, David Lubinski, and Camilla P. Benbow. 2009. Spatial ability for STEM domains: Aligning over 50 years of cumulative psychological knowledge solidifies its importance. Journal of Educational Psychology 101: 817–35. [Google Scholar] [CrossRef]
  50. Wang, Mo, and Paul Hanges. 2011. Latent class procedures: Applications to organizational research. Organizational Research Methods 14: 24–31. [Google Scholar] [CrossRef]
  51. Wang, Mo, Robert R. Sinclair, Le Zhou, and Lindsay E. Sears. 2013. Person-centered analysis: Methods, applications, and implications for occupational health psychology. In Research Methods in Occupational Health Psychology: Measurement, Design, and Data Analysis. Edited by R. R. Sinclair, M. Wang and L. E. Tetrick. New York: Routledge/Taylor & Francis Group, pp. 349–73. [Google Scholar]
  52. Webb, Rose Mary, David Lubinski, and Camilla Persson Benbow. 2007. Spatial ability: A neglected dimension in talent searches for intellectually precocious youth. Journal of Educational Psychology 99: 397–420. [Google Scholar] [CrossRef]
  53. Wee, Serena, Daniel A. Newman, and Dana L. Joseph. 2014. More than g: Selection quality and adverse impact implications of considering second-stratum cognitive abilities. Journal of Applied Psychology 99: 547–63. [Google Scholar] [CrossRef] [PubMed]
  54. Weiss, Howard M., and Deborah E. Rupp. 2011. Experiencing work: An essay on a person-centric work psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice 4: 83–97. [Google Scholar]
  55. Welsh, John R., Susan K. Kucinkas, and Linda T. Curran. 1990. Armed Services Vocational Battery (ASVAB): Integrative Review of Validity Studies (Technical Report No. 90-22). San Antonio: Brooks Air Force Base, Air Force Systems Command. [Google Scholar]
Figure 1. Cognitive Ability Classes by Average Armed Service Vocational Aptitude Battery Dimension. Note. Profile 1 = Arithmetic Reasoning/Assembling Objects; Profile 2 = Verbal Expression and Paragraph Comprehension; Profile 3 = Math Knowledge/Assembling Objects; Profile 4 = Verbal Expression, Paragraph Comprehension, and Assembling Objects; Profile 5 = Uniformly High including General Science, Math, and Verbal.
Figure 1. Cognitive Ability Classes by Average Armed Service Vocational Aptitude Battery Dimension. Note. Profile 1 = Arithmetic Reasoning/Assembling Objects; Profile 2 = Verbal Expression and Paragraph Comprehension; Profile 3 = Math Knowledge/Assembling Objects; Profile 4 = Verbal Expression, Paragraph Comprehension, and Assembling Objects; Profile 5 = Uniformly High including General Science, Math, and Verbal.
Jintelligence 11 00080 g001
Table 1. Correlations among study variables (n = 1681).
Table 1. Correlations among study variables (n = 1681).
VariableMSD123456
1. General Science50.546.63
2. Mechanical Comprehension53.476.570.40 **
3. Verbal Expression50.354.810.59 **0.34 **
4. Arithmetic Reasoning50.985.730.27 **0.36 **0.24 **
5. Paragraph Comprehension51.474.780.37 **0.29 **0.71 **0.27 **
6. Assembling Objects54.657.290.21 **0.42 **0.16 **0.34 **0.18 **
7. Gender0.090.29−0.04−0.20 **0.06 *−0.050.07 **−0.03
Note. * indicates p < 0.05. ** indicates p < 0.01. Gender is coded 0 = male, 1 = female.
Table 2. Model Fit Indices for the Latent Profile Analysis Solutions (n = 1681).
Table 2. Model Fit Indices for the Latent Profile Analysis Solutions (n = 1681).
SolutionLMRT (p)BLRT (p)AICBICsBICEntropy%’s for ClassesNo. Parameters
2 class169.52 (<0.001)<0.00162,706.9162,810.0362,749.670.7557, 4319
3 class540.07 (<0.001)<0.00162,170.4562,311.5662,228.960.7816, 52, 3226
4 class216.38 (0.024)<0.00161,963.9162,143.0162,038.170.7115, 20, 33, 3333
5 class204.25 (<0.001)<0.00161,769.7461,986.8261,859.750.7113, 15, 35, 20, 1740
6 class86.57 (0.296)<0.00161,695.5161,950.5861,801.270.706, 16, 31, 17, 14, 1647
7 class73.13 (0.206)<0.00161,634.9761,928.0461,756.490.723, 3, 18, 18, 14, 28, 1654
8 class88.01 (0.316)<0.00161,562.0561,893.1061,699.320.722, 11, 12, 7, 14, 30, 8, 1461
Note. LMRT = Lo–Mendell–Rubin Test; BLRT = Bootstrap Likelihood Ratio Test; AIC = Akaike Information Criterion; BIC = Bayesian Information Criterion; sBIC = Sample Size-Adjusted Bayesian Information Criterion.
Table 3. Classification Probabilities for the Most Likely Latent Class Membership (Column) by Latent Class (Row).
Table 3. Classification Probabilities for the Most Likely Latent Class Membership (Column) by Latent Class (Row).
12345
10.860.040.100.000.00
20.040.750.170.040.00
30.030.060.840.050.02
40.000.040.110.760.09
50.000.000.030.120.85
Note. 1 = Arithmetic Reasoning/Assembling Objects; 2 = Verbal Expression and Paragraph Comprehension; 3 = Math Knowledge/Assembling Objects; 4 = Verbal Expression, Paragraph Comprehension, Assembling Objects; 5 = Uniformly High including General Science, Math, and Verbal. The accuracy classification probabilities for each of the classes are in the diagonal.
Table 4. Overall Sample Means and Conditional Response Means (CRMs for 5-Class Solution (n = 1681)).
Table 4. Overall Sample Means and Conditional Response Means (CRMs for 5-Class Solution (n = 1681)).
Profile 1Profile 2Profile 3Profile 4Profile 5
Sample Size n = 218n = 250n = 591n = 330n = 292
% of Sample 13%15%35%20%17%
Overall MeanS.D.Conditional Response Means (CRM)
General Science (GS) 50.54 6.63 43.06 46.74 50.09 53.57 56.86
Mechanical Comprehension (MC) 53.47 6.57 48.93 47.04 54.31 52.77 61.48
Verbal Expression (VE) 50.35 4.81 42.77 49.44 48.55 54.75 55.46
Arithmetic Reasoning (AR) 50.98 5.73 48.80 46.71 50.94 50.12 57.30
Paragraph Comprehension (PC) 51.47 4.78 44.50 51.04 49.95 55.30 55.76
Assembling Objects (AO) 54.65 7.29 51.97 45.54 57.27 53.69 60.23
Note. Profile 1 = Arithmetic Reasoning/Assembling Objects; Profile 2 = Verbal Expression/Paragraph Comprehension; Profile 3 = Math Knowledge/Assembling Objects; Profile 4 = Verbal Expression, Paragraph Comprehension, Assembling Objects; Profile 5 = Uniformly High including General Science, Math, and Verbal.
Table 5. Comparison of Classes on Performance Ratings (n = 1681).
Table 5. Comparison of Classes on Performance Ratings (n = 1681).
OutcomeΧ2 M (SD)
Profile 1Profile 2Profile 3Profile 4Profile 5
Effort Rating56.56 **3.40 (0.73) a3.26 (0.84) b3.55 (0.71) c3.56 (0.69 c3.68 (0.72) c
Discipline Rating65.92 **3.66 (0.71) b3.53 (0.80) a3.79 (0.72) b,c3.83 (0.69) c3.94 (0.69) c
Peer Leadership26.45 **3.31 (0.73) a3.22 (0.85) a,b3.42 (0.78) a,c3.38 (0.77) a,b,c3.56 (0.78) c
Note. Means in the same row that do not share a superscript are significantly (p < 0.05) different from each other. Profile 1 = Arithmetic Reasoning/Assembling Objects; Profile 2 = Verbal Expression and Paragraph Comprehension; Profile 3 = Math Knowledge/Assembling Objects; Profile 4 = Verbal Expression, Paragraph Comprehension, Assembling Objects; Profile 5 = Uniformly High including General Science, Math, and Verbal. ** p < 0.01.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Conte, J.M.; Harmata, R.K. Person-Centered Study of Cognitive Ability Dimensions Using Latent Profile Analysis. J. Intell. 2023, 11, 80. https://doi.org/10.3390/jintelligence11050080

AMA Style

Conte JM, Harmata RK. Person-Centered Study of Cognitive Ability Dimensions Using Latent Profile Analysis. Journal of Intelligence. 2023; 11(5):80. https://doi.org/10.3390/jintelligence11050080

Chicago/Turabian Style

Conte, Jeffrey M., and Rebecca K. Harmata. 2023. "Person-Centered Study of Cognitive Ability Dimensions Using Latent Profile Analysis" Journal of Intelligence 11, no. 5: 80. https://doi.org/10.3390/jintelligence11050080

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop