Browsing by Author "Arias, Victor B."
Now showing 1 - 4 of 4
Results Per Page
Sort Options
- ItemBifactor Models of Attention-Deficit/Hyperactivity Disorder (ADHD): An Evaluation of Three Necessary but Underused Psychometric Indexes(2018) Arias, Victor B.; Ponce, Fernando P.; Nunez, Daniel. E.
- ItemHierarchy and Psychometric Properties of ADHD. Symptoms in Spanish Children : An Application of the Graded Response Model(2016) Arias, Victor B.; Nuñez, Daniel E.; Martınez-Molina, Agustın; Ponce, Fernando P.; Arias, Benito
- ItemHow a Few Inconsistent Respondents Can Confound the Structure of Personality Survey Data An Example With the Core-Self Evaluations Scale(2023) Arias, Victor B.; Ponce, Fernando P.; Martinez-Molina, AgustinIn survey data, inconsistent responses due to careless/insufficient effort (C/IE) can lead to problems of replicability and validity. However, data cleaning prior to the main analyses is not yet a standard practice. We investigated the effect of C/IE responses on the structure of personality survey data. For this purpose, we analyzed the structure of the Core-Self Evaluations scale (CSE-S), including the detection of aberrant responses in the study design. While the original theoretical model of the CSE-S assumes that the construct is unidimensional (Judge et al., 2003), recent studies have argued for a multidimensional solution (positive CSE and negative CSE). We hypothesized that this multidimensionality is not substantive but a result of the tendency of C/IE data to generate spurious dimensions. We estimated the confirmatory models before and after removing highly inconsistent response vectors in two independent samples (6% and 4.7%). The analysis of the raw samples clearly favored retaining the two-dimensional model. In contrast, the analysis of the clean datasets suggested the retention of a single factor. A mere 6% C/IE response rate showed enough power to confound the results of the factor analysis. This result suggests that the factor structure of positive and negative CSE factors is spurious, resulting from uncontrolled wording variance produced by a limited proportion of highly inconsistent response vectors. We encourage researchers to include screening for inconsistent responses in their research designs.
- ItemWording effects in assessment: missing the trees for the forest(2022) Ponce Cisternas Fernando Patricio; Torres Irribarra, David; Verges, Álvaro; Arias, Victor B.This article examines wording effects when positive and negative worded items are includedin psychological assessment. Wordings effects have been analyzed in the literature usingstatistical approaches based on population homogeneity assumptions (i.e. CFA, SEM), com-monly adopting the bifactor model to separate trait variance and wording effects. This art-icle presents an alternative approach by explicitly modeling population heterogeneitythrough a latent profile model, based on the idea that a subset of individuals exhibits word-ing effects. This kind of mixture model allows simultaneously to classify respondents, sub-stantively characterize the differences in their response profiles, and report respondents’results in a comparable manner. Using the Rosenberg’s self-esteem scale data from the LISSPanel (N¼6,762) in three studies, we identify a subgroup of participants who respond dif-ferentially according to item-wording and examine the impact of its responses in the esti-mation of the RSES measurement model, in terms of global and individual fit, under one-factor and bifactor models.The results of these analyses support the interpretation of wording effects in terms of a the-oretically-proposed differential pattern of response to positively and negatively wordeditems, introducing a valuable tool for examining the artifactual or substantive interpretationsof such wording effects.