Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for +publisher:"University of Georgia" +contributor:("Allan S. Cohen"). Showing records 1 – 2 of 2 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Georgia

1. Cho, Sun-Joo. A multilevel mixture IRT model for DIF analysis.

Degree: PhD, Educational Psychology, 2007, University of Georgia

The usual methodology for detection of di_erential item functioning (DIF) is to examine di_erences among manifest groups formed by such characteristics as gender, ethnicity, age, etc. Unfortunately, membership in a manifest group is often only modestly related to the actual cause(s) of DIF. Mixture item response theory (IRT) models have been suggested as an alternative methodology to identifying groups formed along the nuisance dimension(s) assumed to be the actual cause(s) of DIF. A multilevel mixture IRT model (MMixIRTM) is described that enables simultaneous detection of DIF at both examinee- and school-levels. The MMixIRTM can be viewed as a combination of an IRT model, an unrestricted latent class model, and a multilevel model. Three perspectives on this model were presented: First, the MMixIRTM can be formed by incorporating mixtures into a multilevel IRT model; second, the MMixIRTM can be formed by incorporating a multilevel structure into a mixture IRT model; and third, the model can be formed by including an IRT model in a multilevel unrestricted latent class model. A fully Bayesian estimation of the MMixIRTM was described including analysis of label switching, use of priors, and model selection strategies along with a discussion of scale linkage. A simulation study and a real data example were presented. Advisors/Committee Members: Allan S. Cohen.

Subjects/Keywords: Bayesian Estimation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Cho, S. (2007). A multilevel mixture IRT model for DIF analysis. (Doctoral Dissertation). University of Georgia. Retrieved from http://purl.galileo.usg.edu/uga_etd/cho_sun-joo_200712_phd

Chicago Manual of Style (16th Edition):

Cho, Sun-Joo. “A multilevel mixture IRT model for DIF analysis.” 2007. Doctoral Dissertation, University of Georgia. Accessed September 23, 2019. http://purl.galileo.usg.edu/uga_etd/cho_sun-joo_200712_phd.

MLA Handbook (7th Edition):

Cho, Sun-Joo. “A multilevel mixture IRT model for DIF analysis.” 2007. Web. 23 Sep 2019.

Vancouver:

Cho S. A multilevel mixture IRT model for DIF analysis. [Internet] [Doctoral dissertation]. University of Georgia; 2007. [cited 2019 Sep 23]. Available from: http://purl.galileo.usg.edu/uga_etd/cho_sun-joo_200712_phd.

Council of Science Editors:

Cho S. A multilevel mixture IRT model for DIF analysis. [Doctoral Dissertation]. University of Georgia; 2007. Available from: http://purl.galileo.usg.edu/uga_etd/cho_sun-joo_200712_phd


University of Georgia

2. Kim, Meereem. Detection of speededness in constructed response items using mixture IRT models.

Degree: PhD, Educational Psychology, 2017, University of Georgia

Speededness effects tend to occur when tests have time limits (Lu & Sireci, 2007). Speededness is normally dealt with in psychometric models as a "nuisance" factor because it is a factor which is not the intended focus of the test. When speededness occurs, therefore, its effects intrude on the construct being measured and can seriously degrade the validity of the test results. A number of different approaches have been used to try to detect which examinees exhibit speededness effects. Speededness in constructed response (CR) items, however, has only recently been studied (Kim et al., 2016), although CR items are becoming increasingly prominent in standardized assessments as a means of getting students to produce a response rather than select a choice (Scalise, 2014). In this dissertation, we investigate test speededness in the context of CR items. The first study examined a statistical model for detection of speededness effects in CR items using a two-class mixture graded response model (GRM; Samejima, 1969) for testlets. Traditional IRT models, unfortunately, cannot detect speededness, as the effects of speededness violate such models. In this first study, therefore, we considered an alternative model for estimating person and item parameters, when speededness effects are present. This approach uses a mixture IRT model (Rost, 1990) and operates, in part, to classify examinees into one of two latent groups, a speeded group and a nonspeeded group. In the second study, the model in the first study was extended to consider model parameters for both person and item as random effects. In particular, we investigated the performance of a random item mixture GRM for testlets with item covariates. The random item model considers both persons and items to be randomly sampled from a population (De Boeck, 2008). Treating items as random enables inclusion of item covariates directly in the model, which allows simultaneous detection of speededness effects and examination of the relationship between speededness effects in CR items and the item covariates. In the third study, we described another possible way to characterize a latent group membership from a mixture IRT model. In general, a mixture IRT model does not readily provide a qualitative explanation of the latent dimension(s). In this dissertation, we investigated a statistical method for detecting latent themes or topics in the actual text that examinees used in giving their answers to CR items. This method is latent Dirichlet allocation (LDA; Blei, Ng, & Jordan, 2003), which is used to detect latent topics in text corpora. We investigated the use of LDA for usefulness in providing information about the qualitative differences in textual responses from the speeded and nonspeeded examinees. Advisors/Committee Members: Allan S. Cohen.

Subjects/Keywords: Speededness; mixture item response theory; graded response model; testlet effect; latent Dirichlet allocation; constructed response items

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Kim, M. (2017). Detection of speededness in constructed response items using mixture IRT models. (Doctoral Dissertation). University of Georgia. Retrieved from http://hdl.handle.net/10724/37806

Chicago Manual of Style (16th Edition):

Kim, Meereem. “Detection of speededness in constructed response items using mixture IRT models.” 2017. Doctoral Dissertation, University of Georgia. Accessed September 23, 2019. http://hdl.handle.net/10724/37806.

MLA Handbook (7th Edition):

Kim, Meereem. “Detection of speededness in constructed response items using mixture IRT models.” 2017. Web. 23 Sep 2019.

Vancouver:

Kim M. Detection of speededness in constructed response items using mixture IRT models. [Internet] [Doctoral dissertation]. University of Georgia; 2017. [cited 2019 Sep 23]. Available from: http://hdl.handle.net/10724/37806.

Council of Science Editors:

Kim M. Detection of speededness in constructed response items using mixture IRT models. [Doctoral Dissertation]. University of Georgia; 2017. Available from: http://hdl.handle.net/10724/37806

.