Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for +publisher:"University of Notre Dame" +contributor:("Ying Cheng, Committee Chair"). Showing records 1 – 3 of 3 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Notre Dame

1. Jeffrey M. Patton. Some Consequences of Response Time Model Misspecification in Educational Measurement</h1>.

Degree: Psychology, 2014, University of Notre Dame

Response times (RTs) on test items are a valuable source of information concerning examinees and the items themselves. As such, they have the potential to improve a wide variety of measurement activities. However, researchers have found that empirical RT distributions can exhibit a variety of shapes among the items within a single test. Though a number of semiparametric and “flexible” parametric models are available, no single model can accommodate all plausible shapes of empirical RT distributions. Thus the goal of this research was to study a few of the potential consequences of RT model misspecification in educational measurement. In particular, two promising applications of RT models were of interest: examinee ability estimation and item selection in computerized adaptive testing (CAT). First, by jointly modeling RTs and item responses, RTs can be used as collateral information in the estimation of examinee ability. This can be accomplished by embedding separate models for RTs and item responses in Level 1 of a hierarchical model and allowing their parameters to correlate in Level 2. If the RT model is misspecified, a potential drawback of this hierarchical structure is that any negative impact on estimates of the RT model parameters may, in turn, negatively impact ability estimates. However, a simulation study found that estimates of the RT model parameters were robust to misspecification of the RT model. In turn, ability estimates were also robust. Second, by considering the time intensity of items during item selection in CAT, test completion times can be reduced without sacrificing the precision of ability estimates. This can be done by choosing items that maximize the ratio of item information to the examinee’s predicted RT. However, an RT model is needed to make the prediction; if the RT model is misspecified, this method may not perform as intended. A simulation study found that whether or not the correct RT model was used to make the prediction had no bearing on test completion times. Additionally, using a simple, average RT as the prediction was just as effective as model-based prediction in reducing test completion times. Advisors/Committee Members: Ying Cheng, Committee Chair, Ke-Hai Yuan, Committee Member, Zhiyong Zhang, Committee Member, Scott E. Maxwell, Committee Member.

Subjects/Keywords: adaptive testing; model misfit; item response theory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Patton, J. M. (2014). Some Consequences of Response Time Model Misspecification in Educational Measurement</h1>. (Thesis). University of Notre Dame. Retrieved from https://curate.nd.edu/show/n296ww74p1r

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Patton, Jeffrey M.. “Some Consequences of Response Time Model Misspecification in Educational Measurement</h1>.” 2014. Thesis, University of Notre Dame. Accessed August 08, 2020. https://curate.nd.edu/show/n296ww74p1r.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Patton, Jeffrey M.. “Some Consequences of Response Time Model Misspecification in Educational Measurement</h1>.” 2014. Web. 08 Aug 2020.

Vancouver:

Patton JM. Some Consequences of Response Time Model Misspecification in Educational Measurement</h1>. [Internet] [Thesis]. University of Notre Dame; 2014. [cited 2020 Aug 08]. Available from: https://curate.nd.edu/show/n296ww74p1r.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Patton JM. Some Consequences of Response Time Model Misspecification in Educational Measurement</h1>. [Thesis]. University of Notre Dame; 2014. Available from: https://curate.nd.edu/show/n296ww74p1r

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Notre Dame

2. Quinn Nathaniel Lathrop. The Impact of Within-Template Systematic Variation on Response Models</h1>.

Degree: Psychology, 2013, University of Notre Dame

With item response data, systematic variation within nested groups of generated items may negatively impact the estimation of item and person parameters. This paper studies a model that can capture the multilevel structure of the data and explain within-template systematic variability. The goal of this model is twofold. First, explaining and removing non-random error may improve ability and item parameter estimates. And second, finding systematic variation can bring insights into the educational process. Simulation results are discussed at length. Advisors/Committee Members: Scott Maxwell, Committee Member, Lijuan Wang, Committee Member, Ying Cheng, Committee Chair.

Subjects/Keywords: computerized assessment; generated items; explanatory IRT; multilevel models; item response theory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lathrop, Q. N. (2013). The Impact of Within-Template Systematic Variation on Response Models</h1>. (Thesis). University of Notre Dame. Retrieved from https://curate.nd.edu/show/1z40ks67c16

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lathrop, Quinn Nathaniel. “The Impact of Within-Template Systematic Variation on Response Models</h1>.” 2013. Thesis, University of Notre Dame. Accessed August 08, 2020. https://curate.nd.edu/show/1z40ks67c16.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lathrop, Quinn Nathaniel. “The Impact of Within-Template Systematic Variation on Response Models</h1>.” 2013. Web. 08 Aug 2020.

Vancouver:

Lathrop QN. The Impact of Within-Template Systematic Variation on Response Models</h1>. [Internet] [Thesis]. University of Notre Dame; 2013. [cited 2020 Aug 08]. Available from: https://curate.nd.edu/show/1z40ks67c16.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lathrop QN. The Impact of Within-Template Systematic Variation on Response Models</h1>. [Thesis]. University of Notre Dame; 2013. Available from: https://curate.nd.edu/show/1z40ks67c16

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Notre Dame

3. Quinn Nathaniel Lathrop. IRT and SVD: Implementing Psychometric Methods in New and Complex Situations</h1>.

Degree: Psychology, 2015, University of Notre Dame

Most psychometric techniques used to analyze assessment data are designed to work with complete data. The rapid increase in the availability and power of technology has contributed to the growing use of computerized tests and related methods. The data arising from these new and complex situations challenge traditional psychometric techniques because of their size (as there is much more data) and their vast missingness (as students respond to only a small subset of possible items). This dissertation focuses on the effect of missing data on psychometric techniques. When individuals respond to different items of varying difficulty, the psychometric techniques that rely on complete data can perform poorly. This dissertation proposes using Singular Value Decomposition (SVD), a matrix decomposition technique often seen in data mining, as a psychometric tool. The major result is that SVD is a viable psychometric tool that appears largely robust to missing data and to the missing mechanism. This document provides analytical and empirical justification for SVD’s use with psychometric data under missing data. Chapter 1 introduces relevant IRT techniques such as nonparametric IRT and a nonparametric item fit statistic. Then, in Chapter 2, SVD is introduced and as well as an Alternating-Least-Squares (ALS) algorithm that extends the decomposition to missing data. Chapter 3 investigates the large sample properties of using SVD with psychometric data. SVD is shown to be a consistent ordinal estimator of student ability and a consistent ordinal estimator of item easiness. Chapter 4 presents simulation results that show that when students respond to different items of varying difficulty, whether the missingness is related to their ability or not, SVD can rank the students better than proportion correct and can better estimate the true relationship between student ability and the probability of a correct response. When missingness is related to student ability, SVD can rank the students, in most conditions, better than a parametric IRT model, even when the parametric model is correctly specified. Advisors/Committee Members: Guangjian Zhang, Committee Member, Scott Maxwell, Committee Member, Ying Cheng, Committee Chair, Zhiyong Zhang, Committee Member.

Subjects/Keywords: missing data; singular value decomposition; data mining; computation statistics; educational data; Item response theory; psychometrics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lathrop, Q. N. (2015). IRT and SVD: Implementing Psychometric Methods in New and Complex Situations</h1>. (Thesis). University of Notre Dame. Retrieved from https://curate.nd.edu/show/pz50gt56z02

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Lathrop, Quinn Nathaniel. “IRT and SVD: Implementing Psychometric Methods in New and Complex Situations</h1>.” 2015. Thesis, University of Notre Dame. Accessed August 08, 2020. https://curate.nd.edu/show/pz50gt56z02.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Lathrop, Quinn Nathaniel. “IRT and SVD: Implementing Psychometric Methods in New and Complex Situations</h1>.” 2015. Web. 08 Aug 2020.

Vancouver:

Lathrop QN. IRT and SVD: Implementing Psychometric Methods in New and Complex Situations</h1>. [Internet] [Thesis]. University of Notre Dame; 2015. [cited 2020 Aug 08]. Available from: https://curate.nd.edu/show/pz50gt56z02.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Lathrop QN. IRT and SVD: Implementing Psychometric Methods in New and Complex Situations</h1>. [Thesis]. University of Notre Dame; 2015. Available from: https://curate.nd.edu/show/pz50gt56z02

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

.