You searched for subject:(Dimensional analysis)
.
Showing records 1 – 30 of
592 total matches.
◁ [1] [2] [3] [4] [5] … [20] ▶

Columbia University
1.
Gao, Yuanjun.
Statistical Machine Learning Methods for High-dimensional Neural Population Data Analysis.
Degree: 2017, Columbia University
URL: https://doi.org/10.7916/D8D2240V
► Advances in techniques have been producing increasingly complex neural recordings, posing significant challenges for data analysis. This thesis discusses novel statistical methods for analyzing high-dimensional…
(more)
▼ Advances in techniques have been producing increasingly complex neural recordings, posing significant challenges for data analysis. This thesis discusses novel statistical methods for analyzing high-dimensional neural data. Part one discusses two extensions of state space models tailored to neural data analysis. First, we propose using a flexible count data distribution family in the observation model to faithfully capture over-dispersion and under-dispersion of the neural observations. Second, we incorporate nonlinear observation models into state space models to improve the flexibility of the model and get a more concise representation of the data. For both extensions, novel variational inference techniques are developed for model fitting, and simulated and real experiments show the advantages of our extensions. Part two discusses a fast region of interest (ROI) detection method for large-scale calcium imaging data based on structured matrix factorization. Part three discusses a method for sampling from a maximum entropy distribution with complicated constraints, which is useful for hypothesis testing for neural data analysis and many other applications related to maximum entropy formulation. We conclude the thesis with discussions and future works.
Subjects/Keywords: Statistics; Neurosciences; Dimensional analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Gao, Y. (2017). Statistical Machine Learning Methods for High-dimensional Neural Population Data Analysis. (Doctoral Dissertation). Columbia University. Retrieved from https://doi.org/10.7916/D8D2240V
Chicago Manual of Style (16th Edition):
Gao, Yuanjun. “Statistical Machine Learning Methods for High-dimensional Neural Population Data Analysis.” 2017. Doctoral Dissertation, Columbia University. Accessed April 10, 2021.
https://doi.org/10.7916/D8D2240V.
MLA Handbook (7th Edition):
Gao, Yuanjun. “Statistical Machine Learning Methods for High-dimensional Neural Population Data Analysis.” 2017. Web. 10 Apr 2021.
Vancouver:
Gao Y. Statistical Machine Learning Methods for High-dimensional Neural Population Data Analysis. [Internet] [Doctoral dissertation]. Columbia University; 2017. [cited 2021 Apr 10].
Available from: https://doi.org/10.7916/D8D2240V.
Council of Science Editors:
Gao Y. Statistical Machine Learning Methods for High-dimensional Neural Population Data Analysis. [Doctoral Dissertation]. Columbia University; 2017. Available from: https://doi.org/10.7916/D8D2240V

Cornell University
2.
Gaynanova, Irina.
Estimation Of Sparse Low-Dimensional Linear Projections.
Degree: PhD, Statistics, 2015, Cornell University
URL: http://hdl.handle.net/1813/40643
► Many multivariate analysis problems are unified under the framework of linear projections. These projections can be tailored towards the analysis of variance (principal components), classification…
(more)
▼ Many multivariate
analysis problems are unified under the framework of linear projections. These projections can be tailored towards the
analysis of variance (principal components), classification (discriminant
analysis) or network recovery (canonical correlation
analysis). Traditional techniques form these projections by using all of the original variables, however in recent years there has been a lot of interest in performing variable selection. The main goal of this dissertation is to elucidate some of the fundamental issues that arise in highdimensional multivariate
analysis and provide computationally efficient and theoretically sound alternatives to existing heuristic techniques
Advisors/Committee Members: Booth,James (chair), Wells,Martin Timothy (coChair), Mezey,Jason G. (committee member), Wegkamp,Marten H. (committee member).
Subjects/Keywords: multivariate analysis; high-dimensional statistics; classification
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Gaynanova, I. (2015). Estimation Of Sparse Low-Dimensional Linear Projections. (Doctoral Dissertation). Cornell University. Retrieved from http://hdl.handle.net/1813/40643
Chicago Manual of Style (16th Edition):
Gaynanova, Irina. “Estimation Of Sparse Low-Dimensional Linear Projections.” 2015. Doctoral Dissertation, Cornell University. Accessed April 10, 2021.
http://hdl.handle.net/1813/40643.
MLA Handbook (7th Edition):
Gaynanova, Irina. “Estimation Of Sparse Low-Dimensional Linear Projections.” 2015. Web. 10 Apr 2021.
Vancouver:
Gaynanova I. Estimation Of Sparse Low-Dimensional Linear Projections. [Internet] [Doctoral dissertation]. Cornell University; 2015. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/1813/40643.
Council of Science Editors:
Gaynanova I. Estimation Of Sparse Low-Dimensional Linear Projections. [Doctoral Dissertation]. Cornell University; 2015. Available from: http://hdl.handle.net/1813/40643

University of Guelph
3.
Connolly, Jessica.
Assessing factors that influence position accuracy in a hydroacoustic telemetry system.
Degree: MS, Department of Mathematics and Statistics, 2012, University of Guelph
URL: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3961
► Simulation modelling was used to quantify the accuracy of positions estimated in a three dimensional underwater environment. Time of arrival differences combined with multilateration methods…
(more)
▼ Simulation modelling was used to quantify the accuracy of positions estimated in a three
dimensional underwater environment. Time of arrival differences combined with multilateration methods were used to make positional estimates of a signal source (acoustic tag). The network studied was used to examine position estimates of aquatic organisms within a sensor (hydrophone) array. Hydrophone position uncertainty (distribution and variance), background noise converted to a measurement of signal strength in the form of a signal to noise ratio, a signal to noise ratio threshold and geometry of the hydrophone array were considered. Each of these factors was studied at two levels by way of a 2 to the power of 5 factorial design and analyzed with an ANOVA
analysis to determine their influence on three
dimensional positioning error. The level of background noise and hydrophone geometry were the two most influential factors in position accuracy. When a high level of background noise was present, it was essential that hydrophone geometry was as close to ideal as possible to ensure accurate position estimates.
Advisors/Committee Members: Umphrey, Gary (advisor).
Subjects/Keywords: acoustic telemetry; factorial analysis; 3-dimensional errors
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Connolly, J. (2012). Assessing factors that influence position accuracy in a hydroacoustic telemetry system. (Masters Thesis). University of Guelph. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3961
Chicago Manual of Style (16th Edition):
Connolly, Jessica. “Assessing factors that influence position accuracy in a hydroacoustic telemetry system.” 2012. Masters Thesis, University of Guelph. Accessed April 10, 2021.
https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3961.
MLA Handbook (7th Edition):
Connolly, Jessica. “Assessing factors that influence position accuracy in a hydroacoustic telemetry system.” 2012. Web. 10 Apr 2021.
Vancouver:
Connolly J. Assessing factors that influence position accuracy in a hydroacoustic telemetry system. [Internet] [Masters thesis]. University of Guelph; 2012. [cited 2021 Apr 10].
Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3961.
Council of Science Editors:
Connolly J. Assessing factors that influence position accuracy in a hydroacoustic telemetry system. [Masters Thesis]. University of Guelph; 2012. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/3961

Hong Kong University of Science and Technology
4.
Mo, Shengyong.
From one-time dimensional control to two-time dimensional hybrid control in batch processes.
Degree: 2013, Hong Kong University of Science and Technology
URL: http://repository.ust.hk/ir/Record/1783.1-67007
;
https://doi.org/10.14711/thesis-b1213709
;
http://repository.ust.hk/ir/bitstream/1783.1-67007/1/th_redirect.html
► Batch processes are the preferred choice for manufacturing high-value-added products. The control performances of key process variables are critical to the product quality and quality…
(more)
▼ Batch processes are the preferred choice for manufacturing high-value-added products. The control performances of key process variables are critical to the product quality and quality consistency of batch processes. Most of the current control algorithms were originally developed for continuous processes. In comparison to continuous processes, batch processes have their own natures: repetive operation, two-time dimensional dynamics (within-batch and batch-to-batch dynamics) and multi-phase. To ensure good control performance, control system design and analysis must be conducted in harmony with the natures of the processes. With such motivations, batch process control was studied systematically by fully exploring their features stated above. First, iterative learning estimator (ILE) with iterative learning control (ILC) was proposed for position control by utilizing the repetitive nature of batch processes. Second, the ILC has been integrated into the prediction model of dynamic matrix control, which leads to the 2 dimensional dynamic matrix control (2D-DMC). It is an integration of optimal feed-forward control and feedback control by using the repetitive and two-time dimensional dynamics (2D) nature of batch processes. Moreover, for systems with measurable states, an optimal guaranteed cost control scheme was developed via a robust H∞ 2D controller for batch processes in an LMI framework. Last but not least, based on the proposed 2D-DMC, 2D hybrid dynamic models comprised of 2D models and general hybrid models concerning the 2D and multi-phase natures of batch processes were designed. With the 2D hybrid models, different design philosophies can be adopted for control algorithms design and tuning. The modeling and control algorithms developed were tested on the MATLAB Simulink as well as an industrial-sized injection molding machine (typical batch process equipment). The control performance has been improved significantly in both the simulation and experimental test. The successful completion of this study not only addresses these important academic issues, but also provides a control method harmonious to the characteristics of batch processes.
Subjects/Keywords: Process control
; Dimensional analysis
; Data processing
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Mo, S. (2013). From one-time dimensional control to two-time dimensional hybrid control in batch processes. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-67007 ; https://doi.org/10.14711/thesis-b1213709 ; http://repository.ust.hk/ir/bitstream/1783.1-67007/1/th_redirect.html
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Mo, Shengyong. “From one-time dimensional control to two-time dimensional hybrid control in batch processes.” 2013. Thesis, Hong Kong University of Science and Technology. Accessed April 10, 2021.
http://repository.ust.hk/ir/Record/1783.1-67007 ; https://doi.org/10.14711/thesis-b1213709 ; http://repository.ust.hk/ir/bitstream/1783.1-67007/1/th_redirect.html.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Mo, Shengyong. “From one-time dimensional control to two-time dimensional hybrid control in batch processes.” 2013. Web. 10 Apr 2021.
Vancouver:
Mo S. From one-time dimensional control to two-time dimensional hybrid control in batch processes. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2013. [cited 2021 Apr 10].
Available from: http://repository.ust.hk/ir/Record/1783.1-67007 ; https://doi.org/10.14711/thesis-b1213709 ; http://repository.ust.hk/ir/bitstream/1783.1-67007/1/th_redirect.html.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Mo S. From one-time dimensional control to two-time dimensional hybrid control in batch processes. [Thesis]. Hong Kong University of Science and Technology; 2013. Available from: http://repository.ust.hk/ir/Record/1783.1-67007 ; https://doi.org/10.14711/thesis-b1213709 ; http://repository.ust.hk/ir/bitstream/1783.1-67007/1/th_redirect.html
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Illinois – Urbana-Champaign
5.
Zhang, Chao.
Multi-dimensional mining of unstructured data with limited supervision.
Degree: PhD, Computer Science, 2018, University of Illinois – Urbana-Champaign
URL: http://hdl.handle.net/2142/102465
► As one of the most important data forms, unstructured text data plays a crucial role in data-driven decision making in domains ranging from social networking…
(more)
▼ As one of the most important data forms, unstructured text data plays a crucial role in data-driven decision making in domains ranging from social networking and information retrieval to healthcare and scientific research. In many emerging applications, people's information needs from text data are becoming multi-dimensional – they demand useful insights for multiple aspects from the given text corpus. However, turning massive text data into multi-
dimensional knowledge remains a challenge that cannot be readily addressed by existing data mining techniques.
In this thesis, we propose algorithms that turn unstructured text data into multi-
dimensional knowledge with limited supervision. We investigate two core questions:
1. How to identify task-relevant data with declarative queries in multiple dimensions?
2. How to distill knowledge from data in a multi-
dimensional space?
To address the above questions, we propose an integrated cube construction and exploitation framework. First, we develop a cube construction module that organizes unstructured data into a cube structure, by discovering latent multi-
dimensional and multi-granular structure from the unstructured text corpus and allocating documents into the structure. Second, we develop a cube exploitation module that models multiple dimensions in the cube space, thereby distilling multi-
dimensional knowledge from data to provide insights along multiple dimensions. Together, these two modules constitute an integrated pipeline: leveraging the cube structure, users can perform multi-
dimensional, multi-granular data selection with declarative queries; and with cube exploitation algorithms, users can make accurate cross-dimension predictions or extract multi-
dimensional patterns for decision making.
The proposed framework has two distinctive advantages when turning text data into multi-
dimensional knowledge: flexibility and label-efficiency. First, it enables acquiring multi-
dimensional knowledge flexibly, as the cube structure allows users to easily identify task-relevant data along multiple dimensions at varied granularities and further distill multi-
dimensional knowledge. Second, the algorithms for cube construction and exploitation require little supervision; this makes the framework appealing for many applications where labeled data are expensive to obtain.
Advisors/Committee Members: Han, Jiawei (advisor), Han, Jiawei (Committee Chair), Zhai, ChengXiang (committee member), Abdelzaher, Tarek (committee member), Mei, Qiaozhu (committee member).
Subjects/Keywords: data mining; multi-dimensional analysis; less supervision
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Zhang, C. (2018). Multi-dimensional mining of unstructured data with limited supervision. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/102465
Chicago Manual of Style (16th Edition):
Zhang, Chao. “Multi-dimensional mining of unstructured data with limited supervision.” 2018. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed April 10, 2021.
http://hdl.handle.net/2142/102465.
MLA Handbook (7th Edition):
Zhang, Chao. “Multi-dimensional mining of unstructured data with limited supervision.” 2018. Web. 10 Apr 2021.
Vancouver:
Zhang C. Multi-dimensional mining of unstructured data with limited supervision. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2018. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/2142/102465.
Council of Science Editors:
Zhang C. Multi-dimensional mining of unstructured data with limited supervision. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2018. Available from: http://hdl.handle.net/2142/102465

Massey University
6.
Ullah, Insha.
Contributions to high-dimensional data analysis : some applications of the regularized covariance matrices : a thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Statistics at Massey University, Albany, New Zealand
.
Degree: 2015, Massey University
URL: http://hdl.handle.net/10179/6608
► High-dimensional data sets, particularly those where the number of variables exceeds the number of observations, are now common in many subject areas including genetics, ecology,…
(more)
▼ High-dimensional data sets, particularly those where the number of variables exceeds
the number of observations, are now common in many subject areas including
genetics, ecology, and statistical pattern recognition to name but a few. The
sample covariance matrix becomes rank deficient and is not invertible when the
number of variables are more than the number of observations. This poses a serious
problem for many classical multivariate techniques that rely on an inverse
of a covariance matrix. Recently, regularized alternatives to the sample covariance
have been proposed, which are not only guaranteed to be positive definite
but also provide reliable estimates. In this Thesis, we bring together some of the
important recent regularized estimators of the covariance matrix and explore their
performance in high-dimensional scenarios via numerical simulations. We make
use of these regularized estimators and attempt to improve the performance of the
three classical multivariate techniques in high-dimensional settings.
In a multivariate random effects models, estimating the between-group covariance
is a well known problem. Its classical estimator involves the difference of two
mean square matrices and often results in negative elements on the main diagonal.
We use a lasso-regularized estimate of the between-group mean square and
propose a new approach to estimate the between-group covariance based on the
EM-algorithm. Using simulation, the procedure is shown to be quite effective and
the estimate obtained is always positive definite.
Multivariate analysis of variance (MANOVA) face serious challenges due to the undesirable
properties of the sample covariance in high-dimensional problems. First,
it suffer from low power and does not maintain accurate type-I error when the
dimension is large as compared to the sample size. Second, MANOVA relies on
the inverse of a covariance matrix and fails to work when the number of variables
exceeds the number of observation. We use an approach based on the lasso regularization
and present a comparative study of the existing approaches including
our proposal. The lasso approach is shown to be an improvement in some cases,
in terms of power of the test, over the existing high-dimensional methods.
Another problem that is addressed in the Thesis is how to detect unusual future
observations when the dimension is large. The Hotelling T2 control chart has
traditionally been used for this purpose. The charting statistic in the control chart
rely on the inverse of a covariance matrix and is not reliable in high-dimensional
problems. To get a reliable estimate of the covariance matrix we use a distribution
free shrinkage estimator. We make use of the available baseline set of data and
propose a procedure to estimate the control limits for monitoring the individual
future observations. The procedure do not assume multivariate normality and
seems robust to the violation of multivariate normality. The simulation study
shows that the new method performs better than…
Subjects/Keywords: Multivariate analysis;
High-dimensional data;
Covariance
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ullah, I. (2015). Contributions to high-dimensional data analysis : some applications of the regularized covariance matrices : a thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Statistics at Massey University, Albany, New Zealand
. (Thesis). Massey University. Retrieved from http://hdl.handle.net/10179/6608
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Ullah, Insha. “Contributions to high-dimensional data analysis : some applications of the regularized covariance matrices : a thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Statistics at Massey University, Albany, New Zealand
.” 2015. Thesis, Massey University. Accessed April 10, 2021.
http://hdl.handle.net/10179/6608.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Ullah, Insha. “Contributions to high-dimensional data analysis : some applications of the regularized covariance matrices : a thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Statistics at Massey University, Albany, New Zealand
.” 2015. Web. 10 Apr 2021.
Vancouver:
Ullah I. Contributions to high-dimensional data analysis : some applications of the regularized covariance matrices : a thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Statistics at Massey University, Albany, New Zealand
. [Internet] [Thesis]. Massey University; 2015. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/10179/6608.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Ullah I. Contributions to high-dimensional data analysis : some applications of the regularized covariance matrices : a thesis submitted in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Statistics at Massey University, Albany, New Zealand
. [Thesis]. Massey University; 2015. Available from: http://hdl.handle.net/10179/6608
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Loughborough University
7.
Sleath, Leslie C.
The dimensional variation analysis of complex mechanical systems.
Degree: Thesis (Eng.D.), 2014, Loughborough University
URL: http://hdl.handle.net/2134/13996
► Dimensional variation analysis (DVA) is a computer based simulation process used to identify potential assembly process issues due the effects of component part and assembly…
(more)
▼ Dimensional variation analysis (DVA) is a computer based simulation process used to identify potential assembly process issues due the effects of component part and assembly variation during manufacture. The sponsoring company has over a number of years developed a DVA process to simulate the variation behaviour of a wide range of static mechanical systems. This project considers whether the current DVA process used by the sponsoring company is suitable for the simulation of complex kinematic systems. The project, which consists of three case studies, identifies several issues that became apparent with the current DVA process when applied to three types of complex kinematic systems. The project goes on to develop solutions to the issues raised in the case studies in the form of new or enhanced methods of information acquisition, simulation modelling and the interpretation and presentation of the simulation output Development of these methods has enabled the sponsoring company to expand the range of system types that can be successfully simulated and significantly enhances the information flow between the DVA process and the wider product development process.
Subjects/Keywords: 658.5; Concurrent engineering; Dimensional management; Dimensional variation analysis; Kinematic constraint map; New product development; Three dimensional visualisation of variation distributions
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Sleath, L. C. (2014). The dimensional variation analysis of complex mechanical systems. (Doctoral Dissertation). Loughborough University. Retrieved from http://hdl.handle.net/2134/13996
Chicago Manual of Style (16th Edition):
Sleath, Leslie C. “The dimensional variation analysis of complex mechanical systems.” 2014. Doctoral Dissertation, Loughborough University. Accessed April 10, 2021.
http://hdl.handle.net/2134/13996.
MLA Handbook (7th Edition):
Sleath, Leslie C. “The dimensional variation analysis of complex mechanical systems.” 2014. Web. 10 Apr 2021.
Vancouver:
Sleath LC. The dimensional variation analysis of complex mechanical systems. [Internet] [Doctoral dissertation]. Loughborough University; 2014. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/2134/13996.
Council of Science Editors:
Sleath LC. The dimensional variation analysis of complex mechanical systems. [Doctoral Dissertation]. Loughborough University; 2014. Available from: http://hdl.handle.net/2134/13996

University of California – San Francisco
8.
Ulrich, Beau.
A Novel Analysis of Skeletal Asymmetry Utilizing 3D CBCT Technology: The Ulrich Analysis.
Degree: Oral and Craniofacial Sciences, 2012, University of California – San Francisco
URL: http://www.escholarship.org/uc/item/14q55139
► Introduction: Proper diagnosis and treatment planning is essential to the outcome of orthodontic therapy with accurate diagnostic records being the pinnacle of that process. Previously,…
(more)
▼ Introduction: Proper diagnosis and treatment planning is essential to the outcome of orthodontic therapy with accurate diagnostic records being the pinnacle of that process. Previously, two-dimensional imagery has been the standard in analyzing or visualizing a patient for skeletal asymmetries but has numerous limiting factors. The limitations that two dimensional analyses face can be solved by the use of three-dimensional cone beam computed tomography (CBCT) when combined with an efficient and relevant analysis. The purpose of this study was to design a novel analysis of asymmetry utilizing CBCT that could be used in a standard orthodontic diagnostic analysis. Methods: CBCT images of 35 patients from the UCSF Orthodontic Clinic were used for development of the analysis. A pilot study with 5 patients having marked asymmetries was traced at 3 different time points to aid in landmark verification and assess reliability. A series of landmarks sharing commonalities with those used in two-dimensional cephalometric analysis were applied. A Pearson Correlation Coefficient with Bonferonni correction as well as a Bland-Altman test for reproducibility was applied for the three timepoints on five different patients to test intraobserver reliability. 10 patients with a molar Class I malocclusion, 10 patients with CII malocclusions, and 10 with CIII malocclusions were used to create a sample of patients with the applied orientation method and asymmetry analysis. Results: Landmark identification was found to be reproducible with only a weak statistical difference in landmark identification. No statistically significant differences were found between any landmarks and their different timepoint selections, particularly those points essential to the establishment of the analysis axis (p<.05). The index provided a quantitative assessment in three planes for both numerical and visual evidence of asymmetry. Conclusions: The Ulrich Orthodontic Asymmetry analysis, combined with reliable and reproducible landmark selection, allows for successful quantitative assessment of asymmetry identification in both 2-dimensional and 3-dimensional visualizations and may provide a standard diagnostic tool to be used with patients seeking orthodontic treatment.
Subjects/Keywords: Dentistry; asymmetry; CBCT; cone-beam CT; craniofacial skeleton; three-dimensional analysis; three-dimensional imaging
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ulrich, B. (2012). A Novel Analysis of Skeletal Asymmetry Utilizing 3D CBCT Technology: The Ulrich Analysis. (Thesis). University of California – San Francisco. Retrieved from http://www.escholarship.org/uc/item/14q55139
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Ulrich, Beau. “A Novel Analysis of Skeletal Asymmetry Utilizing 3D CBCT Technology: The Ulrich Analysis.” 2012. Thesis, University of California – San Francisco. Accessed April 10, 2021.
http://www.escholarship.org/uc/item/14q55139.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Ulrich, Beau. “A Novel Analysis of Skeletal Asymmetry Utilizing 3D CBCT Technology: The Ulrich Analysis.” 2012. Web. 10 Apr 2021.
Vancouver:
Ulrich B. A Novel Analysis of Skeletal Asymmetry Utilizing 3D CBCT Technology: The Ulrich Analysis. [Internet] [Thesis]. University of California – San Francisco; 2012. [cited 2021 Apr 10].
Available from: http://www.escholarship.org/uc/item/14q55139.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Ulrich B. A Novel Analysis of Skeletal Asymmetry Utilizing 3D CBCT Technology: The Ulrich Analysis. [Thesis]. University of California – San Francisco; 2012. Available from: http://www.escholarship.org/uc/item/14q55139
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of South Carolina
9.
Worden, Austin N.
Analysis of Cellular Interactions Within a Collagen Hydrogel.
Degree: Degree ofMSin Biomedical Science, Biomedical Science, 2019, University of South Carolina
URL: https://scholarcommons.sc.edu/etd/5165
► Evidence has arisen over the past several years that use of a three- dimensional (3D) culture system provides a distinct advantage over two- dimensional…
(more)
▼ Evidence has arisen over the past several years that use of a three-
dimensional (3D) culture system provides a distinct advantage over two-
dimensional (2D) systems when cellular interactions are examined in a more natural environment. Changes in morphology, speed, and directionality of cells tested in both planar and 3D matrices have all demonstrated that using 3D system is advantageous. The changes to the cellular migration patterns were shown to be dependent on several variables within the surrounding substrate including cellular content, physical environment, and the matrix chemical milieu. We have taken advantage of using collagen hydrogels as a 3D scaffold for culturing cells for an extended period of time which has led to intriguing discoveries. One such discovery is that independent of cell type, cells which were placed on top of the hydrogel formed a ring structure we termed a toroid. These toroids take the shape of the well in which they are cultured. These toroidal cells appear long, thin, and are reminiscent of spokes on a wheel. However, when cells were mixed into the collagen hydrogel, a gel contraction was observed, but the cells remained homogenous throughout and no toroid was formed. In our studies, stem cells, lens epithelial cells, cardiac fibroblasts, microvascular endothelial cells, and cancer cells, were used individually or in combination. Cells were placed on the top of collagen hydrogels to observe their behavior in this new multicellular environment. We observed that when the different cell types were mixed together they formed a tighter toroid than normal. We also investigated the movement of cells during the toroid formation. To that end, β1 integrin, a member of the integrin family of membrane receptors important for cellular adhesion and recognition, was overexpressed in cells using a plasmid tagged with Green Fluorescent Protein (GFP). We were successful at expressing GFP tagged β1 integrin in cells and observing them in the collagen matrix. Our observations will contribute to the understanding of toroid formation and form the foundation of future computational modeling experiments examining cellular behaviors in response to different microenvironments.
Advisors/Committee Members: Jay D. Potts.
Subjects/Keywords: Biomedical Engineering and Bioengineering; analysis; cellular; collagen; hydrogel; natural environment; three-dimensional; two-dimensional
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Worden, A. N. (2019). Analysis of Cellular Interactions Within a Collagen Hydrogel. (Thesis). University of South Carolina. Retrieved from https://scholarcommons.sc.edu/etd/5165
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Worden, Austin N. “Analysis of Cellular Interactions Within a Collagen Hydrogel.” 2019. Thesis, University of South Carolina. Accessed April 10, 2021.
https://scholarcommons.sc.edu/etd/5165.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Worden, Austin N. “Analysis of Cellular Interactions Within a Collagen Hydrogel.” 2019. Web. 10 Apr 2021.
Vancouver:
Worden AN. Analysis of Cellular Interactions Within a Collagen Hydrogel. [Internet] [Thesis]. University of South Carolina; 2019. [cited 2021 Apr 10].
Available from: https://scholarcommons.sc.edu/etd/5165.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Worden AN. Analysis of Cellular Interactions Within a Collagen Hydrogel. [Thesis]. University of South Carolina; 2019. Available from: https://scholarcommons.sc.edu/etd/5165
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Tulane University
10.
Qu, Zhe.
High-dimensional statistical data integration.
Degree: 2019, Tulane University
URL: https://digitallibrary.tulane.edu/islandora/object/tulane:106916
► [email protected]
Modern biomedical studies often collect multiple types of high-dimensional data on a common set of objects. A representative model for the integrative analysis of…
(more)
▼ [email protected]
Modern biomedical studies often collect multiple types of high-dimensional data on a common set of objects. A representative model for the integrative analysis of multiple data types is to decompose each data matrix into a low-rank common-source matrix generated by latent factors shared across all data types, a low-rank distinctive-source matrix corresponding to each data type, and an additive noise matrix. We propose a novel decomposition method, called the decomposition-based generalized canonical correlation analysis, which appropriately defines those matrices by imposing a desirable orthogonality constraint on distinctive latent factors that aims to sufficiently capture the common latent factors. To further delineate the common and distinctive patterns between two data types, we propose another new decomposition method, called the common and distinctive pattern analysis. This method takes into account the common and distinctive information between the coefficient matrices of the common latent factors. We develop consistent estimation approaches for both proposed decompositions under high-dimensional settings, and demonstrate their finite-sample performance via extensive simulations. We illustrate the superiority of proposed methods over the state of the arts by real-world data examples obtained from The Cancer Genome Atlas and Human Connectome Project.
1
Zhe Qu
Advisors/Committee Members: Hyman, James (Thesis advisor), School of Science & Engineering Mathematics (Degree granting institution).
Subjects/Keywords: High-dimensional data analysis; Data integration; Canonical correlation analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Qu, Z. (2019). High-dimensional statistical data integration. (Thesis). Tulane University. Retrieved from https://digitallibrary.tulane.edu/islandora/object/tulane:106916
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Qu, Zhe. “High-dimensional statistical data integration.” 2019. Thesis, Tulane University. Accessed April 10, 2021.
https://digitallibrary.tulane.edu/islandora/object/tulane:106916.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Qu, Zhe. “High-dimensional statistical data integration.” 2019. Web. 10 Apr 2021.
Vancouver:
Qu Z. High-dimensional statistical data integration. [Internet] [Thesis]. Tulane University; 2019. [cited 2021 Apr 10].
Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:106916.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Qu Z. High-dimensional statistical data integration. [Thesis]. Tulane University; 2019. Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:106916
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Tasmania
11.
Harvey, PM.
Pneumatic-modulation comprehensive two-dimensional gas chromatography for environmental analysis.
Degree: 2013, University of Tasmania
URL: https://eprints.utas.edu.au/16780/1/Harvey_whole_thesis_ex_pub_mat.pdf
;
https://eprints.utas.edu.au/16780/2/Harvey_whole_thesis.pdf
► Environmental petroleum hydrocarbon (PHC) monitoring is a major challenge. Analytical methods must be robust; operate with minimal user intervention; be suitable for remote field operation;…
(more)
▼ Environmental petroleum hydrocarbon (PHC) monitoring is a major challenge. Analytical methods must be robust; operate with minimal user intervention; be suitable for remote field operation; and furnish analytical data that allows the different mechanisms of PHC environmental fate to be investigated. PHC are amenable to analysis by comprehensive two-dimensional gas chromatography (GCxGC). However, conventional GCxGC instrumentation relies on bulky thermal modulation systems. Thus alternative approaches based on fluidic modulation were investigated to determine their suitability for environmental PHC monitoring.
First, a dynamic flow model, which maps carrier gas pressure and flow rate through the first-dimension separation column, the modulator sample loop, and the seconddimension column(s) in a fluidic modulation GCxGC system is described. The dynamic flow model assists design of a pneumatic modulation ensemble and leads to rapid determination of pneumatic conditions, timing parameters, and the dimensions of the separation columns and connecting tubing used to construct the GCxGC system. Three significant innovations are introduced, that were all uncovered by using the dynamic flow model, viz.
i) a “symmetric flow path” modulator improved baseline stability,
ii) appropriate selection of flow restrictors in the first dimension column assembly provides a generally more stable and robust system, and iii) these restrictors increase the modulation period flexibility of the GCxGC system.
Next, a model was developed that permitted a systematic investigation of peak shape in fluidic modulation. In the case of a non-focusing modulator for comprehensive two-dimensional gas chromatography, the systematic distortions induced when the modulator loads the second-dimension column give rise to a characteristic peak shape. Depending on the operating conditions this systematic distortion can be the dominant component of the second-dimension elution profiles. Understanding the factors that cause different peak shape observations provides a rugged approach to method development. It is shown that low flow ratio can lead to significant peak skewing and increasing the flow ratio reduces the magnitude of peak skewing. Validation of the peak shape model is made by comparison with experimental data.
Finally GCxGC methodology was developed and applied to analysis of PHC contaminated soil. GCxGC results met or exceeded, the standards set by regulators and environmental scientists. Fluidic modulation approaches provided excellent sensitivity and permitted detailed monitoring of key PHC transport and degradation pathways, including evaporation, dissolution, and biodegradation.
Subjects/Keywords: chromatography; GCxGC; two-dimensional chromatography; environmental analysis; fuel spill analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Harvey, P. (2013). Pneumatic-modulation comprehensive two-dimensional gas chromatography for environmental analysis. (Thesis). University of Tasmania. Retrieved from https://eprints.utas.edu.au/16780/1/Harvey_whole_thesis_ex_pub_mat.pdf ; https://eprints.utas.edu.au/16780/2/Harvey_whole_thesis.pdf
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Harvey, PM. “Pneumatic-modulation comprehensive two-dimensional gas chromatography for environmental analysis.” 2013. Thesis, University of Tasmania. Accessed April 10, 2021.
https://eprints.utas.edu.au/16780/1/Harvey_whole_thesis_ex_pub_mat.pdf ; https://eprints.utas.edu.au/16780/2/Harvey_whole_thesis.pdf.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Harvey, PM. “Pneumatic-modulation comprehensive two-dimensional gas chromatography for environmental analysis.” 2013. Web. 10 Apr 2021.
Vancouver:
Harvey P. Pneumatic-modulation comprehensive two-dimensional gas chromatography for environmental analysis. [Internet] [Thesis]. University of Tasmania; 2013. [cited 2021 Apr 10].
Available from: https://eprints.utas.edu.au/16780/1/Harvey_whole_thesis_ex_pub_mat.pdf ; https://eprints.utas.edu.au/16780/2/Harvey_whole_thesis.pdf.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Harvey P. Pneumatic-modulation comprehensive two-dimensional gas chromatography for environmental analysis. [Thesis]. University of Tasmania; 2013. Available from: https://eprints.utas.edu.au/16780/1/Harvey_whole_thesis_ex_pub_mat.pdf ; https://eprints.utas.edu.au/16780/2/Harvey_whole_thesis.pdf
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

McMaster University
12.
Pichika, Sathish chandra.
Sparse Canonical Correlation Analysis (SCCA): A Comparative Study.
Degree: MSc, 2011, McMaster University
URL: http://hdl.handle.net/11375/11779
► Canonical Correlation Analysis (CCA) is one of the multivariate statistical methods that can be used to find relationship between two sets of variables. I…
(more)
▼ Canonical Correlation Analysis (CCA) is one of the multivariate statistical methods that can be used to find relationship between two sets of variables. I highlighted challenges in analyzing high-dimensional data with CCA. Recently, Sparse CCA (SCCA) methods have been proposed to identify sparse linear combinations of two sets of variables with maximal correlation in the context of high-dimensional data. In my thesis, I compared three different SCCA approaches. I evaluated the three approaches as well as the classical CCA on simulated datasets and illustrated the methods with publicly available genomic and proteomic datasets.
Master of Science (MSc)
Advisors/Committee Members: Beyene, Joseph, Narayanaswamy Balakrishnan and Aaron Childs, Mathematics and Statistics.
Subjects/Keywords: CCA; SCCA; High-Dimensional; Multivariate Analysis; Multivariate Analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Pichika, S. c. (2011). Sparse Canonical Correlation Analysis (SCCA): A Comparative Study. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/11779
Chicago Manual of Style (16th Edition):
Pichika, Sathish chandra. “Sparse Canonical Correlation Analysis (SCCA): A Comparative Study.” 2011. Masters Thesis, McMaster University. Accessed April 10, 2021.
http://hdl.handle.net/11375/11779.
MLA Handbook (7th Edition):
Pichika, Sathish chandra. “Sparse Canonical Correlation Analysis (SCCA): A Comparative Study.” 2011. Web. 10 Apr 2021.
Vancouver:
Pichika Sc. Sparse Canonical Correlation Analysis (SCCA): A Comparative Study. [Internet] [Masters thesis]. McMaster University; 2011. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/11375/11779.
Council of Science Editors:
Pichika Sc. Sparse Canonical Correlation Analysis (SCCA): A Comparative Study. [Masters Thesis]. McMaster University; 2011. Available from: http://hdl.handle.net/11375/11779

University of Adelaide
13.
Conway, Annie.
Clustering of proteomics imaging mass spectrometry data.
Degree: 2016, University of Adelaide
URL: http://hdl.handle.net/2440/112036
► This thesis presents a toolbox for the exploratory analysis of multivariate data, in particular proteomics imaging mass spectrometry data. Typically such data consist of 15000…
(more)
▼ This thesis presents a toolbox for the exploratory
analysis of multivariate data, in particular proteomics imaging mass spectrometry data. Typically such data consist of 15000 - 20000 spectra with a spatial component, and for each spectrum ion intensities are recorded at specific masses. Clustering is a focus of this thesis, with discussion of k-means clustering and clustering with principal component
analysis
(PCA). Theoretical results relating PCA and clustering are given based on Ding and He (2004), and detailed and corrected proofs of the authors' results are presented. The benefits of transformations prior to clustering of the data are explored. Transformations include normalisation, peak intensity correction (PIC), binary and log transformations. A number of techniques for comparing different clustering results are also discussed and these include set based comparisons with the Jaccard distance, an information based criterion (variation of information), point-pair comparisons (Rand index) and a modified version of the prediction strength of Tibshirani and Walther (2005). These exploratory analyses are applied to imaging mass spectrometry data taken from patients with ovarian cancer. The data are taken from slices of cancerous tissue. The analyses in this thesis are primarily focused on data from one patient, with some techniques demonstrated on other patients for comparison.
Advisors/Committee Members: Koch, Inge (advisor), School of Mathematical Sciences (school).
Subjects/Keywords: clustering; proteomics; multivariate data analysis; high-dimensional data analysis; machine learning
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Conway, A. (2016). Clustering of proteomics imaging mass spectrometry data. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/112036
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Conway, Annie. “Clustering of proteomics imaging mass spectrometry data.” 2016. Thesis, University of Adelaide. Accessed April 10, 2021.
http://hdl.handle.net/2440/112036.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Conway, Annie. “Clustering of proteomics imaging mass spectrometry data.” 2016. Web. 10 Apr 2021.
Vancouver:
Conway A. Clustering of proteomics imaging mass spectrometry data. [Internet] [Thesis]. University of Adelaide; 2016. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/2440/112036.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Conway A. Clustering of proteomics imaging mass spectrometry data. [Thesis]. University of Adelaide; 2016. Available from: http://hdl.handle.net/2440/112036
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Hong Kong University of Science and Technology
14.
Zhang, Yuan.
Cluster analysis for categorical data and its application for dimensionality analysis.
Degree: 2010, Hong Kong University of Science and Technology
URL: http://repository.ust.hk/ir/Record/1783.1-7212
;
https://doi.org/10.14711/thesis-b1116255
;
http://repository.ust.hk/ir/bitstream/1783.1-7212/1/th_redirect.html
► Cluster analysis can not only cluster observations/cases into several groups but also cluster variables into several categories. Huge amount of categorical data is coming from…
(more)
▼ Cluster analysis can not only cluster observations/cases into several groups but also cluster variables into several categories. Huge amount of categorical data is coming from different areas of research, both social and nature sciences. Therefore, there is a need to choose an appropriate method to analyze categorical data. This thesis starts with different clustering methods to cluster observations into several groups using categorical data. Three commonly used methods are compared and latent class analysis is recommended for categorical data analysis. Dimensionality is an application of clustering variables/items into several categories/subscales in education. In this thesis, we develop an alternative approach to study the dimensionality. We are aware of no DETECT or PolyDETECT program available as part of the commonly used statistical packages such as SAS or Matlab. We thus develop a Matlab program to analyze the dimensional structure of educational test based on cluster analysis.
Subjects/Keywords: Cluster analysis
; Dimensional analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Zhang, Y. (2010). Cluster analysis for categorical data and its application for dimensionality analysis. (Thesis). Hong Kong University of Science and Technology. Retrieved from http://repository.ust.hk/ir/Record/1783.1-7212 ; https://doi.org/10.14711/thesis-b1116255 ; http://repository.ust.hk/ir/bitstream/1783.1-7212/1/th_redirect.html
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Zhang, Yuan. “Cluster analysis for categorical data and its application for dimensionality analysis.” 2010. Thesis, Hong Kong University of Science and Technology. Accessed April 10, 2021.
http://repository.ust.hk/ir/Record/1783.1-7212 ; https://doi.org/10.14711/thesis-b1116255 ; http://repository.ust.hk/ir/bitstream/1783.1-7212/1/th_redirect.html.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Zhang, Yuan. “Cluster analysis for categorical data and its application for dimensionality analysis.” 2010. Web. 10 Apr 2021.
Vancouver:
Zhang Y. Cluster analysis for categorical data and its application for dimensionality analysis. [Internet] [Thesis]. Hong Kong University of Science and Technology; 2010. [cited 2021 Apr 10].
Available from: http://repository.ust.hk/ir/Record/1783.1-7212 ; https://doi.org/10.14711/thesis-b1116255 ; http://repository.ust.hk/ir/bitstream/1783.1-7212/1/th_redirect.html.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Zhang Y. Cluster analysis for categorical data and its application for dimensionality analysis. [Thesis]. Hong Kong University of Science and Technology; 2010. Available from: http://repository.ust.hk/ir/Record/1783.1-7212 ; https://doi.org/10.14711/thesis-b1116255 ; http://repository.ust.hk/ir/bitstream/1783.1-7212/1/th_redirect.html
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Michigan State University
15.
Ye, Mingquan.
Faster algorithms for machine learning problems in high dimension.
Degree: 2019, Michigan State University
URL: http://etd.lib.msu.edu/islandora/object/etd:47868
► Thesis M.S. Michigan State University. Computer Science 2019
"When dealing with datasets with high dimension, the existing machine learning algorithms often do not work in…
(more)
▼ Thesis M.S. Michigan State University. Computer Science 2019
"When dealing with datasets with high dimension, the existing machine learning algorithms often do not work in practice. Actually, most of the real-world data has the nature of low intrinsic dimension. For example, data often lies on a low-dimensional manifold or has a low doubling dimension. Inspired by this phenomenon, this thesis tries to improve the time complexities of two fundamental problems in machine learning using some techniques in computational geometry. In Chapter two, we propose a bi-criteria approximation algorithm for minimum enclosing ball with outliers and extend it to the outlier recognition problem. By virtue of the "core-set" idea and the Random Gradient Descent Tree, we propose an efficient algorithm which is linear in the number of points n and the dimensionality d, and provides a probability bound. In experiments, compared with some existing outlier recognition algorithms, our method is proven to be efficient and robust to the outlier ratios. In Chapter three, we adopt the "doubling dimension" to characterize the intrinsic dimension of a point set. By the property of doubling dimension, we can approximate the geometric alignment between two point sets by executing the existing alignment algorithms on their subsets, which achieves a much smaller time complexity. More importantly, the proposed approximate method has a theoretical upper bound and can serve as the preprocessing step of any alignment algorithm." – Page ii.
Description based on online resource;
Advisors/Committee Members: Ding, Hu, Torng, Eric, Tong, Yiying.
Subjects/Keywords: Machine learning – Statistical methods; Dimensional analysis; Multivariate analysis; Computer science
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ye, M. (2019). Faster algorithms for machine learning problems in high dimension. (Thesis). Michigan State University. Retrieved from http://etd.lib.msu.edu/islandora/object/etd:47868
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Ye, Mingquan. “Faster algorithms for machine learning problems in high dimension.” 2019. Thesis, Michigan State University. Accessed April 10, 2021.
http://etd.lib.msu.edu/islandora/object/etd:47868.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Ye, Mingquan. “Faster algorithms for machine learning problems in high dimension.” 2019. Web. 10 Apr 2021.
Vancouver:
Ye M. Faster algorithms for machine learning problems in high dimension. [Internet] [Thesis]. Michigan State University; 2019. [cited 2021 Apr 10].
Available from: http://etd.lib.msu.edu/islandora/object/etd:47868.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Ye M. Faster algorithms for machine learning problems in high dimension. [Thesis]. Michigan State University; 2019. Available from: http://etd.lib.msu.edu/islandora/object/etd:47868
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Illinois – Urbana-Champaign
16.
Wang, Qi.
TextDive: construction, summarization and exploration of multi-dimensional text corpora.
Degree: MS, Computer Science, 2016, University of Illinois – Urbana-Champaign
URL: http://hdl.handle.net/2142/90938
► With massive datasets accumulating in text repositories (e.g., news articles, customer reviews, etc.), it is highly desirable to systematically utilize and explore them by data…
(more)
▼ With massive datasets accumulating in text repositories (e.g., news articles, customer reviews, etc.), it is highly desirable to systematically utilize and explore them by data mining, NLP and database techniques. In our view, documents in text corpora contain informative explicit meta-attributes (e.g., category, date, author, etc.) and implicit attributes (e.g., sentiment), forming one or a set of highly-structured multi-
dimensional spaces. Much knowledge can be derived if we develop effective and efficient multi-
dimensional summarization, exploration and
analysis technologies.
In this demo, we propose an end-to-end, real-time analytical platform TextDive for processing massive text data, and provide valuable insights to general data consumers. First, we develop a set of information extraction, entity typing and text mining methods to extract consolidated dimensions and automatically construct multi-
dimensional textual spaces (i.e., text cubes). Furthermore, we develop a set of OLAP-like text summarization, data exploration and text
analysis mechanisms that understand semantics of text corpora in multi-
dimensional spaces. We also develop an efficient computational solution that involves materializing selective statistics to guarantee the interactive and real-time nature of TextDive.
Advisors/Committee Members: Han, Jiawei (advisor).
Subjects/Keywords: multi-dimensional text corpora analysis; text cube analysis; text summarization
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wang, Q. (2016). TextDive: construction, summarization and exploration of multi-dimensional text corpora. (Thesis). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/90938
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Wang, Qi. “TextDive: construction, summarization and exploration of multi-dimensional text corpora.” 2016. Thesis, University of Illinois – Urbana-Champaign. Accessed April 10, 2021.
http://hdl.handle.net/2142/90938.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Wang, Qi. “TextDive: construction, summarization and exploration of multi-dimensional text corpora.” 2016. Web. 10 Apr 2021.
Vancouver:
Wang Q. TextDive: construction, summarization and exploration of multi-dimensional text corpora. [Internet] [Thesis]. University of Illinois – Urbana-Champaign; 2016. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/2142/90938.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Wang Q. TextDive: construction, summarization and exploration of multi-dimensional text corpora. [Thesis]. University of Illinois – Urbana-Champaign; 2016. Available from: http://hdl.handle.net/2142/90938
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Michigan State University
17.
Banik, Asish Kumar.
Bayesian variable selection and functional data analysis : application to brain imaging.
Degree: 2019, Michigan State University
URL: http://etd.lib.msu.edu/islandora/object/etd:48086
► Thesis Ph. D. Michigan State University. Statistics 2019.
High-dimensional statistics is one of the most studied topics in the field of statistics. The most interesting…
(more)
▼ Thesis Ph. D. Michigan State University. Statistics 2019.
High-dimensional statistics is one of the most studied topics in the field of statistics. The most interesting problem to arise in the last 15 years is variable selection or subset selection. Variable selection is a strong statistical tool that can be explored in functional data analysis. In the first part of this thesis, we implement a Bayesian variable selection method for automatic knot selection. We propose a spike-and-slab prior on knots and formulate a conjugate stochastic search variable selection for significant knots. The computation is substantially faster than existing knot selection methods, as we use Metropolis-Hastings algorithms and a Gibbs sampler for estimation. This work focuses on a single nonlinear covariate, modeled as regression splines. In the next stage, we study Bayesian variable selection in additive models with high-dimensional predictors. The selection of nonlinear functions in models is highly important in recent research, and the Bayesian method of selection has more advantages than contemporary frequentist methods. Chapter 2 examines Bayesian sparse group lasso theory based on spike-and-slab priors to determine its applicability for variable selection and function estimation in nonparametric additive models.The primary objective of Chapter 3 is to build a classification method using longitudinal volumetric magnetic resonance imaging (MRI) data from five regions of interest (ROIs). A functional data analysis method is used to handle the longitudinal measurement of ROIs, and the functional coefficients are later used in the classification models. We propose a P\\'olya-gamma augmentation method to classify normal controls and diseased patients based on functional MRI measurements. We obtain fast-posterior sampling by avoiding the slow and complicated Metropolis-Hastings algorithm. Our main motivation is to determine the important ROIs that have the highest separating power to classify our dichotomous response. We compare the sensitivity, specificity, and accuracy of the classification based on single ROIs and with various combinations of them. We obtain a sensitivity of over 85% and a specificity of around 90% for most of the combinations.Next, we work with Bayesian classification and selection methodology. The main goal of Chapter 4 is to employ longitudinal trajectories in a significant number of sub-regional brain volumetric MRI data as statistical predictors for Alzheimer's disease (AD) classification. We use logistic regression in a Bayesian framework that includes many functional predictors. The direct sampling of regression coefficients from the Bayesian logistic model is difficult due to its complicated likelihood function. In high-dimensional scenarios, the selection of predictors is paramount with the introduction of either spike-and-slab priors, non-local priors, or Horseshoe priors. We seek to avoid the complicated Metropolis-Hastings approach and to develop an easily implementable Gibbs sampler. In addition, the…
Advisors/Committee Members: Maiti, Tapabrata, Sikorskii, Alla, Ramamoorthi, Ramanathapuram V, Bender, Andrew, Baek, Seungik.
Subjects/Keywords: Bayesian statistical decision theory; Dimensional analysis; Multivariate analysis; Statistics
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Banik, A. K. (2019). Bayesian variable selection and functional data analysis : application to brain imaging. (Thesis). Michigan State University. Retrieved from http://etd.lib.msu.edu/islandora/object/etd:48086
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Banik, Asish Kumar. “Bayesian variable selection and functional data analysis : application to brain imaging.” 2019. Thesis, Michigan State University. Accessed April 10, 2021.
http://etd.lib.msu.edu/islandora/object/etd:48086.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Banik, Asish Kumar. “Bayesian variable selection and functional data analysis : application to brain imaging.” 2019. Web. 10 Apr 2021.
Vancouver:
Banik AK. Bayesian variable selection and functional data analysis : application to brain imaging. [Internet] [Thesis]. Michigan State University; 2019. [cited 2021 Apr 10].
Available from: http://etd.lib.msu.edu/islandora/object/etd:48086.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Banik AK. Bayesian variable selection and functional data analysis : application to brain imaging. [Thesis]. Michigan State University; 2019. Available from: http://etd.lib.msu.edu/islandora/object/etd:48086
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
18.
Luo, Bin.
Robust penalized regression for complex high-dimensional data.
Degree: 2020, NC Docks
URL: http://libres.uncg.edu/ir/uncg/f/Luo_uncg_0154D_13007.pdf
► Robust high-dimensional data analysis has become an important and challenging task in complex Big Data analysis due to the high-dimensionality and data contamination. One of…
(more)
▼ Robust high-dimensional data analysis has become an important and challenging task in complex Big Data analysis due to the high-dimensionality and data contamination. One of the most popular procedures is the robust penalized regression. In this dissertation, we address three typical robust ultra-high dimensional regression problems via penalized regression approaches. The first problem is related to the linear model with the existence of outliers, dealing with the outlier detection, variable selection and parameter estimation simultaneously. The second problem is related to robust high-dimensional mean regression with irregular settings such as the data contamination, data asymmetry and heteroscedasticity. The third problem is related to robust bi-level variable selection for the linear regression model with grouping structures in covariates. In Chapter 1, we introduce the background and challenges by overviews of penalized least squares methods and robust regression techniques. In Chapter 2, we propose a novel approach in a penalized weighted least squares framework to perform simultaneous variable selection and outlier detection. We provide a unified link between the proposed framework and a robust M-estimation in general settings. We also establish the non-asymptotic oracle inequalities for the joint estimation of both the regression coefficients and weight vectors. In Chapter 3, we establish a framework of robust estimators in high-dimensional regression models using Penalized Robust Approximated quadratic M estimation (PRAM). This framework allows general settings such as random errors lack of symmetry and homogeneity, or covariates are not sub-Gaussian. Theoretically, we show that, in the ultra-high dimension setting, the PRAM estimator has local estimation consistency at the minimax rate enjoyed by the LS-Lasso and owns the local oracle property, under certain mild conditions. In Chapter 4, we extend the study in Chapter 3 to robust high-dimensional data analysis with structured sparsity. In particular, we propose a framework of high-dimensional M-estimators for bi-level variable selection. This framework encourages bi-level sparsity through a computationally efficient two-stage procedure. It produces strong robust parameter estimators if some nonconvex redescending loss functions are applied. In theory, we provide sufficient conditions under which our proposed two-stage penalized M-estimator possesses simultaneous local estimation consistency and the bi-level variable selection consistency, if a certain nonconvex penalty function is used at the group level. The performances of the proposed estimators are demonstrated in both simulation studies and real examples. In Chapter 5, we provide some discussions and future work.
Subjects/Keywords: Estimation theory; Dimensional analysis; Regression analysis; Least squares
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Luo, B. (2020). Robust penalized regression for complex high-dimensional data. (Thesis). NC Docks. Retrieved from http://libres.uncg.edu/ir/uncg/f/Luo_uncg_0154D_13007.pdf
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Luo, Bin. “Robust penalized regression for complex high-dimensional data.” 2020. Thesis, NC Docks. Accessed April 10, 2021.
http://libres.uncg.edu/ir/uncg/f/Luo_uncg_0154D_13007.pdf.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Luo, Bin. “Robust penalized regression for complex high-dimensional data.” 2020. Web. 10 Apr 2021.
Vancouver:
Luo B. Robust penalized regression for complex high-dimensional data. [Internet] [Thesis]. NC Docks; 2020. [cited 2021 Apr 10].
Available from: http://libres.uncg.edu/ir/uncg/f/Luo_uncg_0154D_13007.pdf.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Luo B. Robust penalized regression for complex high-dimensional data. [Thesis]. NC Docks; 2020. Available from: http://libres.uncg.edu/ir/uncg/f/Luo_uncg_0154D_13007.pdf
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Alberta
19.
Xia, Jianguo.
Developing bioinformatics tools for metabolomics.
Degree: PhD, Department of Biological Sciences, 2011, University of Alberta
URL: https://era.library.ualberta.ca/files/5x21tg11x
► Metabolomics aims to study all small-molecule compounds (i.e. metabolites) in cells, tissues, or biofluids. These compounds provide a functional readout of the physiological, developmental, and…
(more)
▼ Metabolomics aims to study all small-molecule
compounds (i.e. metabolites) in cells, tissues, or biofluids. These
compounds provide a functional readout of the physiological,
developmental, and pathological state of a biological system. The
field of metabolomics has expanded rapidly over the last few years
with increasing applications to disease diagnosis, drug toxicity
screening, nutritional studies and many other life sciences.
However, significant challenges remain in both collecting and
understanding metabolomic data. The central objective of my thesis
project is to develop novel bioinformatic tools to address some of
the key computational challenges in metabolomic studies. In
particular, my research is focused on three areas: (i) compound
identification from complex biofluids, (ii) processing and
statistical analysis of metabolomic data, and (iii) functional
interpretation of metabolomic data. In addressing these issues I
have developed a number of efficient and user-friendly software
tools, including MetaboMiner, MetaboAnalyst, MSEA and MetPA. Each
of these software packages has required the development of novel
algorithms, novel interfaces or the implementation of novel
analytical concepts. MetaboMiner ( <i class="glyphicon
glyphicon-new-window"></i> http://wishart.biology.ualberta.ca/metabominer )
is a standalone Java application for compound identification from
2D NMR spectra of complex biofluids. Based on a novel adaptive
search algorithm and specially constructed spectral libraries,
MetaboMiner is able to automatically identify ~80% of metabolites
from good quality NMR spectra. MetaboAnalyst ( <i
class="glyphicon
glyphicon-new-window"></i> http://www.metaboanalyst.ca )
is a web-based pipeline for metabolomic data processing,
normalization, and statistical analysis. This application is based
on a novel framework that combines the statistical and
visualization power offered by R ( <i class="glyphicon
glyphicon-new-window"></i> http://www.r-project.org )
with an enhanced graphical user interface enabled by Java Server
Faces technology. It is currently the most comprehensive and
popular data analysis web service in metabolomics. MSEA or
metabolite set enrichment analysis ( <i class="glyphicon
glyphicon-new-window"></i> http://www.msea.ca )
represents a novel application of the gene set enrichment analysis
technique to metabolomics. In particular, MSEA is a web application
for the identification of biologically meaningful patterns through
enrichment analysis of quantitative metabolomic data. To create
MSEA, I assembled a unique database of ~6300 groups of biologically
related metabolites with association data on diseases, pathways,
genetic traits, and cellular or organ localization. MetPA ( <i
class="glyphicon
glyphicon-new-window"></i> http://metpa.metabolomics.ca )
is a web-based tool for metabolic pathway analysis. It integrates
functional enrichment analysis and pathway topology analysis
through a novel Google-map style network visualization system.
MetPA currently supports the analysis of…
Subjects/Keywords: metabolomics; metabolite set enrichment analysis; statistical analysis; metabolic pathway analysis; two dimensional NMR; bioinformatics
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Xia, J. (2011). Developing bioinformatics tools for metabolomics. (Doctoral Dissertation). University of Alberta. Retrieved from https://era.library.ualberta.ca/files/5x21tg11x
Chicago Manual of Style (16th Edition):
Xia, Jianguo. “Developing bioinformatics tools for metabolomics.” 2011. Doctoral Dissertation, University of Alberta. Accessed April 10, 2021.
https://era.library.ualberta.ca/files/5x21tg11x.
MLA Handbook (7th Edition):
Xia, Jianguo. “Developing bioinformatics tools for metabolomics.” 2011. Web. 10 Apr 2021.
Vancouver:
Xia J. Developing bioinformatics tools for metabolomics. [Internet] [Doctoral dissertation]. University of Alberta; 2011. [cited 2021 Apr 10].
Available from: https://era.library.ualberta.ca/files/5x21tg11x.
Council of Science Editors:
Xia J. Developing bioinformatics tools for metabolomics. [Doctoral Dissertation]. University of Alberta; 2011. Available from: https://era.library.ualberta.ca/files/5x21tg11x

Vanderbilt University
20.
Diggins, Kirsten Elizabeth.
Quantifying Cellular Heterogeneity in Cancer and the Microenvironment.
Degree: PhD, Cancer Biology, 2016, Vanderbilt University
URL: http://hdl.handle.net/1803/14853
► In spite of recent advances in therapy, cancer remains a leading cause of death worldwide. Therapy response is often unpredictable and relapse frequently occurs. In…
(more)
▼ In spite of recent advances in therapy, cancer remains a leading cause of death worldwide. Therapy response is often unpredictable and relapse frequently occurs. In many cases, this therapy resistance is attributed to subsets of therapy resistant cancer cells and surrounding stromal cells that support a resistant phenotype. A better understanding of cellular heterogeneity in cancer is therefore crucial in order to develop novel therapeutic strategies and improve patient outcomes. Experimental technologies like mass cytometry (CyTOF) allow for high-content, multi-parametric single-cell
analysis of human tumor samples. However, analytical tools and workflows are still needed to standardize and automate the process of identifying and quantitatively describing cell populations in the resulting data. This dissertation presents a novel workflow for automated discovery and characterization of novel and rare cell subsets, quantification of cellular heterogeneity, and characterization of cells based on population-specific feature enrichment. First, a modular workflow is described that combines biaxial gating, dimensionality reduction, clustering, and hierarchically clustered heatmaps to maximize rare population discovery and to create an interpretable visualization of cell population characteristics. Next, a novel method is introduced for quantifying cellular heterogeneity based on two-
dimensional mapping of cells in phenotypic space using tSNE
analysis. Finally, an algorithmic method termed Marker Enrichment Modeling (MEM) is introduced that automatically quantifies population-specific feature enrichment and generates descriptive labels for cell populations based on their feature enrichment scores. MEM
analysis is shown to identify features important to cell identity across multiple datasets, and MEM labels are effectively used to compare populations of cells across tissue types, experiments, institutions, and platforms. Going forward, the tools presented here lay the groundwork for novel computational methods for machine learning of cell identity and registering cell populations across studies or clinical endpoints. Automated methods for identifying and describing cell populations will enable rapid discovery of biologically and clinically relevant cells and contribute to the development of novel diagnostic, prognostic, and therapeutic approaches to cancer and other diseases.
Advisors/Committee Members: Todd D. Giorgio, Ph.D. (committee member), Jonathan M. Irish, Ph.D. (committee member), Melissa Skala, Ph.D. (committee member), Vito Quaranta, M.D. (Committee Chair).
Subjects/Keywords: mass cytometry; computational analysis; cancer; immunology; flow cytometry; single-cell analysis; high-dimensional analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Diggins, K. E. (2016). Quantifying Cellular Heterogeneity in Cancer and the Microenvironment. (Doctoral Dissertation). Vanderbilt University. Retrieved from http://hdl.handle.net/1803/14853
Chicago Manual of Style (16th Edition):
Diggins, Kirsten Elizabeth. “Quantifying Cellular Heterogeneity in Cancer and the Microenvironment.” 2016. Doctoral Dissertation, Vanderbilt University. Accessed April 10, 2021.
http://hdl.handle.net/1803/14853.
MLA Handbook (7th Edition):
Diggins, Kirsten Elizabeth. “Quantifying Cellular Heterogeneity in Cancer and the Microenvironment.” 2016. Web. 10 Apr 2021.
Vancouver:
Diggins KE. Quantifying Cellular Heterogeneity in Cancer and the Microenvironment. [Internet] [Doctoral dissertation]. Vanderbilt University; 2016. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/1803/14853.
Council of Science Editors:
Diggins KE. Quantifying Cellular Heterogeneity in Cancer and the Microenvironment. [Doctoral Dissertation]. Vanderbilt University; 2016. Available from: http://hdl.handle.net/1803/14853

Deakin University
21.
Holland, Brendan John.
Exploring two-dimensional chromatography and chemiluminescence selectivity for complex sample analysis.
Degree: 2015, Deakin University
URL: http://hdl.handle.net/10536/DRO/DU:30079385
Human, animal and plant samples contain a considerable number of chemical substances critical for supporting their existence. This thesis presents improved techniques to find and analyse these components to help us understand more about their roles in the complex jigsaw of life.
Advisors/Committee Members: Conlan Xavier.
Subjects/Keywords: two-dimensional chromatography; chemiluminescence selectivity; human sample analysis; animal sample analysis; plant sample analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Holland, B. J. (2015). Exploring two-dimensional chromatography and chemiluminescence selectivity for complex sample analysis. (Thesis). Deakin University. Retrieved from http://hdl.handle.net/10536/DRO/DU:30079385
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Holland, Brendan John. “Exploring two-dimensional chromatography and chemiluminescence selectivity for complex sample analysis.” 2015. Thesis, Deakin University. Accessed April 10, 2021.
http://hdl.handle.net/10536/DRO/DU:30079385.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Holland, Brendan John. “Exploring two-dimensional chromatography and chemiluminescence selectivity for complex sample analysis.” 2015. Web. 10 Apr 2021.
Vancouver:
Holland BJ. Exploring two-dimensional chromatography and chemiluminescence selectivity for complex sample analysis. [Internet] [Thesis]. Deakin University; 2015. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/10536/DRO/DU:30079385.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Holland BJ. Exploring two-dimensional chromatography and chemiluminescence selectivity for complex sample analysis. [Thesis]. Deakin University; 2015. Available from: http://hdl.handle.net/10536/DRO/DU:30079385
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

NSYSU
22.
Tai, Chiech-an.
An Automatic Data Clustering Algorithm based on Differential Evolution.
Degree: Master, Computer Science and Engineering, 2013, NSYSU
URL: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0730113-152814
► As one of the traditional optimization problems, clustering still plays a vital role for the re-searches both theoretically and practically nowadays. Although many successful clustering…
(more)
▼ As one of the traditional optimization problems, clustering still plays a vital role for the re-searches both theoretically and practically nowadays. Although many successful clustering algorithms have been presented, most (if not all) need to be given the number of clusters before the clustering procedure is invoked. A novel differential evolution based clustering algorithm is presented in this paper to solve the problem of automatically determining the number of clusters. The proposed algorithm, called enhanced differential evolution for automatic cluster-ing (EDEAC), leverages the strengths of two technologies: a novel histogram-based
analysis technique for finding the approximate number of clusters and a heuristic search algorithm for
fine-tuning the automatic clustering results. The experimental results show that the proposed algorithm can not only determine the approximate number of clusters automatically, but it can also provide an accurate number of clusters rapidly even for high
dimensional datasets com-pared to other existing automatic clustering algorithms.
Advisors/Committee Members: Chun-Wei Tsai (chair), Ming-Chao Chiang (committee member), Chu-Sing Yang (chair), Tzung-Pei Hong (chair).
Subjects/Keywords: automatic clustering; data clustering; high-dimensional dataset; histogram analysis; differential evolution
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Tai, C. (2013). An Automatic Data Clustering Algorithm based on Differential Evolution. (Thesis). NSYSU. Retrieved from http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0730113-152814
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Tai, Chiech-an. “An Automatic Data Clustering Algorithm based on Differential Evolution.” 2013. Thesis, NSYSU. Accessed April 10, 2021.
http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0730113-152814.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Tai, Chiech-an. “An Automatic Data Clustering Algorithm based on Differential Evolution.” 2013. Web. 10 Apr 2021.
Vancouver:
Tai C. An Automatic Data Clustering Algorithm based on Differential Evolution. [Internet] [Thesis]. NSYSU; 2013. [cited 2021 Apr 10].
Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0730113-152814.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Tai C. An Automatic Data Clustering Algorithm based on Differential Evolution. [Thesis]. NSYSU; 2013. Available from: http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0730113-152814
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
23.
Coatanéa, Eric.
Conceptual Modelling of Life Cycle Design: A Modelling and Evaluation Method Based on Analogies and Dimensionless Numbers.
Degree: 2005, Helsinki University of Technology
URL: http://lib.tkk.fi/Diss/2005/isbn9512278537/
► This thesis develops a paradigm for conceptual design based on the idea that dimensional analysis can improve the evaluation and comparison of concepts of solution…
(more)
▼ This thesis develops a paradigm for conceptual design based on the idea that dimensional analysis can improve the evaluation and comparison of concepts of solution during the conceptual design process. The conceptual design approach developed in this research is a combination of tasks which starts with the identification of the customer needs in a formalized manner is followed by the generation of design concepts taking into account the different phases of the physical life cycle and ends by the evaluation and adequacy analysis of the concepts of solution with the formalized needs. The General Design Theory (GDT) is used as the methodological basis of this work. Using the results of GDT, the research introduces a definition of the concept of function which is generic and not dedicated to a solution-based approach. Consequently the concept of function fulfils its intended objective of modelling the design problems at a general level. In addition to the concept of function, this thesis introduces a series of classifications based on generic concepts and rules aimed at generating concepts of solutions progressively. All these concepts are integrated into the developed metamodel framework. The metamodel provides a group of generic concepts associated with laws and mapped with a normalized functional vocabulary. The metamodel framework is an intermediate structure developed in order to provide guidance during the synthesis process and to meet the initial condition in order to transform the classification structure into a metric space. A metric space is a topological space with a unique metric. The transformation of the initial topological space into a metric space can be obtained when a series of conditions are verified. The first condition consists of clustering the concepts of solutions in order to underline the comparable aspects in each of them. This is done by using a set of dedicated rules. In addition three other fundamental conditions should be obtained. The metamodel framework ensures the first condition; an enhanced fundamental system of unit provides the second condition and a paradigm of separation of concept the third one. When all these three conditions are verified, it becomes possible to transform the design problems modelled by four types of generic variables into a series of dimensionless groups. This transformation process is achieved by using the Vashy-Buckingham theorem and the Butterfield's paradigm. The Butterfield's paradigm is used in order to select the minimum set of repeated variables which ensure the non-singularity of the metrization procedure. This transformation process ends with the creation of a machinery dedicated to the qualitative simulation of the concepts of solutions. The thesis ends with the study of practical cases.
TKK dissertations, ISSN 1795-4584; 11
Advisors/Committee Members: Helsinki University of Technology, Department of Mechanical Engineering, Laboratory of Machine Design, Université de Bretagne Occidentale, Ecole Doctorale des Sciences de la Matière, de l'Information et de la Santé.
Subjects/Keywords: conceptual design; life cycle design; dimensional analysis; topology; General Design Theory
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Coatanéa, E. (2005). Conceptual Modelling of Life Cycle Design: A Modelling and Evaluation Method Based on Analogies and Dimensionless Numbers. (Thesis). Helsinki University of Technology. Retrieved from http://lib.tkk.fi/Diss/2005/isbn9512278537/
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Coatanéa, Eric. “Conceptual Modelling of Life Cycle Design: A Modelling and Evaluation Method Based on Analogies and Dimensionless Numbers.” 2005. Thesis, Helsinki University of Technology. Accessed April 10, 2021.
http://lib.tkk.fi/Diss/2005/isbn9512278537/.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Coatanéa, Eric. “Conceptual Modelling of Life Cycle Design: A Modelling and Evaluation Method Based on Analogies and Dimensionless Numbers.” 2005. Web. 10 Apr 2021.
Vancouver:
Coatanéa E. Conceptual Modelling of Life Cycle Design: A Modelling and Evaluation Method Based on Analogies and Dimensionless Numbers. [Internet] [Thesis]. Helsinki University of Technology; 2005. [cited 2021 Apr 10].
Available from: http://lib.tkk.fi/Diss/2005/isbn9512278537/.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Coatanéa E. Conceptual Modelling of Life Cycle Design: A Modelling and Evaluation Method Based on Analogies and Dimensionless Numbers. [Thesis]. Helsinki University of Technology; 2005. Available from: http://lib.tkk.fi/Diss/2005/isbn9512278537/
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Rochester
24.
Bryant, Linda M.
Remote tutoring : a choice for deaf college students : an
action research study.
Degree: EdD, 2011, University of Rochester
URL: http://hdl.handle.net/1802/16913
► Today‘s deaf college students are expected to succeed academically despite language and learning challenges (Paul, 2009). As a support service, the benefits of tutoring have…
(more)
▼ Today‘s deaf college students are expected to
succeed academically despite language and learning challenges
(Paul, 2009). As a support service, the benefits of tutoring have
been well documented; however, research using remote tutoring with
deaf college students is lacking. This Action Research study
examined the activities (actions and interactions) that occurred
during twenty-two remote-tutoring sessions with nine deaf students
in my English class. Using Dimensional Analysis (Schatzman, 1991),
the dimensions used to narrate the remote tutoring process with
deaf college students served to inform relevant theory and answered
the research question: How does using remote tutoring with deaf
college students affect my tutoring practices? Findings pointed to
choices as the central perspective revealing students desire
options for supplemental learning. These included choice of time
for tutoring; choice of type of tutoring (traditional or remote);
choice of remote tutoring (asynchronous versus synchronous); choice
of remote technologies (webcam, chat, email or videophone); choice
of communication (ASL, SimCom or Speech); choice of tutor; and
choice of course (e.g., math, English, science). Relevant
dimensions included transitioning, benefits and sharing
experiences. Analysis also revealed theory suggesting that remote
tutoring is comparable to traditional tutoring using technologies
as the mediating tool. Whether it‘s provided in one‘s office or
through a webconferencing site, both are similar in delivery of
instruction and perceived benefits. An action plan for delivering
remote tutoring to deaf college students in other English classes
is outlined. Implications for tutors, deaf educators and distance
educators are discussed and future research considerations are
proposed.
Subjects/Keywords: Deaf; Tutoring; Action research; Dimensional analysis; Distance learning; Remote tutoring
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bryant, L. M. (2011). Remote tutoring : a choice for deaf college students : an
action research study. (Doctoral Dissertation). University of Rochester. Retrieved from http://hdl.handle.net/1802/16913
Chicago Manual of Style (16th Edition):
Bryant, Linda M. “Remote tutoring : a choice for deaf college students : an
action research study.” 2011. Doctoral Dissertation, University of Rochester. Accessed April 10, 2021.
http://hdl.handle.net/1802/16913.
MLA Handbook (7th Edition):
Bryant, Linda M. “Remote tutoring : a choice for deaf college students : an
action research study.” 2011. Web. 10 Apr 2021.
Vancouver:
Bryant LM. Remote tutoring : a choice for deaf college students : an
action research study. [Internet] [Doctoral dissertation]. University of Rochester; 2011. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/1802/16913.
Council of Science Editors:
Bryant LM. Remote tutoring : a choice for deaf college students : an
action research study. [Doctoral Dissertation]. University of Rochester; 2011. Available from: http://hdl.handle.net/1802/16913

Tulane University
25.
Xu, Chao.
Hypothesis Testing for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits.
Degree: 2018, Tulane University
URL: https://digitallibrary.tulane.edu/islandora/object/tulane:78817
► Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in…
(more)
▼ Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in the extreme phenotypic samples within the top and bottom percentiles, EPS can boost the study power compared with the random sampling with the same sample size. The existing statistical methods for EPS data test the variants/regions individually. However, many disorders are caused by multiple genetic factors. Therefore, it is critical to simultaneously model the effects of genetic factors, which may increase the power of current genetic studies and identify novel disease-associated genetic factors in EPS. The challenge of the simultaneous analysis of genetic data is that the number (p ~10,000) of genetic factors is typically greater than the sample size (n ~1,000) in a single study. The standard linear model would be inappropriate for this p>n problem due to the rank deficiency of the design matrix. An alternative solution is to apply a penalized regression method – the least absolute shrinkage and selection operator (LASSO).
LASSO can deal with this high-dimensional (p>n) problem by forcing certain regression coefficients to be zero. Although the application of LASSO in genetic studies under random sampling has been widely studied, its statistical inference and testing under EPS remain unknown. We propose a novel sparse model (EPS-LASSO) with hypothesis test for high-dimensional regression under EPS based on a decorrelated score function to investigate the genetic associations, including the gene expression and rare variant analyses. The comprehensive simulation shows EPS-LASSO outperforms existing methods with superior power when the effects are large and stable type I error and FDR control. Together with the real data analysis of genetic study for obesity, our results indicate that EPS-LASSO is an effective method for EPS data analysis, which can account for correlated predictors.
1
Chao Xu
Advisors/Committee Members: Deng, Hong-Wen (Thesis advisor), School of Public Health & Tropical Medicine Biostatistics and Bioinformatics (Degree granting institution).
Subjects/Keywords: extreme sampling; high-dimensional regression; genetic data analysis
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Xu, C. (2018). Hypothesis Testing for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits. (Thesis). Tulane University. Retrieved from https://digitallibrary.tulane.edu/islandora/object/tulane:78817
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Xu, Chao. “Hypothesis Testing for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits.” 2018. Thesis, Tulane University. Accessed April 10, 2021.
https://digitallibrary.tulane.edu/islandora/object/tulane:78817.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Xu, Chao. “Hypothesis Testing for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits.” 2018. Web. 10 Apr 2021.
Vancouver:
Xu C. Hypothesis Testing for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits. [Internet] [Thesis]. Tulane University; 2018. [cited 2021 Apr 10].
Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:78817.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Xu C. Hypothesis Testing for High-Dimensional Regression Under Extreme Phenotype Sampling of Continuous Traits. [Thesis]. Tulane University; 2018. Available from: https://digitallibrary.tulane.edu/islandora/object/tulane:78817
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
26.
Maiorana, Sara A.
SPEEDY 3-D.
Degree: 2014, SUNY College at Fredonia
URL: http://hdl.handle.net/1951/65144
► This research investigates how college students’ spatial skills vary by age, gender, college major and additional factors. Specifically, it explores students’ abilities to visualize two-dimensional…
(more)
▼ This research investigates how college students’ spatial skills vary by age, gender, college major and additional factors. Specifically, it explores students’ abilities to visualize two-dimensional air nets corresponding to two-dimensional illustrations of three-dimensional cubes. Also, this study examines how the use of a tangible air net manipulative affects performance. During this study, students answered a five-problem quiz involving matching and creating two-dimensional air nets for a given cube and vice versa. The results of the assessment were compared to those from a survey on the students’ age, gender, college major, ethnicity, and students’ perceptions of which problems were the most difficult and least difficult. It was hypothesized that male mathematics majors with access to a manipulative would perform best on the given spatial skills problems. The results of this study indicated that gender and college major had no statistical significance in spatial ability test score. Additional results revealed that there was a significant difference in test score by class, particularly with the use of a manipulative, and that the most difficult problem and least difficult problem on the assessment were both of the unfolding-type spatial ability task. These findings have noteworthy implications for in-service and pre-service mathematics teachers, particularly at the secondary level, regarding lesson planning and implementation when teaching spatial reasoning.
Subjects/Keywords: Spatial ability.;
Mathematics teachers – Training of.;
Dimensional analysis.
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Maiorana, S. A. (2014). SPEEDY 3-D.
(Masters Thesis). SUNY College at Fredonia. Retrieved from http://hdl.handle.net/1951/65144
Chicago Manual of Style (16th Edition):
Maiorana, Sara A. “SPEEDY 3-D.
” 2014. Masters Thesis, SUNY College at Fredonia. Accessed April 10, 2021.
http://hdl.handle.net/1951/65144.
MLA Handbook (7th Edition):
Maiorana, Sara A. “SPEEDY 3-D.
” 2014. Web. 10 Apr 2021.
Vancouver:
Maiorana SA. SPEEDY 3-D.
[Internet] [Masters thesis]. SUNY College at Fredonia; 2014. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/1951/65144.
Council of Science Editors:
Maiorana SA. SPEEDY 3-D.
[Masters Thesis]. SUNY College at Fredonia; 2014. Available from: http://hdl.handle.net/1951/65144

University of Illinois – Urbana-Champaign
27.
Ouyang, Yunbo.
Scalable sparsity structure learning using Bayesian methods.
Degree: PhD, Statistics, 2018, University of Illinois – Urbana-Champaign
URL: http://hdl.handle.net/2142/101264
► Learning sparsity pattern in high dimension is a great challenge in both implementation and theory. In this thesis we develop scalable Bayesian algorithms based on…
(more)
▼ Learning sparsity pattern in high dimension is a great challenge in both implementation and theory. In this thesis we develop scalable Bayesian algorithms based on EM algorithm and variational inference to learn sparsity structure in various models. Estimation consistency and selection consistency of our methods are established. First, a nonparametric Bayes estimator is proposed for the problem of estimating a sparse sequence based on Gaussian random variables. We adopt the popular two-group prior with one component being a point mass at zero, and the other component being a mixture of Gaussian distributions. Although the Gaussian family has been shown to be suboptimal for this problem, we find that Gaussian mixtures, with a proper choice on the means and mixing weights, have the desired asymptotic behavior, e.g., the corresponding posterior concentrates on balls with the desired minimax rate. Second, the above estimator could be directly applied to the high
dimensional linear classification. In theory, we not only build a bridge to connect the estimation error of the mean difference and the classification error in different scenarios, also provide sufficient conditions of sub-optimal classifiers and optimal classifiers. Third, we study adaptive ridge regression for linear models. Adaptive ridge regression is closely related with Bayesian variable selection problem with Gaussian mixture spike-and-slab prior because it resembles EM algorithm developed in Wang et al. (2016) for the above problem. The output of adaptive ridge regression can be used to construct a distribution estimator to approximate posterior. We show the approximate posterior has the desired concentration property and adaptive ridge regression estimator has desired predictive error. Last, we propose a Bayesian approach to sparse principal components
analysis (PCA). We show that our algorithm, which is based on variational approximation, achieves Bayesian selection consistency. Empirical studies have demonstrated the competitive performance of the proposed algorithm.
Advisors/Committee Members: Liang, Feng (advisor), Liang, Feng (Committee Chair), Qu, Annie (committee member), Narisetty, Naveen N (committee member), Zhu, Ruoqing (committee member).
Subjects/Keywords: Bayesian statistics; high-dimensional data analysis; variable selection
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ouyang, Y. (2018). Scalable sparsity structure learning using Bayesian methods. (Doctoral Dissertation). University of Illinois – Urbana-Champaign. Retrieved from http://hdl.handle.net/2142/101264
Chicago Manual of Style (16th Edition):
Ouyang, Yunbo. “Scalable sparsity structure learning using Bayesian methods.” 2018. Doctoral Dissertation, University of Illinois – Urbana-Champaign. Accessed April 10, 2021.
http://hdl.handle.net/2142/101264.
MLA Handbook (7th Edition):
Ouyang, Yunbo. “Scalable sparsity structure learning using Bayesian methods.” 2018. Web. 10 Apr 2021.
Vancouver:
Ouyang Y. Scalable sparsity structure learning using Bayesian methods. [Internet] [Doctoral dissertation]. University of Illinois – Urbana-Champaign; 2018. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/2142/101264.
Council of Science Editors:
Ouyang Y. Scalable sparsity structure learning using Bayesian methods. [Doctoral Dissertation]. University of Illinois – Urbana-Champaign; 2018. Available from: http://hdl.handle.net/2142/101264

North Carolina State University
28.
Palmer, Jeremy Andrew.
Development of Millimeter Scale Motors for Miniature Direct Drive Robots.
Degree: PhD, Mechanical Engineering, 2002, North Carolina State University
URL: http://www.lib.ncsu.edu/resolver/1840.16/5710
► The twentieth century marked a period of rapid expansion of technology associated with miniaturization of engineering systems. A recent theme in this trend is the…
(more)
▼ The twentieth century marked a period of rapid expansion of technology associated with miniaturization of engineering systems. A recent theme in this trend is the development of miniature, distributed robots that mimic insect behavior and locomotion. This research addresses the need for millimeter-scale, direct drive, high force/torque motors to support these platforms. Among the technologies currently available, scalable motors based on piezoelectric transducers are the focus. The specific contributions of this work are as follows. (1.) The design,
analysis, and characterization of a macro-scale linear piezomotor constructed with a parallel arrangement of stressed unimorph piezoelectric transducers are presented. The prototype demonstrates a novel application of passive mechanical latches to produce inchworm motion while eliminating the need for multiple control signals. (2.) A
dimensional analysis is conducted to reveal scale factors that govern the relationship between stressed unimorph performance parameters and size. The results support a millimeter-scale version of the linear piezomotor that incorporates transducers with alternative annular geometry for improved stiffness. (3.) The development of a miniature mode conversion rotary ultrasonic motor based on a piezoelectric stack transducer is reported. Results of a dynamic
analysis lead to scale factors for static torque and rotor velocity. Lastly, the linear and rotary piezomotor systems are compared in the context of scalability to determine the most effective system for miniature direct drive robotics.
Blocked force performance of the miniature linear piezomotor was limited to 0.25 N by back slip in the passive latches, and transducer displacement losses leading to greater compliance in the assembly. Since displacement of the annular stressed unimorph transducer decreases with the square of the outside radius, precision engineering is required to avoid these losses. The rotary ultrasonic motor proved to be a more effective choice for driving the robotic locomotion system.
Dimensional analysis results indicate that static torque scales with the square of the rotor contact radius. Using alternative designs, a static torque density of 0.37 Nm/kg was measured in the prototype.
Advisors/Committee Members: Dr. Edward Grant, Committee Co-Chair (advisor), Dr. Jeffrey Eischen, Committee Co-Chair (advisor).
Subjects/Keywords: dimensional analysis; ultrasonic motors; piezoelectric
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Palmer, J. A. (2002). Development of Millimeter Scale Motors for Miniature Direct Drive Robots. (Doctoral Dissertation). North Carolina State University. Retrieved from http://www.lib.ncsu.edu/resolver/1840.16/5710
Chicago Manual of Style (16th Edition):
Palmer, Jeremy Andrew. “Development of Millimeter Scale Motors for Miniature Direct Drive Robots.” 2002. Doctoral Dissertation, North Carolina State University. Accessed April 10, 2021.
http://www.lib.ncsu.edu/resolver/1840.16/5710.
MLA Handbook (7th Edition):
Palmer, Jeremy Andrew. “Development of Millimeter Scale Motors for Miniature Direct Drive Robots.” 2002. Web. 10 Apr 2021.
Vancouver:
Palmer JA. Development of Millimeter Scale Motors for Miniature Direct Drive Robots. [Internet] [Doctoral dissertation]. North Carolina State University; 2002. [cited 2021 Apr 10].
Available from: http://www.lib.ncsu.edu/resolver/1840.16/5710.
Council of Science Editors:
Palmer JA. Development of Millimeter Scale Motors for Miniature Direct Drive Robots. [Doctoral Dissertation]. North Carolina State University; 2002. Available from: http://www.lib.ncsu.edu/resolver/1840.16/5710

Wayne State University
29.
Li, Yan.
Novel Regression Models For High-Dimensional Survival Analysis.
Degree: PhD, Computer Science, 2016, Wayne State University
URL: https://digitalcommons.wayne.edu/oa_dissertations/1555
► Survival analysis aims to predict the occurrence of specific events of interest at future time points. The presence of incomplete observations due to censoring…
(more)
▼ Survival
analysis aims to predict the occurrence of specific events of interest at future time points. The presence of incomplete observations due to censoring brings unique challenges in this domain and differentiates survival
analysis techniques from other standard regression methods. In this thesis, we propose four models to deal with the high-
dimensional survival
analysis. Firstly, we propose a regularized linear regression model with weighted least-squares to handle the survival prediction in the presence of censored instances. We employ the elastic net penalty term for inducing sparsity into the linear model to effectively handle high-
dimensional data. As opposed to the existing censored linear models, the parameter estimation of our model does not need any prior estimation of survival times of censored instances. The second model we proposed is a unified model for regularized parametric survival regression for an arbitrary survival distribution. We employ a generalized linear model to approximate the negative log-likelihood and use the elastic net as a sparsity-inducing penalty to effectively deal with high-
dimensional data. The proposed model is then formulated as a penalized iteratively reweighted least squares and solved using a cyclical coordinate descent-based method.Considering the fact that the popularly used survival
analysis methods such as Cox proportional hazard model and parametric survival regression suffer from some strict assumptions and hypotheses that are not realistic in many real-world applications. we reformulate the survival
analysis problem as a multi-task learning problem in the third model which predicts the survival time by estimating the survival status at each time interval during the study duration. We propose an indicator matrix to enable the multi-task learning algorithm to handle censored instances and incorporate some of the important characteristics of survival problems such as non-negative non-increasing list structure into our model through max-heap projection. And the proposed formulation is solved via an Alternating Direction Method of Multipliers (ADMM) based algorithm. Besides above three methods which aim at solving standard survival prediction problem, we also propose a transfer learning model for survival
analysis. During our study, we noticed that obtaining sufficient labeled training instances for learning a robust prediction model is a very time consuming process and can be extremely difficult in practice. Thus, we proposed a Cox based model which uses the L2,1-norm penalty to encourage source predictors and target predictors share similar sparsity patterns and hence learns a shared representation across source and target domains to improve the model performance on the target task. We demonstrate the performance of the proposed models using several real-world high-
dimensional biomedical benchmark datasets and our experimental results indicate that our model outperforms other state-of-the-art related competing methods and attains very competitive performance on…
Advisors/Committee Members: Chandan K. Reddy.
Subjects/Keywords: High-dimensional data; Regularization; sparsity; Survival Analysis; Computer Sciences
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Li, Y. (2016). Novel Regression Models For High-Dimensional Survival Analysis. (Doctoral Dissertation). Wayne State University. Retrieved from https://digitalcommons.wayne.edu/oa_dissertations/1555
Chicago Manual of Style (16th Edition):
Li, Yan. “Novel Regression Models For High-Dimensional Survival Analysis.” 2016. Doctoral Dissertation, Wayne State University. Accessed April 10, 2021.
https://digitalcommons.wayne.edu/oa_dissertations/1555.
MLA Handbook (7th Edition):
Li, Yan. “Novel Regression Models For High-Dimensional Survival Analysis.” 2016. Web. 10 Apr 2021.
Vancouver:
Li Y. Novel Regression Models For High-Dimensional Survival Analysis. [Internet] [Doctoral dissertation]. Wayne State University; 2016. [cited 2021 Apr 10].
Available from: https://digitalcommons.wayne.edu/oa_dissertations/1555.
Council of Science Editors:
Li Y. Novel Regression Models For High-Dimensional Survival Analysis. [Doctoral Dissertation]. Wayne State University; 2016. Available from: https://digitalcommons.wayne.edu/oa_dissertations/1555

University of Ottawa
30.
Andison, Christopher.
Patient-Specific Finite Element Modeling of the Mitral Valve
.
Degree: 2015, University of Ottawa
URL: http://hdl.handle.net/10393/33396
► As the most commonly diseased heart valve, the mitral valve (MV) has been the subject of extensive research for many years. Unfortunately, the only treatment…
(more)
▼ As the most commonly diseased heart valve, the mitral valve (MV) has been the subject of extensive research for many years. Unfortunately, the only treatment options currently available are surgical repair and replacement. Although repair is almost always preferable to replacement, it is often underperformed due to the complexity of MV repair surgeries. Consequently, there is significant interest in generating patient-specific finite element models of the MV for the purpose of simulating mitral repairs. For practical purposes transesophageal echocardiographic (TEE) images are most commonly used to reconstruct the mitral apparatus. However, limitations in ultrasound technology have prevented the detection of leaflet thicknesses. In the current study, a method was developed to accurately model variations in leaflet thicknesses using TEE datasets. Nine healthy datasets were modeled and the leaflet thicknesses were found to closely match previously reported results. As anticipated, normal valve function was also observed over the entire cardiac cycle.
Subjects/Keywords: Mitral valve;
Finite element analysis;
Three-dimensional echocardiography
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Andison, C. (2015). Patient-Specific Finite Element Modeling of the Mitral Valve
. (Thesis). University of Ottawa. Retrieved from http://hdl.handle.net/10393/33396
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Andison, Christopher. “Patient-Specific Finite Element Modeling of the Mitral Valve
.” 2015. Thesis, University of Ottawa. Accessed April 10, 2021.
http://hdl.handle.net/10393/33396.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Andison, Christopher. “Patient-Specific Finite Element Modeling of the Mitral Valve
.” 2015. Web. 10 Apr 2021.
Vancouver:
Andison C. Patient-Specific Finite Element Modeling of the Mitral Valve
. [Internet] [Thesis]. University of Ottawa; 2015. [cited 2021 Apr 10].
Available from: http://hdl.handle.net/10393/33396.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Andison C. Patient-Specific Finite Element Modeling of the Mitral Valve
. [Thesis]. University of Ottawa; 2015. Available from: http://hdl.handle.net/10393/33396
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
◁ [1] [2] [3] [4] [5] … [20] ▶
.