You searched for +publisher:"University of Saskatchewan" +contributor:("Li, Longhai")
.
Showing records 1 – 24 of
24 total matches.
No search limiters apply to these results.
1.
Achath, Sudhakar 1955-.
Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability.
Degree: 2017, University of Saskatchewan
URL: http://hdl.handle.net/10388/7888
► This study is about developing some further ideas in imprecise probability models of financial risk measures. A financial risk measure has been interpreted as an…
(more)
▼ This study is about developing some further ideas in imprecise probability models of financial risk measures. A financial risk measure has been interpreted as an upper prevision of imprecise probability, which through the conjugacy relationship can be seen as a lower prevision. The risk measures selected in the study are value-at-risk (VaR) and conditional value-at-risk (CVaR). The notion of coherence of risk measures is explained. Stocks that are traded in the financial markets (the risky assets) are seen as the gambles. The study makes a determination through computation from actual assets data whether the risk measure assessments of gambles (assets) are coherent as an imprecise probability. It is observed that coherence of assessments depends on the asset's returns distribution characteristic.
Advisors/Committee Members: Bickis, Mikelis, Samei, Ebrahim, Li, Longhai, Wilson, Craig.
Subjects/Keywords: Imprecise Probability; Lower Prevision; Risk Measure; Coherence
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Achath, S. 1. (2017). Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7888
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Achath, Sudhakar 1955-. “Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability.” 2017. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7888.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Achath, Sudhakar 1955-. “Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability.” 2017. Web. 16 Feb 2019.
Vancouver:
Achath S1. Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability. [Internet] [Thesis]. University of Saskatchewan; 2017. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7888.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Achath S1. Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability. [Thesis]. University of Saskatchewan; 2017. Available from: http://hdl.handle.net/10388/7888
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
2.
Achath, Sudhakar 1955-.
Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability.
Degree: 2017, University of Saskatchewan
URL: http://hdl.handle.net/10388/7889
► This study is about developing some further ideas in imprecise probability models of financial risk measures. A financial risk measure has been interpreted as an…
(more)
▼ This study is about developing some further ideas in imprecise probability models of financial risk measures. A financial risk measure has been interpreted as an upper prevision of imprecise probability, which through the conjugacy relationship can be seen as a lower prevision. The risk measures selected in the study are value-at-risk (VaR) and conditional value-at-risk (CVaR). The notion of coherence of risk measures is explained. Stocks that are traded in the financial markets (the risky assets) are seen as the gambles. The study makes a determination through computation from actual assets data whether the risk measure assessments of gambles (assets) are coherent as an imprecise probability. It is observed that coherence of assessments depends on the asset's returns distribution characteristic.
Advisors/Committee Members: Bickis, Mikelis, Samei, Ebrahim, Li, Longhai, Wilson, Craig.
Subjects/Keywords: Imprecise Probability; Lower Prevision; Risk Measure; Coherence
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Achath, S. 1. (2017). Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7889
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Achath, Sudhakar 1955-. “Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability.” 2017. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7889.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Achath, Sudhakar 1955-. “Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability.” 2017. Web. 16 Feb 2019.
Vancouver:
Achath S1. Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability. [Internet] [Thesis]. University of Saskatchewan; 2017. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7889.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Achath S1. Computational Determination of Coherence of Financial Risk Measure as a Lower Prevision of Imprecise Probability. [Thesis]. University of Saskatchewan; 2017. Available from: http://hdl.handle.net/10388/7889
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
3.
Dong, Yue.
A Simulation Study to Evaluate Bayesian LASSO’s Performance in Zero-Inflated Poisson (ZIP) Models.
Degree: 2016, University of Saskatchewan
URL: http://hdl.handle.net/10388/7313
► When modelling count data, it is possible to have excessive zeros in the data in many applications. My thesis concentrates on the variable selection in…
(more)
▼ When modelling count data, it is possible to have excessive zeros in the data in many applications. My thesis concentrates on the variable selection in zero-inflated Poisson (ZIP) models. This thesis work is motivated by Brown et al. (2015), who considered the excessive amount of zero in their data structure and the site-specific random effects, and used Bayesian LASSO method for variable selection in their post-fire tree recruitment study in interior Alaska, USA and north Yukon, Canada. However, the above study has not carried out systematic simulation studies to evaluate Bayesian LASSO’s performance under different scenarios. Therefore, my thesis conducts a series of simulation studies to evaluate Bayesian LASSO’s performance with respect to different setting of some simulation factors.
My thesis considers three simulation factors: the number of subjects (N), the number of repeated measurements (R) and the true values of regression coefficients in the ZIP models. With different settings of the three factors, the proposed Bayesian LASSO’s performance would be evaluated using three indicators: the sensitivity, the specificity and the exact fit rate. For applied practitioners, my thesis would be a useful example demonstrating under what circumstances one can expect Bayesian LASSO to have good performance in ZIP models. After sorting out the simulation results, we can find that Bayesian LASSO’s performance is jointly affected by all the three simulation factors, while this method of variable selection is more reliable when the true coefficients are not close to zero.
My thesis also has some limitations. Primarily, with the time limitation of my thesis, it is impossible to consider all the factors that can potentially affect the simulation results, and using other penalty forms other than L1 penalty is also left for future researchers to work on. Moreover, the current variable selection method is only for fixed effects selection while the variable selection for the mixed effect selection in ZIP models can be a direction for future work.
Advisors/Committee Members: Liu, Juxin, Li, Longhai, Sowa, Artur, Lamb, Eric.
Subjects/Keywords: Variable selection; Zero-inflated model; Bayesian LASSO
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Dong, Y. (2016). A Simulation Study to Evaluate Bayesian LASSO’s Performance in Zero-Inflated Poisson (ZIP) Models. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7313
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Dong, Yue. “A Simulation Study to Evaluate Bayesian LASSO’s Performance in Zero-Inflated Poisson (ZIP) Models.” 2016. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7313.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Dong, Yue. “A Simulation Study to Evaluate Bayesian LASSO’s Performance in Zero-Inflated Poisson (ZIP) Models.” 2016. Web. 16 Feb 2019.
Vancouver:
Dong Y. A Simulation Study to Evaluate Bayesian LASSO’s Performance in Zero-Inflated Poisson (ZIP) Models. [Internet] [Thesis]. University of Saskatchewan; 2016. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7313.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Dong Y. A Simulation Study to Evaluate Bayesian LASSO’s Performance in Zero-Inflated Poisson (ZIP) Models. [Thesis]. University of Saskatchewan; 2016. Available from: http://hdl.handle.net/10388/7313
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
4.
Rijal, Sanjeev 1979-.
Complex Survey And Design Effect With Rao-Scott Correction And Log-Linear Analysis.
Degree: 2017, University of Saskatchewan
URL: http://hdl.handle.net/10388/7975
► Collection of data through sample surveys involves a wide range of techniques and procedures. Data is collected with the priority of maximum accuracy of information…
(more)
▼ Collection of data through sample surveys involves a wide range of techniques and procedures. Data is collected with the priority of maximum accuracy of information and with minimum cost and effort. Various sampling techniques are used to achieve the objective of accuracy and low cost.
When the data is collected from a complex surveys techniques for the analysis of categorical data (e.g. the chi-squared test for association between pairs of variable) have to be modified from the procedures used when the sample design is assumed to be a simple random sample. When the data is acquired using various sampling techniques in a complex design, there is a sample design effect on whatever analysis is used. Rao-Scott showed the effect of design on the tests of fitness, homogeneity and independence. The Log-linear model is used for analysis of higher dimension categorical data. The modifications of this analysis for complex survey designs were also proposed by Rao-Scott.
Simultaneous Test Procedure, another method, to test the homogeneity between multiple categories can also be linked with log-likelihood ratio statistics.
The fundamental concepts of data collection are explained with examples. The basic concepts of test procedure along with ways to get log-linear models are discussed leading to the multi-dimension with general log-linear model. The examples following the concepts show the validity in calculation.
Advisors/Committee Members: Laverty, William H, Li, Longhai, Kelly, Ivan, Soteros, Chris.
Subjects/Keywords: Design Effect; Rao-Scott Correction; Simultaneous Test Procedure
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Rijal, S. 1. (2017). Complex Survey And Design Effect With Rao-Scott Correction And Log-Linear Analysis. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7975
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Rijal, Sanjeev 1979-. “Complex Survey And Design Effect With Rao-Scott Correction And Log-Linear Analysis.” 2017. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7975.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Rijal, Sanjeev 1979-. “Complex Survey And Design Effect With Rao-Scott Correction And Log-Linear Analysis.” 2017. Web. 16 Feb 2019.
Vancouver:
Rijal S1. Complex Survey And Design Effect With Rao-Scott Correction And Log-Linear Analysis. [Internet] [Thesis]. University of Saskatchewan; 2017. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7975.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Rijal S1. Complex Survey And Design Effect With Rao-Scott Correction And Log-Linear Analysis. [Thesis]. University of Saskatchewan; 2017. Available from: http://hdl.handle.net/10388/7975
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
5.
Bai, Wei.
Randomized Quantile Residual for Assessing Generalized Linear Mixed Models with Application to Zero-Inflated Microbiome Data.
Degree: 2018, University of Saskatchewan
URL: http://hdl.handle.net/10388/9216
► In microbiome research, it is often of interest to investigate the impact of clinical and environmental factors on microbial abundance, which is often quantified as…
(more)
▼ In microbiome research, it is often of interest to investigate the impact of clinical and environmental factors on microbial abundance, which is often quantified as the total number of unique operational taxonomic units (OTUs). The important features of OTU count data are the presence of a large number of zeros and skewness in the positive counts. A common strategy to handle excessive zeros is to use zero-inflated models or zero-modified (hurdle) models. Moreover, subjects in microbiome data often have clustering structure, for example humans from the same family or plants from the same plot; as a result, random effects should be included to account for the clustering effects.
Model diagnosis is an essential step to ensure that a fitted model is adequate for the data. However, diagnosing zero-inflated counts models is still a challenging research problem. Pearson and deviance residuals are often used in practice for diagnosing counts models, despite wide recognition that these residuals are far from normality when applied to count data. Randomized quantile residual (RQR) was proposed in literature to circumvent the above problems in traditional residuals. The key idea of the RQR is to randomize the lower tail probability into a uniform random number between the discontinuity gap of cumulative density function (CDF). It can be shown that RQRs are normally distributed under the true model. To the best of our knowledge, RQR has not been applied to diagnose zero inflated or modified mixed effects models. In this thesis project, we have developed generic R functions that can compute RQRs for zero-inflated and zero-modified mixed effects models based on fitting outputs of glmmTMB. We have tested our functions using datasets generated from zero-modified Poisson (ZMP) and zero-modified negative binomial (ZMNB) models. Our simulation studies show that RQRs are normally distributed under the true model. In GOF tests, the type 1 error rates are close to the nominal level 0.05, and the powers of rejecting the wrong models are very good. We have also applied RQR to assess 8 models for a real human microbiome OTU dataset and concluded that ZMNB or zero-inflated negative binomial (ZINB) models provide adequate fits to the dataset.
Advisors/Committee Members: Li, Longhai, Feng, Cindy, Khan, Shahedul, Wright, Laura.
Subjects/Keywords: Operational taxonomic units (OTUs); randomized quantile residual (RQR); cumulative density function (CDF); zero-modified Poisson (ZMP); zero-modified negative binomial (ZMNB); zero-inflated negative binomial (ZINB).
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bai, W. (2018). Randomized Quantile Residual for Assessing Generalized Linear Mixed Models with Application to Zero-Inflated Microbiome Data. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/9216
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Bai, Wei. “Randomized Quantile Residual for Assessing Generalized Linear Mixed Models with Application to Zero-Inflated Microbiome Data.” 2018. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/9216.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Bai, Wei. “Randomized Quantile Residual for Assessing Generalized Linear Mixed Models with Application to Zero-Inflated Microbiome Data.” 2018. Web. 16 Feb 2019.
Vancouver:
Bai W. Randomized Quantile Residual for Assessing Generalized Linear Mixed Models with Application to Zero-Inflated Microbiome Data. [Internet] [Thesis]. University of Saskatchewan; 2018. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/9216.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Bai W. Randomized Quantile Residual for Assessing Generalized Linear Mixed Models with Application to Zero-Inflated Microbiome Data. [Thesis]. University of Saskatchewan; 2018. Available from: http://hdl.handle.net/10388/9216
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
6.
Wu, Tingxuan 1987-.
Randomized Survival Probability Residual for Assessing Parametric Survival Models.
Degree: 2018, University of Saskatchewan
URL: http://hdl.handle.net/10388/11696
► Traditional residuals for diagnosing accelerated failure time models in survival analysis, such as Cox-Snell, martingale and deviance residuals, have been widely used. However, ex- amining…
(more)
▼ Traditional residuals for diagnosing accelerated failure time models in survival analysis, such as Cox-Snell, martingale and deviance residuals, have been widely used. However, ex- amining those residuals are often only made visually, which can be subjective. Therefore, lack of objective measure of examining model adequacy has been a long-standing issue that needs to be addressed for survival analysis. In this thesis, a new type of residual is proposed called Normal-transformed Randomized Survival Probability (NRSP) residual. A compre- hensive review of the traditional residuals including Cox Snell and deviance residuals is firstly presented highlighting their disadvantages for examining model adequacy. We then introduce NRSP residual. Simulation studies were conducted to compare the performance of NRSP residuals with the traditional residuals. Our simulation studies demonstrated that NRSP residuals are approximately normally distributed when the fitted model is correctly speci- fied, and has great statistical power in detecting model inadequacies. We also apply NRSP residuals to a real dataset to check the goodness-of-fit of three plausible models.
Advisors/Committee Members: Li, Longhai, Feng, Cindy, Khan, Shahedul, Pahwa, Punam, Shao, Enchuan.
Subjects/Keywords: RSP; UCS; MCS; NCS
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wu, T. 1. (2018). Randomized Survival Probability Residual for Assessing Parametric Survival Models. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/11696
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Wu, Tingxuan 1987-. “Randomized Survival Probability Residual for Assessing Parametric Survival Models.” 2018. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/11696.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Wu, Tingxuan 1987-. “Randomized Survival Probability Residual for Assessing Parametric Survival Models.” 2018. Web. 16 Feb 2019.
Vancouver:
Wu T1. Randomized Survival Probability Residual for Assessing Parametric Survival Models. [Internet] [Thesis]. University of Saskatchewan; 2018. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/11696.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Wu T1. Randomized Survival Probability Residual for Assessing Parametric Survival Models. [Thesis]. University of Saskatchewan; 2018. Available from: http://hdl.handle.net/10388/11696
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
7.
Rana, Md Masud.
Spatial-Longitudinal Bent-Cable Model with an Application to Atmospheric CFC Data.
Degree: 2012, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2012-08-636
► Spatial data (also called georeferenced data) arise in a wide range of scientific studies, including geography, agriculture, criminology, geology, urban and regional economics. The underlying…
(more)
▼ Spatial data (also called georeferenced data) arise in a wide range of scientific studies, including geography, agriculture, criminology, geology, urban and regional economics. The underlying spatial effects – the measurement error caused by any spatial pattern embedded in data – may affect both the validity and robustness of traditional descriptive and inferential techniques. Therefore, it is of paramount importance to take into account spatial effects when analysing spatially dependent data. In particular, addressing the spatial association among attribute values observed at different locations and the systematic variation of phenomena by locations are the two major aspects of modelling spatial data.
The bent-cable is a parametric regression model to study data that exhibits a trend change over time. It comprises two linear segments to describe the incoming and outgoing phases, joined by a quadratic bend to model the transition period. For spatial longitudinal data, measurements taken over time are nested within spatially dependent locations. In this thesis, we extend the existing longitudinal bent-cable regression model to handle spatial effects. We do so in a hierarchical Bayesian framework by allowing the error terms to be correlated across space. We illustrate our methodology with an application to atmospheric chlorofluorocarbon (CFC) data. We also present a simulation study to demonstrate the performance of our proposed methodology.
Although we have tailored our work for the CFC data, our modelling framework may be applicable to a wide variety of other situations across the range of the econometrics, transportation, social, health and medical sciences. In addition, our methodology can be further extended by taking into account interaction between temporal and spatial effects. With the current model, this could be done with a spatial correlation structure that changes as a function of time.
Advisors/Committee Members: Khan, Shahedul A., Li, Longhai, Guo, Xulin, Soteros, Chris, Bickis, Mikelis.
Subjects/Keywords: atmospheric ozone depletion; Bayesian inference; bent cable regression; chlorofluorocarbon; longitudinal data; spatial effects
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Rana, M. M. (2012). Spatial-Longitudinal Bent-Cable Model with an Application to Atmospheric CFC Data. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2012-08-636
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Rana, Md Masud. “Spatial-Longitudinal Bent-Cable Model with an Application to Atmospheric CFC Data.” 2012. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2012-08-636.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Rana, Md Masud. “Spatial-Longitudinal Bent-Cable Model with an Application to Atmospheric CFC Data.” 2012. Web. 16 Feb 2019.
Vancouver:
Rana MM. Spatial-Longitudinal Bent-Cable Model with an Application to Atmospheric CFC Data. [Internet] [Thesis]. University of Saskatchewan; 2012. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2012-08-636.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Rana MM. Spatial-Longitudinal Bent-Cable Model with an Application to Atmospheric CFC Data. [Thesis]. University of Saskatchewan; 2012. Available from: http://hdl.handle.net/10388/ETD-2012-08-636
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
8.
Ashleik, Naeima Abdallah 1980-.
Learning for Contingency Tables and Survival Data using Imprecise Probabilities.
Degree: 2018, University of Saskatchewan
URL: http://hdl.handle.net/10388/8501
► Bayesian inference is a method of statistical inference in which all forms of uncertainty are expressed in terms of probability. Classical Bayesian inference has some…
(more)
▼ Bayesian inference is a method of statistical inference in which all forms of uncertainty
are expressed in terms of probability. Classical Bayesian inference has some limitations. One of these situations is when we have little to no information about the experiment; another
situation is when we have computational or time limitations. Also problematic is a situation where there are conflicts in choosing a prior distribution where we have experts giving different
prior information, which results in less precise posterior probabilities. Because of these
limitations, imprecise Bayesian approach takes place in Bayesian inference.
Upper and lower posterior expectations are computed in order to calculate the degree of imprecision of the log-odds ratio. This is implemented in two-way contingency tables and then generalized to three-way tables by using different families of prior distributions, is which the core of this work. Survival data including right-censored observations are generated and converted to a sequence of 2 x 2 tables, three-way contingency tables, each 2 x 2 is built at
each observed death time. Here, we assume only one death happens at each time and no ties.
To implement imprecise Bayesian inference, two choices of imprecise priors are chosen. A set of four Normal priors and a set of four Beta priors are used with a non-central hypergeometric
likelihood to update the posterior families and then the degree of imprecision is calculated for both cases. An example of real data is applied on Ovarian Cancer Survival data where upper and lower posterior expectations are estimated in order to calculate the degree of imprecision.
We conduct simulation studies to sample from posterior distribution and estimate the
log-odds ratio by using upper and lower posterior expectations. In the situation of three-way contingency tables, updating a set of priors to a set of posterior is done sequentially at each
table by running MCMC method through using JAGS from R via rjags and runjags packages.
Also, four factors (sample size, censoring rate, true parameter, and balancing rate) are studied to see how these four factors a ect the degree of imprecision with the two choices of imprecise priors. A fractional factorial design of 27 runs is constructed to see which one of these four factors is more signi cant. For each one of these 27 combination, upper and lower posterior expectations and the degree of imprecision of the log-odds ratio are calculated.
The findings show that the smallest value of the degree of imprecision appears at the
combination where the sample size is large (n = 200) and small number of censored times.
In contrast, the largest value of the degree of imprecision is observed at the combination where the sample size is small (n = 40) and large number of censored times. These conclusions are supported by the findings of ANOVA that show that main e ects of the four factors are
significant. The conclusion that can be summarized from the results of this work is having more information (more data) leads to less…
Advisors/Committee Members: Bickis, Mike, Soteros, Chris, Khan, Shahedul, Li, Longhai, Neufeld, Eric.
Subjects/Keywords: Bayesian inference; Contingency tables; Survival Data; and Imprecise probability.
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ashleik, N. A. 1. (2018). Learning for Contingency Tables and Survival Data using Imprecise Probabilities. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/8501
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Ashleik, Naeima Abdallah 1980-. “Learning for Contingency Tables and Survival Data using Imprecise Probabilities.” 2018. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/8501.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Ashleik, Naeima Abdallah 1980-. “Learning for Contingency Tables and Survival Data using Imprecise Probabilities.” 2018. Web. 16 Feb 2019.
Vancouver:
Ashleik NA1. Learning for Contingency Tables and Survival Data using Imprecise Probabilities. [Internet] [Thesis]. University of Saskatchewan; 2018. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/8501.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Ashleik NA1. Learning for Contingency Tables and Survival Data using Imprecise Probabilities. [Thesis]. University of Saskatchewan; 2018. Available from: http://hdl.handle.net/10388/8501
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
9.
Rostamiforooshani, Mehdi 1988-.
Modeling the Risk of Hip Fracture among Residents in the Long Term Care Facilities in British Columbia, Canada: Impact of Misspecication of the Correlation Structure on the Parameter Estimates.
Degree: 2016, University of Saskatchewan
URL: http://hdl.handle.net/10388/7324
► In practice, survival data are often grouped into clusters, such as clinical sites, geographical regions and so on. This clustering imposes correlation among individuals within…
(more)
▼ In practice, survival data are often grouped into clusters, such as clinical sites, geographical regions and so on. This clustering imposes correlation among individuals within each cluster, which is known as within cluster correlation. For instance, in our motivating example, within each long term care facility (LTCF), the elderly are likely from nearby areas with similar quality of life and having access to similar health care. As such, individual sharing the same hidden features may correlate with each other. The shared frailty model is therefore often used to take into account the correlation among individuals from the same cluster. In some applications, when the survival data are collected over geographical regions, random effects corresponding to geographical regions in closer proximately to each other might also be similar in magnitude, due to underlying environmental characteristics. Therefore, shared spatial frailty model can be adopted to model the spatial correlation among the clusters, which are often implemented using Bayesian Markov Chain Monte Carlo method. This method comes at the price of slow mixing rates and heavy computational cost, which may reader it impractical for data intensive application.
In this thesis, motivated by the computational challenges encountered in modelling spatial correlation in a real application involving large scale survival data, we used simulations to assess the efficiency loss in parameter estimates if residual spatial correlation is present but using a spatially uncorrelated random effect term in the model. Our simulation study indicates that the share frailty model with only the spatially correlated random effect term may not be sufficient to govern the total residual variation, whereas the simpler model with only the spatially uncorrelated random effect term performs surprisingly well in estimating the model parameters compared with the true model with both the spatially correlated and uncorrelated random effect terms. As such, using the shared frailty model with independent frailty term should be reliable for estimating the effects of covariates, especially when the percentage of censoring is not high and the number of clusters is large. Also, such model is advantageous, since it can be easily and efficiently implemented in a standard statistical software. This is not to say that the shared frailty model with independent frailty term should be preferred over the spatial frailty model in all cases. Indeed, when the primary goal of inference is predicting the hazard for specific covariates group, additional care needs to be given due to the bias in the scale parameter associated with the Weibull distribution, when the correlation structure is misspecified.
Advisors/Committee Members: Feng, Cindy Xin., Whiting, Susan, Li, Longhai, Guo, Xulin, Szafron, Michael.
Subjects/Keywords: Regression; Random-effect term; Likelihood function; Bias; Bayesian Statistics; MCMC
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Rostamiforooshani, M. 1. (2016). Modeling the Risk of Hip Fracture among Residents in the Long Term Care Facilities in British Columbia, Canada: Impact of Misspecication of the Correlation Structure on the Parameter Estimates. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7324
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Rostamiforooshani, Mehdi 1988-. “Modeling the Risk of Hip Fracture among Residents in the Long Term Care Facilities in British Columbia, Canada: Impact of Misspecication of the Correlation Structure on the Parameter Estimates.” 2016. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7324.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Rostamiforooshani, Mehdi 1988-. “Modeling the Risk of Hip Fracture among Residents in the Long Term Care Facilities in British Columbia, Canada: Impact of Misspecication of the Correlation Structure on the Parameter Estimates.” 2016. Web. 16 Feb 2019.
Vancouver:
Rostamiforooshani M1. Modeling the Risk of Hip Fracture among Residents in the Long Term Care Facilities in British Columbia, Canada: Impact of Misspecication of the Correlation Structure on the Parameter Estimates. [Internet] [Thesis]. University of Saskatchewan; 2016. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7324.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Rostamiforooshani M1. Modeling the Risk of Hip Fracture among Residents in the Long Term Care Facilities in British Columbia, Canada: Impact of Misspecication of the Correlation Structure on the Parameter Estimates. [Thesis]. University of Saskatchewan; 2016. Available from: http://hdl.handle.net/10388/7324
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
10.
Sajobi, Tolulope.
Descriptive discriminant analysis for repeated measures data.
Degree: 2012, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2012-03-361
► Background: Linear discriminant analysis (DA) encompasses procedures for classifying observations into groups (predictive discriminant analysis, PDA) and describing the relative importance of variables for distinguishing…
(more)
▼ Background: Linear discriminant analysis (DA) encompasses procedures for classifying observations into groups (predictive discriminant analysis, PDA) and describing the relative importance of variables for distinguishing between groups (descriptive discriminant analysis, DDA) in multivariate data. In recent years, there has been increased interest in DA procedures for repeated measures data. PDA procedures that assume parsimonious repeated measures mean and covariance structures have been developed, but corresponding DDA procedures have not been proposed. Most DA procedures for repeated measures data rest on the assumption of multivariate normality, which may not be satisfied in biostatistical applications. For example, health-related quality of life (HRQOL) measures, which are increasingly being used as outcomes in clinical trials and cohort studies, are likely to exhibit skewed or heavy-tailed distributions. As well, measures of relative importance based on discriminant function coefficients (DFCs) for DDA procedures have not been proposed for repeated measures data. Purpose: The purpose of this research is to develop repeated measures discriminant analysis (RMDA) procedures based on parsimonious covariance structures, including compound symmetric and first order autoregressive structures, and that are robust (i.e., insensitive) to multivariate non-normal distributions. It also extends these methods to evaluate the relative importance of variables in multivariate repeated measures (i.e., doubly multivariate) data. Method: Monte Carlo studies were conducted to investigate the performance of the proposed RMDA procedures under various degrees of group mean separation, repeated measures correlation structures, departure from multivariate normality, and magnitude of covariance mis-specification. Data from the Manitoba Inflammatory Bowel Disease Cohort Study, a prospective longitudinal cohort study about the psychosocial determinants of health and well-being, are used to illustrate their applications. Results: The conventional maximum likelihood (ML) estimates of DFCs for RMDA procedures based on parsimonious covariance structures exhibited substantial bias and error when the covariance structure was mis-specified or when the data followed a multivariate skewed or heavy-tailed distribution. The DFCs of RMDA procedures based on robust estimators obtained from coordinatewise trimmed means and Winsorized variances, were less biased and more efficient when the data followed a multivariate non-normal distribution, but were sensitive to the effects of covariance mis-specification. Measures of relative importance for doubly multivariate data based on linear combinations of the within-variable DFCs resulted in the highest proportion of correctly ranked variables. Conclusions: DA procedures based on parsimonious covariance structures and robust estimators will produce unbiased and efficient estimates of variable relative importance of variables in repeated measures data and can be used to test for change in relative importance…
Advisors/Committee Members: Lix, Lisa M., Muhajarine, Nazeem, Laverty, William, Jones, Jennifer, Li, Longhai.
Subjects/Keywords: discriminant analysis; repeated measurements; covariance structure; health-related quality of life
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Sajobi, T. (2012). Descriptive discriminant analysis for repeated measures data. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2012-03-361
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Sajobi, Tolulope. “Descriptive discriminant analysis for repeated measures data.” 2012. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2012-03-361.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Sajobi, Tolulope. “Descriptive discriminant analysis for repeated measures data.” 2012. Web. 16 Feb 2019.
Vancouver:
Sajobi T. Descriptive discriminant analysis for repeated measures data. [Internet] [Thesis]. University of Saskatchewan; 2012. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2012-03-361.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Sajobi T. Descriptive discriminant analysis for repeated measures data. [Thesis]. University of Saskatchewan; 2012. Available from: http://hdl.handle.net/10388/ETD-2012-03-361
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Saskatchewan
11.
Li, Zhaoqin.
QUANTIFYING GRASSLAND NON-PHOTOSYNTHETIC VEGETATION BIOMASS USING REMOTE SENSING DATA.
Degree: 2017, University of Saskatchewan
URL: http://hdl.handle.net/10388/8130
► Non-photosynthetic vegetation (NPV) refers to vegetation that cannot perform a photosynthetic function. NPV, including standing dead vegetation and surface plant litter, plays a vital role…
(more)
▼ Non-photosynthetic vegetation (NPV) refers to vegetation that cannot perform a photosynthetic function. NPV, including standing dead vegetation and surface plant litter, plays a vital role in maintaining ecosystem function through controlling carbon, water and nutrient uptake as well as natural fire frequency and intensity in diverse ecosystems such as forest, savannah, wetland, cropland, and grassland. Due to its ecological importance, NPV has been selected as an indicator of grassland ecosystem health by the Alberta Public Lands Administration in Canada. The ecological importance of NPV has driven considerable research on quantifying NPV biomass with remote sensing approaches in various ecosystems. Although remote images, especially hyperspectral images, have demonstrated potential for use in NPV estimation, there has not been a way to quantify NPV biomass in semiarid grasslands where NPV biomass is affected by green vegetation (PV), bare soil and biological soil crust (BSC). The purpose of this research is to find a solution to quantitatively estimate NPV biomass with remote sensing approaches in semiarid mixed grasslands. Research was conducted in Grasslands National Park (GNP), a parcel of semiarid mixed prairie grassland in southern
Saskatchewan, Canada. Multispectral images, including newly operational Landsat 8 Operational Land Imager (OLI) and Sentinel-2A Multi-spectral Instrument (MSIs) images and fine Quad-pol Radarsat-2 images were used for estimating NPV biomass in early, middle, and peak growing seasons via a simple linear regression approach. The results indicate that multispectral Landsat 8 OLI and Sentinel-2A MSIs have potential to quantify NPV biomass in peak and early senescence growing seasons. Radarsat-2 can also provide a solution for NPV biomass estimation. However, the performance of Radarsat-2 images is greatly affected by incidence angle of the image acquisition. This research filled a critical gap in applying remote sensing approaches to quantify NPV biomass in grassland ecosystems. NPV biomass estimates and approaches for estimating NPV biomass will contribute to grassland ecosystem health assessment (EHA) and natural resource (i.e. land, soil, water, plant, and animal) management.
Advisors/Committee Members: Guo, Xulin, Noble, Bram, de Boer, Dirk, Li, Longhai, Akkerman, Avi.
Subjects/Keywords: non-photosynthetic vegetation; biomass; green vegetation; biological soil crust; bare soil; multispectral image; Landsat 8; Sentinel-2A; Radarsat-2; ecosystem health; vegetation phenology
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Li, Z. (2017). QUANTIFYING GRASSLAND NON-PHOTOSYNTHETIC VEGETATION BIOMASS USING REMOTE SENSING DATA. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/8130
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Li, Zhaoqin. “QUANTIFYING GRASSLAND NON-PHOTOSYNTHETIC VEGETATION BIOMASS USING REMOTE SENSING DATA.” 2017. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/8130.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Li, Zhaoqin. “QUANTIFYING GRASSLAND NON-PHOTOSYNTHETIC VEGETATION BIOMASS USING REMOTE SENSING DATA.” 2017. Web. 16 Feb 2019.
Vancouver:
Li Z. QUANTIFYING GRASSLAND NON-PHOTOSYNTHETIC VEGETATION BIOMASS USING REMOTE SENSING DATA. [Internet] [Thesis]. University of Saskatchewan; 2017. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/8130.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Li Z. QUANTIFYING GRASSLAND NON-PHOTOSYNTHETIC VEGETATION BIOMASS USING REMOTE SENSING DATA. [Thesis]. University of Saskatchewan; 2017. Available from: http://hdl.handle.net/10388/8130
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
12.
Qiu, Shi.
Cross-validatory Model Comparison and Divergent Regions Detection using iIS and iWAIC for Disease Mapping.
Degree: 2015, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2015-03-1988
► The well-documented problems associated with mapping raw rates of disease have resulted in an increased use of Bayesian hierarchical models to produce maps of "smoothed''…
(more)
▼ The well-documented problems associated with mapping raw rates of disease have resulted in an increased use of Bayesian hierarchical models to produce maps of "smoothed'' estimates of disease rates. Two statistical problems arise in using Bayesian hierarchical models for disease mapping. The first problem is in comparing goodness of fit of various models, which can be used to test different hypotheses. The second problem is in identifying outliers/divergent regions with unusually high or low residual risk of disease, or those whose disease rates are not well fitted. The results of outlier detection may generate further hypotheses as to what additional covariates might be necessary for explaining the disease. Leave-one-out cross-validatory (LOOCV) model assessment has been used for these two problems. However, actual LOOCV is time-consuming. This thesis introduces two methods, namely iIS and iWAIC, for approximating LOOCV, using only Markov chain samples simulated from a posterior distribution based on a full data set. In iIS and iWAIC, we first integrate the latent variables without reference to holdout observation, then apply IS and WAIC approximations to the integrated predictive density and evaluation function. We apply iIS and iWAIC to two real data sets. Our empirical results show that iIS and iWAIC can provide significantly better estimation of LOOCV model assessment than existing methods including DIC, Importance Sampling, WAIC, posterior checking and Ghosting methods.
Advisors/Committee Members: Li, Longhai, Feng, Cindy.Xin, Bickis, Mikelis G..
Subjects/Keywords: cross-validation; disease mapping; importance sampling; WAIC
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Qiu, S. (2015). Cross-validatory Model Comparison and Divergent Regions Detection using iIS and iWAIC for Disease Mapping. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2015-03-1988
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Qiu, Shi. “Cross-validatory Model Comparison and Divergent Regions Detection using iIS and iWAIC for Disease Mapping.” 2015. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2015-03-1988.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Qiu, Shi. “Cross-validatory Model Comparison and Divergent Regions Detection using iIS and iWAIC for Disease Mapping.” 2015. Web. 16 Feb 2019.
Vancouver:
Qiu S. Cross-validatory Model Comparison and Divergent Regions Detection using iIS and iWAIC for Disease Mapping. [Internet] [Thesis]. University of Saskatchewan; 2015. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2015-03-1988.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Qiu S. Cross-validatory Model Comparison and Divergent Regions Detection using iIS and iWAIC for Disease Mapping. [Thesis]. University of Saskatchewan; 2015. Available from: http://hdl.handle.net/10388/ETD-2015-03-1988
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
13.
Wang, Yunyang 1986-.
Comparison of Stochastic Volatility Models Using Integrated Information Criteria.
Degree: 2016, University of Saskatchewan
URL: http://hdl.handle.net/10388/7591
► Stochastic volatility (SV) models are a family of models that commonly used in the modeling of stock prices. In all SV models, volatility is treated…
(more)
▼ Stochastic volatility (SV) models are a family of models that commonly used in the modeling of stock prices. In all SV models, volatility is treated as a stochastic time series. However, SV models are still quite different from each other from the perspective of both underlying principles and parameter layouts. Therefore, selecting the most appropriate SV model for a given set of stock price data is important in making future predictions of stock market. To achieve this goal, leave-one-out cross-validation (LOOCV) methods could be used. However, LOOCV methods are computationally expensive, thus its use is very limited in practice. In our studies of SV models, we proposed two new model-selection approaches, integrated widely applicable information criterion (iWAIC) and integrated importance sampling information criterion (iIS-IC), as alternatives to approximate LOOCV results. In iWAIC and iIS-IC methods, we first calculate the expected likelihood of each observation as an integral with respect to the corresponding latent variable (the current log-volatility parameter). Since the observations are highly correlated with their corresponding latent variable, the integrated likelihood of each t^th observation (y_t^obs) is expected to approximate the expect likelihood of y_t^obs calculated from the model with y_t^obs as its holdout data. Second, the integrated expected likelihood is used, as a replacement of the expected likelihood, in the calculation of information criteria. Since the integration with respect to the latent variable largely reduces the model's bias towards the corresponding observation, the integrated information criteria are expected to approximate LOOCV results. To evaluate the performance of iWAIC and iIS-IC, we first conducted an empirical study using simulated data sets. The results from this study show that iIS-IC method has an improved performance over the traditional IS-IC, but iWAIC does not outperform the non-integrated WAIC method. A further empirical study using real-world stock market return data was subsequently carried out. According to the model-selection results, the best model for the given data is either the SV model with two independent autoregressive processes, or the SV model with nonzero expected returns.
Advisors/Committee Members: Li, Longhai, Samei, Ebrahim, Liu, Juxin, Chaban, Maxym.
Subjects/Keywords: Model selection criteria; Stochastic volatility models; Integrated information criteria
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wang, Y. 1. (2016). Comparison of Stochastic Volatility Models Using Integrated Information Criteria. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7591
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Wang, Yunyang 1986-. “Comparison of Stochastic Volatility Models Using Integrated Information Criteria.” 2016. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7591.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Wang, Yunyang 1986-. “Comparison of Stochastic Volatility Models Using Integrated Information Criteria.” 2016. Web. 16 Feb 2019.
Vancouver:
Wang Y1. Comparison of Stochastic Volatility Models Using Integrated Information Criteria. [Internet] [Thesis]. University of Saskatchewan; 2016. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7591.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Wang Y1. Comparison of Stochastic Volatility Models Using Integrated Information Criteria. [Thesis]. University of Saskatchewan; 2016. Available from: http://hdl.handle.net/10388/7591
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
14.
ABDULLAH CHISTI, K B M.
Texture Analysis of Diffraction Enhanced Synchrotron Images of Trabecular Bone at the Wrist.
Degree: 2013, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2013-08-1194
► The purpose of this study is to determine the correlation between texture features of Di raction Enhanced Imaging (DEI) images and trabecular properties of human…
(more)
▼ The purpose of this study is to determine the correlation between texture features of Di raction
Enhanced Imaging (DEI) images and trabecular properties of human wrist bone in the assessment
of osteoporosis. Osteoporosis is a metabolic bone disorder that is characterized by reduced bone
mass and a deterioration of bone structure which results in an increased fracture risk. Since the
disease is preventable, diagnostic techniques are of major importance. Bone micro-architecture and
Bone mineral density (BMD) are two main factors related to osteoporotic fractures. Trabecular
properties like bone volume (BV), trabecular number (Tb.N), trabecular thickness (Tb.Th), bone
surface (BS), and other properties of bone, characterizes the bone architecture. Currently, however,
BMD is the only measurement carried out to assess osteoporosis. Researchers suggest that bone
micro-architecture and texture analysis of bone images along with BMD can provide more accuracy
in the assessment.
We have applied texture analysis on DEI images and extracted texture features. In our study,
we used fractal analysis, gray level co-occurrence matrix (GLCM), texture feature coding method
(TFCM), and local binary patterns (LBP) as texture analysis methods to extract texture features.
3D Micro-CT trabecular properties were extracted using SkyScanTM CTAN software. Then, we
determined the correlation between texture features and trabecular properties. GLCM energy fea-
ture of DEI images explained more than 39% of variance in bone surface by volume ratio (BS/BV),
38% of variance in percent bone volume (BV/TV), and 37% of variance in trabecular number
(Tb.N). TFCM homogeneity feature of DEI images explained more than 42% of variance in bone
surface (BS) parameter. LBP operator - LBP 11 of DEI images explained more than 34% of vari-
ance in bone surface (BS) and 30% of variance in bone surface density (BS/TV). Fractal dimension
parameter of DEI images explained more than 47% of variance in bone surface (BS) and 32% of
variance in bone volume (BV). This study will facilitate in the quanti cation of osteoporosis beyond
conventional BMD.
Advisors/Committee Members: Eramian, Mark, Cooper, David, Neufeld, Eric, Chapman, Dean, Li, Longhai.
Subjects/Keywords: Texture; Osteoporosis; TFCM; GLCM; LBP; Fractal; Bone micro architecture; DEI
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
ABDULLAH CHISTI, K. B. M. (2013). Texture Analysis of Diffraction Enhanced Synchrotron Images of Trabecular Bone at the Wrist. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2013-08-1194
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
ABDULLAH CHISTI, K B M. “Texture Analysis of Diffraction Enhanced Synchrotron Images of Trabecular Bone at the Wrist.” 2013. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2013-08-1194.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
ABDULLAH CHISTI, K B M. “Texture Analysis of Diffraction Enhanced Synchrotron Images of Trabecular Bone at the Wrist.” 2013. Web. 16 Feb 2019.
Vancouver:
ABDULLAH CHISTI KBM. Texture Analysis of Diffraction Enhanced Synchrotron Images of Trabecular Bone at the Wrist. [Internet] [Thesis]. University of Saskatchewan; 2013. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2013-08-1194.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
ABDULLAH CHISTI KBM. Texture Analysis of Diffraction Enhanced Synchrotron Images of Trabecular Bone at the Wrist. [Thesis]. University of Saskatchewan; 2013. Available from: http://hdl.handle.net/10388/ETD-2013-08-1194
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
15.
Islam, Naorin 1990-.
Substance Abuse and Health: A Structural Equation Modeling Approach to Assess Latent Health Effects.
Degree: 2016, University of Saskatchewan
URL: http://hdl.handle.net/10388/7632
► Background of the Study. Repeated use of a substance (alcohol or drug) may lead to mental and physical sickness, personality changes, insomnia, nausea, mood swings…
(more)
▼ Background of the Study. Repeated use of a substance (alcohol or drug) may lead to mental and physical sickness, personality changes, insomnia, nausea, mood swings and other disturbances. The number of people addicted to alcohol and/or drug has been increasing every year at an alarming rate. Although the extent of abuse is not directly measurable (i.e., latent), statistical techniques allow us to describe such a hypothetical construct using available information.
Objective. There are many factors potentially associated with substance abuse (e.g., smoking, education, cultural background). Although these variables are readily available in many studies, the cause (e.g., a measure of drug or alcohol abuse) is latent, with the observed variables being its manifestations. A measure of a latent health factor index could also be of particular interest. In this study, we investigate the effects of socio-demographic variables on substance (drug and alcohol) abuse and health in the Canadian population. In particular, the objective is to address the following questions: (a) What would be a reasonable hypothesis to explain causes of substance abusive behavior (i.e., cause and effect relationship)? (b) What model would adequately describe the cause-and-effect relationship between the observed variables and health and substance-related latent variables? (c) What covariates are significantly associated with alcohol and drug abusive environments and health status?
Method. To describe the cause-and-effect relationship among substance abuse, health and socio-demographic variables, we consider structural equation modeling. One of the appealing features of this technique is that it provides a concise assessment of complex relationships. The idea is to formulate a hypothesis regarding such relationships based on prior knowledge about the problem at hand, and then evaluate this hypothesis using statistical techniques. The main goal is to develop a model/hypothesis which can adequately describe the interrelationships among these variables.
Summary Results. The study is based on a survey conducted by Health Canada. We consider 2012 survey data for
Saskatchewan and Manitoba, and then develop models to describe the complex relationships among three hypothetical constructs (drug and alcohol abusive environments and heath) and socio-demographic variables. One of the important findings of the study is that an increase in the severity of drug abusive environment may worsen the health of individuals. Another interesting finding is that smoking has no direct effect on health, but it may lead to an environment (alcohol or drug abusive) that could have negative impact on health. Based on our findings, we conclude that substance abuse may significantly deteriorate health. This research will provide policy-makers as well as the public with an understanding of the extent of impacts of substance abuse and relevant socio-demographic variables on health.
Advisors/Committee Members: Khan, Shahedul, Li, Longhai, Srinivasan, Raj, Guo, Xulin.
Subjects/Keywords: Structural equation modeling
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Islam, N. 1. (2016). Substance Abuse and Health: A Structural Equation Modeling Approach to Assess Latent Health Effects. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7632
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Islam, Naorin 1990-. “Substance Abuse and Health: A Structural Equation Modeling Approach to Assess Latent Health Effects.” 2016. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7632.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Islam, Naorin 1990-. “Substance Abuse and Health: A Structural Equation Modeling Approach to Assess Latent Health Effects.” 2016. Web. 16 Feb 2019.
Vancouver:
Islam N1. Substance Abuse and Health: A Structural Equation Modeling Approach to Assess Latent Health Effects. [Internet] [Thesis]. University of Saskatchewan; 2016. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7632.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Islam N1. Substance Abuse and Health: A Structural Equation Modeling Approach to Assess Latent Health Effects. [Thesis]. University of Saskatchewan; 2016. Available from: http://hdl.handle.net/10388/7632
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
16.
Jiang, Lai.
Fully Bayesian T-probit Regression with Heavy-tailed Priors for Selection in High-Dimensional Features with Grouping Structure.
Degree: 2015, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2015-09-2232
► Feature selection is demanded in many modern scientific research problems that use high-dimensional data. A typical example is to find the genes that are most…
(more)
▼ Feature selection is demanded in many modern scientific research problems that
use high-dimensional data. A typical example is to find the genes that are most related to a certain disease (e.g., cancer) from high-dimensional gene expression profiles. There are tremendous difficulties in eliminating a large number of useless or redundant features. The expression levels of genes have structure; for example, a group of co-regulated genes that have similar biological functions tend to have similar mRNA expression levels. Many statistical
methods have been proposed to take the grouping structure into consideration in feature selection and regression, including Group LASSO, Supervised Group LASSO, and regression on group representatives. In this thesis, we propose to use a sophisticated Markov chain Monte Carlo method (Hamiltonian Monte Carlo with restricted Gibbs sampling) to fit T-probit regression
with heavy-tailed priors to make selection in the features with grouping structure. We will
refer to this method as fully Bayesian T-probit. The main feature of fully Bayesian T-probit is that it can make feature selection within groups automatically without a pre-specification of the grouping structure and more efficiently discard noise features than LASSO (Least Absolute Shrinkage and Selection Operator). Therefore, the feature subsets selected by fully Bayesian T-probit are significantly more sparse than subsets
selected by many other methods in the literature. Such succinct feature subsets are much easier to interpret or understand based on existing biological knowledge and further experimental investigations. In this thesis, we
use simulated and real datasets to demonstrate that the predictive performances of the more sparse feature subsets selected by fully Bayesian T-probit are comparable with the much larger feature subsets selected by plain LASSO, Group LASSO, Supervised
Group LASSO, random forest, penalized logistic regression and t-test. In addition,
we demonstrate that the succinct feature subsets selected by fully Bayesian T-probit have significantly better predictive power than the feature subsets of the same size taken from the top features selected by the aforementioned methods.
Advisors/Committee Members: Li, Longhai, Bickis, Mik, Liu, Juxin, Kusalik, Anthony, Stephens, David.
Subjects/Keywords: Bayesian methods; probit; MCMC; gene expression data; grouping structure
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Jiang, L. (2015). Fully Bayesian T-probit Regression with Heavy-tailed Priors for Selection in High-Dimensional Features with Grouping Structure. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2015-09-2232
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Jiang, Lai. “Fully Bayesian T-probit Regression with Heavy-tailed Priors for Selection in High-Dimensional Features with Grouping Structure.” 2015. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2015-09-2232.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Jiang, Lai. “Fully Bayesian T-probit Regression with Heavy-tailed Priors for Selection in High-Dimensional Features with Grouping Structure.” 2015. Web. 16 Feb 2019.
Vancouver:
Jiang L. Fully Bayesian T-probit Regression with Heavy-tailed Priors for Selection in High-Dimensional Features with Grouping Structure. [Internet] [Thesis]. University of Saskatchewan; 2015. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2015-09-2232.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Jiang L. Fully Bayesian T-probit Regression with Heavy-tailed Priors for Selection in High-Dimensional Features with Grouping Structure. [Thesis]. University of Saskatchewan; 2015. Available from: http://hdl.handle.net/10388/ETD-2015-09-2232
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
17.
Sadeghpour, Alireza.
Empirical Investigation of Randomized Quantile Residuals for Diagnosis of Non-Normal Regression Models.
Degree: 2016, University of Saskatchewan
URL: http://hdl.handle.net/10388/7513
► Traditional tools for model diagnosis for Generalized Linear Model (GLM), such as deviance and Pearson residuals, have been often utilized to examine goodness of fit…
(more)
▼ Traditional tools for model diagnosis for Generalized Linear Model (GLM), such as deviance and Pearson residuals, have been often utilized to examine goodness of fit of GLMs. In normal linear regression, both of these residuals coincide and are normally distributed; however in non-normal regression models, such as Logistic or Poisson regressions, the residuals are far from normality, with residuals aligning nearly parallel curves according to distinct response values, which imposes great challenges for visual inspection. As such, the residual plots for modeling discrete outcome variables convey very limited meaningful information, which render it of limited practical use.
Randomized quantile residuals was proposed in literature to circumvent the above-mentioned problems in the traditional residuals in modeling discrete outcomes. However, this approach has not gained deserved awareness and attention. Therefore, in this thesis, we theoretically justify the normality of the randomized quantile residuals and compare their performance with the traditional ones, Pearson and deviance residuals, through a set of simulation studies. Our simulation studies demonstrate the normality of randomized quantile residuals when the fitted model is true. Further, we show that randomized quantile residual is able to detect many kinds of model inadequacies. For instance, the linearity assumption of the covariate effect in GLM can be examined by visually checking the plots of randomized quantile residuals against the predicted values or the covariates. Randomized quantile residuals can be also used to detect overdispersion and zero-inflation, two commonly occurred cases associated with count data. We advocate examining normality of the randomized quantile residuals as a unifying way for examining the goodness of fit for regression model, especially for modeling the discrete outcomes. We also demonstrate this approach in a real application studying the independent association between air pollution and daily influenza incidence in Beijing, China.
Advisors/Committee Members: Li, Longhai, Feng, Cindy, Khan, Shahedul, Samei, Ebrahim, Eramian, Mark.
Subjects/Keywords: Regression; GLM; Residual; Pearson; Deviance; Randomized Quantile; ZIP
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Sadeghpour, A. (2016). Empirical Investigation of Randomized Quantile Residuals for Diagnosis of Non-Normal Regression Models. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7513
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Sadeghpour, Alireza. “Empirical Investigation of Randomized Quantile Residuals for Diagnosis of Non-Normal Regression Models.” 2016. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7513.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Sadeghpour, Alireza. “Empirical Investigation of Randomized Quantile Residuals for Diagnosis of Non-Normal Regression Models.” 2016. Web. 16 Feb 2019.
Vancouver:
Sadeghpour A. Empirical Investigation of Randomized Quantile Residuals for Diagnosis of Non-Normal Regression Models. [Internet] [Thesis]. University of Saskatchewan; 2016. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7513.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Sadeghpour A. Empirical Investigation of Randomized Quantile Residuals for Diagnosis of Non-Normal Regression Models. [Thesis]. University of Saskatchewan; 2016. Available from: http://hdl.handle.net/10388/7513
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
18.
Wang, Yunyang 1986-.
Comparison of Stochastic Volatility Models Using Integrated Information Criteria.
Degree: 2016, University of Saskatchewan
URL: http://hdl.handle.net/10388/7590
► Stochastic volatility (SV) models are a family of models that commonly used in the modeling of stock prices. In all SV models, volatility is treated…
(more)
▼ Stochastic volatility (SV) models are a family of models that commonly used in the modeling of stock prices. In all SV models, volatility is treated as a stochastic time series. However, SV models are still quite different from each other from the perspective of both underlying principles and parameter layouts. Therefore, selecting the most appropriate SV model for a given set of stock price data is important in making future predictions of stock market. To achieve this goal, leave-one-out cross-validation (LOOCV) methods could be used. However, LOOCV methods are computationally expensive, thus its use is very limited in practice. In our studies of SV models, we proposed two new model-selection approaches, integrated widely applicable information criterion (iWAIC) and integrated importance sampling information criterion (iIS-IC), as alternatives to approximate LOOCV results. In iWAIC and iIS-IC methods, we first calculate the expected likelihood of each observation as an integral with respect to the corresponding latent variable (the current log-volatility parameter). Since the observations are highly correlated with their corresponding latent variable, the integrated likelihood of each t^th observation (y_t^obs) is expected to approximate the expect likelihood of y_t^obs calculated from the model with y_t^obs as its holdout data. Second, the integrated expected likelihood is used, as a replacement of the expected likelihood, in the calculation of information criteria. Since the integration with respect to the latent variable largely reduces the model's bias towards the corresponding observation, the integrated information criteria are expected to approximate LOOCV results. To evaluate the performance of iWAIC and iIS-IC, we first conducted an empirical study using simulated data sets. The results from this study show that iIS-IC method has an improved performance over the traditional IS-IC, but iWAIC does not outperform the non-integrated WAIC method. A further empirical study using real-world stock market return data was subsequently carried out. According to the model-selection results, the best model for the given data is either the SV model with two independent autoregressive processes, or the SV model with nonzero expected returns.
Advisors/Committee Members: Li, Longhai, Samei, Ebrahim, Liu, Juxin, Chaban, Maxym.
Subjects/Keywords: Model selection criteria; Stochastic volatility models; Integrated information criteria
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wang, Y. 1. (2016). Comparison of Stochastic Volatility Models Using Integrated Information Criteria. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/7590
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Wang, Yunyang 1986-. “Comparison of Stochastic Volatility Models Using Integrated Information Criteria.” 2016. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/7590.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Wang, Yunyang 1986-. “Comparison of Stochastic Volatility Models Using Integrated Information Criteria.” 2016. Web. 16 Feb 2019.
Vancouver:
Wang Y1. Comparison of Stochastic Volatility Models Using Integrated Information Criteria. [Internet] [Thesis]. University of Saskatchewan; 2016. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/7590.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Wang Y1. Comparison of Stochastic Volatility Models Using Integrated Information Criteria. [Thesis]. University of Saskatchewan; 2016. Available from: http://hdl.handle.net/10388/7590
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
19.
Kar, Setu Chandra.
Generalized Bent-Cable Methodology for Changepoint Data: A Bayesian Approach.
Degree: 2017, University of Saskatchewan
URL: http://hdl.handle.net/10388/8313
► The choice of the model framework in a regression setting depends on the nature of the data. The focus of this study is on changepoint…
(more)
▼ The choice of the model framework in a regression setting depends on the nature of the data.
The focus of this study is on changepoint data, exhibiting three phases: incoming and outgoing,
both of which are linear, joined by a curved transition. These types of data can arise in many
applications, including medical, health and environmental sciences. Piecewise linear models have
been extensively utilized to characterize such changepoint trajectories in di erent scientific fields.
However, although appealing due to its simple structure, a piecewise linear model is not realistic
in many applications where data exhibit a gradual change over time.
The most important aspect of characterizing a changepoint trajectory involves identifying the
transition zone accurately. It is not only because the location of the transition zone is of particular
interest in many areas of study, but also because it plays an important role in adequately describing
the incoming and the outgoing phases of a changepoint trajectory. Note that once the transition is
detected, the incoming and the outgoing phases can be modeled using linear functions. Overall, it
is desirable to formulate a model in such a way that it can capture all the three phases satisfactorily,
while being parsimonious with greatly interpretable regression coe cients. Since data may exhibit
an either gradual or abrupt transition, it is also important for the transition model to be flexible.
Bent-cable regression is an appealing statistical tool to characterize such trajectories, quantifying
the nature of the transition between the two linear phases by modeling the transition as a quadratic
phase with unknown width. We demonstrate that a quadratic function may not be appropriate to
adequately describe many changepoint data. In practice, the quadratic function of the bent-cable
model may lead to a wider or narrower interval than what could possibly be necessary to adequately
describe a transition phase. We propose a generalization of the bent-cable model by relaxing the
assumption of the quadratic bend. Specifically, an additional transition parameter is included in the
bent-cable model to provide su cient flexibility so that inference about the transition zone (i.e.,
shape and width of the bend) can be data driven, rather than pre-assumed as a specific type.
We discuss the properties of the generalized model, and then propose a Bayesian approach for
statistical inference. The generalized model is then demonstrated with applications to three data sets taken from environmental science and economics. We also consider a comparison among the
quadratic bent-cable, generalized bent-cable and piecewise linear models in terms of goodness of
fit in analyzing both real-world and simulated data. Moreover, we supplement the motivation for
our generalized bent-cable methodology via extensive simulations – we simulate changepoint data
under some realistic assumptions, and then fit the quadratic bent-cable, generalized bent-cable
and piecewise linear models to each of the simulated data sets to…
Advisors/Committee Members: Khan, Shahedul Ahsan, Li, Longhai, Samei, Ebrahim, Chaban, Maxym.
Subjects/Keywords: Bayesian; bent-cable; changepoint; Markov Chain Monte Carlo; transition
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kar, S. C. (2017). Generalized Bent-Cable Methodology for Changepoint Data: A Bayesian Approach. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/8313
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kar, Setu Chandra. “Generalized Bent-Cable Methodology for Changepoint Data: A Bayesian Approach.” 2017. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/8313.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kar, Setu Chandra. “Generalized Bent-Cable Methodology for Changepoint Data: A Bayesian Approach.” 2017. Web. 16 Feb 2019.
Vancouver:
Kar SC. Generalized Bent-Cable Methodology for Changepoint Data: A Bayesian Approach. [Internet] [Thesis]. University of Saskatchewan; 2017. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/8313.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kar SC. Generalized Bent-Cable Methodology for Changepoint Data: A Bayesian Approach. [Thesis]. University of Saskatchewan; 2017. Available from: http://hdl.handle.net/10388/8313
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
20.
Kendall, Courtney.
Factor scoring methods affected by response shift in patient-reported outcomes.
Degree: 2014, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2014-07-1626
► Objective: Patient-reported outcomes (PROs) are measures collected from a patient to determine how he/she feels or functions in regards to a health condition. Longitudinal PROs,…
(more)
▼ Objective: Patient-reported outcomes (PROs) are measures collected from a patient to determine how he/she feels or functions in regards to a health condition. Longitudinal PROs, which are collected at multiple occasions from the same individual, may be affected by response shift (RS). RS is a change in a person’s self-evaluation of a target construct. Latent variable models (LVMs) are statistical models that relate observed variables to latent variables (LV). LVMs are used to analyze PROs and detect RS. LVs are random variables whose realizations are not observable. Factor scores are estimates of LVs for each individual and can be estimated from parameter estimates of LVMs. Factor scoring methods to estimate factor scores include: Thurstone, Bartlett, and sum scores. This simulation study examines the effects of RS on factor scores used to test for change in the LV means and recommend a factor scoring method least affected by RS.
Methods: Data from two time points were fit to three confirmatory factor analysis (CFA) models. CFA models are a type of LVM. Each CFA model had different sets of parameters that were invariant over time. The unconstrained (Uncon) CFA model had no invariant parameters, the constrained (Con) model had all the parameters invariant, and the partially constrained (Pcon) model had some of the parameters invariant over time. Factor scores were estimated and tested for change over time via paired t-test. The Type I error, power, and factor loading (the regression coefficient between an observed and LV) and factor score bias were estimated to determine if RS influenced the test of change over time and factor score estimation.
Results: The results depended on the true LV mean. The Type I error and power were similar for all factor scoring methods and CFA models when the LV mean was 0 at time 1. For LV mean of 0.5 at time 1 the Type I error and power increased as RS increased for all factor scores except for scores estimated from the Uncon model and Bartlett method. The biases of the factor loadings were unaffected by RS when estimated from an Uncon model. The factor scores estimated from the Uncon model and the Bartlett and sum scores method had the smallest factor score biases.
Conclusion: The factor scores estimated from the Uncon model and the Bartlett method was least affected by RS and performed best in all measures of Type I error, statistical power, factor loading and factor score bias. Estimating factor scores from PROs data that ignores RS may result in erroneous (or biased) estimates.
Advisors/Committee Members: Lix, Lisa, Liu, Juxin, Li, Longhai, Sarty, Gordon, Martin, John.
Subjects/Keywords: Factor scores; response shift,; patient-reported outcomes
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kendall, C. (2014). Factor scoring methods affected by response shift in patient-reported outcomes. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2014-07-1626
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kendall, Courtney. “Factor scoring methods affected by response shift in patient-reported outcomes.” 2014. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2014-07-1626.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kendall, Courtney. “Factor scoring methods affected by response shift in patient-reported outcomes.” 2014. Web. 16 Feb 2019.
Vancouver:
Kendall C. Factor scoring methods affected by response shift in patient-reported outcomes. [Internet] [Thesis]. University of Saskatchewan; 2014. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2014-07-1626.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kendall C. Factor scoring methods affected by response shift in patient-reported outcomes. [Thesis]. University of Saskatchewan; 2014. Available from: http://hdl.handle.net/10388/ETD-2014-07-1626
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
21.
Obeidat, Mohammed.
Bayesian analysis for time series of count data.
Degree: 2014, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2014-07-1589
► Time series involving count data are present in a wide variety of applications. In many applications, the observed counts are usually small and dependent. Failure…
(more)
▼ Time series involving count data are present in a wide variety of applications. In many applications, the observed counts are usually small and dependent. Failure to take these facts into account can lead to misleading inferences and may detect false relationships. To tackle such issues, a Poisson parameter-driven model is assumed for the time series at hand. This model can account for the time dependence between observations through introducing an autoregressive latent process.
In this thesis, we consider Bayesian approaches for estimating the Poisson parameter-driven model. The main challenge is that the likelihood function for the observed counts involves a high dimensional integral after integrating out the latent variables. The main contributions of this thesis are threefold. First, I develop a new single-move (SM) Markov chain Monte Carlo (MCMC) method to sample the latent variables one by one. Second, I adopt the idea of the particle Gibbs sampler (PGS) method \citep{andrieu} into our model setting and compare its performance with the SM method. Third, I consider Bayesian composite likelihood methods and compare three different adjustment methods with the unadjusted method and the SM method. The comparisons provide a practical guide to what method to use.
We conduct simulation studies to compare the latter two methods with the SM method. We conclude that the SM method outperforms the PGS method for small sample size, while they perform almost the same for large sample size. However, the SM method is much faster than the PGS method. The adjusted Bayesian composite methods provide closer results to the SM than the unadjusted one. The PGS and the selected adjustment method from simulation studies are compared with the SM method via a real data example. Similar results are obtained: first, the PGS method provides results very close to those of the SM method. Second, the adjusted composite likelihood methods provide closer results to the SM than the unadjusted one.
Advisors/Committee Members: Liu, Juxin, Soteros, Chrisine, Bickis, Mik, Li, Longhai, Osgood, Nathaniel, Gao, Xin.
Subjects/Keywords: Time series; count data; Poisson parameter-driven model; composite likelihood; particle Gibbs sampler; car crashes
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Obeidat, M. (2014). Bayesian analysis for time series of count data. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2014-07-1589
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Obeidat, Mohammed. “Bayesian analysis for time series of count data.” 2014. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2014-07-1589.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Obeidat, Mohammed. “Bayesian analysis for time series of count data.” 2014. Web. 16 Feb 2019.
Vancouver:
Obeidat M. Bayesian analysis for time series of count data. [Internet] [Thesis]. University of Saskatchewan; 2014. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2014-07-1589.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Obeidat M. Bayesian analysis for time series of count data. [Thesis]. University of Saskatchewan; 2014. Available from: http://hdl.handle.net/10388/ETD-2014-07-1589
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
22.
Naseri, Mahsa.
A MULTI-FUNCTIONAL PROVENANCE ARCHITECTURE: CHALLENGES AND SOLUTIONS.
Degree: 2013, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2013-12-1419
► In service-oriented environments, services are put together in the form of a workflow with the aim of distributed problem solving. Capturing the execution details of…
(more)
▼ In service-oriented environments, services are put together in the form of a workflow with the aim of distributed problem solving.
Capturing the execution details of the services' transformations is a significant advantage of using workflows. These execution details, referred to as provenance information, are usually traced automatically and stored in provenance stores. Provenance data contains the data recorded by a workflow engine during a workflow execution. It identifies what data is passed between services, which services are involved, and how results are eventually generated for particular sets of input values.
Provenance information is of great importance and has found its way through areas in computer science such as: Bioinformatics, database, social, sensor networks, etc.
Current exploitation and application of provenance data is very limited as provenance systems started being developed for specific applications. Thus, applying learning and knowledge discovery methods to provenance data can provide rich and useful information on workflows and services.
Therefore, in this work, the challenges with workflows and services are studied to discover the possibilities and benefits of providing solutions by using provenance data.
A multifunctional architecture is presented which addresses the workflow and service issues by exploiting provenance data. These challenges include workflow composition, abstract workflow selection, refinement, evaluation, and graph model extraction. The specific contribution of the proposed architecture is its novelty in providing a basis for taking advantage of the previous execution details of services and workflows along with artificial intelligence and knowledge management techniques to resolve the major challenges regarding workflows. The presented architecture is application-independent and could be deployed in any area.
The requirements for such an architecture along with its building components are discussed. Furthermore, the responsibility of the components, related works and the implementation details of the architecture along with each component are presented.
Advisors/Committee Members: Ludwig, Simone A., Osgood, Nathaniel, Horsch, Michael, McCalla, Gord, Li, Longhai.
Subjects/Keywords: Workflow; Provenance; Worflow Evaluation; Service Composition; Service Selection; Hidden Markov Model; Partially Observable Markov Decision Process; Bayesian Structure Learning
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Naseri, M. (2013). A MULTI-FUNCTIONAL PROVENANCE ARCHITECTURE: CHALLENGES AND SOLUTIONS. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2013-12-1419
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Naseri, Mahsa. “A MULTI-FUNCTIONAL PROVENANCE ARCHITECTURE: CHALLENGES AND SOLUTIONS.” 2013. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2013-12-1419.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Naseri, Mahsa. “A MULTI-FUNCTIONAL PROVENANCE ARCHITECTURE: CHALLENGES AND SOLUTIONS.” 2013. Web. 16 Feb 2019.
Vancouver:
Naseri M. A MULTI-FUNCTIONAL PROVENANCE ARCHITECTURE: CHALLENGES AND SOLUTIONS. [Internet] [Thesis]. University of Saskatchewan; 2013. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2013-12-1419.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Naseri M. A MULTI-FUNCTIONAL PROVENANCE ARCHITECTURE: CHALLENGES AND SOLUTIONS. [Thesis]. University of Saskatchewan; 2013. Available from: http://hdl.handle.net/10388/ETD-2013-12-1419
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
23.
Schmirler, Matthew.
Strand Passage And Knotting Probabilities In An Interacting Self-Avoiding Polygon Model.
Degree: 2012, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2012-09-670
► The work presented in this thesis develops a new model for local strand passage in a ring polymer in a dilute salt solution. This model,…
(more)
▼ The work presented in this thesis develops a new model for local strand passage in a ring polymer in a dilute salt solution. This model, called the Interacting Local Strand Passage (ILSP) model, models ring polymers via Theta-SAPs, which are self-avoiding polygons (SAPs) in the simple cubic lattice that contain a fixed structure Theta. This fixed structure represents two segments of the self-avoiding polygon being brought ''close'' together for the purpose of performing a strand passage. Theta-SAPs were first studied in the Local Strand Passage (LSP) model developed by Szafron (2000, 2009), where each Theta-SAP is considered equally likely in order to model good solvent conditions. In the ILSP model, each Theta-SAP has a modified Yukawa potential energy which contains an attractive term as well as a screened Coulomb potential that accounts for the effect of salt in the model. The energy function used in this model was first proposed by Tesi et al. (1994) for studying self-avoiding polygons in the simple cubic lattice.
The ILSP model is studied in this thesis using the Interacting Theta-BFACF (I-Theta-BFACF) Algorithm, an algorithm which is developed in this thesis and is proven to be ergodic on the set of all Theta-SAPs of a particular knot type and connection class. The I-Theta-BFACF algorithm was created by modifying the Theta-BFACF algorithm developed by Szafron (2000, 2009) to include energy-based Metropolis sampling. This modification allows one to sample Theta-SAPs of a particular knot type and connection class based on a priori chosen solvent conditions.
Multiple simulations (each consisting of 40 billion time steps) of composite Markov Chain Monte Carlo implementations of the I-Theta-BFACF algorithm are performed on unknotted connection class II Theta-SAPs over a wide range of salt concentrations. The data from these simulations is used to estimate, as a function of polygon length, the probability of an unknotted Theta-SAP remaining an unknot after a strand passage, as well as the probability of it becoming a positive trefoil knot. The results strongly suggest that as the length of a Theta-SAP goes to infinity, the probability of the Theta-SAP becoming knotted after a strand passage increases as the salt concentration in the model increases. These results serve as a first step for studying how the knot reduction factor (studied by Liu et al. (2006) and Szafron and Soteros (2011)) of a ring polymer varies in differing solvent conditions. The goal of this future research is to find solvent conditions and a local geometry of the strand passage site that yields a knot reduction factor comparable to the research of Rybenkov et al. (1997), which shows an 80-fold reduction of knotting after type II topoisomerase enzymes act on DNA.
Advisors/Committee Members: Soteros, Christine E., Bickis, Miķelis G., Szafron, Michael L., Li, Longhai, Kusalik, Anthony J..
Subjects/Keywords: Mathematics; Statistics; Topoisomerase; MCMC; Markov Chain Monte Carlo; Metropolis Sampling; DNA; Self Avoiding Polygons
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Schmirler, M. (2012). Strand Passage And Knotting Probabilities In An Interacting Self-Avoiding Polygon Model. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2012-09-670
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Schmirler, Matthew. “Strand Passage And Knotting Probabilities In An Interacting Self-Avoiding Polygon Model.” 2012. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2012-09-670.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Schmirler, Matthew. “Strand Passage And Knotting Probabilities In An Interacting Self-Avoiding Polygon Model.” 2012. Web. 16 Feb 2019.
Vancouver:
Schmirler M. Strand Passage And Knotting Probabilities In An Interacting Self-Avoiding Polygon Model. [Internet] [Thesis]. University of Saskatchewan; 2012. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2012-09-670.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Schmirler M. Strand Passage And Knotting Probabilities In An Interacting Self-Avoiding Polygon Model. [Thesis]. University of Saskatchewan; 2012. Available from: http://hdl.handle.net/10388/ETD-2012-09-670
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
24.
Janzen, Michael.
Virtual camera selection using a semiring constraint satisfaction approach.
Degree: 2012, University of Saskatchewan
URL: http://hdl.handle.net/10388/ETD-2012-06-417
► Players and viewers of three-dimensional computer generated games and worlds view renderings from the viewpoint of a virtual camera. As such, determining a good view…
(more)
▼ Players and viewers of three-dimensional computer generated games and worlds view renderings from the viewpoint of a virtual camera. As such, determining a good view of the scene is important to present a good game or three-dimensional world. Previous research has developed technologies to nd good positions for the virtual camera, but little work has been done to automatically select between multiple virtual cameras, similar to a human director at a sporting event. This thesis describes a software tool to select among camera feeds from multiple virtual cameras in a virtual environment using semiring-based constraint satisfaction techniques (SCSP), a soft constraint approach. The system encodes a designer's preferences, and selects the best camera feed even in over-constrained or under-constrained environments. The system functions in real time for dynamic scenes using only current information (i.e. no prediction). To reduce the camera selection time the SCSP evaluation can be cached and converted to native code. This SCSP approach is implemented in two virtual environments: a virtual hockey game using a spectator viewpoint, and a virtual 3D maze game using a third person perspective. Comparisons against hard constraints are made using constraint satisfaction problems.
Advisors/Committee Members: Neufeld, Eric, Horsch, Michael C., McQuillan, Ian, Stanley, Kevin G., Kusalik, Anthony J., Li, Longhai, Goodwin, Scott.
Subjects/Keywords: virutal camera selection; SCSP
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Janzen, M. (2012). Virtual camera selection using a semiring constraint satisfaction approach. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/ETD-2012-06-417
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Janzen, Michael. “Virtual camera selection using a semiring constraint satisfaction approach.” 2012. Thesis, University of Saskatchewan. Accessed February 16, 2019.
http://hdl.handle.net/10388/ETD-2012-06-417.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Janzen, Michael. “Virtual camera selection using a semiring constraint satisfaction approach.” 2012. Web. 16 Feb 2019.
Vancouver:
Janzen M. Virtual camera selection using a semiring constraint satisfaction approach. [Internet] [Thesis]. University of Saskatchewan; 2012. [cited 2019 Feb 16].
Available from: http://hdl.handle.net/10388/ETD-2012-06-417.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Janzen M. Virtual camera selection using a semiring constraint satisfaction approach. [Thesis]. University of Saskatchewan; 2012. Available from: http://hdl.handle.net/10388/ETD-2012-06-417
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
.