You searched for subject:(Markov chains)
.
Showing records 1 – 30 of
376 total matches.
◁ [1] [2] [3] [4] [5] … [13] ▶
1.
Wicks, John Randolph.
An Algorithm to Compute the Stochastically Stable
Distribution of a Perturbed Markov Matrix.
Degree: PhD, Computer Science, 2009, Brown University
URL: https://repository.library.brown.edu/studio/item/bdr:92/
► Recently, some researchers have attempted to exploit state-aggregation techniques to compute stable distributions of high-dimensional Markov matrices. While these researchers have devised an efficient, recursive…
(more)
▼ Recently, some researchers have attempted to exploit
state-aggregation techniques to compute stable distributions of
high-dimensional
Markov matrices. While these researchers have
devised an efficient, recursive algorithm, their results are only
approximate. We improve upon past results by presenting a novel
state aggregation technique, which we use to give the first (to our
knowledge) scalable, exact algorithm for computing the
stochastically stable distribution of a perturbed
Markov matrix.
Since it is not combinatorial in nature, our algorithm is
computationally feasible even for high-dimensional
models.
Advisors/Committee Members: Greenwald, Amy (director), Serrano, Roberto (reader), Centintemel, Ugur (reader).
Subjects/Keywords: Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wicks, J. R. (2009). An Algorithm to Compute the Stochastically Stable
Distribution of a Perturbed Markov Matrix. (Doctoral Dissertation). Brown University. Retrieved from https://repository.library.brown.edu/studio/item/bdr:92/
Chicago Manual of Style (16th Edition):
Wicks, John Randolph. “An Algorithm to Compute the Stochastically Stable
Distribution of a Perturbed Markov Matrix.” 2009. Doctoral Dissertation, Brown University. Accessed March 01, 2021.
https://repository.library.brown.edu/studio/item/bdr:92/.
MLA Handbook (7th Edition):
Wicks, John Randolph. “An Algorithm to Compute the Stochastically Stable
Distribution of a Perturbed Markov Matrix.” 2009. Web. 01 Mar 2021.
Vancouver:
Wicks JR. An Algorithm to Compute the Stochastically Stable
Distribution of a Perturbed Markov Matrix. [Internet] [Doctoral dissertation]. Brown University; 2009. [cited 2021 Mar 01].
Available from: https://repository.library.brown.edu/studio/item/bdr:92/.
Council of Science Editors:
Wicks JR. An Algorithm to Compute the Stochastically Stable
Distribution of a Perturbed Markov Matrix. [Doctoral Dissertation]. Brown University; 2009. Available from: https://repository.library.brown.edu/studio/item/bdr:92/
2.
Sudyko, Elena.
Dollarisation finançière en Russie : Financial dollarization in Russia.
Degree: Docteur es, Sciences de gestion, 2018, Université Paris-Saclay (ComUE)
URL: http://www.theses.fr/2018SACLE032
► Le travail développe un modèle de portfolio à propos de la dollarisation financière (FD), et l'estime pour la Russie. La contribution de ce travail sera…
(more)
▼ Le travail développe un modèle de portfolio à propos de la dollarisation financière (FD), et l'estime pour la Russie. La contribution de ce travail sera de construire le premier modèle théorique de variance moyenne asymétrique d'aplatissement sur la dollarisation financière et de le valider empiriquement. Le travail se fonde sur des recherches antérieures qui ont trouvé que l'ajout de moments plus élevés, comme l'asymétrie et l'aplatissement, à la variance minimale du portfolio(MVP) permettant une meilleure modélisation des choix de portfolio et de développe un model comme celui-ci pour la FD. Nous utilisons ensuite les méthodes Markovswitching sur les données mensuelles pour les dépôts bancaires en Russie depuis la fin des années 1990 afin de documenter l'influence dominante de l'inflation et de la dépréciation de la monnaie et de leurs moments comme principaux déterminants de dépôt de dollarisation dans un cadre de variance-moyenne-asymétrique-aplatie en période de crise, par opposition aux périodes normales.
This thesis develops a portfolio model of financial dollarization (FD) and estimates it for Russia. The contribution of this work will be to construct the first theoretical meanvariance-skewness-kurtosis model of financial dollarization and to validate it empirically. The work builds on previous research which found that adding higher moments, as Skewness and Kurtosis, to the minimum variance portfolio (MVP) enables a better modelling of portfolio choice, and develops such a model for FD. We then use Markovswitching methods on monthly data for bank deposits in Russia since the late 1990s to document the dominant influence of inflation and currency depreciation and their moments as the main determinants of deposit dollarization in a mean-varianceskewness-kurtosis framework during crisis as opposed to normal periods.
Advisors/Committee Members: Jawadi, Fredj (thesis director), Girardin, Eric (thesis director).
Subjects/Keywords: Chaines de Markov; Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Sudyko, E. (2018). Dollarisation finançière en Russie : Financial dollarization in Russia. (Doctoral Dissertation). Université Paris-Saclay (ComUE). Retrieved from http://www.theses.fr/2018SACLE032
Chicago Manual of Style (16th Edition):
Sudyko, Elena. “Dollarisation finançière en Russie : Financial dollarization in Russia.” 2018. Doctoral Dissertation, Université Paris-Saclay (ComUE). Accessed March 01, 2021.
http://www.theses.fr/2018SACLE032.
MLA Handbook (7th Edition):
Sudyko, Elena. “Dollarisation finançière en Russie : Financial dollarization in Russia.” 2018. Web. 01 Mar 2021.
Vancouver:
Sudyko E. Dollarisation finançière en Russie : Financial dollarization in Russia. [Internet] [Doctoral dissertation]. Université Paris-Saclay (ComUE); 2018. [cited 2021 Mar 01].
Available from: http://www.theses.fr/2018SACLE032.
Council of Science Editors:
Sudyko E. Dollarisation finançière en Russie : Financial dollarization in Russia. [Doctoral Dissertation]. Université Paris-Saclay (ComUE); 2018. Available from: http://www.theses.fr/2018SACLE032

Universiteit Utrecht
3.
Schelling, W.D.
Analyzing a queueing network.
Degree: 2015, Universiteit Utrecht
URL: http://dspace.library.uu.nl:8080/handle/1874/307053
► The purpose of the thesis is to help ING Bank to get a better understanding of their ICT landscape. Since the ICT landscape of ING…
(more)
▼ The purpose of the thesis is to help ING Bank to get a better understanding of their ICT landscape. Since the ICT landscape of ING Bank is very large, we will focus on a specific part of the ICT landscape. ING Bank wants to identify the bottlenecks more quickly. Therefore we develop two different mathematical tools which can help in identifying the bottlenecks of such a complex queueing system. The first tool is a simulation tool using the program Rockwell Arena. The second tool is an analytical approach, based on
Markov chains. Both tools have some advantages and disadvantages which we also discuss in the thesis.
Advisors/Committee Members: Dajani, Karma.
Subjects/Keywords: queueing theory; markov chains; simulation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Schelling, W. D. (2015). Analyzing a queueing network. (Masters Thesis). Universiteit Utrecht. Retrieved from http://dspace.library.uu.nl:8080/handle/1874/307053
Chicago Manual of Style (16th Edition):
Schelling, W D. “Analyzing a queueing network.” 2015. Masters Thesis, Universiteit Utrecht. Accessed March 01, 2021.
http://dspace.library.uu.nl:8080/handle/1874/307053.
MLA Handbook (7th Edition):
Schelling, W D. “Analyzing a queueing network.” 2015. Web. 01 Mar 2021.
Vancouver:
Schelling WD. Analyzing a queueing network. [Internet] [Masters thesis]. Universiteit Utrecht; 2015. [cited 2021 Mar 01].
Available from: http://dspace.library.uu.nl:8080/handle/1874/307053.
Council of Science Editors:
Schelling WD. Analyzing a queueing network. [Masters Thesis]. Universiteit Utrecht; 2015. Available from: http://dspace.library.uu.nl:8080/handle/1874/307053

University of Edinburgh
4.
Apted, William D.F.
Modelling Tourist Movements and Trends in the City of Edinburgh.
Degree: 2012, University of Edinburgh
URL: http://hdl.handle.net/1842/6353
► Abstract There is a great deal of interest, socioeconomically, in knowing the patterns and trends of the movements of groups of people, including the probability…
(more)
▼ Abstract
There is a great deal of interest, socioeconomically, in knowing the patterns and trends of the movements of groups of people, including the probability of where they are likely to move based on where they are currently located. The ability to predict and define the most probable movement of a large numbers of people within a cityscape can be of great benefit in providing data on which cities can be adapted to help accommodate and manage people en masse. This includes aspects which could have a direct influence on tourists, such as helping to know where to provide advice, help or additional information, as well as providing information on best locations for marketing, by having areas that promote activities, products and events. This project aims to look at the methodology of implementing
Markov Chains to model tourist movements within the City of Edinburgh, looking at the 17 most popular tourist sites near the heart of the city. Focusing on a city wide scale, different applicable representations of tourist movement will be investigated and mapping of the data developed from the model used to gain a visual insight into the movement patterns within Edinburgh. This study showed that the use of
Markov chains and topographic modelling can be a valuable aid to identify the probability of tourist movements between tourist attractions, based on current location, and the volume of tourists using different routes between them. This information would enable better decisions in a wide range of areas including for example where to locate other attractions or public amenities, advertising, focused maintenance of route and where investment can be made to improve the overall tourist experience and encourage repeat visits to the city.
Advisors/Committee Members: Mackaness, William.
Subjects/Keywords: Markov Chains; Tourist Modelling
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Apted, W. D. F. (2012). Modelling Tourist Movements and Trends in the City of Edinburgh. (Thesis). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/6353
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Apted, William D F. “Modelling Tourist Movements and Trends in the City of Edinburgh.” 2012. Thesis, University of Edinburgh. Accessed March 01, 2021.
http://hdl.handle.net/1842/6353.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Apted, William D F. “Modelling Tourist Movements and Trends in the City of Edinburgh.” 2012. Web. 01 Mar 2021.
Vancouver:
Apted WDF. Modelling Tourist Movements and Trends in the City of Edinburgh. [Internet] [Thesis]. University of Edinburgh; 2012. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/1842/6353.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Apted WDF. Modelling Tourist Movements and Trends in the City of Edinburgh. [Thesis]. University of Edinburgh; 2012. Available from: http://hdl.handle.net/1842/6353
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Boston University
5.
Stevens, Roger T.
An application of Markov chains.
Degree: MA, Mathematics, 1959, Boston University
URL: http://hdl.handle.net/2144/24603
► Probability problems in which a time parameter is involved are known as stochastic processes. The simplest time dependent stochastic processes are those in which the…
(more)
▼ Probability problems in which a time parameter is involved are known as stochastic processes. The simplest time dependent stochastic processes are those in which the probabilities of a system changing to various states are solely dependent upon the present state of the system. These processes are known as Markov processes, or for the case where only discrete time intervals are considered, as Markov chains. A Markov chain may be completely defined by the matrix of its transition probabilities. This matrix is called a stochastic matrix and is characterized by the facts that it is a square matrix, that the elements of each column sum to one and that all the elements are non-negative.
An important consideration in most Markov chain problems is the effect of a number of transitions as defined by the stochastic matrix. Performing this operation requires determining the higher powers of the stochastic matrix. Two modal matrices are defined, where k is the matrix of the column characteristic vectors of the stochastic matrix and K is the matrix of the row characteristic vectors. It is shown that with proper normalization of these vectors, the stochastic matrix P is equal to kAK, where A is the matrix of the characteristic roots along the diagonal and zeroes elsewhere. .The higher powers of the stochastic matrix, Pm, are then found to be equal to kAmk. The stochastic matrix is found always to have a characteristic root one, and all the other roots are shown to be less than one in absolute value. The limiting transition matrix P ∞ is found to have identical columns, each consisting of the characteristic column vector associated with the characteristic root one. The limiting distribution is the same vector and is independent of the initial conditions.[TRUNCATED]
Subjects/Keywords: Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Stevens, R. T. (1959). An application of Markov chains. (Masters Thesis). Boston University. Retrieved from http://hdl.handle.net/2144/24603
Chicago Manual of Style (16th Edition):
Stevens, Roger T. “An application of Markov chains.” 1959. Masters Thesis, Boston University. Accessed March 01, 2021.
http://hdl.handle.net/2144/24603.
MLA Handbook (7th Edition):
Stevens, Roger T. “An application of Markov chains.” 1959. Web. 01 Mar 2021.
Vancouver:
Stevens RT. An application of Markov chains. [Internet] [Masters thesis]. Boston University; 1959. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/2144/24603.
Council of Science Editors:
Stevens RT. An application of Markov chains. [Masters Thesis]. Boston University; 1959. Available from: http://hdl.handle.net/2144/24603
6.
Tifenbach, Ryan M.
A Combinatorial Approach to Nearly Uncoupled Markov Chains.
Degree: 2011, RIAN
URL: http://eprints.maynoothuniversity.ie/3730/
► A discrete-time Markov chain on a state space S is a sequence of random variables X = fx0; x1; : : :g that take on…
(more)
▼ A discrete-time Markov chain on a state space S is a sequence of random variables
X = fx0; x1; : : :g that take on values in S. A Markov chain is a model of a system
which changes or evolves over time; the random variable xt is the state of the system
at time t.
A subset E S is referred to as an almost invariant aggregate if whenever xt 2 E,
then with high probability xt+1 2 E, as well. That is, if there is a small positive value
such that if xt 2 E then the probability that xt+1 =2 E is less than or equal to ,
then E is an almost invariant aggregate. If E is such an aggregate and xt 2 E, then
the probability that xt+1; : : : ; xt+s 2 E is at least (1-E)s. A Markov chain tends to
remain within its almost invariant aggregates (if it possesses any) for long periods of
time.
We refer to the Markov chain X as nearly uncoupled (with respect to some positive
) if its associated state space contains two or more disjoint almost invariant
aggregates. Nearly uncoupled Markov chains are characterised by long periods of
relatively constant behaviour, punctuated by occasional drastic changes in state.
We present a series of algorithms intended to construct almost invariant aggregates
of a given Markov chain. These algorithms are iterative processes which utilise a
concept known as the stochastic complement. The stochastic complement is a method
by which a Markov chain on a state space S can be reduced to a random process on
a proper subset S0 S, while preserving many of the algebraic properties of the
original Markov chain.
We pay special attention to the reversible case. A Markov chain is reversible if
it is symmetric in time { by which we mean that if we were to reverse the order of
the variables x1; : : : ; xt, for some relatively large t, the resulting process would be
essentially indistinguishable from the original Markov chain.
Subjects/Keywords: Hamilton Institute; Markov Chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Tifenbach, R. M. (2011). A Combinatorial Approach to Nearly Uncoupled Markov Chains. (Thesis). RIAN. Retrieved from http://eprints.maynoothuniversity.ie/3730/
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Tifenbach, Ryan M. “A Combinatorial Approach to Nearly Uncoupled Markov Chains.” 2011. Thesis, RIAN. Accessed March 01, 2021.
http://eprints.maynoothuniversity.ie/3730/.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Tifenbach, Ryan M. “A Combinatorial Approach to Nearly Uncoupled Markov Chains.” 2011. Web. 01 Mar 2021.
Vancouver:
Tifenbach RM. A Combinatorial Approach to Nearly Uncoupled Markov Chains. [Internet] [Thesis]. RIAN; 2011. [cited 2021 Mar 01].
Available from: http://eprints.maynoothuniversity.ie/3730/.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Tifenbach RM. A Combinatorial Approach to Nearly Uncoupled Markov Chains. [Thesis]. RIAN; 2011. Available from: http://eprints.maynoothuniversity.ie/3730/
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
7.
Tifenbach, Ryan M.
A Combinatorial Approach to Nearly Uncoupled Markov Chains.
Degree: 2011, RIAN
URL: http://mural.maynoothuniversity.ie/3730/
► A discrete-time Markov chain on a state space S is a sequence of random variables X = fx0; x1; : : :g that take on…
(more)
▼ A discrete-time Markov chain on a state space S is a sequence of random variables
X = fx0; x1; : : :g that take on values in S. A Markov chain is a model of a system
which changes or evolves over time; the random variable xt is the state of the system
at time t.
A subset E S is referred to as an almost invariant aggregate if whenever xt 2 E,
then with high probability xt+1 2 E, as well. That is, if there is a small positive value
such that if xt 2 E then the probability that xt+1 =2 E is less than or equal to ,
then E is an almost invariant aggregate. If E is such an aggregate and xt 2 E, then
the probability that xt+1; : : : ; xt+s 2 E is at least (1-E)s. A Markov chain tends to
remain within its almost invariant aggregates (if it possesses any) for long periods of
time.
We refer to the Markov chain X as nearly uncoupled (with respect to some positive
) if its associated state space contains two or more disjoint almost invariant
aggregates. Nearly uncoupled Markov chains are characterised by long periods of
relatively constant behaviour, punctuated by occasional drastic changes in state.
We present a series of algorithms intended to construct almost invariant aggregates
of a given Markov chain. These algorithms are iterative processes which utilise a
concept known as the stochastic complement. The stochastic complement is a method
by which a Markov chain on a state space S can be reduced to a random process on
a proper subset S0 S, while preserving many of the algebraic properties of the
original Markov chain.
We pay special attention to the reversible case. A Markov chain is reversible if
it is symmetric in time { by which we mean that if we were to reverse the order of
the variables x1; : : : ; xt, for some relatively large t, the resulting process would be
essentially indistinguishable from the original Markov chain.
Subjects/Keywords: Hamilton Institute; Markov Chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Tifenbach, R. M. (2011). A Combinatorial Approach to Nearly Uncoupled Markov Chains. (Thesis). RIAN. Retrieved from http://mural.maynoothuniversity.ie/3730/
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Tifenbach, Ryan M. “A Combinatorial Approach to Nearly Uncoupled Markov Chains.” 2011. Thesis, RIAN. Accessed March 01, 2021.
http://mural.maynoothuniversity.ie/3730/.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Tifenbach, Ryan M. “A Combinatorial Approach to Nearly Uncoupled Markov Chains.” 2011. Web. 01 Mar 2021.
Vancouver:
Tifenbach RM. A Combinatorial Approach to Nearly Uncoupled Markov Chains. [Internet] [Thesis]. RIAN; 2011. [cited 2021 Mar 01].
Available from: http://mural.maynoothuniversity.ie/3730/.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Tifenbach RM. A Combinatorial Approach to Nearly Uncoupled Markov Chains. [Thesis]. RIAN; 2011. Available from: http://mural.maynoothuniversity.ie/3730/
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Georgia Tech
8.
Paarporn, Keith.
Dynamics of epidemic spreading over networks with agent awareness.
Degree: MS, Electrical and Computer Engineering, 2016, Georgia Tech
URL: http://hdl.handle.net/1853/55676
► We study an SIS (susceptible-infected-susceptible) model of disease spread over a contact network of n agents. The agents receive personalized information about the epidemic through…
(more)
▼ We study an SIS (susceptible-infected-susceptible) model of disease spread over a contact network of n agents. The agents receive personalized information about the epidemic through their social network and a global broadcast of the current infected fraction among the population. They reduce interactions with their neighbors when they believe the epidemic is prevalent. The epidemic dynamics are described by a
Markov chain, from which a mean-field approximation (MFA) is derived. We derive a threshold condition above which the epidemic is expected to persist for a long time, and below which it dies out quickly. Through a coupling argument, we also establish stochastic domination properties between the awareness model and the model with no awareness. The effect awareness has on the disease dynamics is studied on various random graph families.
Advisors/Committee Members: Shamma, Jeff (committee member), Egerstedt, Magnus B. (committee member), Wardi, Yorai (committee member).
Subjects/Keywords: Epidemics; Networks; Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Paarporn, K. (2016). Dynamics of epidemic spreading over networks with agent awareness. (Masters Thesis). Georgia Tech. Retrieved from http://hdl.handle.net/1853/55676
Chicago Manual of Style (16th Edition):
Paarporn, Keith. “Dynamics of epidemic spreading over networks with agent awareness.” 2016. Masters Thesis, Georgia Tech. Accessed March 01, 2021.
http://hdl.handle.net/1853/55676.
MLA Handbook (7th Edition):
Paarporn, Keith. “Dynamics of epidemic spreading over networks with agent awareness.” 2016. Web. 01 Mar 2021.
Vancouver:
Paarporn K. Dynamics of epidemic spreading over networks with agent awareness. [Internet] [Masters thesis]. Georgia Tech; 2016. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/1853/55676.
Council of Science Editors:
Paarporn K. Dynamics of epidemic spreading over networks with agent awareness. [Masters Thesis]. Georgia Tech; 2016. Available from: http://hdl.handle.net/1853/55676

Georgia Tech
9.
Bhakta, Prateek Jayeshbhai.
Markov chains for weighted lattice structures.
Degree: PhD, Computer Science, 2016, Georgia Tech
URL: http://hdl.handle.net/1853/55689
► Markov chains are an essential tool for sampling from large sets, and are ubiquitous across many scientific fields, including statistical physics, industrial engineering, and computer…
(more)
▼ Markov chains are an essential tool for sampling from large sets, and are ubiquitous across many scientific fields, including statistical physics, industrial engineering, and computer science. To be a useful tool for sampling, the number of steps needed for a
Markov chain to converge approximately to the target probability distribution, also known as the mixing time, should be a small polynomial in n, the size of a state. We study problems that arise from the design and analysis of
Markov chains that sample from configurations of lattice structures. Specifically, we will be interested in settings where each state is sampled with a non-uniform weight that depends on the structure of the configuration. These weighted lattice models arise naturally in many contexts, and are typically more difficult to analyze than their unweighted counterparts. Our focus will be on exploiting these weightings both to develop new efficient algorithms for sampling and to prove new mixing time bounds for existing
Markov chains. First, we will present an efficient algorithm for sampling fixed rank elements from a graded poset, which includes sampling integer partitions of n as a special case. Then, we study the problem of sampling weighted perfect matchings on lattices using a natural
Markov chain based on "rotations", and provide evidence towards understanding why this
Markov chain has empirically been observed to converge slowly. Finally, we present and analyze a generalized version of the Schelling Segregation model, first proposed in 1971 by economist Thomas Schelling to explain possible causes of racial segregation in cities. We identify conditions under which segregation, or clustering, is likely or unlikely to occur. Our analysis techniques for all three problems are drawn from the interface of theoretical computer science with discrete mathematics and statistical physics.
Advisors/Committee Members: Randall, Dana (advisor), Mihail, Milena (committee member), Goldberg, David (committee member), Vigoda, Eric (committee member), Tetali, Prasad (committee member).
Subjects/Keywords: Markov chains; Mixing rates
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bhakta, P. J. (2016). Markov chains for weighted lattice structures. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/55689
Chicago Manual of Style (16th Edition):
Bhakta, Prateek Jayeshbhai. “Markov chains for weighted lattice structures.” 2016. Doctoral Dissertation, Georgia Tech. Accessed March 01, 2021.
http://hdl.handle.net/1853/55689.
MLA Handbook (7th Edition):
Bhakta, Prateek Jayeshbhai. “Markov chains for weighted lattice structures.” 2016. Web. 01 Mar 2021.
Vancouver:
Bhakta PJ. Markov chains for weighted lattice structures. [Internet] [Doctoral dissertation]. Georgia Tech; 2016. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/1853/55689.
Council of Science Editors:
Bhakta PJ. Markov chains for weighted lattice structures. [Doctoral Dissertation]. Georgia Tech; 2016. Available from: http://hdl.handle.net/1853/55689

Universidade Estadual de Campinas
10.
Vieira, Francisco Zuilton Gonçalves.
Cadeias de Markov homogêneas discretas: Discrete homogeneous Markov chains.
Degree: 2011, Universidade Estadual de Campinas
URL: http://repositorio.unicamp.br/jspui/handle/REPOSIP/306581
► Abstract: This dissertation deals with the study of discrete Markov chains with values in a countable state space. Markov chains are processes stochastic in the…
(more)
▼ Abstract: This dissertation deals with the study of discrete
Markov chains with values in a countable state space.
Markov chains are processes stochastic in the following sense: given the present moment, the future does not depend on the past, but only in the present moment. Our study is conducted on homogeneous
Markov chains (HMC) discrete. Initially, we introduced the definition and the basic concepts of discrete HMC. Such studies lead us to understand the concept of topology Transition matrices associated to HMC. The topology of these
chains is a necessary tool for the study of the recurrent and transient sets, which are of great importance in this theory. The study of steady states and the strong
Markov properties are also addressed. This latter property serves to build the concept of recurrent state. From this latter concept we work with the concepts of positive and null recurrent. Finally, we studied the important concept of absorption time, which is understood as the time that some state is absorbed to a set recurrent
Advisors/Committee Members: UNIVERSIDADE ESTADUAL DE CAMPINAS (CRUESP), Stelmastchuk, Simão Nicolau, 1977- (advisor), Universidade Estadual de Campinas. Instituto de Matemática, Estatística e Computação Científica (institution), Programa de Pós-Graduação em Matemática (nameofprogram), Torezzan, Cristiano (committee member), Veloso, Marcelo Oliveira (committee member).
Subjects/Keywords: Probabilidades; Markov, Cadeias de; Probabilities; Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Vieira, F. Z. G. (2011). Cadeias de Markov homogêneas discretas: Discrete homogeneous Markov chains. (Thesis). Universidade Estadual de Campinas. Retrieved from http://repositorio.unicamp.br/jspui/handle/REPOSIP/306581
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Vieira, Francisco Zuilton Gonçalves. “Cadeias de Markov homogêneas discretas: Discrete homogeneous Markov chains.” 2011. Thesis, Universidade Estadual de Campinas. Accessed March 01, 2021.
http://repositorio.unicamp.br/jspui/handle/REPOSIP/306581.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Vieira, Francisco Zuilton Gonçalves. “Cadeias de Markov homogêneas discretas: Discrete homogeneous Markov chains.” 2011. Web. 01 Mar 2021.
Vancouver:
Vieira FZG. Cadeias de Markov homogêneas discretas: Discrete homogeneous Markov chains. [Internet] [Thesis]. Universidade Estadual de Campinas; 2011. [cited 2021 Mar 01].
Available from: http://repositorio.unicamp.br/jspui/handle/REPOSIP/306581.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Vieira FZG. Cadeias de Markov homogêneas discretas: Discrete homogeneous Markov chains. [Thesis]. Universidade Estadual de Campinas; 2011. Available from: http://repositorio.unicamp.br/jspui/handle/REPOSIP/306581
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
11.
Ashton, Stephen.
The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks.
Degree: PhD, 2019, University of Sussex
URL: http://sro.sussex.ac.uk/id/eprint/88850/
;
https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603
► In this thesis, I provide a statistical analysis of high-resolution contact pattern data within primary and secondary schools as collected by the SocioPatterns collaboration. Students…
(more)
▼ In this thesis, I provide a statistical analysis of high-resolution contact pattern data within primary and secondary schools as collected by the SocioPatterns collaboration. Students are graphically represented as nodes in a temporally evolving network, in which links represent proximity or interaction between students. I focus on link- and node-level statistics, such as the on- and off-durations of links as well as the activity potential of nodes and links. Parametric models are fitted to the onand off-durations of links, interevent times and node activity potentials and, based on these, I propose a number of theoretical models that are able to reproduce the collected data within varying levels of accuracy. By doing so, I aim to identify the minimal network-level properties that are needed to closely match the real-world data, with the aim of combining this contact pattern model with epidemic models in future work. I also provide Bayesian methods for parameter estimation using exact Bayesian and Markov Chain Monte Carlo methods, applying these in the case of Mittag-Leffler distributed data to artificially generated data and real-world examples. Additionally, I present probabilistic methods for model selection - namely the Akaike and Bayesian Information Criteria and apply them to the data and examples in the previous section.
Subjects/Keywords: 510; QA0274.7 Markov processes. Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ashton, S. (2019). The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks. (Doctoral Dissertation). University of Sussex. Retrieved from http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603
Chicago Manual of Style (16th Edition):
Ashton, Stephen. “The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks.” 2019. Doctoral Dissertation, University of Sussex. Accessed March 01, 2021.
http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603.
MLA Handbook (7th Edition):
Ashton, Stephen. “The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks.” 2019. Web. 01 Mar 2021.
Vancouver:
Ashton S. The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks. [Internet] [Doctoral dissertation]. University of Sussex; 2019. [cited 2021 Mar 01].
Available from: http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603.
Council of Science Editors:
Ashton S. The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks. [Doctoral Dissertation]. University of Sussex; 2019. Available from: http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603
12.
Sherborne, Neil.
Non-Markovian epidemic dynamics on networks.
Degree: PhD, 2018, University of Sussex
URL: http://sro.sussex.ac.uk/id/eprint/79084/
;
https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572
► The use of networks to model the spread of epidemics through structured populations is widespread. However, epidemics on networks lead to intractable exact systems with…
(more)
▼ The use of networks to model the spread of epidemics through structured populations is widespread. However, epidemics on networks lead to intractable exact systems with the need to coarse grain and focus on some average quantities. Often, the underlying stochastic processes are Markovian and so are the resulting mean-field models constructed as systems of ordinary differential equations (ODEs). However, the lack of memory (or memorylessness) does not accurately describe real disease dynamics. For instance, many epidemiological studies have shown that the true distribution of the infectious period is rather centred around its mean, whereas the memoryless assumption imposes an exponential distribution on the infectious period. Assumptions such as these greatly affect the predicted course of an epidemic and can lead to inaccurate predictions about disease spread. Such limitations of existing approaches to modelling epidemics on networks motivated my efforts to develop non-Markovian models which would be better suited to capture essential realistic features of disease dynamics. In the first part of my thesis I developed a pairwise, multi-stage SIR (susceptible-infected-recovered) model. Each infectious node goes through some K 2 N infectious stages, which for K > 1 means that the infectious period is gamma-distributed. Analysis of the model provided analytic expressions for the epidemic threshold and the expected final epidemic size. Using available epidemiological data on the infectious periods of various diseases, I demonstrated the importance of considering the shape of the infectious period distribution. The second part of the thesis expanded the framework of non-Markovian dynamics to networks with heterogeneous degree distributions with non-negligible levels of clustering. These properties are ubiquitous in many real-world networks and make model development and analysis much more challenging. To this end, I have derived and analysed a compact pairwise model with the number of equations being independent of the range of node degrees, and investigated the effects of clustering on epidemic dynamics. My thesis culminated with the third part where I explored the relationships between several different modelling methodologies, and derived an original non-Markovian Edge-Based Compartmental Model (EBCM) which allows both transmission and recovery to be arbitrary independent stochastic processes. The major result is a rigorous mathematical proof that the message passing (MP) model and the EBCM are equivalent, and thus, the EBCM is statistically exact on the ensemble of configuration model networks. From this consideration I derived a generalised pairwise-like model which I then used to build a model hierarchy, and to show that, given corresponding parameters and initial conditions, these models are identical to MP model or EBCM. In the final part of my thesis I considered the important problem of coupling epidemic dynamics with changes in network structure in response to the perceived risk of the epidemic. This was framed…
Subjects/Keywords: 510; QA0274.7 Markov processes. Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Sherborne, N. (2018). Non-Markovian epidemic dynamics on networks. (Doctoral Dissertation). University of Sussex. Retrieved from http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572
Chicago Manual of Style (16th Edition):
Sherborne, Neil. “Non-Markovian epidemic dynamics on networks.” 2018. Doctoral Dissertation, University of Sussex. Accessed March 01, 2021.
http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572.
MLA Handbook (7th Edition):
Sherborne, Neil. “Non-Markovian epidemic dynamics on networks.” 2018. Web. 01 Mar 2021.
Vancouver:
Sherborne N. Non-Markovian epidemic dynamics on networks. [Internet] [Doctoral dissertation]. University of Sussex; 2018. [cited 2021 Mar 01].
Available from: http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572.
Council of Science Editors:
Sherborne N. Non-Markovian epidemic dynamics on networks. [Doctoral Dissertation]. University of Sussex; 2018. Available from: http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572

Cornell University
13.
Zeber, David.
Extremal Properties Of Markov Chains And The Conditional Extreme Value Model.
Degree: PhD, Statistics, 2012, Cornell University
URL: http://hdl.handle.net/1813/31016
► Multivariate extreme value theory has proven useful for modeling multivariate data in fields such as finance and environmental science, where one is interested in accounting…
(more)
▼ Multivariate extreme value theory has proven useful for modeling multivariate data in fields such as finance and environmental science, where one is interested in accounting for the tendency of observations to exceed an extremely high (or low) threshold. Recent work has developed extremal models by studying the conditional distribution of a random vector, conditional on one of the components becoming extreme. This provides a way to handle situations such as asymptotic dependence, where traditional techniques may be uninformative. In this thesis, we explore the implications of the assumption that such a conditional distribution is well approximated by a limiting probability distribution when the conditioning component is extreme. We consider a version of the conditional distribution specified by a transition function. If the transition kernel of a
Markov chain satisfies our assumption, then a process known as the tail chain approximates the
Markov chain over extreme states. We characterize the class of
chains which admit such an approximation, and investigate the properties of the tail chain in relation to the distinction between extreme and non-extreme states. We find that, in general, the tail chain approximates a portion of the original process we term the "extremal component". We further derive the limit in distribution of a point process consisting of normalized
Markov chain observations, expressing the limit in terms of the tail chain. We also consider the case where a transition function satisfying our assumption describes the dependence structure of a random vector. We establish conditions under which a conditional extreme value model is appropriate, and derive the form of the limiting measure.
Advisors/Committee Members: Resnick, Sidney Ira (chair), Nussbaum, Michael (committee member), Samorodnitsky, Gennady (committee member).
Subjects/Keywords: Extreme Value Theory; Markov Chains; Point Processes
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Zeber, D. (2012). Extremal Properties Of Markov Chains And The Conditional Extreme Value Model. (Doctoral Dissertation). Cornell University. Retrieved from http://hdl.handle.net/1813/31016
Chicago Manual of Style (16th Edition):
Zeber, David. “Extremal Properties Of Markov Chains And The Conditional Extreme Value Model.” 2012. Doctoral Dissertation, Cornell University. Accessed March 01, 2021.
http://hdl.handle.net/1813/31016.
MLA Handbook (7th Edition):
Zeber, David. “Extremal Properties Of Markov Chains And The Conditional Extreme Value Model.” 2012. Web. 01 Mar 2021.
Vancouver:
Zeber D. Extremal Properties Of Markov Chains And The Conditional Extreme Value Model. [Internet] [Doctoral dissertation]. Cornell University; 2012. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/1813/31016.
Council of Science Editors:
Zeber D. Extremal Properties Of Markov Chains And The Conditional Extreme Value Model. [Doctoral Dissertation]. Cornell University; 2012. Available from: http://hdl.handle.net/1813/31016

Cornell University
14.
Murugan, Mathav.
Random Walks On Metric Measure Spaces.
Degree: PhD, Applied Mathematics, 2015, Cornell University
URL: http://hdl.handle.net/1813/40684
► In this thesis, we study transition probability estimates for Markov chains and their relationship to the geometry of the underlying state space. The thesis is…
(more)
▼ In this thesis, we study transition probability estimates for
Markov chains and their relationship to the geometry of the underlying state space. The thesis is divided into two parts. In the first part (Chapter 1) we consider
Markov chains with bounded range, that is there exists R > 0 such that the
Markov chain (Xn )n∈N satisfies d(Xn , Xn+1 ) < R for all n ∈ N). In the second part (Chapter 2 and 3) we consider
Markov chains with heavy-tailed jumps. In Chapter 1, we characterize Gaussian estimates for transition probability of a discrete time
Markov chain in terms of geometric properties of the underlying state space. In particular, we show that the following are equivalent: 1. Two sided Gaussian bounds on heat kernel 2. A scale invariant Parabolic Harnack inequality 3. Volume doubling property and a scale invariant Poincar´ inequality. e The underlying state space is metric measure space that includes both manifolds and graphs as special cases. Various applications and examples are provided. An important feature of our work is that our techniques are robust to small perturbations of the underlying space. In Chapter 2, we study the long-term behaviour of random walks with heavy tailed jumps. We focus on the case where the 'index of tail heaviness' (or jump index) [beta] ∈ (0, 2). Extending several existing work by other authors, we prove global upper and lower bounds for n-step transition probability density that is sharp up to constants. In Chapter 3, we study random walks with heavy tailed jumps where the index of tail heaviness [beta] is allowed to take any positive value. We assume that the state space in this case is a graph satisfying a sub-Gaussian estimate, which is typical of many fractal-like graphs. On such graphs, we establish a threshold behavior of heavy-tailed
Markov chains when the index governing the tail heaviness equals the escape time exponent of the simple random walk. In a certain sense, this generalizes the classical threshold corresponding to the second moment condition. This thesis is based on joint work with Laurent Saloff-Coste.
Advisors/Committee Members: Saloff-Coste,Laurent Pascal (chair), Levine,Lionel (committee member), Samorodnitsky,Gennady (committee member).
Subjects/Keywords: Markov chains; transition probability estimates; anomalous diffusion
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Murugan, M. (2015). Random Walks On Metric Measure Spaces. (Doctoral Dissertation). Cornell University. Retrieved from http://hdl.handle.net/1813/40684
Chicago Manual of Style (16th Edition):
Murugan, Mathav. “Random Walks On Metric Measure Spaces.” 2015. Doctoral Dissertation, Cornell University. Accessed March 01, 2021.
http://hdl.handle.net/1813/40684.
MLA Handbook (7th Edition):
Murugan, Mathav. “Random Walks On Metric Measure Spaces.” 2015. Web. 01 Mar 2021.
Vancouver:
Murugan M. Random Walks On Metric Measure Spaces. [Internet] [Doctoral dissertation]. Cornell University; 2015. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/1813/40684.
Council of Science Editors:
Murugan M. Random Walks On Metric Measure Spaces. [Doctoral Dissertation]. Cornell University; 2015. Available from: http://hdl.handle.net/1813/40684
15.
Geetha Antony Pullen.
Bayesian methods in genetics: a graph theoretic
approach; -.
Degree: Statistics, 2012, Kannur University
URL: http://shodhganga.inflibnet.ac.in/handle/10603/6056
None
Appendices p.166-181, Bibliography
p.151-165
Advisors/Committee Members: Kumaran, M.
Subjects/Keywords: Markov Chains; Statistics
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Pullen, G. A. (2012). Bayesian methods in genetics: a graph theoretic
approach; -. (Thesis). Kannur University. Retrieved from http://shodhganga.inflibnet.ac.in/handle/10603/6056
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Pullen, Geetha Antony. “Bayesian methods in genetics: a graph theoretic
approach; -.” 2012. Thesis, Kannur University. Accessed March 01, 2021.
http://shodhganga.inflibnet.ac.in/handle/10603/6056.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Pullen, Geetha Antony. “Bayesian methods in genetics: a graph theoretic
approach; -.” 2012. Web. 01 Mar 2021.
Vancouver:
Pullen GA. Bayesian methods in genetics: a graph theoretic
approach; -. [Internet] [Thesis]. Kannur University; 2012. [cited 2021 Mar 01].
Available from: http://shodhganga.inflibnet.ac.in/handle/10603/6056.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Pullen GA. Bayesian methods in genetics: a graph theoretic
approach; -. [Thesis]. Kannur University; 2012. Available from: http://shodhganga.inflibnet.ac.in/handle/10603/6056
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

North Carolina State University
16.
Liu, Ning.
Spectral Clustering for Graphs and Markov Chains.
Degree: PhD, Computer Science, 2010, North Carolina State University
URL: http://www.lib.ncsu.edu/resolver/1840.16/4886
► Spectral graph partitioning based on spectral theory has become a popular clustering method over the last few years. The starting point is the work of…
(more)
▼ Spectral graph partitioning based on spectral theory has become a popular clustering method over the last few years. The starting point is the work of Fiedler who shows that an eigenvector of the Laplacian matrix of an undirected graph (symmetric system) provides the minimum cut of graph nodes. The spectral technique can also be
applied to a
Markov chain to cluster states and, in general, is more broadly applicable to nonsymmetric systems. Enlightened by these
facts, we combine them to show that
Markov chains, due to two different clustering techniques they offer, are effective approaches
for clustering in more general situations. In this dissertation, we advance the state of the art of spectral clustering and introduce a
new algorithm to decompose matrices into blocks.
We first prove that the second eigenvector of the signless Laplacian provides a heuristic solution to the NP-complete state clustering problem which is the dual problem of graph partitioning. A new
method for clustering nodes of a graph that have negative edge weights is also proposed.
Second, a connection between the singular vectors obtained from an SVD decomposition and the eigenvectors from spectral algorithms on data clustering is revealed. We show that the singular vectors of the node-edge incidence matrix generate not only clusters on the nodes but also clusters on the edges.
Third, relating spectral clustering and state clustering of
Markov chains, we present two clustering techniques for
Markov chains based
on two different measures and suggest a mean of incorporating both techniques to obtain comprehensive information concerning state
clusters.
Fourth, we display the connection between spectral clustering and dimension reduction techniques in statistical clustering. Also, the
results obtained from spectral and statistical clustering are shown to be related.
Finally, we develop a new improved spectral clustering procedure for decomposing matrices into blocks. This algorithm works well in
several applications, especially in problems of detecting communities in complex networks, where some existing methods, e.g. MARCA and TPABLO, fail.
Advisors/Committee Members: Dr. William J. Stewart, Committee Chair (advisor), Dr. Harry G. Perros, Committee Co-Chair (advisor), Dr. Laurie Williams, Committee Member (advisor), Dr. Michael Devetsikiotis, Committee Member (advisor).
Subjects/Keywords: spectral clustering; graph partitioning; markov chains; eigenvalue
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Liu, N. (2010). Spectral Clustering for Graphs and Markov Chains. (Doctoral Dissertation). North Carolina State University. Retrieved from http://www.lib.ncsu.edu/resolver/1840.16/4886
Chicago Manual of Style (16th Edition):
Liu, Ning. “Spectral Clustering for Graphs and Markov Chains.” 2010. Doctoral Dissertation, North Carolina State University. Accessed March 01, 2021.
http://www.lib.ncsu.edu/resolver/1840.16/4886.
MLA Handbook (7th Edition):
Liu, Ning. “Spectral Clustering for Graphs and Markov Chains.” 2010. Web. 01 Mar 2021.
Vancouver:
Liu N. Spectral Clustering for Graphs and Markov Chains. [Internet] [Doctoral dissertation]. North Carolina State University; 2010. [cited 2021 Mar 01].
Available from: http://www.lib.ncsu.edu/resolver/1840.16/4886.
Council of Science Editors:
Liu N. Spectral Clustering for Graphs and Markov Chains. [Doctoral Dissertation]. North Carolina State University; 2010. Available from: http://www.lib.ncsu.edu/resolver/1840.16/4886

University of Adelaide
17.
Teo, Mingmei.
Optimal allocation of vaccines in metapopulations.
Degree: 2017, University of Adelaide
URL: http://hdl.handle.net/2440/112808
► Infectious diseases have had a devastating impact on society and the world's population throughout the years. For example, the Spanish flu in 1918 and most…
(more)
▼ Infectious diseases have had a devastating impact on society and the world's population throughout the years. For example, the Spanish flu in 1918 and most recently, the Ebola epidemic in West Africa. The introduction of vaccines has managed to keep some infectious diseases under control and in the case of small-pox, eradicated it from the world's population. Hence, vaccines are a very effective method to control the spread of an infectious disease and where possible, the world's population should be vaccinated against all possible diseases. However, the production of vaccines is expensive and if there is a novel strain of a disease, it is often unlikely that a vaccine has been developed to combat the outbreak immediately. Instead, a vaccines must be developed during an epidemic. Both of these situations therefore result in a limited supply of vaccines. Thus, it is of great interest and importance to public health officials to be able to know how best to allocate a limited supply of vaccines to a population. This is the main idea and theme of this thesis. We investigate two possible questions that can arise from this problem. The first is to consider the case where there is a novel strain of a disease and vaccines are developed during an epidemic. For many public health officials in countries or cities around the world, the question is how best to allocate a limited supply of vaccines to the population to minimise the number of people that become infected, after the infection is already present in the population. That is, they are interested in determining the optimal allocation of limited vaccines which take into account the changing dynamics of the epidemic. This naturally lends itself to dynamic programming, which determines the optimal actions whilst accounting for the dynamics of a process. Hence, we explore the use of dynamic programming techniques, backward dynamic programming and approximate dynamic programming, to attempt to solve this problem. We observe that backward dynamic programming does not scale well with the size of the population and so an alternative method is to consider approximate dynamic programming. The approximate dynamic programming algorithms we consider fall under the category of lookup tables. We find, through order calculations, that these methods do not work efficiently for our problem and so other types of algorithms need to be explored in order for approximate dynamic programming to be applied. The second question of interest is to consider an epidemic occurring in some part of the world and government officials in a currently uninfected country have access to a limited supply of vaccines. Then, their interest lies in how best to distribute this supply of vaccines to the population to minimise the mean final epidemic size, that is, the number of people that become infected over the course of an epidemic, should an epidemic arise. Further, we also investigate whether vaccines should be withheld until the first onset of infection in the population or be distributed before infection is…
Advisors/Committee Members: Bean, Nigel Geoffrey (advisor), Ross, Joshua (advisor), School of Mathematical Sciences (school).
Subjects/Keywords: vaccination; continuous-time Markov chains; mathematical epidemiology
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Teo, M. (2017). Optimal allocation of vaccines in metapopulations. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/112808
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Teo, Mingmei. “Optimal allocation of vaccines in metapopulations.” 2017. Thesis, University of Adelaide. Accessed March 01, 2021.
http://hdl.handle.net/2440/112808.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Teo, Mingmei. “Optimal allocation of vaccines in metapopulations.” 2017. Web. 01 Mar 2021.
Vancouver:
Teo M. Optimal allocation of vaccines in metapopulations. [Internet] [Thesis]. University of Adelaide; 2017. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/2440/112808.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Teo M. Optimal allocation of vaccines in metapopulations. [Thesis]. University of Adelaide; 2017. Available from: http://hdl.handle.net/2440/112808
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Edinburgh
18.
Milios, Dimitrios.
On approximating the stochastic behaviour of Markovian process algebra models.
Degree: PhD, 2014, University of Edinburgh
URL: http://hdl.handle.net/1842/8930
► Markov chains offer a rigorous mathematical framework to describe systems that exhibit stochastic behaviour, as they are supported by a plethora of methodologies to analyse…
(more)
▼ Markov chains offer a rigorous mathematical framework to describe systems that exhibit stochastic behaviour, as they are supported by a plethora of methodologies to analyse their properties. Stochastic process algebras are high-level formalisms, where systems are represented as collections of interacting components. This compositional approach to modelling allows us to describe complex Markov chains using a compact high-level specification. There is an increasing need to investigate the properties of complex systems, not only in the field of computer science, but also in computational biology. To explore the stochastic properties of large Markov chains is a demanding task in terms of computational resources. Approximating the stochastic properties can be an effective way to deal with the complexity of large models. In this thesis, we investigate methodologies to approximate the stochastic behaviour of Markovian process algebra models. The discussion revolves around two main topics: approximate state-space aggregation and stochastic simulation. Although these topics are different in nature, they are both motivated by the need to efficiently handle complex systems. Approximate Markov chain aggregation constitutes the formulation of a smaller Markov chain that approximates the behaviour of the original model. The principal hypothesis is that states that can be characterised as equivalent can be adequately represented as a single state. We discuss different notions of approximate state equivalence, and how each of these can be used as a criterion to partition the state-space accordingly. Nevertheless, approximate aggregation methods typically require an explicit representation of the transition matrix, a fact that renders them impractical for large models. We propose a compositional approach to aggregation, as a means to efficiently approximate complex Markov models that are defined in a process algebra specification, PEPA in particular. Regarding our contributions to Markov chain simulation, we propose an accelerated method that can be characterised as almost exact, in the sense that it can be arbitrarily precise. We discuss how it is possible to sample from the trajectory space rather than the transition space. This approach requires fewer random samples than a typical simulation algorithm. Most importantly, our approach does not rely on particular assumptions with respect to the model properties, in contrast to otherwise more efficient approaches.
Subjects/Keywords: 519.2; process algebras; Markov chains; simulation
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Milios, D. (2014). On approximating the stochastic behaviour of Markovian process algebra models. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/8930
Chicago Manual of Style (16th Edition):
Milios, Dimitrios. “On approximating the stochastic behaviour of Markovian process algebra models.” 2014. Doctoral Dissertation, University of Edinburgh. Accessed March 01, 2021.
http://hdl.handle.net/1842/8930.
MLA Handbook (7th Edition):
Milios, Dimitrios. “On approximating the stochastic behaviour of Markovian process algebra models.” 2014. Web. 01 Mar 2021.
Vancouver:
Milios D. On approximating the stochastic behaviour of Markovian process algebra models. [Internet] [Doctoral dissertation]. University of Edinburgh; 2014. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/1842/8930.
Council of Science Editors:
Milios D. On approximating the stochastic behaviour of Markovian process algebra models. [Doctoral Dissertation]. University of Edinburgh; 2014. Available from: http://hdl.handle.net/1842/8930

Delft University of Technology
19.
Shelat, Sanmay (author).
Developing an Integrated Pedestrian Behaviour Model for Office Buildings.
Degree: 2017, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:6a52306a-2747-450e-89e4-390f5db96bf5
► As an increasing number of people work in office buildings and new sophisticated sensor technologies become available there is, both, a need and potential, to…
(more)
▼ As an increasing number of people work in office buildings and new sophisticated sensor technologies become available there is, both, a need and potential, to develop more complex building service controls that increase the energy efficiency of buildings as well as the well-being of employees. The data required for the testing and evaluation of such control systems is usually in the form of movements and locations of office building occupants collected over long periods of time. However, such data is generally difficult to obtain for reasons ranging from the need to evaluate un-commissioned buildings to privacy concerns related to data sharing. Therefore, this study develops a pedestrian behaviour model that can simulate office occupants’ movements and locations thereby acting as a research platform that produces data for external applications. The model is integrated as it simulates not only the movements of occupants between different locations in the building but also decisions that drive the movements such as which activities occupants want to carry out throughout the day, and when and where they want to perform these activities. Furthermore, the model is based on the guidelines of (i) flexibility – the model is able to simulate movements in different building plans and represent movement patterns of different organizations; (ii) extensibility – the model uses a modular framework that enables easy adoption of more complex components and integration with other workplace related studies as and when required; and (iii) data parsimony – the model has low and simple data requirements itself.
Advisors/Committee Members: Hoogendoorn, Serge (graduation committee), Daamen, Winnie (mentor), van der Spek, Stefan (graduation committee), Duives, Dorine (graduation committee), Kaag, Bjorn (graduation committee), Delft University of Technology (degree granting institution).
Subjects/Keywords: Pedestrian simulation models; Office buildings; Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Shelat, S. (. (2017). Developing an Integrated Pedestrian Behaviour Model for Office Buildings. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:6a52306a-2747-450e-89e4-390f5db96bf5
Chicago Manual of Style (16th Edition):
Shelat, Sanmay (author). “Developing an Integrated Pedestrian Behaviour Model for Office Buildings.” 2017. Masters Thesis, Delft University of Technology. Accessed March 01, 2021.
http://resolver.tudelft.nl/uuid:6a52306a-2747-450e-89e4-390f5db96bf5.
MLA Handbook (7th Edition):
Shelat, Sanmay (author). “Developing an Integrated Pedestrian Behaviour Model for Office Buildings.” 2017. Web. 01 Mar 2021.
Vancouver:
Shelat S(. Developing an Integrated Pedestrian Behaviour Model for Office Buildings. [Internet] [Masters thesis]. Delft University of Technology; 2017. [cited 2021 Mar 01].
Available from: http://resolver.tudelft.nl/uuid:6a52306a-2747-450e-89e4-390f5db96bf5.
Council of Science Editors:
Shelat S(. Developing an Integrated Pedestrian Behaviour Model for Office Buildings. [Masters Thesis]. Delft University of Technology; 2017. Available from: http://resolver.tudelft.nl/uuid:6a52306a-2747-450e-89e4-390f5db96bf5

Delft University of Technology
20.
Krishnan, Vishruth (author).
Exploring the Potential of Uber Movement Data: An Amsterdam case study.
Degree: 2019, Delft University of Technology
URL: http://resolver.tudelft.nl/uuid:77ebadc7-8af9-4b42-8f36-a94755eb5009
► With the increasing use of big data in varied applications to improve decision making and provide new insights, the research explores the potential of the…
(more)
▼ With the increasing use of big data in varied applications to improve decision making and provide new insights, the research explores the potential of the Uber Movement data set released by Uber comprising of travel times from one zone to the other. A better understanding of the potential of the dataset could lead to the addition of the existing tool kit of Transport planners and city officials at the municipality of Amsterdam. Moreover, it would be the first of a kind data set enabling an understanding of taxi movement in the city. The Uber Movement Travel Time comprises of the average travel time between two wijken, where the ‘sourceid’ and ‘dstid’ do not correspond to the origin and destination of a trip but simply represent the directionality of the travel time measured. The data is aggregated across different levels of temporal detail and the number of data points directly corresponds to the level of temporal aggregation. For instance, if the quarterly aggregated data for the different days of the week is downloaded, the number of data points between a ‘sourceid’ and ‘dstid’ cannot exceed seven. Three aspects of the data set were explored: 1) ability to capture the demand for Ubers 2) ability to capture recurrent congestion and 3) ability to capture non-recurrent congestion. While the data according to the Uber Movement and previously used instances, the data is suited for performance (recurrent congestion and non-recurrent congestion) and impact-related studies of the network. The absence of route related information limits the applications of the data. The potential of the data is also limited by the data sparsity. The potential of the data was best revealed through demand studies which indicated a skewed user group of tourists, airport users (to and fro), work-related trips and users using Ubers late at night. In addition, with respect to the goals of the municipality in managing traffic activity across different zones and time periods, by implementing and extending an existing model in the form of adding ‘occupancy related measures’ and ‘shortest path’. Thus, based on the data penetration levels and travel time data, the model developed offers insights at a strategic level to the city in the form of Spatio-temporal concentration of Uber vehicles, occupancy levels through the day. The potential of the data lies in its ability to offer strategic insights to the city of Amsterdam and the greater Amsterdam region in the form of the unique Spatio-temporal spread of Uber vehicles across different hours of the day.
Civil Engineering | Transport and Planning
Advisors/Committee Members: van Lint, J.W.C. (mentor), Calvert, S.C. (mentor), Bozzon, Alessandro (graduation committee), Knijff, Tom (graduation committee), Delft University of Technology (degree granting institution).
Subjects/Keywords: Uber Movement; Amsterdam; Markov chains; Travel time
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Krishnan, V. (. (2019). Exploring the Potential of Uber Movement Data: An Amsterdam case study. (Masters Thesis). Delft University of Technology. Retrieved from http://resolver.tudelft.nl/uuid:77ebadc7-8af9-4b42-8f36-a94755eb5009
Chicago Manual of Style (16th Edition):
Krishnan, Vishruth (author). “Exploring the Potential of Uber Movement Data: An Amsterdam case study.” 2019. Masters Thesis, Delft University of Technology. Accessed March 01, 2021.
http://resolver.tudelft.nl/uuid:77ebadc7-8af9-4b42-8f36-a94755eb5009.
MLA Handbook (7th Edition):
Krishnan, Vishruth (author). “Exploring the Potential of Uber Movement Data: An Amsterdam case study.” 2019. Web. 01 Mar 2021.
Vancouver:
Krishnan V(. Exploring the Potential of Uber Movement Data: An Amsterdam case study. [Internet] [Masters thesis]. Delft University of Technology; 2019. [cited 2021 Mar 01].
Available from: http://resolver.tudelft.nl/uuid:77ebadc7-8af9-4b42-8f36-a94755eb5009.
Council of Science Editors:
Krishnan V(. Exploring the Potential of Uber Movement Data: An Amsterdam case study. [Masters Thesis]. Delft University of Technology; 2019. Available from: http://resolver.tudelft.nl/uuid:77ebadc7-8af9-4b42-8f36-a94755eb5009

Washington University in St. Louis
21.
Cook, Scott.
Markov Chains Derived From Lagrangian Mechanical Systems.
Degree: PhD, Mathematics, 2011, Washington University in St. Louis
URL: https://openscholarship.wustl.edu/etd/75
► The theory of Markov chains with countable state spaces is a greatly developed and successful area of probability theory and statistics. There is much interest…
(more)
▼ The theory of
Markov chains with countable state spaces is a greatly developed and successful area of probability theory and statistics. There is much interest in continuing to develop the theory of
Markov chains beyond countable state spaces. One needs good and well motivated model systems in this effort. In this thesis, we propose to produce such systems by introducing randomness into familiar deterministic systems so that we can draw upon the existing: deterministic) results to aid the analysis of our
Markov chains. We will focus most heavily on models drawn from Lagrangian mechanical systems with collisions: billiards).
Advisors/Committee Members: Renato Feres.
Subjects/Keywords: Mathematics; Mechanics; dynamical systems, Markov chains, probabilty
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Cook, S. (2011). Markov Chains Derived From Lagrangian Mechanical Systems. (Doctoral Dissertation). Washington University in St. Louis. Retrieved from https://openscholarship.wustl.edu/etd/75
Chicago Manual of Style (16th Edition):
Cook, Scott. “Markov Chains Derived From Lagrangian Mechanical Systems.” 2011. Doctoral Dissertation, Washington University in St. Louis. Accessed March 01, 2021.
https://openscholarship.wustl.edu/etd/75.
MLA Handbook (7th Edition):
Cook, Scott. “Markov Chains Derived From Lagrangian Mechanical Systems.” 2011. Web. 01 Mar 2021.
Vancouver:
Cook S. Markov Chains Derived From Lagrangian Mechanical Systems. [Internet] [Doctoral dissertation]. Washington University in St. Louis; 2011. [cited 2021 Mar 01].
Available from: https://openscholarship.wustl.edu/etd/75.
Council of Science Editors:
Cook S. Markov Chains Derived From Lagrangian Mechanical Systems. [Doctoral Dissertation]. Washington University in St. Louis; 2011. Available from: https://openscholarship.wustl.edu/etd/75

University of Tennessee – Knoxville
22.
Kodituwakku, Hansaka Angel Dias Edirisinghe.
InSight2: An Interactive Web Based Platform for Modeling and Analysis of Large Scale Argus Network Flow Data.
Degree: MS, Computer Engineering, 2017, University of Tennessee – Knoxville
URL: https://trace.tennessee.edu/utk_gradthes/4885
► Monitoring systems are paramount to the proactive detection and mitigation of problems in computer networks related to performance and security. Degraded performance and compromised…
(more)
▼ Monitoring systems are paramount to the proactive detection and mitigation of problems in computer networks related to performance and security. Degraded performance and compromised end-nodes can cost computer networks downtime, data loss and reputation. InSight2 is a platform that models, analyzes and visualizes large scale Argus network flow data using up-to-date geographical data, organizational information, and emerging threats. It is engineered to meet the needs of network administrators with flexibility and modularity in mind. Scalability is ensured by devising multi-core processing by implementing robust software architecture. Extendibility is achieved by enabling the end user to enrich flow records using additional user provided databases. Deployment is streamlined by providing an automated installation script. State-of-the-art visualizations are devised and presented in a secure, user friendly web interface giving greater insight about the network to the end user.
Advisors/Committee Members: Jens Gregor, Mark E. Dean, Audris Mockus.
Subjects/Keywords: insight2; noc; performance; analytics; visualization; markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kodituwakku, H. A. D. E. (2017). InSight2: An Interactive Web Based Platform for Modeling and Analysis of Large Scale Argus Network Flow Data. (Thesis). University of Tennessee – Knoxville. Retrieved from https://trace.tennessee.edu/utk_gradthes/4885
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kodituwakku, Hansaka Angel Dias Edirisinghe. “InSight2: An Interactive Web Based Platform for Modeling and Analysis of Large Scale Argus Network Flow Data.” 2017. Thesis, University of Tennessee – Knoxville. Accessed March 01, 2021.
https://trace.tennessee.edu/utk_gradthes/4885.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kodituwakku, Hansaka Angel Dias Edirisinghe. “InSight2: An Interactive Web Based Platform for Modeling and Analysis of Large Scale Argus Network Flow Data.” 2017. Web. 01 Mar 2021.
Vancouver:
Kodituwakku HADE. InSight2: An Interactive Web Based Platform for Modeling and Analysis of Large Scale Argus Network Flow Data. [Internet] [Thesis]. University of Tennessee – Knoxville; 2017. [cited 2021 Mar 01].
Available from: https://trace.tennessee.edu/utk_gradthes/4885.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kodituwakku HADE. InSight2: An Interactive Web Based Platform for Modeling and Analysis of Large Scale Argus Network Flow Data. [Thesis]. University of Tennessee – Knoxville; 2017. Available from: https://trace.tennessee.edu/utk_gradthes/4885
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Virginia Tech
23.
Lambeth, Jacob Nelson.
Improved Methods for Gridding, Stochastic Modeling, and Compact Characterization of Terrain Surfaces.
Degree: MS, Mechanical Engineering, 2013, Virginia Tech
URL: http://hdl.handle.net/10919/19329
► Accurate terrain models provide the chassis designer with a powerful tool to make informed design decisions early in the design process. During this stage, engineers…
(more)
▼ Accurate terrain models provide the chassis designer with a powerful tool to make informed design decisions early in the design process. During this stage, engineers are challenged with predicting vehicle loads through modeling and simulation. The accuracy of these simulation results depends not only on the fidelity of the model, but also on the excitation to the model. It is clear that the terrain is the main excitation to the vehicle [1]. The inputs to these models are often based directly on physical measurements (terrain profiles); therefore, the terrain measurements must be as accurate as possible. A collection of novel methods can be developed to aid in the study and application of 3D terrain measurements, which are dense and non-uniform, including efficient gridding, stochastic modeling, and compact characterization. Terrain measurements are not collected with uniform spacing, which is necessary for efficient data storage and simulation. Many techniques are developed to help effectively grid dense terrain point clouds in a curved regular grid (CRG) format, including center and random vehicle paths, sorted gridding methods, and software implementation. In addition, it is beneficial to characterize the terrain as a realization of an underlying stochastic process and to develop a mathematical model of that process. A method is developed to represent a continuous-state
Markov chain as a collection of univariate distributions, to be applied to terrain road profiles. The resulting form is extremely customizable and significantly more compact than a discrete-state
Markov chain, yet it still provides a viable alternative for stochastically modeling terrain. Many new simulation techniques take advantage of 3D gridded roads along with traditional 2D terrain profiles. A technique is developed to model and synthesize 3D terrain surfaces by applying a variety of 2D stochastic models to the topological components of terrain, which are also decomposed into frequency bandwidths and down-sampled. The quality of the synthetic surface is determined using many statistical tests, and the entire work is implemented into a powerful software suite. Engineers from many disciplines who work with terrain surfaces need to describe the overall physical characteristics compactly and consistently. A method is developed to characterize terrain surfaces with a few coefficients by performing a principal component analysis, via singular value decomposition (SVD), to the parameter sets that define a collection of surface models.
Advisors/Committee Members: Ferris, John B. (committeechair), Tjhung, Tana (committee member), Taheri, Saied (committee member).
Subjects/Keywords: Terrain; Surfaces; Gridding; Modeling; Characterization; Markov Chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lambeth, J. N. (2013). Improved Methods for Gridding, Stochastic Modeling, and Compact Characterization of Terrain Surfaces. (Masters Thesis). Virginia Tech. Retrieved from http://hdl.handle.net/10919/19329
Chicago Manual of Style (16th Edition):
Lambeth, Jacob Nelson. “Improved Methods for Gridding, Stochastic Modeling, and Compact Characterization of Terrain Surfaces.” 2013. Masters Thesis, Virginia Tech. Accessed March 01, 2021.
http://hdl.handle.net/10919/19329.
MLA Handbook (7th Edition):
Lambeth, Jacob Nelson. “Improved Methods for Gridding, Stochastic Modeling, and Compact Characterization of Terrain Surfaces.” 2013. Web. 01 Mar 2021.
Vancouver:
Lambeth JN. Improved Methods for Gridding, Stochastic Modeling, and Compact Characterization of Terrain Surfaces. [Internet] [Masters thesis]. Virginia Tech; 2013. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/10919/19329.
Council of Science Editors:
Lambeth JN. Improved Methods for Gridding, Stochastic Modeling, and Compact Characterization of Terrain Surfaces. [Masters Thesis]. Virginia Tech; 2013. Available from: http://hdl.handle.net/10919/19329

University of New South Wales
24.
Tjakra, Javan Dave.
Modeling and analysis of particulate system collective dynamical features.
Degree: Chemical Sciences & Engineering, 2013, University of New South Wales
URL: http://handle.unsw.edu.au/1959.4/52903
;
https://unsworks.unsw.edu.au/fapi/datastream/unsworks:11581/SOURCE01?view=true
► The majority of industrial particulate operations are highly energy intensive, whichleads to expensive operational costs. The fundamental particulate behavior mechanisms,which determine the system collective dynamical…
(more)
▼ The majority of industrial particulate operations are highly energy intensive, whichleads to expensive operational costs. The fundamental particulate behavior mechanisms,which determine the system collective dynamical behavior, is not fullyunderstood. Hence, it is difficult to optimize and control particulate processes.This thesis aims to develop a systematic approach to modeling and analyzingthe overall/collective dynamical features of particulate systems. The dynamicsof particulate systems are modeled based on a stochastic approach in the formof
Markov chains. The models can be developed using particle behavior dataobtained from either experimental or numerical based approaches. A numericalapproach, in particular Discrete Element Method, is used in this work.The collective dynamics of particle movement influences the effectiveness ofparticulate operations. The
Markov chains approach is used to model the collectivemovement of monodisperse particulate systems under constant operatingconditions. The key operator represents the probability of particle movement fromone location to another, which can estimate particle trajectory. In addition, anapproach to analyzing the collective dynamics of particle movement is also developed,in particular the oscillatory behavior and spatial distribution of particlemovements. The proposed model is then extended for systems with time-varyingoperating conditions. This provides a way to optimize and control the system behaviorby manipulating the operating conditions. The
Markov chains models for polydisperse particulate systems under both constant and time-varying operatingconditions are also developed. The model performs a parallel analysis of each typeof particle. This opens a pathway to monitor and analyze polydisperse particulatesystems. Additionally, the model has the potential to aid the implementation ofprocess control of polydisperse systems.The development of a
Markov chains model for a non-spatial distribution analysisis also introduced. The operator represents the probability of non-locationalmovements of a particle property between or within arbitrary intervals. This canbe used to model the collective dynamics of particle energy distributions. Additionally,a measure to relate particle impact energy (which is unmeasurable duringoperation) to kinetic energy (which can be estimated during operation) is proposed.This provides a foundation for the development of an indirect impactenergy sensor which is useful for real-time monitoring and process control.
Advisors/Committee Members: Bao, Jie, Chemical Sciences & Engineering, Faculty of Engineering, UNSW, Yang, Runyu, Materials Science & Engineering, Faculty of Science, UNSW.
Subjects/Keywords: DEM; Collective Dynamics; Particulate System; Markov Chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Tjakra, J. D. (2013). Modeling and analysis of particulate system collective dynamical features. (Doctoral Dissertation). University of New South Wales. Retrieved from http://handle.unsw.edu.au/1959.4/52903 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:11581/SOURCE01?view=true
Chicago Manual of Style (16th Edition):
Tjakra, Javan Dave. “Modeling and analysis of particulate system collective dynamical features.” 2013. Doctoral Dissertation, University of New South Wales. Accessed March 01, 2021.
http://handle.unsw.edu.au/1959.4/52903 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:11581/SOURCE01?view=true.
MLA Handbook (7th Edition):
Tjakra, Javan Dave. “Modeling and analysis of particulate system collective dynamical features.” 2013. Web. 01 Mar 2021.
Vancouver:
Tjakra JD. Modeling and analysis of particulate system collective dynamical features. [Internet] [Doctoral dissertation]. University of New South Wales; 2013. [cited 2021 Mar 01].
Available from: http://handle.unsw.edu.au/1959.4/52903 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:11581/SOURCE01?view=true.
Council of Science Editors:
Tjakra JD. Modeling and analysis of particulate system collective dynamical features. [Doctoral Dissertation]. University of New South Wales; 2013. Available from: http://handle.unsw.edu.au/1959.4/52903 ; https://unsworks.unsw.edu.au/fapi/datastream/unsworks:11581/SOURCE01?view=true

University of Cambridge
25.
Thomas, Samuel.
Universality of cutoff for random walks on random Cayley graphs.
Degree: PhD, 2020, University of Cambridge
URL: https://doi.org/10.17863/CAM.58736
;
https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.818190
► Consider the random Cayley graph of a finite group G with respect to k generators chosen uniformly at random. This draws a Cayley graph uniformly…
(more)
▼ Consider the random Cayley graph of a finite group G with respect to k generators chosen uniformly at random. This draws a Cayley graph uniformly amongst all degree-k Cayley graphs of G. A conjecture of Aldous and Diaconis (1985) from the '80s asserts, for k ≫ log |G|, the following: • the random walk on this (random) graph exhibits cutoff with high probability (whp); • the cutoff time depends only on k and |G| asymptotically (up to smaller order terms). The cutoff time should not depend (strongly) on the choice of generators. In other words, "cutoff is universal for the random walk on the random Cayley graph". Restricted to Abelian groups, this was verified in the '90s; the cutoff time T(k, |G|) was found explicitly. In fact, T(k, |G|) was shown to be an upper bound on mixing for arbitrary groups. First we extend the conjecture to 1 ≪ k ≲ log |G|. Write d(G) for the minimal size of a generating set of G. We establish cutoff (for the random walk on) all Abelian group under the condition k - d(G) ≫ 1, verifying the occurrence of cutoff part of the Aldous – Diaconis conjecture. This condition is almost optimal to guarantee that the group if generated whp. For the cutoff time to depend only on k and |G|, not the algebraic structure of G, we show that d(G) ≪ log |G| and k - d(G) ≍ k ≫ 1 is sufficient. However, the result does not hold if k ≍ log |G| ≍ d(G); there are even regimes with 1 ≪ k ≪ log |G| for which it does not hold if we allow 1 ≪ k - d(G) ≪ k. Next we consider the (non-Abelian) Heisenberg group H := Hp,d of d x d matrices with entries in Zp, with p prime and d ≥ 3 not diverging too quickly. We establish cutoff for any k ≫ 1 with log k ≪ log |H|. Except for k growing super-polynomially in |G| (ie log k ≫ log log |G|), this is the first example where cutoff has been established for any non-Abelian group. Further, even restricting to k ≫ log |G|, the ctuoff time cannot be written as a function of only k and |G|; rather, one needs |Hab|, the size of the Abelianisation, also. In fact, taking d → ∞ sufficiently slowly, the mixing time is of smaller order (not just a constant smaller) than T(k, |H|), the universal upper bound. When k ≳ log |Hab|, we can remove the primality assumption on p. Our next sequence of results still regards mixing, but this time determines upper bounds which hold for large classes of groups, rather than establishing cutoff. From a nilpotent group G, we construct an Abelian group G' (from the lower central series of G) of the same size. We show that the mixing time for G is at least as fast (asymptotically) as that for G' whp. Wilson (1997) conjectured that, amongst all groups of size at most 2d, the group Z2d gives rise to the slowest mixing time. When restricted to Abelian groups, we deduce thsi from the explicit description of the mixing time which we obtain. As a corollary of the above nilpotent-to-Abelian comparison, this is extended from the Abelian to the nilpotent set-up. The spirit of the Aldous – Diaconis conjecture is that certain properties of the random Cayley graph…
Subjects/Keywords: Markov chains; mixing times; cutoff; universality
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Thomas, S. (2020). Universality of cutoff for random walks on random Cayley graphs. (Doctoral Dissertation). University of Cambridge. Retrieved from https://doi.org/10.17863/CAM.58736 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.818190
Chicago Manual of Style (16th Edition):
Thomas, Samuel. “Universality of cutoff for random walks on random Cayley graphs.” 2020. Doctoral Dissertation, University of Cambridge. Accessed March 01, 2021.
https://doi.org/10.17863/CAM.58736 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.818190.
MLA Handbook (7th Edition):
Thomas, Samuel. “Universality of cutoff for random walks on random Cayley graphs.” 2020. Web. 01 Mar 2021.
Vancouver:
Thomas S. Universality of cutoff for random walks on random Cayley graphs. [Internet] [Doctoral dissertation]. University of Cambridge; 2020. [cited 2021 Mar 01].
Available from: https://doi.org/10.17863/CAM.58736 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.818190.
Council of Science Editors:
Thomas S. Universality of cutoff for random walks on random Cayley graphs. [Doctoral Dissertation]. University of Cambridge; 2020. Available from: https://doi.org/10.17863/CAM.58736 ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.818190

University of Cambridge
26.
Thomas, Samuel.
Universality of Cutoff for Random Walks on Random Cayley Graphs.
Degree: PhD, 2020, University of Cambridge
URL: https://www.repository.cam.ac.uk/handle/1810/311645
► Consider the random Cayley graph of a finite group G with respect to k generators chosen uniformly at random. This draws a Cayley graph uniformly…
(more)
▼ Consider the random Cayley graph of a finite group G with respect to k generators chosen uniformly at random. This draws a Cayley graph uniformly amongst all degree-k Cayley graphs of G. A conjecture of Aldous and Diaconis (1985) from the '80s asserts, for k ≫ log |G|, the following: • the random walk on this (random) graph exhibits cutoff with high probability (whp); • the cutoff time depends only on k and |G| asymptotically (up to smaller order terms). The cutoff time should not depend (strongly) on the choice of generators. In other words, "cutoff is universal for the random walk on the random Cayley graph". Restricted to Abelian groups, this was verified in the '90s; the cutoff time T(k, |G|) was found explicitly. In fact, T(k, |G|) was shown to be an upper bound on mixing for arbitrary groups. First we extend the conjecture to 1 ≪ k ≲ log |G|. Write d(G) for the minimal size of a generating set of G. We establish cutoff (for the random walk on) all Abelian group under the condition k - d(G) ≫ 1, verifying the occurrence of cutoff part of the Aldous – Diaconis conjecture. This condition is almost optimal to guarantee that the group if generated whp. For the cutoff time to depend only on k and |G|, not the algebraic structure of G, we show that d(G) ≪ log |G| and k - d(G) ≍ k ≫ 1 is sufficient. However, the result does not hold if k ≍ log |G| ≍ d(G); there are even regimes with 1 ≪ k ≪ log |G| for which it does not hold if we allow 1 ≪ k - d(G) ≪ k. Next we consider the (non-Abelian) Heisenberg group H := Hp,d of d x d matrices with entries in Zp, with p prime and d ≥ 3 not diverging too quickly. We establish cutoff for any k ≫ 1 with log k ≪ log |H|. Except for k growing super-polynomially in |G| (ie log k ≫ log log |G|), this is the first example where cutoff has been established for any non-Abelian group. Further, even restricting to k ≫ log |G|, the ctuoff time cannot be written as a function of only k and |G|; rather, one needs |Hab|, the size of the Abelianisation, also. In fact, taking d → ∞ sufficiently slowly, the mixing time is of smaller order (not just a constant smaller) than T(k, |H|), the universal upper bound. When k ≳ log |Hab|, we can remove the primality assumption on p. Our next sequence of results still regards mixing, but this time determines upper bounds which hold for large classes of groups, rather than establishing cutoff. From a nilpotent group G, we construct an Abelian group G' (from the lower central series of G) of the same size. We show that the mixing time for G is at least as fast (asymptotically) as that for G' whp. Wilson (1997) conjectured that, amongst all groups of size at most 2d, the group Z2d gives rise to the slowest mixing time. When restricted to Abelian groups, we deduce thsi from the explicit description of the mixing time which we obtain. As a corollary of the above nilpotent-to-Abelian comparison, this is extended from the Abelian to the nilpotent set-up. The spirit of the Aldous – Diaconis conjecture is that certain properties of the random Cayley graph…
Subjects/Keywords: Markov chains; mixing times; cutoff; universality
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Thomas, S. (2020). Universality of Cutoff for Random Walks on Random Cayley Graphs. (Doctoral Dissertation). University of Cambridge. Retrieved from https://www.repository.cam.ac.uk/handle/1810/311645
Chicago Manual of Style (16th Edition):
Thomas, Samuel. “Universality of Cutoff for Random Walks on Random Cayley Graphs.” 2020. Doctoral Dissertation, University of Cambridge. Accessed March 01, 2021.
https://www.repository.cam.ac.uk/handle/1810/311645.
MLA Handbook (7th Edition):
Thomas, Samuel. “Universality of Cutoff for Random Walks on Random Cayley Graphs.” 2020. Web. 01 Mar 2021.
Vancouver:
Thomas S. Universality of Cutoff for Random Walks on Random Cayley Graphs. [Internet] [Doctoral dissertation]. University of Cambridge; 2020. [cited 2021 Mar 01].
Available from: https://www.repository.cam.ac.uk/handle/1810/311645.
Council of Science Editors:
Thomas S. Universality of Cutoff for Random Walks on Random Cayley Graphs. [Doctoral Dissertation]. University of Cambridge; 2020. Available from: https://www.repository.cam.ac.uk/handle/1810/311645

Louisiana State University
27.
Parrott, Chester Ira.
Distributed Load Testing by Modeling and Simulating User Behavior.
Degree: PhD, Artificial Intelligence and Robotics, 2020, Louisiana State University
URL: https://digitalcommons.lsu.edu/gradschool_dissertations/5436
► Modern human-machine systems such as microservices rely upon agile engineering practices which require changes to be tested and released more frequently than classically engineered…
(more)
▼ Modern human-machine systems such as microservices rely upon agile engineering practices which require changes to be tested and released more frequently than classically engineered systems. A critical step in the testing of such systems is the generation of realistic workloads or load testing. Generated workload emulates the expected behaviors of users and machines within a system under test in order to find potentially unknown failure states. Typical testing tools rely on static testing artifacts to generate realistic workload conditions. Such artifacts can be cumbersome and costly to maintain; however, even model-based alternatives can prevent adaptation to changes in a system or its usage. Lack of adaptation can prevent the integration of load testing into system quality assurance, leading to an incomplete evaluation of system quality.
The goal of this research is to improve the state of software engineering by addressing open challenges in load testing of human-machine systems with a novel process that a) models and classifies user behavior from streaming and aggregated log data, b) adapts to changes in system and user behavior, and c) generates distributed workload by realistically simulating user behavior. This research contributes a Learning, Online, Distributed Engine for Simulation and Testing based on the Operational Norms of Entities within a system (LODESTONE): a novel process to distributed load testing by modeling and simulating user behavior. We specify LODESTONE within the context of a human-machine system to illustrate distributed adaptation and execution in load testing processes. LODESTONE uses log data to generate and update user behavior models, cluster them into similar behavior profiles, and instantiate distributed workload on software systems. We analyze user behavioral data having differing characteristics to replicate human-machine interactions in a modern microservice environment. We discuss tools, algorithms, software design, and implementation in two different computational environments: client-server and cloud-based microservices. We illustrate the advantages of LODESTONE through a qualitative comparison of key feature parameters and experimentation based on shared data and models. LODESTONE continuously adapts to changes in the system to be tested which allows for the integration of load testing into the quality assurance process for cloud-based microservices.
Subjects/Keywords: Markov Chains; Load Testing; Clustering; Machine Learning
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Parrott, C. I. (2020). Distributed Load Testing by Modeling and Simulating User Behavior. (Doctoral Dissertation). Louisiana State University. Retrieved from https://digitalcommons.lsu.edu/gradschool_dissertations/5436
Chicago Manual of Style (16th Edition):
Parrott, Chester Ira. “Distributed Load Testing by Modeling and Simulating User Behavior.” 2020. Doctoral Dissertation, Louisiana State University. Accessed March 01, 2021.
https://digitalcommons.lsu.edu/gradschool_dissertations/5436.
MLA Handbook (7th Edition):
Parrott, Chester Ira. “Distributed Load Testing by Modeling and Simulating User Behavior.” 2020. Web. 01 Mar 2021.
Vancouver:
Parrott CI. Distributed Load Testing by Modeling and Simulating User Behavior. [Internet] [Doctoral dissertation]. Louisiana State University; 2020. [cited 2021 Mar 01].
Available from: https://digitalcommons.lsu.edu/gradschool_dissertations/5436.
Council of Science Editors:
Parrott CI. Distributed Load Testing by Modeling and Simulating User Behavior. [Doctoral Dissertation]. Louisiana State University; 2020. Available from: https://digitalcommons.lsu.edu/gradschool_dissertations/5436
28.
Lopes, Fabio Marcellus Lima Sá Makiyama.
Limite do fluído para o grafo aleatório de Erdos-Rényi.
Degree: Mestrado, Estatística, 2010, University of São Paulo
URL: http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05052010-155151/
;
► Neste trabalho, aplicamos o algoritmo Breadth-First Search para encontrar o tamanho de uma componente conectada no grafo aleatório de Erdos-Rényi. Uma cadeia de Markov é…
(more)
▼ Neste trabalho, aplicamos o algoritmo Breadth-First Search para encontrar o tamanho de uma componente conectada no grafo aleatório de Erdos-Rényi. Uma cadeia de Markov é obtida deste procedimento. Apresentamos alguns resultados bem conhecidos sobre o comportamento dessa cadeia de Markov. Combinamos alguns destes resultados para obter uma proposição sobre a probabilidade da componente atingir um determinado tamanho e um resultado de convergência do estado da cadeia neste instante. Posteriormente, aplicamos o teorema de convergência de Darling (2002) a sequência de cadeias de Markov reescaladas e indexadas por N, o número de vértices do grafo, para mostrar que as trajetórias dessas cadeias convergem uniformemente em probabilidade para a solução de uma equação diferencial ordinária. Deste resultado segue a bem conhecida lei fraca dos grandes números para a componente gigante do grafo aleatório de Erdos-Rényi, no caso supercrítico. Além disso, obtemos o limite do fluído para um modelo epidêmico que é uma extensão daquele proposto em Kurtz et al. (2008).
In this work, we apply the Breadth-First Search algorithm to find the size of a connected component of the Erdos-Rényi random graph. A Markov chain is obtained of this procedure. We present some well-known results about the behavior of this Markov chain, and combine some of these results to obtain a proposition about the probability that the component reaches a certain size and a convergence result about the state of the chain at that time. Next, we apply the convergence theorem of Darling (2002) to the sequence of rescaled Markov chains indexed by N, the number of vertices of the graph, to show that the trajectories of these chains converge uniformly in probability to the solution of an ordinary dierential equation. From the latter result follows the well-known weak law of large numbers of the giant component of the Erdos-Renyi random graph, in the supercritical case. Moreover, we obtain the uid limit for an epidemic model which is an extension of that proposed in Kurtz et al. (2008).
Advisors/Committee Members: Machado, Fabio Prates.
Subjects/Keywords: Cadeias de Markov; Convergence; Convergência; Grafos aleatórios.; Markov chains; Random graphs
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Lopes, F. M. L. S. M. (2010). Limite do fluído para o grafo aleatório de Erdos-Rényi. (Masters Thesis). University of São Paulo. Retrieved from http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05052010-155151/ ;
Chicago Manual of Style (16th Edition):
Lopes, Fabio Marcellus Lima Sá Makiyama. “Limite do fluído para o grafo aleatório de Erdos-Rényi.” 2010. Masters Thesis, University of São Paulo. Accessed March 01, 2021.
http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05052010-155151/ ;.
MLA Handbook (7th Edition):
Lopes, Fabio Marcellus Lima Sá Makiyama. “Limite do fluído para o grafo aleatório de Erdos-Rényi.” 2010. Web. 01 Mar 2021.
Vancouver:
Lopes FMLSM. Limite do fluído para o grafo aleatório de Erdos-Rényi. [Internet] [Masters thesis]. University of São Paulo; 2010. [cited 2021 Mar 01].
Available from: http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05052010-155151/ ;.
Council of Science Editors:
Lopes FMLSM. Limite do fluído para o grafo aleatório de Erdos-Rényi. [Masters Thesis]. University of São Paulo; 2010. Available from: http://www.teses.usp.br/teses/disponiveis/45/45133/tde-05052010-155151/ ;

Universidade Estadual de Campinas
29.
Ramos, Yuri Tobias Aquiel Correa, 1991-.
Aplicações de cadeias de Markov no ensino médio: Applications of Markov chains in high school.
Degree: 2017, Universidade Estadual de Campinas
URL: http://repositorio.unicamp.br/jspui/handle/REPOSIP/322382
► Abstract: This study proposes two instigating problems for application in high school. Several aspects of such problems are discussed, with a proposed solution always focused…
(more)
▼ Abstract: This study proposes two instigating problems for application in high school. Several aspects of such problems are discussed, with a proposed solution always focused on the use of
Markov Chains and their transition matrices. The proposals for practical activities and lesson plans were planned and discussed in order to provide the teacher with the best way to apply these concepts. High school mathematics is often plastered, and the problems of Player Ruin and Cave Escape show through a computational, theoretical, and investigative practice the use of a stochastic process applied to the knowledge that a high school student can master. The last chapter brings some more advanced calculations of expected value, being reserved to the teacher the possibility of further deepening in the called Drunkard¿s Walk, leading students with greater interest in a deepening of the transition matrices of a
Markov Chain
Advisors/Committee Members: UNIVERSIDADE ESTADUAL DE CAMPINAS (CRUESP), Rifo, Laura Leticia Ramos, 1970- (advisor), Universidade Estadual de Campinas. Instituto de Matemática, Estatística e Computação Científica (institution), Programa de Pós-Graduação em Matemática em Rede Nacional (nameofprogram), Ruffino, Paulo Regis Caron (committee member), Peixoto, Claudia Monteiro (committee member).
Subjects/Keywords: Probabilidades; Matrizes (Matemática); Markov, Cadeias de; Probabilities; Matrices; Markov chains
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ramos, Yuri Tobias Aquiel Correa, 1. (2017). Aplicações de cadeias de Markov no ensino médio: Applications of Markov chains in high school. (Thesis). Universidade Estadual de Campinas. Retrieved from http://repositorio.unicamp.br/jspui/handle/REPOSIP/322382
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Ramos, Yuri Tobias Aquiel Correa, 1991-. “Aplicações de cadeias de Markov no ensino médio: Applications of Markov chains in high school.” 2017. Thesis, Universidade Estadual de Campinas. Accessed March 01, 2021.
http://repositorio.unicamp.br/jspui/handle/REPOSIP/322382.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Ramos, Yuri Tobias Aquiel Correa, 1991-. “Aplicações de cadeias de Markov no ensino médio: Applications of Markov chains in high school.” 2017. Web. 01 Mar 2021.
Vancouver:
Ramos, Yuri Tobias Aquiel Correa 1. Aplicações de cadeias de Markov no ensino médio: Applications of Markov chains in high school. [Internet] [Thesis]. Universidade Estadual de Campinas; 2017. [cited 2021 Mar 01].
Available from: http://repositorio.unicamp.br/jspui/handle/REPOSIP/322382.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Ramos, Yuri Tobias Aquiel Correa 1. Aplicações de cadeias de Markov no ensino médio: Applications of Markov chains in high school. [Thesis]. Universidade Estadual de Campinas; 2017. Available from: http://repositorio.unicamp.br/jspui/handle/REPOSIP/322382
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
30.
Zacharias, Leena.
An adiabatic approach to analysis of time inhomogeneous Markov chains : a queueing policy application.
Degree: MS, Electrical and Computer Engineering, 2011, Oregon State University
URL: http://hdl.handle.net/1957/23293
► In this thesis, convergence of time inhomogeneous Markov chains is studied using an adiabatic approach. The adiabatic framework considers slowly changing systems and the adiabatic…
(more)
▼ In this thesis, convergence of time inhomogeneous
Markov chains is studied using an adiabatic approach. The adiabatic framework considers slowly changing systems and the adiabatic time quantifies the time required for the change such that the final state of the system is close to some equilibrium state. This approach is used in
Markov chains to measure the time to converge to a stationary distribution. Continuous time reversible
Markov chains on a finite state space with generators changing at fixed time intervals are studied. This characterization is applied to a Markovian queueing model with unknown arrival rate. The time inhomogeneous
Markov chain is induced by a queueing policy dependent on uncertainties in arrival rate estimation. It is shown that the above convergence happens with high probability after a sufficiently large time. The above evolution is studied via simulations as well and compared to the bounds suggested by the analysis. These results give the sufficient amount of time one must wait for the queue to reach a stationary, stable distribution under our queueing policy.
Advisors/Committee Members: Nguyen, Thinh (advisor), Kovchegov, Yevgeniy (committee member).
Subjects/Keywords: Markov chains; Markov processes
…2.2
Reversible Markov chains . . . . . . . . . . . . . . . . . . . . . . . .
6
2.3… …Continuous time Markov chains . . . . . . . . . . . . . . . . . . . .
8
2.4
Other Results… …13
3.1
4
A General Result for Time Inhomogeneous Markov Chains . . . . .
14
Application… …Approach to Analysis of Time Inhomogeneous
Markov Chains: a Queueing Policy Application
Chapter… …inhomogeneous Markov chain. The theory of Markov chains is
extensively used in queueing theory [2…
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Zacharias, L. (2011). An adiabatic approach to analysis of time inhomogeneous Markov chains : a queueing policy application. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/23293
Chicago Manual of Style (16th Edition):
Zacharias, Leena. “An adiabatic approach to analysis of time inhomogeneous Markov chains : a queueing policy application.” 2011. Masters Thesis, Oregon State University. Accessed March 01, 2021.
http://hdl.handle.net/1957/23293.
MLA Handbook (7th Edition):
Zacharias, Leena. “An adiabatic approach to analysis of time inhomogeneous Markov chains : a queueing policy application.” 2011. Web. 01 Mar 2021.
Vancouver:
Zacharias L. An adiabatic approach to analysis of time inhomogeneous Markov chains : a queueing policy application. [Internet] [Masters thesis]. Oregon State University; 2011. [cited 2021 Mar 01].
Available from: http://hdl.handle.net/1957/23293.
Council of Science Editors:
Zacharias L. An adiabatic approach to analysis of time inhomogeneous Markov chains : a queueing policy application. [Masters Thesis]. Oregon State University; 2011. Available from: http://hdl.handle.net/1957/23293
◁ [1] [2] [3] [4] [5] … [13] ▶
.