Advanced search options

Sorted by: relevance · author · university · date | New search

You searched for `subject:(Markov processes)`

.
Showing records 1 – 30 of
614 total matches.

◁ [1] [2] [3] [4] [5] … [21] ▶

Search Limiters

Dates

- 2015 – 2019 (149)
- 2010 – 2014 (206)
- 2005 – 2009 (111)
- 2000 – 2004 (57)
- 1995 – 1999 (32)
- 1990 – 1994 (20)
- 1985 – 1989 (28)
- 1980 – 1984 (22)
- 1975 – 1979 (15)
- 1970 – 1974 (18)

Universities

- Georgia Tech (50)
- Hong Kong University of Science and Technology (26)
- University of Hong Kong (23)
- Michigan State University (19)
- The Ohio State University (19)
- McGill University (16)
- University of Florida (15)
- Virginia Tech (14)
- ETH Zürich (13)
- MIT (13)
- Indian Institute of Science (12)
- Oregon State University (11)
- Rutgers University (10)
- University of Texas – Austin (10)

Department

Degrees

- PhD (203)
- MS (48)
- Docteur es (43)
- M. Phil. (12)

Languages

- English (415)
- French (29)
- Portuguese (13)
- Greek (10)

Country

- US (291)
- Canada (53)
- Hong Kong (49)
- France (43)
- Australia (29)
- UK (23)
- Netherlands (21)
- South Africa (17)
- Switzerland (14)
- India (14)
- Greece (13)
- Brazil (12)

▼ Search Limiters

Oregon State University

1. Alkaee Taleghan, Majid. Simulator-Defined MDP Planning with Applications in Natural Resource Management.

Degree: PhD, Computer Science, 2017, Oregon State University

URL: http://hdl.handle.net/1957/60125

► This work is inspired by problems in natural resource management centered on the challenge of invasive species. Computing optimal management policies for maintaining ecosystem sustainable…
(more)

Subjects/Keywords: Markov Decision Processes; Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Alkaee Taleghan, M. (2017). Simulator-Defined MDP Planning with Applications in Natural Resource Management. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/60125

Chicago Manual of Style (16^{th} Edition):

Alkaee Taleghan, Majid. “Simulator-Defined MDP Planning with Applications in Natural Resource Management.” 2017. Doctoral Dissertation, Oregon State University. Accessed May 25, 2019. http://hdl.handle.net/1957/60125.

MLA Handbook (7^{th} Edition):

Alkaee Taleghan, Majid. “Simulator-Defined MDP Planning with Applications in Natural Resource Management.” 2017. Web. 25 May 2019.

Vancouver:

Alkaee Taleghan M. Simulator-Defined MDP Planning with Applications in Natural Resource Management. [Internet] [Doctoral dissertation]. Oregon State University; 2017. [cited 2019 May 25]. Available from: http://hdl.handle.net/1957/60125.

Council of Science Editors:

Alkaee Taleghan M. Simulator-Defined MDP Planning with Applications in Natural Resource Management. [Doctoral Dissertation]. Oregon State University; 2017. Available from: http://hdl.handle.net/1957/60125

University of Hong Kong

2.
朱冬梅; Zhu, Dongmei.
Construction of non-standard *Markov* chain models with
applications.

Degree: PhD, 2014, University of Hong Kong

URL: Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358

►

In this thesis, the properties of some non-standard *Markov* chain models and their corresponding parameter estimation methods are investigated. Several practical applications and extensions are…
(more)

Subjects/Keywords: Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

朱冬梅; Zhu, D. (2014). Construction of non-standard Markov chain models with applications. (Doctoral Dissertation). University of Hong Kong. Retrieved from Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358

Chicago Manual of Style (16^{th} Edition):

朱冬梅; Zhu, Dongmei. “Construction of non-standard Markov chain models with applications.” 2014. Doctoral Dissertation, University of Hong Kong. Accessed May 25, 2019. Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358.

MLA Handbook (7^{th} Edition):

朱冬梅; Zhu, Dongmei. “Construction of non-standard Markov chain models with applications.” 2014. Web. 25 May 2019.

Vancouver:

朱冬梅; Zhu D. Construction of non-standard Markov chain models with applications. [Internet] [Doctoral dissertation]. University of Hong Kong; 2014. [cited 2019 May 25]. Available from: Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358.

Council of Science Editors:

朱冬梅; Zhu D. Construction of non-standard Markov chain models with applications. [Doctoral Dissertation]. University of Hong Kong; 2014. Available from: Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358

University of Johannesburg

3.
Swarts, Francis.
* Markov* characterization of fading channels.

Degree: 2014, University of Johannesburg

URL: http://hdl.handle.net/10210/11665

►

M. Ing. (Electrical and Electronic Engineering)

This thesis investigates various methods of modeling fading communication channels. These modeling methods include various techniques for the modeling… (more)

Subjects/Keywords: Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Swarts, F. (2014). Markov characterization of fading channels. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/11665

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Swarts, Francis. “Markov characterization of fading channels.” 2014. Thesis, University of Johannesburg. Accessed May 25, 2019. http://hdl.handle.net/10210/11665.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Swarts, Francis. “Markov characterization of fading channels.” 2014. Web. 25 May 2019.

Vancouver:

Swarts F. Markov characterization of fading channels. [Internet] [Thesis]. University of Johannesburg; 2014. [cited 2019 May 25]. Available from: http://hdl.handle.net/10210/11665.

Note: this citation may be lacking information needed for this citation format:

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Swarts F. Markov characterization of fading channels. [Thesis]. University of Johannesburg; 2014. Available from: http://hdl.handle.net/10210/11665

Not specified: Masters Thesis or Doctoral Dissertation

University of Nevada – Las Vegas

4. Metz, Brandon John. A Comparison of Recent Results on the Unicity Conjecture of the Markoff Equation.

Degree: MS, Mathematical Sciences, 2015, University of Nevada – Las Vegas

URL: https://digitalscholarship.unlv.edu/thesesdissertations/2389

► In this thesis we discuss the positive integer solutions to the equation known as the Markoff equation x^{2} + y^{2} + z^{2} = 3xyz.…
(more)

Subjects/Keywords: Markov processes; Markov spectrum; Mathematics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Metz, B. J. (2015). A Comparison of Recent Results on the Unicity Conjecture of the Markoff Equation. (Masters Thesis). University of Nevada – Las Vegas. Retrieved from https://digitalscholarship.unlv.edu/thesesdissertations/2389

Chicago Manual of Style (16^{th} Edition):

Metz, Brandon John. “A Comparison of Recent Results on the Unicity Conjecture of the Markoff Equation.” 2015. Masters Thesis, University of Nevada – Las Vegas. Accessed May 25, 2019. https://digitalscholarship.unlv.edu/thesesdissertations/2389.

MLA Handbook (7^{th} Edition):

Metz, Brandon John. “A Comparison of Recent Results on the Unicity Conjecture of the Markoff Equation.” 2015. Web. 25 May 2019.

Vancouver:

Metz BJ. A Comparison of Recent Results on the Unicity Conjecture of the Markoff Equation. [Internet] [Masters thesis]. University of Nevada – Las Vegas; 2015. [cited 2019 May 25]. Available from: https://digitalscholarship.unlv.edu/thesesdissertations/2389.

Council of Science Editors:

Metz BJ. A Comparison of Recent Results on the Unicity Conjecture of the Markoff Equation. [Masters Thesis]. University of Nevada – Las Vegas; 2015. Available from: https://digitalscholarship.unlv.edu/thesesdissertations/2389

5.
Haugomat, Tristan.
Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type * processes*.

Degree: Docteur es, Mathématiques et leurs interactions, 2018, Rennes 1

URL: http://www.theses.fr/2018REN1S046

►

Dans cette thèse, nous donnons une localisation en espace de la théorie des processus de Feller. Un premier objectif est d’obtenir des résultats simples et… (more)

Subjects/Keywords: Processus de Markov; Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Haugomat, T. (2018). Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes. (Doctoral Dissertation). Rennes 1. Retrieved from http://www.theses.fr/2018REN1S046

Chicago Manual of Style (16^{th} Edition):

Haugomat, Tristan. “Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes.” 2018. Doctoral Dissertation, Rennes 1. Accessed May 25, 2019. http://www.theses.fr/2018REN1S046.

MLA Handbook (7^{th} Edition):

Haugomat, Tristan. “Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes.” 2018. Web. 25 May 2019.

Vancouver:

Haugomat T. Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes. [Internet] [Doctoral dissertation]. Rennes 1; 2018. [cited 2019 May 25]. Available from: http://www.theses.fr/2018REN1S046.

Council of Science Editors:

Haugomat T. Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes. [Doctoral Dissertation]. Rennes 1; 2018. Available from: http://www.theses.fr/2018REN1S046

University of Johannesburg

6.
Zhou, Wenge.
An experimental evaluation of *Markov* channel models.

Degree: 2012, University of Johannesburg

URL: http://hdl.handle.net/10210/7039

►

M.Ing.

The main contribution of this thesis can be summarized as follows. Firstly, we implemented a high speed error gap recording system, which can run… (more)

Subjects/Keywords: Markov processes - Evaluation

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Zhou, W. (2012). An experimental evaluation of Markov channel models. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/7039

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Zhou, Wenge. “An experimental evaluation of Markov channel models.” 2012. Thesis, University of Johannesburg. Accessed May 25, 2019. http://hdl.handle.net/10210/7039.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Zhou, Wenge. “An experimental evaluation of Markov channel models.” 2012. Web. 25 May 2019.

Vancouver:

Zhou W. An experimental evaluation of Markov channel models. [Internet] [Thesis]. University of Johannesburg; 2012. [cited 2019 May 25]. Available from: http://hdl.handle.net/10210/7039.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhou W. An experimental evaluation of Markov channel models. [Thesis]. University of Johannesburg; 2012. Available from: http://hdl.handle.net/10210/7039

Not specified: Masters Thesis or Doctoral Dissertation

7. Birmpa, Panagiota. Quantification of mesoscopic and macroscopic fluctuations in interacting particle systems.

Degree: PhD, 2018, University of Sussex

URL: http://sro.sussex.ac.uk/id/eprint/76622/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.751856

► The purpose of this PhD thesis is to study mesoscopic and macroscopic fluctuations in Interacting Particle Systems. The thesis is split into two main parts.…
(more)

Subjects/Keywords: 510; QA0274.7 Markov processes. Markov chains

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Birmpa, P. (2018). Quantification of mesoscopic and macroscopic fluctuations in interacting particle systems. (Doctoral Dissertation). University of Sussex. Retrieved from http://sro.sussex.ac.uk/id/eprint/76622/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.751856

Chicago Manual of Style (16^{th} Edition):

Birmpa, Panagiota. “Quantification of mesoscopic and macroscopic fluctuations in interacting particle systems.” 2018. Doctoral Dissertation, University of Sussex. Accessed May 25, 2019. http://sro.sussex.ac.uk/id/eprint/76622/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.751856.

MLA Handbook (7^{th} Edition):

Birmpa, Panagiota. “Quantification of mesoscopic and macroscopic fluctuations in interacting particle systems.” 2018. Web. 25 May 2019.

Vancouver:

Birmpa P. Quantification of mesoscopic and macroscopic fluctuations in interacting particle systems. [Internet] [Doctoral dissertation]. University of Sussex; 2018. [cited 2019 May 25]. Available from: http://sro.sussex.ac.uk/id/eprint/76622/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.751856.

Council of Science Editors:

Birmpa P. Quantification of mesoscopic and macroscopic fluctuations in interacting particle systems. [Doctoral Dissertation]. University of Sussex; 2018. Available from: http://sro.sussex.ac.uk/id/eprint/76622/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.751856

8. Sherborne, Neil. Non-Markovian epidemic dynamics on networks.

Degree: PhD, 2018, University of Sussex

URL: http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572

► The use of networks to model the spread of epidemics through structured populations is widespread. However, epidemics on networks lead to intractable exact systems with…
(more)

Subjects/Keywords: 510; QA0274.7 Markov processes. Markov chains

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Sherborne, N. (2018). Non-Markovian epidemic dynamics on networks. (Doctoral Dissertation). University of Sussex. Retrieved from http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572

Chicago Manual of Style (16^{th} Edition):

Sherborne, Neil. “Non-Markovian epidemic dynamics on networks.” 2018. Doctoral Dissertation, University of Sussex. Accessed May 25, 2019. http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572.

MLA Handbook (7^{th} Edition):

Sherborne, Neil. “Non-Markovian epidemic dynamics on networks.” 2018. Web. 25 May 2019.

Vancouver:

Sherborne N. Non-Markovian epidemic dynamics on networks. [Internet] [Doctoral dissertation]. University of Sussex; 2018. [cited 2019 May 25]. Available from: http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572.

Council of Science Editors:

Sherborne N. Non-Markovian epidemic dynamics on networks. [Doctoral Dissertation]. University of Sussex; 2018. Available from: http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572

Simon Fraser University

9.
Phillips, Gary Leslie.
Two classification theorems of states of discrete *Markov* chains. – .

Degree: 1970, Simon Fraser University

URL: http://summit.sfu.ca/item/4178

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Phillips, G. L. (1970). Two classification theorems of states of discrete Markov chains. – . (Thesis). Simon Fraser University. Retrieved from http://summit.sfu.ca/item/4178

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Phillips, Gary Leslie. “Two classification theorems of states of discrete Markov chains. – .” 1970. Thesis, Simon Fraser University. Accessed May 25, 2019. http://summit.sfu.ca/item/4178.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Phillips, Gary Leslie. “Two classification theorems of states of discrete Markov chains. – .” 1970. Web. 25 May 2019.

Vancouver:

Phillips GL. Two classification theorems of states of discrete Markov chains. – . [Internet] [Thesis]. Simon Fraser University; 1970. [cited 2019 May 25]. Available from: http://summit.sfu.ca/item/4178.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Phillips GL. Two classification theorems of states of discrete Markov chains. – . [Thesis]. Simon Fraser University; 1970. Available from: http://summit.sfu.ca/item/4178

Not specified: Masters Thesis or Doctoral Dissertation

Oregon State University

10.
Raghavan, Aswin.
Domain-Independent Planning for *Markov* Decision *Processes* with Factored State and Action Spaces.

Degree: PhD, Computer Science, 2017, Oregon State University

URL: http://hdl.handle.net/1957/60559

► *Markov* Decision *Processes* (MDPs) are the de-facto formalism for studying sequential decision making problems with uncertainty, ranging from classical problems such as inventory control and…
(more)

Subjects/Keywords: Planning under uncertainty; Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Raghavan, A. (2017). Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/60559

Chicago Manual of Style (16^{th} Edition):

Raghavan, Aswin. “Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces.” 2017. Doctoral Dissertation, Oregon State University. Accessed May 25, 2019. http://hdl.handle.net/1957/60559.

MLA Handbook (7^{th} Edition):

Raghavan, Aswin. “Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces.” 2017. Web. 25 May 2019.

Vancouver:

Raghavan A. Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces. [Internet] [Doctoral dissertation]. Oregon State University; 2017. [cited 2019 May 25]. Available from: http://hdl.handle.net/1957/60559.

Council of Science Editors:

Raghavan A. Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces. [Doctoral Dissertation]. Oregon State University; 2017. Available from: http://hdl.handle.net/1957/60559

Oregon State University

11. Yang, Minghui. A hazardous-inspection model with costly repair.

Degree: PhD, Statistics, 1988, Oregon State University

URL: http://hdl.handle.net/1957/40009

► An inspection-repair model is developed that presumes inspection is hazardous to the system being inspected. The form of the optimal inspection-repair policy is determined for…
(more)

Subjects/Keywords: Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Yang, M. (1988). A hazardous-inspection model with costly repair. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/40009

Chicago Manual of Style (16^{th} Edition):

Yang, Minghui. “A hazardous-inspection model with costly repair.” 1988. Doctoral Dissertation, Oregon State University. Accessed May 25, 2019. http://hdl.handle.net/1957/40009.

MLA Handbook (7^{th} Edition):

Yang, Minghui. “A hazardous-inspection model with costly repair.” 1988. Web. 25 May 2019.

Vancouver:

Yang M. A hazardous-inspection model with costly repair. [Internet] [Doctoral dissertation]. Oregon State University; 1988. [cited 2019 May 25]. Available from: http://hdl.handle.net/1957/40009.

Council of Science Editors:

Yang M. A hazardous-inspection model with costly repair. [Doctoral Dissertation]. Oregon State University; 1988. Available from: http://hdl.handle.net/1957/40009

University of Hong Kong

12.
郭慈安; Kwok, Chi-on, Michael.
Some results on higher order *Markov* Chain
models.

Degree: M. Phil., 1988, University of Hong Kong

URL: Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847

published_or_final_version

Statistics

Master

Master of Philosophy

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

郭慈安; Kwok, Chi-on, M. (1988). Some results on higher order Markov Chain models. (Masters Thesis). University of Hong Kong. Retrieved from Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847

Chicago Manual of Style (16^{th} Edition):

郭慈安; Kwok, Chi-on, Michael. “Some results on higher order Markov Chain models.” 1988. Masters Thesis, University of Hong Kong. Accessed May 25, 2019. Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847.

MLA Handbook (7^{th} Edition):

郭慈安; Kwok, Chi-on, Michael. “Some results on higher order Markov Chain models.” 1988. Web. 25 May 2019.

Vancouver:

郭慈安; Kwok, Chi-on M. Some results on higher order Markov Chain models. [Internet] [Masters thesis]. University of Hong Kong; 1988. [cited 2019 May 25]. Available from: Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847.

Council of Science Editors:

郭慈安; Kwok, Chi-on M. Some results on higher order Markov Chain models. [Masters Thesis]. University of Hong Kong; 1988. Available from: Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847

University of Hong Kong

13.
葉錦元; Yip, Kam-yuen, William.
Simulation and inference of aggregated *Markov*
* processes*.

Degree: Master of Social Sciences, 1993, University of Hong Kong

URL: Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668

published_or_final_version

Applied Statistics

Master

Master of Social Sciences

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

葉錦元; Yip, Kam-yuen, W. (1993). Simulation and inference of aggregated Markov processes. (Masters Thesis). University of Hong Kong. Retrieved from Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668

Chicago Manual of Style (16^{th} Edition):

葉錦元; Yip, Kam-yuen, William. “Simulation and inference of aggregated Markov processes.” 1993. Masters Thesis, University of Hong Kong. Accessed May 25, 2019. Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668.

MLA Handbook (7^{th} Edition):

葉錦元; Yip, Kam-yuen, William. “Simulation and inference of aggregated Markov processes.” 1993. Web. 25 May 2019.

Vancouver:

葉錦元; Yip, Kam-yuen W. Simulation and inference of aggregated Markov processes. [Internet] [Masters thesis]. University of Hong Kong; 1993. [cited 2019 May 25]. Available from: Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668.

Council of Science Editors:

葉錦元; Yip, Kam-yuen W. Simulation and inference of aggregated Markov processes. [Masters Thesis]. University of Hong Kong; 1993. Available from: Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668

Oregon State University

14. Hostetler, Jesse A. Monte Carlo Tree Search with Fixed and Adaptive Abstractions.

Degree: PhD, Computer Science, 2017, Oregon State University

URL: http://hdl.handle.net/1957/60635

► Monte Carlo tree search (MCTS) is a class of online planning algorithms for *Markov* decision *processes* (MDPs) and related models that has found success in…
(more)

Subjects/Keywords: Artificial intelligence; Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Hostetler, J. A. (2017). Monte Carlo Tree Search with Fixed and Adaptive Abstractions. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/60635

Chicago Manual of Style (16^{th} Edition):

Hostetler, Jesse A. “Monte Carlo Tree Search with Fixed and Adaptive Abstractions.” 2017. Doctoral Dissertation, Oregon State University. Accessed May 25, 2019. http://hdl.handle.net/1957/60635.

MLA Handbook (7^{th} Edition):

Hostetler, Jesse A. “Monte Carlo Tree Search with Fixed and Adaptive Abstractions.” 2017. Web. 25 May 2019.

Vancouver:

Hostetler JA. Monte Carlo Tree Search with Fixed and Adaptive Abstractions. [Internet] [Doctoral dissertation]. Oregon State University; 2017. [cited 2019 May 25]. Available from: http://hdl.handle.net/1957/60635.

Council of Science Editors:

Hostetler JA. Monte Carlo Tree Search with Fixed and Adaptive Abstractions. [Doctoral Dissertation]. Oregon State University; 2017. Available from: http://hdl.handle.net/1957/60635

McGill University

15.
Nair, G. Gopalakrishnan.
Functions of *Markov* chains.

Degree: MS, Department of Mathematics, 1969, McGill University

URL: http://digitool.library.mcgill.ca/thesisfile46505.pdf

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Nair, G. G. (1969). Functions of Markov chains. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile46505.pdf

Chicago Manual of Style (16^{th} Edition):

Nair, G Gopalakrishnan. “Functions of Markov chains.” 1969. Masters Thesis, McGill University. Accessed May 25, 2019. http://digitool.library.mcgill.ca/thesisfile46505.pdf.

MLA Handbook (7^{th} Edition):

Nair, G Gopalakrishnan. “Functions of Markov chains.” 1969. Web. 25 May 2019.

Vancouver:

Nair GG. Functions of Markov chains. [Internet] [Masters thesis]. McGill University; 1969. [cited 2019 May 25]. Available from: http://digitool.library.mcgill.ca/thesisfile46505.pdf.

Council of Science Editors:

Nair GG. Functions of Markov chains. [Masters Thesis]. McGill University; 1969. Available from: http://digitool.library.mcgill.ca/thesisfile46505.pdf

McGill University

16. Bose, A. (Amitava). Quantum chains.

Degree: MS, Department of Mathematics, 1968, McGill University

URL: http://digitool.library.mcgill.ca/thesisfile47148.pdf

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Bose, A. (. (1968). Quantum chains. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile47148.pdf

Chicago Manual of Style (16^{th} Edition):

Bose, A (Amitava). “Quantum chains.” 1968. Masters Thesis, McGill University. Accessed May 25, 2019. http://digitool.library.mcgill.ca/thesisfile47148.pdf.

MLA Handbook (7^{th} Edition):

Bose, A (Amitava). “Quantum chains.” 1968. Web. 25 May 2019.

Vancouver:

Bose A(. Quantum chains. [Internet] [Masters thesis]. McGill University; 1968. [cited 2019 May 25]. Available from: http://digitool.library.mcgill.ca/thesisfile47148.pdf.

Council of Science Editors:

Bose A(. Quantum chains. [Masters Thesis]. McGill University; 1968. Available from: http://digitool.library.mcgill.ca/thesisfile47148.pdf

McGill University

17.
Dansereau, Maryse.
Techniques d'estimation pour les chaînes de *Markov* y compris les chaînes avec matrice causative constante.

Degree: MS, Department of Mathematics, 1974, McGill University

URL: http://digitool.library.mcgill.ca/thesisfile48492.pdf

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Dansereau, M. (1974). Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile48492.pdf

Chicago Manual of Style (16^{th} Edition):

Dansereau, Maryse. “Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante.” 1974. Masters Thesis, McGill University. Accessed May 25, 2019. http://digitool.library.mcgill.ca/thesisfile48492.pdf.

MLA Handbook (7^{th} Edition):

Dansereau, Maryse. “Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante.” 1974. Web. 25 May 2019.

Vancouver:

Dansereau M. Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante. [Internet] [Masters thesis]. McGill University; 1974. [cited 2019 May 25]. Available from: http://digitool.library.mcgill.ca/thesisfile48492.pdf.

Council of Science Editors:

Dansereau M. Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante. [Masters Thesis]. McGill University; 1974. Available from: http://digitool.library.mcgill.ca/thesisfile48492.pdf

Montana Tech

18.
Culbertson, Denny Durfee.
Semi-*Markov* chains.

Degree: MA, 1963, Montana Tech

URL: https://scholarworks.umt.edu/etd/8340

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Culbertson, D. D. (1963). Semi-Markov chains. (Masters Thesis). Montana Tech. Retrieved from https://scholarworks.umt.edu/etd/8340

Chicago Manual of Style (16^{th} Edition):

Culbertson, Denny Durfee. “Semi-Markov chains.” 1963. Masters Thesis, Montana Tech. Accessed May 25, 2019. https://scholarworks.umt.edu/etd/8340.

MLA Handbook (7^{th} Edition):

Culbertson, Denny Durfee. “Semi-Markov chains.” 1963. Web. 25 May 2019.

Vancouver:

Culbertson DD. Semi-Markov chains. [Internet] [Masters thesis]. Montana Tech; 1963. [cited 2019 May 25]. Available from: https://scholarworks.umt.edu/etd/8340.

Council of Science Editors:

Culbertson DD. Semi-Markov chains. [Masters Thesis]. Montana Tech; 1963. Available from: https://scholarworks.umt.edu/etd/8340

McGill University

19.
Solvason, Diane Lynn.
Maximum likelihood estimation for *Markov* renewal * processes*.

Degree: MS, Department of Mathematics, 1977, McGill University

URL: http://digitool.library.mcgill.ca/thesisfile53842.pdf

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Solvason, D. L. (1977). Maximum likelihood estimation for Markov renewal processes. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile53842.pdf

Chicago Manual of Style (16^{th} Edition):

Solvason, Diane Lynn. “Maximum likelihood estimation for Markov renewal processes.” 1977. Masters Thesis, McGill University. Accessed May 25, 2019. http://digitool.library.mcgill.ca/thesisfile53842.pdf.

MLA Handbook (7^{th} Edition):

Solvason, Diane Lynn. “Maximum likelihood estimation for Markov renewal processes.” 1977. Web. 25 May 2019.

Vancouver:

Solvason DL. Maximum likelihood estimation for Markov renewal processes. [Internet] [Masters thesis]. McGill University; 1977. [cited 2019 May 25]. Available from: http://digitool.library.mcgill.ca/thesisfile53842.pdf.

Council of Science Editors:

Solvason DL. Maximum likelihood estimation for Markov renewal processes. [Masters Thesis]. McGill University; 1977. Available from: http://digitool.library.mcgill.ca/thesisfile53842.pdf

Oregon State University

20. Ching, Brenton S. Analysis of iteration schemes for deterministic transport in binary Markovian mixtures.

Degree: MS, Nuclear Engineering, 2000, Oregon State University

URL: http://hdl.handle.net/1957/32893

► The Adams-Larsen-Pomraning coupled transport model has been used to describe neutral particle transport in binary stochastic mixtures. Here, the mixing statistics are considered to be…
(more)

Subjects/Keywords: Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ching, B. S. (2000). Analysis of iteration schemes for deterministic transport in binary Markovian mixtures. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/32893

Chicago Manual of Style (16^{th} Edition):

Ching, Brenton S. “Analysis of iteration schemes for deterministic transport in binary Markovian mixtures.” 2000. Masters Thesis, Oregon State University. Accessed May 25, 2019. http://hdl.handle.net/1957/32893.

MLA Handbook (7^{th} Edition):

Ching, Brenton S. “Analysis of iteration schemes for deterministic transport in binary Markovian mixtures.” 2000. Web. 25 May 2019.

Vancouver:

Ching BS. Analysis of iteration schemes for deterministic transport in binary Markovian mixtures. [Internet] [Masters thesis]. Oregon State University; 2000. [cited 2019 May 25]. Available from: http://hdl.handle.net/1957/32893.

Council of Science Editors:

Ching BS. Analysis of iteration schemes for deterministic transport in binary Markovian mixtures. [Masters Thesis]. Oregon State University; 2000. Available from: http://hdl.handle.net/1957/32893

Oregon State University

21.
Ott, Melvin Leroy.
Optimal policies in continuous *Markov* decision chains.

Degree: PhD, Statistics, 1974, Oregon State University

URL: http://hdl.handle.net/1957/44137

► For continuous time, finite state and action, *Markov* decision chains, optimal policies are studied; (i) a procedure for transforming the terminal reward vector is given…
(more)

Subjects/Keywords: Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ott, M. L. (1974). Optimal policies in continuous Markov decision chains. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/44137

Chicago Manual of Style (16^{th} Edition):

Ott, Melvin Leroy. “Optimal policies in continuous Markov decision chains.” 1974. Doctoral Dissertation, Oregon State University. Accessed May 25, 2019. http://hdl.handle.net/1957/44137.

MLA Handbook (7^{th} Edition):

Ott, Melvin Leroy. “Optimal policies in continuous Markov decision chains.” 1974. Web. 25 May 2019.

Vancouver:

Ott ML. Optimal policies in continuous Markov decision chains. [Internet] [Doctoral dissertation]. Oregon State University; 1974. [cited 2019 May 25]. Available from: http://hdl.handle.net/1957/44137.

Council of Science Editors:

Ott ML. Optimal policies in continuous Markov decision chains. [Doctoral Dissertation]. Oregon State University; 1974. Available from: http://hdl.handle.net/1957/44137

University of British Columbia

22.
Salisbury, Thomas S.
Construction of strong *Markov* *processes* through excursions, and a related Martin boundary
.

Degree: 1983, University of British Columbia

URL: http://hdl.handle.net/2429/24354

► For certain *Markov* *processes*, K. Ito has defined the Poisson point process of excursions away from a fixed point. The law of this process is…
(more)

Subjects/Keywords: Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Salisbury, T. S. (1983). Construction of strong Markov processes through excursions, and a related Martin boundary . (Thesis). University of British Columbia. Retrieved from http://hdl.handle.net/2429/24354

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Salisbury, Thomas S. “Construction of strong Markov processes through excursions, and a related Martin boundary .” 1983. Thesis, University of British Columbia. Accessed May 25, 2019. http://hdl.handle.net/2429/24354.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Salisbury, Thomas S. “Construction of strong Markov processes through excursions, and a related Martin boundary .” 1983. Web. 25 May 2019.

Vancouver:

Salisbury TS. Construction of strong Markov processes through excursions, and a related Martin boundary . [Internet] [Thesis]. University of British Columbia; 1983. [cited 2019 May 25]. Available from: http://hdl.handle.net/2429/24354.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Salisbury TS. Construction of strong Markov processes through excursions, and a related Martin boundary . [Thesis]. University of British Columbia; 1983. Available from: http://hdl.handle.net/2429/24354

Not specified: Masters Thesis or Doctoral Dissertation

Columbia University

23. Ruiz Lacedelli, Octavio. Essays in information relaxations and scenario analysis for partially observable settings.

Degree: 2019, Columbia University

URL: https://doi.org/10.7916/d8-mwkk-mr35

► This dissertation consists of three main essays in which we study important problems in engineering and finance. In the first part of this dissertation, we…
(more)

Subjects/Keywords: Operations research; Finance; Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Ruiz Lacedelli, O. (2019). Essays in information relaxations and scenario analysis for partially observable settings. (Doctoral Dissertation). Columbia University. Retrieved from https://doi.org/10.7916/d8-mwkk-mr35

Chicago Manual of Style (16^{th} Edition):

Ruiz Lacedelli, Octavio. “Essays in information relaxations and scenario analysis for partially observable settings.” 2019. Doctoral Dissertation, Columbia University. Accessed May 25, 2019. https://doi.org/10.7916/d8-mwkk-mr35.

MLA Handbook (7^{th} Edition):

Ruiz Lacedelli, Octavio. “Essays in information relaxations and scenario analysis for partially observable settings.” 2019. Web. 25 May 2019.

Vancouver:

Ruiz Lacedelli O. Essays in information relaxations and scenario analysis for partially observable settings. [Internet] [Doctoral dissertation]. Columbia University; 2019. [cited 2019 May 25]. Available from: https://doi.org/10.7916/d8-mwkk-mr35.

Council of Science Editors:

Ruiz Lacedelli O. Essays in information relaxations and scenario analysis for partially observable settings. [Doctoral Dissertation]. Columbia University; 2019. Available from: https://doi.org/10.7916/d8-mwkk-mr35

Western Carolina University

24.
Shouse, Kirke.
Activity recognition using Grey-*Markov* model.

Degree: 2011, Western Carolina University

URL: http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032

► Activity Recognition (AR) is a process of identifying actions and goals of one or more agents of interest. AR techniques have been applied to both…
(more)

Subjects/Keywords: Human activity recognition; Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Shouse, K. (2011). Activity recognition using Grey-Markov model. (Masters Thesis). Western Carolina University. Retrieved from http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032

Chicago Manual of Style (16^{th} Edition):

Shouse, Kirke. “Activity recognition using Grey-Markov model.” 2011. Masters Thesis, Western Carolina University. Accessed May 25, 2019. http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032.

MLA Handbook (7^{th} Edition):

Shouse, Kirke. “Activity recognition using Grey-Markov model.” 2011. Web. 25 May 2019.

Vancouver:

Shouse K. Activity recognition using Grey-Markov model. [Internet] [Masters thesis]. Western Carolina University; 2011. [cited 2019 May 25]. Available from: http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032.

Council of Science Editors:

Shouse K. Activity recognition using Grey-Markov model. [Masters Thesis]. Western Carolina University; 2011. Available from: http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032

Rutgers University

25.
Parag, Toufiq U, 1979-.
Labeling hypergraph-structured data using *Markov* network.

Degree: PhD, Computer Science, 2011, Rutgers University

URL: http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655

►

The goal of this dissertation is to label datapoints into two groups utilizing higher order information among them. More specifically, given likelihood (or error) measures… (more)

Subjects/Keywords: Markov processes; Computer science – Mathematics

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Parag, Toufiq U, 1. (2011). Labeling hypergraph-structured data using Markov network. (Doctoral Dissertation). Rutgers University. Retrieved from http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655

Chicago Manual of Style (16^{th} Edition):

Parag, Toufiq U, 1979-. “Labeling hypergraph-structured data using Markov network.” 2011. Doctoral Dissertation, Rutgers University. Accessed May 25, 2019. http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655.

MLA Handbook (7^{th} Edition):

Parag, Toufiq U, 1979-. “Labeling hypergraph-structured data using Markov network.” 2011. Web. 25 May 2019.

Vancouver:

Parag, Toufiq U 1. Labeling hypergraph-structured data using Markov network. [Internet] [Doctoral dissertation]. Rutgers University; 2011. [cited 2019 May 25]. Available from: http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655.

Council of Science Editors:

Parag, Toufiq U 1. Labeling hypergraph-structured data using Markov network. [Doctoral Dissertation]. Rutgers University; 2011. Available from: http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655

University of Johannesburg

26.
Marcon, Sinclair Antony.
* Markov* chains : a graph theoretical approach.

Degree: 2013, University of Johannesburg

URL: http://hdl.handle.net/10210/8363

►

M.Sc. (Mathematics)

In chapter 1, we give the reader some background concerning digraphs that are used in the discussion of *Markov* chains; namely, their *Markov*…
(more)

Subjects/Keywords: Markov processes; Graph theory

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Marcon, S. A. (2013). Markov chains : a graph theoretical approach. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/8363

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Marcon, Sinclair Antony. “Markov chains : a graph theoretical approach.” 2013. Thesis, University of Johannesburg. Accessed May 25, 2019. http://hdl.handle.net/10210/8363.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Marcon, Sinclair Antony. “Markov chains : a graph theoretical approach.” 2013. Web. 25 May 2019.

Vancouver:

Marcon SA. Markov chains : a graph theoretical approach. [Internet] [Thesis]. University of Johannesburg; 2013. [cited 2019 May 25]. Available from: http://hdl.handle.net/10210/8363.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Marcon SA. Markov chains : a graph theoretical approach. [Thesis]. University of Johannesburg; 2013. Available from: http://hdl.handle.net/10210/8363

Not specified: Masters Thesis or Doctoral Dissertation

University of Adelaide

27.
Falzon, Lucia.
On the accumulated sojourn time in finite-state *Markov* *processes* / Lucia Falzon.

Degree: 1997, University of Adelaide

URL: http://hdl.handle.net/2440/19096

The subject of this thesis is the joint probability density of the accumulated sojourn time in each state of a Markov process when the initial state is known.
*Advisors/Committee Members: Dept. of Applied Mathematics (school).*

Subjects/Keywords: Markov processes

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Falzon, L. (1997). On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/19096

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Falzon, Lucia. “On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon.” 1997. Thesis, University of Adelaide. Accessed May 25, 2019. http://hdl.handle.net/2440/19096.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Falzon, Lucia. “On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon.” 1997. Web. 25 May 2019.

Vancouver:

Falzon L. On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon. [Internet] [Thesis]. University of Adelaide; 1997. [cited 2019 May 25]. Available from: http://hdl.handle.net/2440/19096.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Falzon L. On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon. [Thesis]. University of Adelaide; 1997. Available from: http://hdl.handle.net/2440/19096

Not specified: Masters Thesis or Doctoral Dissertation

University of Adelaide

28.
Setiawaty, Berlian.
Consistent estimation of the order for hidden *Markov* models / Berlian Setiawaty.

Degree: 1999, University of Adelaide

URL: http://hdl.handle.net/2440/19570

In this thesis a maximum compensated log-likelihood method is proposed for estimating the order of general hidden Markov models.
*Advisors/Committee Members: Dept. of Applied Mathematics (school).*

Subjects/Keywords: Markov processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Setiawaty, B. (1999). Consistent estimation of the order for hidden Markov models / Berlian Setiawaty. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/19570

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Setiawaty, Berlian. “Consistent estimation of the order for hidden Markov models / Berlian Setiawaty.” 1999. Thesis, University of Adelaide. Accessed May 25, 2019. http://hdl.handle.net/2440/19570.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Setiawaty, Berlian. “Consistent estimation of the order for hidden Markov models / Berlian Setiawaty.” 1999. Web. 25 May 2019.

Vancouver:

Setiawaty B. Consistent estimation of the order for hidden Markov models / Berlian Setiawaty. [Internet] [Thesis]. University of Adelaide; 1999. [cited 2019 May 25]. Available from: http://hdl.handle.net/2440/19570.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Setiawaty B. Consistent estimation of the order for hidden Markov models / Berlian Setiawaty. [Thesis]. University of Adelaide; 1999. Available from: http://hdl.handle.net/2440/19570

Not specified: Masters Thesis or Doctoral Dissertation

29. Vanessa Rocha, Andréa. Substitution operators .

Degree: 2009, Universidade Federal de Pernambuco

URL: http://repositorio.ufpe.br/handle/123456789/7073

► Nós estudamos um novo tipo de processo estocástico a tempo discreto, que nós chamamos de processos de substituição. Como o tempo é discreto, nós podemos…
(more)

Subjects/Keywords: Substitution operators; Substitution Processes; Markov Processes.

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Vanessa Rocha, A. (2009). Substitution operators . (Thesis). Universidade Federal de Pernambuco. Retrieved from http://repositorio.ufpe.br/handle/123456789/7073

Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16^{th} Edition):

Vanessa Rocha, Andréa. “Substitution operators .” 2009. Thesis, Universidade Federal de Pernambuco. Accessed May 25, 2019. http://repositorio.ufpe.br/handle/123456789/7073.

Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7^{th} Edition):

Vanessa Rocha, Andréa. “Substitution operators .” 2009. Web. 25 May 2019.

Vancouver:

Vanessa Rocha A. Substitution operators . [Internet] [Thesis]. Universidade Federal de Pernambuco; 2009. [cited 2019 May 25]. Available from: http://repositorio.ufpe.br/handle/123456789/7073.

Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Vanessa Rocha A. Substitution operators . [Thesis]. Universidade Federal de Pernambuco; 2009. Available from: http://repositorio.ufpe.br/handle/123456789/7073

Not specified: Masters Thesis or Doctoral Dissertation

Rutgers University

30. Puranam, Srinivasa Kartikeya, 1981-. Stochastic analysis of bidding in sequential auctions and related problems.

Degree: PhD, Management, 2010, Rutgers University

URL: http://hdl.rutgers.edu/1782.1/rucore10002600001.ETD.000056033

►

In this thesis we study bidding in sequential auctions and taboo optimiza- tion criteria for *Markov* Decision *Processes*. In the second chapter we study the…
(more)

Subjects/Keywords: Markov processes; Stochastic processes; Stochastic analysis

Record Details Similar Records

❌

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6^{th} Edition):

Puranam, Srinivasa Kartikeya, 1. (2010). Stochastic analysis of bidding in sequential auctions and related problems. (Doctoral Dissertation). Rutgers University. Retrieved from http://hdl.rutgers.edu/1782.1/rucore10002600001.ETD.000056033

Chicago Manual of Style (16^{th} Edition):

Puranam, Srinivasa Kartikeya, 1981-. “Stochastic analysis of bidding in sequential auctions and related problems.” 2010. Doctoral Dissertation, Rutgers University. Accessed May 25, 2019. http://hdl.rutgers.edu/1782.1/rucore10002600001.ETD.000056033.

MLA Handbook (7^{th} Edition):

Puranam, Srinivasa Kartikeya, 1981-. “Stochastic analysis of bidding in sequential auctions and related problems.” 2010. Web. 25 May 2019.

Vancouver:

Puranam, Srinivasa Kartikeya 1. Stochastic analysis of bidding in sequential auctions and related problems. [Internet] [Doctoral dissertation]. Rutgers University; 2010. [cited 2019 May 25]. Available from: http://hdl.rutgers.edu/1782.1/rucore10002600001.ETD.000056033.

Council of Science Editors:

Puranam, Srinivasa Kartikeya 1. Stochastic analysis of bidding in sequential auctions and related problems. [Doctoral Dissertation]. Rutgers University; 2010. Available from: http://hdl.rutgers.edu/1782.1/rucore10002600001.ETD.000056033