Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Markov processes). Showing records 1 – 30 of 647 total matches.

[1] [2] [3] [4] [5] … [22]

Search Limiters

Last 2 Years | English Only

Degrees

Levels

Languages

Country

▼ Search Limiters


Oregon State University

1. Alkaee Taleghan, Majid. Simulator-Defined MDP Planning with Applications in Natural Resource Management.

Degree: PhD, Computer Science, 2017, Oregon State University

 This work is inspired by problems in natural resource management centered on the challenge of invasive species. Computing optimal management policies for maintaining ecosystem sustainable… (more)

Subjects/Keywords: Markov Decision Processes; Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Alkaee Taleghan, M. (2017). Simulator-Defined MDP Planning with Applications in Natural Resource Management. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/60125

Chicago Manual of Style (16th Edition):

Alkaee Taleghan, Majid. “Simulator-Defined MDP Planning with Applications in Natural Resource Management.” 2017. Doctoral Dissertation, Oregon State University. Accessed February 28, 2020. http://hdl.handle.net/1957/60125.

MLA Handbook (7th Edition):

Alkaee Taleghan, Majid. “Simulator-Defined MDP Planning with Applications in Natural Resource Management.” 2017. Web. 28 Feb 2020.

Vancouver:

Alkaee Taleghan M. Simulator-Defined MDP Planning with Applications in Natural Resource Management. [Internet] [Doctoral dissertation]. Oregon State University; 2017. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/1957/60125.

Council of Science Editors:

Alkaee Taleghan M. Simulator-Defined MDP Planning with Applications in Natural Resource Management. [Doctoral Dissertation]. Oregon State University; 2017. Available from: http://hdl.handle.net/1957/60125


University of Hong Kong

2. 朱冬梅; Zhu, Dongmei. Construction of non-standard Markov chain models with applications.

Degree: PhD, 2014, University of Hong Kong

In this thesis, the properties of some non-standard Markov chain models and their corresponding parameter estimation methods are investigated. Several practical applications and extensions are… (more)

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

朱冬梅; Zhu, D. (2014). Construction of non-standard Markov chain models with applications. (Doctoral Dissertation). University of Hong Kong. Retrieved from Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358

Chicago Manual of Style (16th Edition):

朱冬梅; Zhu, Dongmei. “Construction of non-standard Markov chain models with applications.” 2014. Doctoral Dissertation, University of Hong Kong. Accessed February 28, 2020. Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358.

MLA Handbook (7th Edition):

朱冬梅; Zhu, Dongmei. “Construction of non-standard Markov chain models with applications.” 2014. Web. 28 Feb 2020.

Vancouver:

朱冬梅; Zhu D. Construction of non-standard Markov chain models with applications. [Internet] [Doctoral dissertation]. University of Hong Kong; 2014. [cited 2020 Feb 28]. Available from: Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358.

Council of Science Editors:

朱冬梅; Zhu D. Construction of non-standard Markov chain models with applications. [Doctoral Dissertation]. University of Hong Kong; 2014. Available from: Zhu, D. [朱冬梅]. (2014). Construction of non-standard Markov chain models with applications. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5295517 ; http://dx.doi.org/10.5353/th_b5295517 ; http://hdl.handle.net/10722/202358


University of Johannesburg

3. Swarts, Francis. Markov characterization of fading channels.

Degree: 2014, University of Johannesburg

M. Ing. (Electrical and Electronic Engineering)

This thesis investigates various methods of modeling fading communication channels. These modeling methods include various techniques for the modeling… (more)

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Swarts, F. (2014). Markov characterization of fading channels. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/11665

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Swarts, Francis. “Markov characterization of fading channels.” 2014. Thesis, University of Johannesburg. Accessed February 28, 2020. http://hdl.handle.net/10210/11665.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Swarts, Francis. “Markov characterization of fading channels.” 2014. Web. 28 Feb 2020.

Vancouver:

Swarts F. Markov characterization of fading channels. [Internet] [Thesis]. University of Johannesburg; 2014. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/10210/11665.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Swarts F. Markov characterization of fading channels. [Thesis]. University of Johannesburg; 2014. Available from: http://hdl.handle.net/10210/11665

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Rutgers University

4. Bo, Xiao. A strategy for evaluating the quality of trace alignment tools based on a Markov model.

Degree: MS, Electrical and Computer Engineering, 2014, Rutgers University

 Trace alignment of event logs is used to understand and improve business processes. A key missing component of current approaches for performing trace alignment is… (more)

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bo, X. (2014). A strategy for evaluating the quality of trace alignment tools based on a Markov model. (Masters Thesis). Rutgers University. Retrieved from https://rucore.libraries.rutgers.edu/rutgers-lib/45206/

Chicago Manual of Style (16th Edition):

Bo, Xiao. “A strategy for evaluating the quality of trace alignment tools based on a Markov model.” 2014. Masters Thesis, Rutgers University. Accessed February 28, 2020. https://rucore.libraries.rutgers.edu/rutgers-lib/45206/.

MLA Handbook (7th Edition):

Bo, Xiao. “A strategy for evaluating the quality of trace alignment tools based on a Markov model.” 2014. Web. 28 Feb 2020.

Vancouver:

Bo X. A strategy for evaluating the quality of trace alignment tools based on a Markov model. [Internet] [Masters thesis]. Rutgers University; 2014. [cited 2020 Feb 28]. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/45206/.

Council of Science Editors:

Bo X. A strategy for evaluating the quality of trace alignment tools based on a Markov model. [Masters Thesis]. Rutgers University; 2014. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/45206/

5. Haugomat, Tristan. Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes.

Degree: Docteur es, Mathématiques et leurs interactions, 2018, Rennes 1

Dans cette thèse, nous donnons une localisation en espace de la théorie des processus de Feller. Un premier objectif est d’obtenir des résultats simples et… (more)

Subjects/Keywords: Processus de Markov; Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Haugomat, T. (2018). Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes. (Doctoral Dissertation). Rennes 1. Retrieved from http://www.theses.fr/2018REN1S046

Chicago Manual of Style (16th Edition):

Haugomat, Tristan. “Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes.” 2018. Doctoral Dissertation, Rennes 1. Accessed February 28, 2020. http://www.theses.fr/2018REN1S046.

MLA Handbook (7th Edition):

Haugomat, Tristan. “Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes.” 2018. Web. 28 Feb 2020.

Vancouver:

Haugomat T. Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes. [Internet] [Doctoral dissertation]. Rennes 1; 2018. [cited 2020 Feb 28]. Available from: http://www.theses.fr/2018REN1S046.

Council of Science Editors:

Haugomat T. Localisation en espace de la propriété de Feller avec application aux processus de type Lévy : Space localisation of the Feller property with application to Lévy-type processes. [Doctoral Dissertation]. Rennes 1; 2018. Available from: http://www.theses.fr/2018REN1S046

6. Ashton, Stephen. The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks.

Degree: PhD, 2019, University of Sussex

 In this thesis, I provide a statistical analysis of high-resolution contact pattern data within primary and secondary schools as collected by the SocioPatterns collaboration. Students… (more)

Subjects/Keywords: QA0274.7 Markov processes. Markov chains

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ashton, S. (2019). The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks. (Doctoral Dissertation). University of Sussex. Retrieved from http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603

Chicago Manual of Style (16th Edition):

Ashton, Stephen. “The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks.” 2019. Doctoral Dissertation, University of Sussex. Accessed February 28, 2020. http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603.

MLA Handbook (7th Edition):

Ashton, Stephen. “The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks.” 2019. Web. 28 Feb 2020.

Vancouver:

Ashton S. The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks. [Internet] [Doctoral dissertation]. University of Sussex; 2019. [cited 2020 Feb 28]. Available from: http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603.

Council of Science Editors:

Ashton S. The mathematics of human contact : developing stochastic algorithms for the generation of time-varying dynamic human contact networks. [Doctoral Dissertation]. University of Sussex; 2019. Available from: http://sro.sussex.ac.uk/id/eprint/88850/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.793603


University of Johannesburg

7. Zhou, Wenge. An experimental evaluation of Markov channel models.

Degree: 2012, University of Johannesburg

M.Ing.

The main contribution of this thesis can be summarized as follows. Firstly, we implemented a high speed error gap recording system, which can run… (more)

Subjects/Keywords: Markov processes - Evaluation

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Zhou, W. (2012). An experimental evaluation of Markov channel models. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/7039

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Zhou, Wenge. “An experimental evaluation of Markov channel models.” 2012. Thesis, University of Johannesburg. Accessed February 28, 2020. http://hdl.handle.net/10210/7039.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Zhou, Wenge. “An experimental evaluation of Markov channel models.” 2012. Web. 28 Feb 2020.

Vancouver:

Zhou W. An experimental evaluation of Markov channel models. [Internet] [Thesis]. University of Johannesburg; 2012. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/10210/7039.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Zhou W. An experimental evaluation of Markov channel models. [Thesis]. University of Johannesburg; 2012. Available from: http://hdl.handle.net/10210/7039

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

8. Sherborne, Neil. Non-Markovian epidemic dynamics on networks.

Degree: PhD, 2018, University of Sussex

 The use of networks to model the spread of epidemics through structured populations is widespread. However, epidemics on networks lead to intractable exact systems with… (more)

Subjects/Keywords: 510; QA0274.7 Markov processes. Markov chains

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sherborne, N. (2018). Non-Markovian epidemic dynamics on networks. (Doctoral Dissertation). University of Sussex. Retrieved from http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572

Chicago Manual of Style (16th Edition):

Sherborne, Neil. “Non-Markovian epidemic dynamics on networks.” 2018. Doctoral Dissertation, University of Sussex. Accessed February 28, 2020. http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572.

MLA Handbook (7th Edition):

Sherborne, Neil. “Non-Markovian epidemic dynamics on networks.” 2018. Web. 28 Feb 2020.

Vancouver:

Sherborne N. Non-Markovian epidemic dynamics on networks. [Internet] [Doctoral dissertation]. University of Sussex; 2018. [cited 2020 Feb 28]. Available from: http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572.

Council of Science Editors:

Sherborne N. Non-Markovian epidemic dynamics on networks. [Doctoral Dissertation]. University of Sussex; 2018. Available from: http://sro.sussex.ac.uk/id/eprint/79084/ ; https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.759572


Oregon State University

9. Yang, Minghui. A hazardous-inspection model with costly repair.

Degree: PhD, Statistics, 1988, Oregon State University

 An inspection-repair model is developed that presumes inspection is hazardous to the system being inspected. The form of the optimal inspection-repair policy is determined for… (more)

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yang, M. (1988). A hazardous-inspection model with costly repair. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/40009

Chicago Manual of Style (16th Edition):

Yang, Minghui. “A hazardous-inspection model with costly repair.” 1988. Doctoral Dissertation, Oregon State University. Accessed February 28, 2020. http://hdl.handle.net/1957/40009.

MLA Handbook (7th Edition):

Yang, Minghui. “A hazardous-inspection model with costly repair.” 1988. Web. 28 Feb 2020.

Vancouver:

Yang M. A hazardous-inspection model with costly repair. [Internet] [Doctoral dissertation]. Oregon State University; 1988. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/1957/40009.

Council of Science Editors:

Yang M. A hazardous-inspection model with costly repair. [Doctoral Dissertation]. Oregon State University; 1988. Available from: http://hdl.handle.net/1957/40009


University of Hong Kong

10. 郭慈安; Kwok, Chi-on, Michael. Some results on higher order Markov Chain models.

Degree: M. Phil., 1988, University of Hong Kong

published_or_final_version

Statistics

Master

Master of Philosophy

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

郭慈安; Kwok, Chi-on, M. (1988). Some results on higher order Markov Chain models. (Masters Thesis). University of Hong Kong. Retrieved from Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847

Chicago Manual of Style (16th Edition):

郭慈安; Kwok, Chi-on, Michael. “Some results on higher order Markov Chain models.” 1988. Masters Thesis, University of Hong Kong. Accessed February 28, 2020. Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847.

MLA Handbook (7th Edition):

郭慈安; Kwok, Chi-on, Michael. “Some results on higher order Markov Chain models.” 1988. Web. 28 Feb 2020.

Vancouver:

郭慈安; Kwok, Chi-on M. Some results on higher order Markov Chain models. [Internet] [Masters thesis]. University of Hong Kong; 1988. [cited 2020 Feb 28]. Available from: Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847.

Council of Science Editors:

郭慈安; Kwok, Chi-on M. Some results on higher order Markov Chain models. [Masters Thesis]. University of Hong Kong; 1988. Available from: Kwok, C. M. [郭慈安]. (1988). Some results on higher order Markov Chain models. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3120865 ; http://dx.doi.org/10.5353/th_b3120865 ; http://hdl.handle.net/10722/32847


University of Hong Kong

11. 葉錦元; Yip, Kam-yuen, William. Simulation and inference of aggregated Markov processes.

Degree: Master of Social Sciences, 1993, University of Hong Kong

published_or_final_version

Applied Statistics

Master

Master of Social Sciences

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

葉錦元; Yip, Kam-yuen, W. (1993). Simulation and inference of aggregated Markov processes. (Masters Thesis). University of Hong Kong. Retrieved from Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668

Chicago Manual of Style (16th Edition):

葉錦元; Yip, Kam-yuen, William. “Simulation and inference of aggregated Markov processes.” 1993. Masters Thesis, University of Hong Kong. Accessed February 28, 2020. Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668.

MLA Handbook (7th Edition):

葉錦元; Yip, Kam-yuen, William. “Simulation and inference of aggregated Markov processes.” 1993. Web. 28 Feb 2020.

Vancouver:

葉錦元; Yip, Kam-yuen W. Simulation and inference of aggregated Markov processes. [Internet] [Masters thesis]. University of Hong Kong; 1993. [cited 2020 Feb 28]. Available from: Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668.

Council of Science Editors:

葉錦元; Yip, Kam-yuen W. Simulation and inference of aggregated Markov processes. [Masters Thesis]. University of Hong Kong; 1993. Available from: Yip, K. W. [葉錦元]. (1994). Simulation and inference of aggregated Markov processes. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b3197754 ; http://dx.doi.org/10.5353/th_b3197754 ; http://hdl.handle.net/10722/28668


University of Johannesburg

12. Marcon, Sinclair Antony. Markov chains : a graph theoretical approach.

Degree: 2013, University of Johannesburg

M.Sc. (Mathematics)

In chapter 1, we give the reader some background concerning digraphs that are used in the discussion of Markov chains; namely, their Markov(more)

Subjects/Keywords: Markov processes; Graph theory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Marcon, S. A. (2013). Markov chains : a graph theoretical approach. (Thesis). University of Johannesburg. Retrieved from http://hdl.handle.net/10210/8363

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Marcon, Sinclair Antony. “Markov chains : a graph theoretical approach.” 2013. Thesis, University of Johannesburg. Accessed February 28, 2020. http://hdl.handle.net/10210/8363.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Marcon, Sinclair Antony. “Markov chains : a graph theoretical approach.” 2013. Web. 28 Feb 2020.

Vancouver:

Marcon SA. Markov chains : a graph theoretical approach. [Internet] [Thesis]. University of Johannesburg; 2013. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/10210/8363.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Marcon SA. Markov chains : a graph theoretical approach. [Thesis]. University of Johannesburg; 2013. Available from: http://hdl.handle.net/10210/8363

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Oregon State University

13. Raghavan, Aswin. Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces.

Degree: PhD, Computer Science, 2017, Oregon State University

Markov Decision Processes (MDPs) are the de-facto formalism for studying sequential decision making problems with uncertainty, ranging from classical problems such as inventory control and… (more)

Subjects/Keywords: Planning under uncertainty; Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Raghavan, A. (2017). Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/60559

Chicago Manual of Style (16th Edition):

Raghavan, Aswin. “Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces.” 2017. Doctoral Dissertation, Oregon State University. Accessed February 28, 2020. http://hdl.handle.net/1957/60559.

MLA Handbook (7th Edition):

Raghavan, Aswin. “Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces.” 2017. Web. 28 Feb 2020.

Vancouver:

Raghavan A. Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces. [Internet] [Doctoral dissertation]. Oregon State University; 2017. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/1957/60559.

Council of Science Editors:

Raghavan A. Domain-Independent Planning for Markov Decision Processes with Factored State and Action Spaces. [Doctoral Dissertation]. Oregon State University; 2017. Available from: http://hdl.handle.net/1957/60559


Oregon State University

14. Hostetler, Jesse A. Monte Carlo Tree Search with Fixed and Adaptive Abstractions.

Degree: PhD, Computer Science, 2017, Oregon State University

 Monte Carlo tree search (MCTS) is a class of online planning algorithms for Markov decision processes (MDPs) and related models that has found success in… (more)

Subjects/Keywords: Artificial intelligence; Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hostetler, J. A. (2017). Monte Carlo Tree Search with Fixed and Adaptive Abstractions. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/60635

Chicago Manual of Style (16th Edition):

Hostetler, Jesse A. “Monte Carlo Tree Search with Fixed and Adaptive Abstractions.” 2017. Doctoral Dissertation, Oregon State University. Accessed February 28, 2020. http://hdl.handle.net/1957/60635.

MLA Handbook (7th Edition):

Hostetler, Jesse A. “Monte Carlo Tree Search with Fixed and Adaptive Abstractions.” 2017. Web. 28 Feb 2020.

Vancouver:

Hostetler JA. Monte Carlo Tree Search with Fixed and Adaptive Abstractions. [Internet] [Doctoral dissertation]. Oregon State University; 2017. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/1957/60635.

Council of Science Editors:

Hostetler JA. Monte Carlo Tree Search with Fixed and Adaptive Abstractions. [Doctoral Dissertation]. Oregon State University; 2017. Available from: http://hdl.handle.net/1957/60635


McGill University

15. Nair, G. Gopalakrishnan. Functions of Markov chains.

Degree: MS, Department of Mathematics, 1969, McGill University

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Nair, G. G. (1969). Functions of Markov chains. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile46505.pdf

Chicago Manual of Style (16th Edition):

Nair, G Gopalakrishnan. “Functions of Markov chains.” 1969. Masters Thesis, McGill University. Accessed February 28, 2020. http://digitool.library.mcgill.ca/thesisfile46505.pdf.

MLA Handbook (7th Edition):

Nair, G Gopalakrishnan. “Functions of Markov chains.” 1969. Web. 28 Feb 2020.

Vancouver:

Nair GG. Functions of Markov chains. [Internet] [Masters thesis]. McGill University; 1969. [cited 2020 Feb 28]. Available from: http://digitool.library.mcgill.ca/thesisfile46505.pdf.

Council of Science Editors:

Nair GG. Functions of Markov chains. [Masters Thesis]. McGill University; 1969. Available from: http://digitool.library.mcgill.ca/thesisfile46505.pdf


McGill University

16. Bose, A. (Amitava). Quantum chains.

Degree: MS, Department of Mathematics, 1968, McGill University

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bose, A. (. (1968). Quantum chains. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile47148.pdf

Chicago Manual of Style (16th Edition):

Bose, A (Amitava). “Quantum chains.” 1968. Masters Thesis, McGill University. Accessed February 28, 2020. http://digitool.library.mcgill.ca/thesisfile47148.pdf.

MLA Handbook (7th Edition):

Bose, A (Amitava). “Quantum chains.” 1968. Web. 28 Feb 2020.

Vancouver:

Bose A(. Quantum chains. [Internet] [Masters thesis]. McGill University; 1968. [cited 2020 Feb 28]. Available from: http://digitool.library.mcgill.ca/thesisfile47148.pdf.

Council of Science Editors:

Bose A(. Quantum chains. [Masters Thesis]. McGill University; 1968. Available from: http://digitool.library.mcgill.ca/thesisfile47148.pdf


McGill University

17. Dansereau, Maryse. Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante.

Degree: MS, Department of Mathematics, 1974, McGill University

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Dansereau, M. (1974). Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile48492.pdf

Chicago Manual of Style (16th Edition):

Dansereau, Maryse. “Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante.” 1974. Masters Thesis, McGill University. Accessed February 28, 2020. http://digitool.library.mcgill.ca/thesisfile48492.pdf.

MLA Handbook (7th Edition):

Dansereau, Maryse. “Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante.” 1974. Web. 28 Feb 2020.

Vancouver:

Dansereau M. Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante. [Internet] [Masters thesis]. McGill University; 1974. [cited 2020 Feb 28]. Available from: http://digitool.library.mcgill.ca/thesisfile48492.pdf.

Council of Science Editors:

Dansereau M. Techniques d'estimation pour les chaînes de Markov y compris les chaînes avec matrice causative constante. [Masters Thesis]. McGill University; 1974. Available from: http://digitool.library.mcgill.ca/thesisfile48492.pdf


Montana Tech

18. Culbertson, Denny Durfee. Semi-Markov chains.

Degree: MA, 1963, Montana Tech

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Culbertson, D. D. (1963). Semi-Markov chains. (Masters Thesis). Montana Tech. Retrieved from https://scholarworks.umt.edu/etd/8340

Chicago Manual of Style (16th Edition):

Culbertson, Denny Durfee. “Semi-Markov chains.” 1963. Masters Thesis, Montana Tech. Accessed February 28, 2020. https://scholarworks.umt.edu/etd/8340.

MLA Handbook (7th Edition):

Culbertson, Denny Durfee. “Semi-Markov chains.” 1963. Web. 28 Feb 2020.

Vancouver:

Culbertson DD. Semi-Markov chains. [Internet] [Masters thesis]. Montana Tech; 1963. [cited 2020 Feb 28]. Available from: https://scholarworks.umt.edu/etd/8340.

Council of Science Editors:

Culbertson DD. Semi-Markov chains. [Masters Thesis]. Montana Tech; 1963. Available from: https://scholarworks.umt.edu/etd/8340


McGill University

19. Solvason, Diane Lynn. Maximum likelihood estimation for Markov renewal processes.

Degree: MS, Department of Mathematics, 1977, McGill University

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Solvason, D. L. (1977). Maximum likelihood estimation for Markov renewal processes. (Masters Thesis). McGill University. Retrieved from http://digitool.library.mcgill.ca/thesisfile53842.pdf

Chicago Manual of Style (16th Edition):

Solvason, Diane Lynn. “Maximum likelihood estimation for Markov renewal processes.” 1977. Masters Thesis, McGill University. Accessed February 28, 2020. http://digitool.library.mcgill.ca/thesisfile53842.pdf.

MLA Handbook (7th Edition):

Solvason, Diane Lynn. “Maximum likelihood estimation for Markov renewal processes.” 1977. Web. 28 Feb 2020.

Vancouver:

Solvason DL. Maximum likelihood estimation for Markov renewal processes. [Internet] [Masters thesis]. McGill University; 1977. [cited 2020 Feb 28]. Available from: http://digitool.library.mcgill.ca/thesisfile53842.pdf.

Council of Science Editors:

Solvason DL. Maximum likelihood estimation for Markov renewal processes. [Masters Thesis]. McGill University; 1977. Available from: http://digitool.library.mcgill.ca/thesisfile53842.pdf


University of British Columbia

20. Salisbury, Thomas S. Construction of strong Markov processes through excursions, and a related Martin boundary .

Degree: 1983, University of British Columbia

 For certain Markov processes, K. Ito has defined the Poisson point process of excursions away from a fixed point. The law of this process is… (more)

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Salisbury, T. S. (1983). Construction of strong Markov processes through excursions, and a related Martin boundary . (Thesis). University of British Columbia. Retrieved from http://hdl.handle.net/2429/24354

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Salisbury, Thomas S. “Construction of strong Markov processes through excursions, and a related Martin boundary .” 1983. Thesis, University of British Columbia. Accessed February 28, 2020. http://hdl.handle.net/2429/24354.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Salisbury, Thomas S. “Construction of strong Markov processes through excursions, and a related Martin boundary .” 1983. Web. 28 Feb 2020.

Vancouver:

Salisbury TS. Construction of strong Markov processes through excursions, and a related Martin boundary . [Internet] [Thesis]. University of British Columbia; 1983. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/2429/24354.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Salisbury TS. Construction of strong Markov processes through excursions, and a related Martin boundary . [Thesis]. University of British Columbia; 1983. Available from: http://hdl.handle.net/2429/24354

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Oregon State University

21. Ching, Brenton S. Analysis of iteration schemes for deterministic transport in binary Markovian mixtures.

Degree: MS, Nuclear Engineering, 2000, Oregon State University

 The Adams-Larsen-Pomraning coupled transport model has been used to describe neutral particle transport in binary stochastic mixtures. Here, the mixing statistics are considered to be… (more)

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ching, B. S. (2000). Analysis of iteration schemes for deterministic transport in binary Markovian mixtures. (Masters Thesis). Oregon State University. Retrieved from http://hdl.handle.net/1957/32893

Chicago Manual of Style (16th Edition):

Ching, Brenton S. “Analysis of iteration schemes for deterministic transport in binary Markovian mixtures.” 2000. Masters Thesis, Oregon State University. Accessed February 28, 2020. http://hdl.handle.net/1957/32893.

MLA Handbook (7th Edition):

Ching, Brenton S. “Analysis of iteration schemes for deterministic transport in binary Markovian mixtures.” 2000. Web. 28 Feb 2020.

Vancouver:

Ching BS. Analysis of iteration schemes for deterministic transport in binary Markovian mixtures. [Internet] [Masters thesis]. Oregon State University; 2000. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/1957/32893.

Council of Science Editors:

Ching BS. Analysis of iteration schemes for deterministic transport in binary Markovian mixtures. [Masters Thesis]. Oregon State University; 2000. Available from: http://hdl.handle.net/1957/32893


Oregon State University

22. Ott, Melvin Leroy. Optimal policies in continuous Markov decision chains.

Degree: PhD, Statistics, 1974, Oregon State University

 For continuous time, finite state and action, Markov decision chains, optimal policies are studied; (i) a procedure for transforming the terminal reward vector is given… (more)

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ott, M. L. (1974). Optimal policies in continuous Markov decision chains. (Doctoral Dissertation). Oregon State University. Retrieved from http://hdl.handle.net/1957/44137

Chicago Manual of Style (16th Edition):

Ott, Melvin Leroy. “Optimal policies in continuous Markov decision chains.” 1974. Doctoral Dissertation, Oregon State University. Accessed February 28, 2020. http://hdl.handle.net/1957/44137.

MLA Handbook (7th Edition):

Ott, Melvin Leroy. “Optimal policies in continuous Markov decision chains.” 1974. Web. 28 Feb 2020.

Vancouver:

Ott ML. Optimal policies in continuous Markov decision chains. [Internet] [Doctoral dissertation]. Oregon State University; 1974. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/1957/44137.

Council of Science Editors:

Ott ML. Optimal policies in continuous Markov decision chains. [Doctoral Dissertation]. Oregon State University; 1974. Available from: http://hdl.handle.net/1957/44137


Columbia University

23. Ruiz Lacedelli, Octavio. Essays in information relaxations and scenario analysis for partially observable settings.

Degree: 2019, Columbia University

 This dissertation consists of three main essays in which we study important problems in engineering and finance. In the first part of this dissertation, we… (more)

Subjects/Keywords: Operations research; Finance; Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ruiz Lacedelli, O. (2019). Essays in information relaxations and scenario analysis for partially observable settings. (Doctoral Dissertation). Columbia University. Retrieved from https://doi.org/10.7916/d8-mwkk-mr35

Chicago Manual of Style (16th Edition):

Ruiz Lacedelli, Octavio. “Essays in information relaxations and scenario analysis for partially observable settings.” 2019. Doctoral Dissertation, Columbia University. Accessed February 28, 2020. https://doi.org/10.7916/d8-mwkk-mr35.

MLA Handbook (7th Edition):

Ruiz Lacedelli, Octavio. “Essays in information relaxations and scenario analysis for partially observable settings.” 2019. Web. 28 Feb 2020.

Vancouver:

Ruiz Lacedelli O. Essays in information relaxations and scenario analysis for partially observable settings. [Internet] [Doctoral dissertation]. Columbia University; 2019. [cited 2020 Feb 28]. Available from: https://doi.org/10.7916/d8-mwkk-mr35.

Council of Science Editors:

Ruiz Lacedelli O. Essays in information relaxations and scenario analysis for partially observable settings. [Doctoral Dissertation]. Columbia University; 2019. Available from: https://doi.org/10.7916/d8-mwkk-mr35


Western Carolina University

24. Shouse, Kirke. Activity recognition using Grey-Markov model.

Degree: 2011, Western Carolina University

 Activity Recognition (AR) is a process of identifying actions and goals of one or more agents of interest. AR techniques have been applied to both… (more)

Subjects/Keywords: Human activity recognition; Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shouse, K. (2011). Activity recognition using Grey-Markov model. (Masters Thesis). Western Carolina University. Retrieved from http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032

Chicago Manual of Style (16th Edition):

Shouse, Kirke. “Activity recognition using Grey-Markov model.” 2011. Masters Thesis, Western Carolina University. Accessed February 28, 2020. http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032.

MLA Handbook (7th Edition):

Shouse, Kirke. “Activity recognition using Grey-Markov model.” 2011. Web. 28 Feb 2020.

Vancouver:

Shouse K. Activity recognition using Grey-Markov model. [Internet] [Masters thesis]. Western Carolina University; 2011. [cited 2020 Feb 28]. Available from: http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032.

Council of Science Editors:

Shouse K. Activity recognition using Grey-Markov model. [Masters Thesis]. Western Carolina University; 2011. Available from: http://libres.uncg.edu/ir/listing.aspx?styp=ti&id=9032


University of Adelaide

25. Falzon, Lucia. On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon.

Degree: 1997, University of Adelaide

The subject of this thesis is the joint probability density of the accumulated sojourn time in each state of a Markov process when the initial state is known. Advisors/Committee Members: Dept. of Applied Mathematics (school).

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Falzon, L. (1997). On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/19096

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Falzon, Lucia. “On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon.” 1997. Thesis, University of Adelaide. Accessed February 28, 2020. http://hdl.handle.net/2440/19096.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Falzon, Lucia. “On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon.” 1997. Web. 28 Feb 2020.

Vancouver:

Falzon L. On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon. [Internet] [Thesis]. University of Adelaide; 1997. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/2440/19096.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Falzon L. On the accumulated sojourn time in finite-state Markov processes / Lucia Falzon. [Thesis]. University of Adelaide; 1997. Available from: http://hdl.handle.net/2440/19096

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Adelaide

26. Setiawaty, Berlian. Consistent estimation of the order for hidden Markov models / Berlian Setiawaty.

Degree: 1999, University of Adelaide

In this thesis a maximum compensated log-likelihood method is proposed for estimating the order of general hidden Markov models. Advisors/Committee Members: Dept. of Applied Mathematics (school).

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Setiawaty, B. (1999). Consistent estimation of the order for hidden Markov models / Berlian Setiawaty. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/19570

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Setiawaty, Berlian. “Consistent estimation of the order for hidden Markov models / Berlian Setiawaty.” 1999. Thesis, University of Adelaide. Accessed February 28, 2020. http://hdl.handle.net/2440/19570.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Setiawaty, Berlian. “Consistent estimation of the order for hidden Markov models / Berlian Setiawaty.” 1999. Web. 28 Feb 2020.

Vancouver:

Setiawaty B. Consistent estimation of the order for hidden Markov models / Berlian Setiawaty. [Internet] [Thesis]. University of Adelaide; 1999. [cited 2020 Feb 28]. Available from: http://hdl.handle.net/2440/19570.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Setiawaty B. Consistent estimation of the order for hidden Markov models / Berlian Setiawaty. [Thesis]. University of Adelaide; 1999. Available from: http://hdl.handle.net/2440/19570

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Rutgers University

27. Parag, Toufiq U, 1979-. Labeling hypergraph-structured data using Markov network.

Degree: PhD, Computer Science, 2011, Rutgers University

The goal of this dissertation is to label datapoints into two groups utilizing higher order information among them. More specifically, given likelihood (or error) measures… (more)

Subjects/Keywords: Markov processes; Computer science – Mathematics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Parag, Toufiq U, 1. (2011). Labeling hypergraph-structured data using Markov network. (Doctoral Dissertation). Rutgers University. Retrieved from http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655

Chicago Manual of Style (16th Edition):

Parag, Toufiq U, 1979-. “Labeling hypergraph-structured data using Markov network.” 2011. Doctoral Dissertation, Rutgers University. Accessed February 28, 2020. http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655.

MLA Handbook (7th Edition):

Parag, Toufiq U, 1979-. “Labeling hypergraph-structured data using Markov network.” 2011. Web. 28 Feb 2020.

Vancouver:

Parag, Toufiq U 1. Labeling hypergraph-structured data using Markov network. [Internet] [Doctoral dissertation]. Rutgers University; 2011. [cited 2020 Feb 28]. Available from: http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655.

Council of Science Editors:

Parag, Toufiq U 1. Labeling hypergraph-structured data using Markov network. [Doctoral Dissertation]. Rutgers University; 2011. Available from: http://hdl.rutgers.edu/1782.1/rucore10001600001.ETD.000057655


Rutgers University

28. Dai, Wei. First passage times and relaxation times of unfolded proteins and the funnel model of protein folding.

Degree: PhD, Physics and Astronomy, 2016, Rutgers University

Protein folding has been a challenging puzzle for decades but it is still not fully understood. One important way to gain insights of the mechanism… (more)

Subjects/Keywords: Protein folding; Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Dai, W. (2016). First passage times and relaxation times of unfolded proteins and the funnel model of protein folding. (Doctoral Dissertation). Rutgers University. Retrieved from https://rucore.libraries.rutgers.edu/rutgers-lib/49952/

Chicago Manual of Style (16th Edition):

Dai, Wei. “First passage times and relaxation times of unfolded proteins and the funnel model of protein folding.” 2016. Doctoral Dissertation, Rutgers University. Accessed February 28, 2020. https://rucore.libraries.rutgers.edu/rutgers-lib/49952/.

MLA Handbook (7th Edition):

Dai, Wei. “First passage times and relaxation times of unfolded proteins and the funnel model of protein folding.” 2016. Web. 28 Feb 2020.

Vancouver:

Dai W. First passage times and relaxation times of unfolded proteins and the funnel model of protein folding. [Internet] [Doctoral dissertation]. Rutgers University; 2016. [cited 2020 Feb 28]. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/49952/.

Council of Science Editors:

Dai W. First passage times and relaxation times of unfolded proteins and the funnel model of protein folding. [Doctoral Dissertation]. Rutgers University; 2016. Available from: https://rucore.libraries.rutgers.edu/rutgers-lib/49952/


Texas Christian University

29. Huff, Edward Martin. Markov analysis of response timing on a DRL schedule / by Edward Martin Huff.

Degree: 1966, Texas Christian University

Subjects/Keywords: Markov processes

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Huff, E. M. (1966). Markov analysis of response timing on a DRL schedule / by Edward Martin Huff. (Thesis). Texas Christian University. Retrieved from https://repository.tcu.edu/handle/116099117/34629

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Huff, Edward Martin. “Markov analysis of response timing on a DRL schedule / by Edward Martin Huff.” 1966. Thesis, Texas Christian University. Accessed February 28, 2020. https://repository.tcu.edu/handle/116099117/34629.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Huff, Edward Martin. “Markov analysis of response timing on a DRL schedule / by Edward Martin Huff.” 1966. Web. 28 Feb 2020.

Vancouver:

Huff EM. Markov analysis of response timing on a DRL schedule / by Edward Martin Huff. [Internet] [Thesis]. Texas Christian University; 1966. [cited 2020 Feb 28]. Available from: https://repository.tcu.edu/handle/116099117/34629.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Huff EM. Markov analysis of response timing on a DRL schedule / by Edward Martin Huff. [Thesis]. Texas Christian University; 1966. Available from: https://repository.tcu.edu/handle/116099117/34629

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Simon Fraser University

30. Phillips, Gary Leslie. Two classification theorems of states of discrete Markov chains.  – .

Degree: 1970, Simon Fraser University

Subjects/Keywords: Markov processes.

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Phillips, G. L. (1970). Two classification theorems of states of discrete Markov chains.  – . (Thesis). Simon Fraser University. Retrieved from http://summit.sfu.ca/item/4178

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Phillips, Gary Leslie. “Two classification theorems of states of discrete Markov chains.  – .” 1970. Thesis, Simon Fraser University. Accessed February 28, 2020. http://summit.sfu.ca/item/4178.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Phillips, Gary Leslie. “Two classification theorems of states of discrete Markov chains.  – .” 1970. Web. 28 Feb 2020.

Vancouver:

Phillips GL. Two classification theorems of states of discrete Markov chains.  – . [Internet] [Thesis]. Simon Fraser University; 1970. [cited 2020 Feb 28]. Available from: http://summit.sfu.ca/item/4178.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Phillips GL. Two classification theorems of states of discrete Markov chains.  – . [Thesis]. Simon Fraser University; 1970. Available from: http://summit.sfu.ca/item/4178

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

[1] [2] [3] [4] [5] … [22]

.