Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Information Bottleneck). Showing records 1 – 12 of 12 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


Louisiana State University

1. Bayat, Farhang. Study of Fundamental Tradeoff Between Deliverable and Private Information in Statistical Inference.

Degree: PhD, Systems and Communications, 2020, Louisiana State University

  My primary objective in this dissertation is to establish a framework under which I launch a systematic study of the fundamental tradeoff between deliverable… (more)

Subjects/Keywords: disclosable information; private information; privacy; information bottleneck; ADMM

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bayat, F. (2020). Study of Fundamental Tradeoff Between Deliverable and Private Information in Statistical Inference. (Doctoral Dissertation). Louisiana State University. Retrieved from https://digitalcommons.lsu.edu/gradschool_dissertations/5234

Chicago Manual of Style (16th Edition):

Bayat, Farhang. “Study of Fundamental Tradeoff Between Deliverable and Private Information in Statistical Inference.” 2020. Doctoral Dissertation, Louisiana State University. Accessed November 28, 2020. https://digitalcommons.lsu.edu/gradschool_dissertations/5234.

MLA Handbook (7th Edition):

Bayat, Farhang. “Study of Fundamental Tradeoff Between Deliverable and Private Information in Statistical Inference.” 2020. Web. 28 Nov 2020.

Vancouver:

Bayat F. Study of Fundamental Tradeoff Between Deliverable and Private Information in Statistical Inference. [Internet] [Doctoral dissertation]. Louisiana State University; 2020. [cited 2020 Nov 28]. Available from: https://digitalcommons.lsu.edu/gradschool_dissertations/5234.

Council of Science Editors:

Bayat F. Study of Fundamental Tradeoff Between Deliverable and Private Information in Statistical Inference. [Doctoral Dissertation]. Louisiana State University; 2020. Available from: https://digitalcommons.lsu.edu/gradschool_dissertations/5234


University of California – Berkeley

2. Marzen, Sarah. Bio-inspired problems in rate-distortion theory.

Degree: Physics, 2016, University of California – Berkeley

 Designed and evolved sensors are faced with a difficult optimization problem: storing more information about the environment and reaping greater “rewards” costs time, energy, and… (more)

Subjects/Keywords: Physics; causal states; predictive information bottleneck; rate-distortion theory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Marzen, S. (2016). Bio-inspired problems in rate-distortion theory. (Thesis). University of California – Berkeley. Retrieved from http://www.escholarship.org/uc/item/8234q7f8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Marzen, Sarah. “Bio-inspired problems in rate-distortion theory.” 2016. Thesis, University of California – Berkeley. Accessed November 28, 2020. http://www.escholarship.org/uc/item/8234q7f8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Marzen, Sarah. “Bio-inspired problems in rate-distortion theory.” 2016. Web. 28 Nov 2020.

Vancouver:

Marzen S. Bio-inspired problems in rate-distortion theory. [Internet] [Thesis]. University of California – Berkeley; 2016. [cited 2020 Nov 28]. Available from: http://www.escholarship.org/uc/item/8234q7f8.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Marzen S. Bio-inspired problems in rate-distortion theory. [Thesis]. University of California – Berkeley; 2016. Available from: http://www.escholarship.org/uc/item/8234q7f8

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Michigan

3. Noushad Iranzad, Mortaza. Estimation of Information Measures and Its Applications in Machine Learning.

Degree: PhD, Computer Science & Engineering, 2019, University of Michigan

Information-theoretic measures such as Shannon entropy, mutual information, and the Kullback-Leibler (KL) divergence have a broad range of applications in information and coding theory, statistics,… (more)

Subjects/Keywords: Estimation; Information Theory; Bayes Error; Deep Learning; Information Bottleneck; Divergence; Computer Science; Engineering

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Noushad Iranzad, M. (2019). Estimation of Information Measures and Its Applications in Machine Learning. (Doctoral Dissertation). University of Michigan. Retrieved from http://hdl.handle.net/2027.42/153472

Chicago Manual of Style (16th Edition):

Noushad Iranzad, Mortaza. “Estimation of Information Measures and Its Applications in Machine Learning.” 2019. Doctoral Dissertation, University of Michigan. Accessed November 28, 2020. http://hdl.handle.net/2027.42/153472.

MLA Handbook (7th Edition):

Noushad Iranzad, Mortaza. “Estimation of Information Measures and Its Applications in Machine Learning.” 2019. Web. 28 Nov 2020.

Vancouver:

Noushad Iranzad M. Estimation of Information Measures and Its Applications in Machine Learning. [Internet] [Doctoral dissertation]. University of Michigan; 2019. [cited 2020 Nov 28]. Available from: http://hdl.handle.net/2027.42/153472.

Council of Science Editors:

Noushad Iranzad M. Estimation of Information Measures and Its Applications in Machine Learning. [Doctoral Dissertation]. University of Michigan; 2019. Available from: http://hdl.handle.net/2027.42/153472


UCLA

4. Achille, Alessandro. Emergent Properties of Deep Neural Networks.

Degree: Computer Science, 2019, UCLA

 We show that information theoretic quantities can be used to control and describe the training process of Deep Neural Networks, and can explain how properties,… (more)

Subjects/Keywords: Computer science; Critical Periods; Information Bottleneck; Information in the Weights; Learning dynamics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Achille, A. (2019). Emergent Properties of Deep Neural Networks. (Thesis). UCLA. Retrieved from http://www.escholarship.org/uc/item/8gb8x6w9

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Achille, Alessandro. “Emergent Properties of Deep Neural Networks.” 2019. Thesis, UCLA. Accessed November 28, 2020. http://www.escholarship.org/uc/item/8gb8x6w9.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Achille, Alessandro. “Emergent Properties of Deep Neural Networks.” 2019. Web. 28 Nov 2020.

Vancouver:

Achille A. Emergent Properties of Deep Neural Networks. [Internet] [Thesis]. UCLA; 2019. [cited 2020 Nov 28]. Available from: http://www.escholarship.org/uc/item/8gb8x6w9.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Achille A. Emergent Properties of Deep Neural Networks. [Thesis]. UCLA; 2019. Available from: http://www.escholarship.org/uc/item/8gb8x6w9

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Adelaide

5. Williams, Jerome Oskar. [EMBARGOED] Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression.

Degree: 2019, University of Adelaide

 Improving efficiency in deep learning models implies achieving a more accurate model for a given computational budget, or conversely a faster, leaner model without losing… (more)

Subjects/Keywords: Machine learning; neural network; deep learning; computer vision; regularization; compression; information bottleneck; autoencoder; pedestrian detection; region of interest; convolutional; statistics; efficiency

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Williams, J. O. (2019). [EMBARGOED] Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression. (Thesis). University of Adelaide. Retrieved from http://hdl.handle.net/2440/120659

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Williams, Jerome Oskar. “[EMBARGOED] Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression.” 2019. Thesis, University of Adelaide. Accessed November 28, 2020. http://hdl.handle.net/2440/120659.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Williams, Jerome Oskar. “[EMBARGOED] Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression.” 2019. Web. 28 Nov 2020.

Vancouver:

Williams JO. [EMBARGOED] Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression. [Internet] [Thesis]. University of Adelaide; 2019. [cited 2020 Nov 28]. Available from: http://hdl.handle.net/2440/120659.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Williams JO. [EMBARGOED] Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression. [Thesis]. University of Adelaide; 2019. Available from: http://hdl.handle.net/2440/120659

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


ETH Zürich

6. Roh, Philip. Methoden zur datengetriebenen Lokalisierung des Verbesserungspotentials in Produktionsabläufen.

Degree: 2019, ETH Zürich

 The aim of this scientific treatise is to support manufacturing companies in increas-ing the productivity of their production processes, one of their classic challenges. The… (more)

Subjects/Keywords: PRODUCTION MANAGEMENT (PRODUCTION); Information Stream Mapping; Bottleneck Detection; info:eu-repo/classification/ddc/620; info:eu-repo/classification/ddc/620; Engineering & allied operations; Engineering & allied operations

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Roh, P. (2019). Methoden zur datengetriebenen Lokalisierung des Verbesserungspotentials in Produktionsabläufen. (Doctoral Dissertation). ETH Zürich. Retrieved from http://hdl.handle.net/20.500.11850/370352

Chicago Manual of Style (16th Edition):

Roh, Philip. “Methoden zur datengetriebenen Lokalisierung des Verbesserungspotentials in Produktionsabläufen.” 2019. Doctoral Dissertation, ETH Zürich. Accessed November 28, 2020. http://hdl.handle.net/20.500.11850/370352.

MLA Handbook (7th Edition):

Roh, Philip. “Methoden zur datengetriebenen Lokalisierung des Verbesserungspotentials in Produktionsabläufen.” 2019. Web. 28 Nov 2020.

Vancouver:

Roh P. Methoden zur datengetriebenen Lokalisierung des Verbesserungspotentials in Produktionsabläufen. [Internet] [Doctoral dissertation]. ETH Zürich; 2019. [cited 2020 Nov 28]. Available from: http://hdl.handle.net/20.500.11850/370352.

Council of Science Editors:

Roh P. Methoden zur datengetriebenen Lokalisierung des Verbesserungspotentials in Produktionsabläufen. [Doctoral Dissertation]. ETH Zürich; 2019. Available from: http://hdl.handle.net/20.500.11850/370352

7. Sinkkonen, Janne. Learning Metrics and Discriminative Clustering.

Degree: 2003, Helsinki University of Technology

In this work methods have been developed to extract relevant information from large, multivariate data sets in a flexible, nonlinear way. The techniques are applicable… (more)

Subjects/Keywords: clustering; discriminative clustering; exploratory data analysis; feature extraction; information bottleneck; information geometry; learning metrics; mutual information; supervised unsupervised learning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sinkkonen, J. (2003). Learning Metrics and Discriminative Clustering. (Thesis). Helsinki University of Technology. Retrieved from http://lib.tkk.fi/Diss/2003/isbn9512267977/

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sinkkonen, Janne. “Learning Metrics and Discriminative Clustering.” 2003. Thesis, Helsinki University of Technology. Accessed November 28, 2020. http://lib.tkk.fi/Diss/2003/isbn9512267977/.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sinkkonen, Janne. “Learning Metrics and Discriminative Clustering.” 2003. Web. 28 Nov 2020.

Vancouver:

Sinkkonen J. Learning Metrics and Discriminative Clustering. [Internet] [Thesis]. Helsinki University of Technology; 2003. [cited 2020 Nov 28]. Available from: http://lib.tkk.fi/Diss/2003/isbn9512267977/.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sinkkonen J. Learning Metrics and Discriminative Clustering. [Thesis]. Helsinki University of Technology; 2003. Available from: http://lib.tkk.fi/Diss/2003/isbn9512267977/

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Vrije Universiteit Amsterdam

8. Peer, S. The economics of trip scheduling, travel time variability and traffic information .

Degree: 2013, Vrije Universiteit Amsterdam

Subjects/Keywords: transport economics; trip scheduling; traffic information; value of time; value of schedule delay; bottleneck model

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Peer, S. (2013). The economics of trip scheduling, travel time variability and traffic information . (Doctoral Dissertation). Vrije Universiteit Amsterdam. Retrieved from http://hdl.handle.net/1871/40171

Chicago Manual of Style (16th Edition):

Peer, S. “The economics of trip scheduling, travel time variability and traffic information .” 2013. Doctoral Dissertation, Vrije Universiteit Amsterdam. Accessed November 28, 2020. http://hdl.handle.net/1871/40171.

MLA Handbook (7th Edition):

Peer, S. “The economics of trip scheduling, travel time variability and traffic information .” 2013. Web. 28 Nov 2020.

Vancouver:

Peer S. The economics of trip scheduling, travel time variability and traffic information . [Internet] [Doctoral dissertation]. Vrije Universiteit Amsterdam; 2013. [cited 2020 Nov 28]. Available from: http://hdl.handle.net/1871/40171.

Council of Science Editors:

Peer S. The economics of trip scheduling, travel time variability and traffic information . [Doctoral Dissertation]. Vrije Universiteit Amsterdam; 2013. Available from: http://hdl.handle.net/1871/40171


University of Southern California

9. Srinivasamurthy, Naveen. Compression algorithms for distributed classification with applications to distributed speech recognition.

Degree: PhD, Electrical Engineering, 2007, University of Southern California

 With wide proliferation of mobile devices and the explosion of new multimedia applications, there is a need for client-server architectures to enable low complexity/memory clients… (more)

Subjects/Keywords: distributed classification; distributed speech recognition, Joint compression and classification; DPCM; quantization; scalable encoding; scalable DPCM; product vector quantization; mutual information loss; information bottleneck

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Srinivasamurthy, N. (2007). Compression algorithms for distributed classification with applications to distributed speech recognition. (Doctoral Dissertation). University of Southern California. Retrieved from http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/482798/rec/1536

Chicago Manual of Style (16th Edition):

Srinivasamurthy, Naveen. “Compression algorithms for distributed classification with applications to distributed speech recognition.” 2007. Doctoral Dissertation, University of Southern California. Accessed November 28, 2020. http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/482798/rec/1536.

MLA Handbook (7th Edition):

Srinivasamurthy, Naveen. “Compression algorithms for distributed classification with applications to distributed speech recognition.” 2007. Web. 28 Nov 2020.

Vancouver:

Srinivasamurthy N. Compression algorithms for distributed classification with applications to distributed speech recognition. [Internet] [Doctoral dissertation]. University of Southern California; 2007. [cited 2020 Nov 28]. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/482798/rec/1536.

Council of Science Editors:

Srinivasamurthy N. Compression algorithms for distributed classification with applications to distributed speech recognition. [Doctoral Dissertation]. University of Southern California; 2007. Available from: http://digitallibrary.usc.edu/cdm/compoundobject/collection/p15799coll127/id/482798/rec/1536

10. Liu, Feiyang. Implementation and verification of the Information Bottleneck interpretation of deep neural networks.

Degree: Electrical Engineering and Computer Science (EECS), 2018, KTH

Although deep neural networks (DNNs) have made remarkable achievementsin various elds, there is still not a matching practical theory that is able toexplain DNNs'… (more)

Subjects/Keywords: The Information bottleneck method; Mutual information; Deep neural networks; Binning; nformations askhack (IB) -metoden; ömsesidig information; djupa neuronnät; binning; Electrical Engineering, Electronic Engineering, Information Engineering; Elektroteknik och elektronik

…x5B;10] is the Information bottleneck (IB) method, proposed by Tishby… …networks with the information bottleneck method by validating Tishby’s work. • Visualize… …information bottleneck method and several mutual information estimators. We will go through neural… …information bottleneck interpretation of deep neural networks, where the IB method sets an optimal… …Learning and Information Bottleneck In this chapter, we will summarize the essential background… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Liu, F. (2018). Implementation and verification of the Information Bottleneck interpretation of deep neural networks. (Thesis). KTH. Retrieved from http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-235744

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Liu, Feiyang. “Implementation and verification of the Information Bottleneck interpretation of deep neural networks.” 2018. Thesis, KTH. Accessed November 28, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-235744.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Liu, Feiyang. “Implementation and verification of the Information Bottleneck interpretation of deep neural networks.” 2018. Web. 28 Nov 2020.

Vancouver:

Liu F. Implementation and verification of the Information Bottleneck interpretation of deep neural networks. [Internet] [Thesis]. KTH; 2018. [cited 2020 Nov 28]. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-235744.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Liu F. Implementation and verification of the Information Bottleneck interpretation of deep neural networks. [Thesis]. KTH; 2018. Available from: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-235744

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

11. Sodhani, Shagun. Learning competitive ensemble of information-constrained primitives.

Degree: 2019, Université de Montréal

Subjects/Keywords: Reinforcement Learning; Hierarchical Reinforcement Learning; Information Bottleneck; Compositionality; Modular network; Apprentissage par renforcement; Apprentissage par renforcement hiérarchique; Goulot d'étranglement de l'information; Compositionnalité; Réseaux modulaires; Applied Sciences - Artificial Intelligence / Sciences appliqués et technologie - Intelligence artificielle (UMI : 0800)

information bottleneck to design an informationtheoretic objective which leads to the specialization… …Bottleneck Tishby et al. [2000] introduced the concept of Information Bottleneck as an… …thus creating a bottleneck for the information in e can be learnt by maximizing I(X, e Y… …applied the information bottleneck principle to deep neural networks and showed that it can be… …using a variational approximation to obtain a lower bound on the information bottleneck… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sodhani, S. (2019). Learning competitive ensemble of information-constrained primitives. (Thesis). Université de Montréal. Retrieved from http://hdl.handle.net/1866/22537

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sodhani, Shagun. “Learning competitive ensemble of information-constrained primitives.” 2019. Thesis, Université de Montréal. Accessed November 28, 2020. http://hdl.handle.net/1866/22537.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sodhani, Shagun. “Learning competitive ensemble of information-constrained primitives.” 2019. Web. 28 Nov 2020.

Vancouver:

Sodhani S. Learning competitive ensemble of information-constrained primitives. [Internet] [Thesis]. Université de Montréal; 2019. [cited 2020 Nov 28]. Available from: http://hdl.handle.net/1866/22537.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sodhani S. Learning competitive ensemble of information-constrained primitives. [Thesis]. Université de Montréal; 2019. Available from: http://hdl.handle.net/1866/22537

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Florida

12. Schipper, Joel Daniel. A Knowledge-Based Toxicology Consultant for Diagnosing Multiple Disorders.

Degree: PhD, Electrical and Computer Engineering, 2008, University of Florida

 Every year, toxic exposures kill twelve hundred Americans. More than half of these deaths are the result of exposures to multiple substances. In addition to… (more)

Subjects/Keywords: Databases; Expert systems; Mining; Physicians; Poisons; Reasoning; Symptomatology; Systems design; Toxicology; Toxins; acquisition, adjusted, artificial, automated, automatic, automatically, based, bottleneck, case, center, centers, clinical, computerized, consultant, consultation, contributor, contributors, control, data, database, databases, decision, diagnose, diagnoses, diagnosing, diagnosis, diagnostic, differential, discovery, disorder, disorders, drug, drugs, effect, effects, expert, exposure, exposures, fault, faults, florida, generate, generated, generation, information, intelligence, intelligent, knowledge, learning, likelihood, machine, medical, medicine, mining, multiple, poison, poisons, primary, ratio, ratios, reasoning, rule, rules, sign, signs, substance, substances, support, symptom, symptoms, system, systems, toxic, toxicology, toxin, toxins

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Schipper, J. D. (2008). A Knowledge-Based Toxicology Consultant for Diagnosing Multiple Disorders. (Doctoral Dissertation). University of Florida. Retrieved from https://ufdc.ufl.edu/UFE0021958

Chicago Manual of Style (16th Edition):

Schipper, Joel Daniel. “A Knowledge-Based Toxicology Consultant for Diagnosing Multiple Disorders.” 2008. Doctoral Dissertation, University of Florida. Accessed November 28, 2020. https://ufdc.ufl.edu/UFE0021958.

MLA Handbook (7th Edition):

Schipper, Joel Daniel. “A Knowledge-Based Toxicology Consultant for Diagnosing Multiple Disorders.” 2008. Web. 28 Nov 2020.

Vancouver:

Schipper JD. A Knowledge-Based Toxicology Consultant for Diagnosing Multiple Disorders. [Internet] [Doctoral dissertation]. University of Florida; 2008. [cited 2020 Nov 28]. Available from: https://ufdc.ufl.edu/UFE0021958.

Council of Science Editors:

Schipper JD. A Knowledge-Based Toxicology Consultant for Diagnosing Multiple Disorders. [Doctoral Dissertation]. University of Florida; 2008. Available from: https://ufdc.ufl.edu/UFE0021958

.