Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for +publisher:"University of Arkansas" +contributor:("Michael S. Gashler"). Showing records 1 – 3 of 3 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


University of Arkansas

1. Hammer, Jon C. Enabling Usage Pattern-based Logical Status Inference for Mobile Phones.

Degree: MS, 2016, University of Arkansas

Logical statuses of mobile users, such as isBusy and isAlone, are the key enabler for a plethora of context-aware mobile applications. While on-board hardware sensors (such as motion, proximity, and location sensors) have been extensively studied for logical status inference, continuous usage typically requires formidable energy consumption, which degrades the user experience. In this thesis, we argue that smartphone usage statistics can be used for logical status inference with negligible energy cost. To validate this argument, we present a continuous inference engine that (1) intercepts multiple operating system events, in particular foreground app, notifications, screen states, and connected networks; (2) extracts informative features from OS events; and (3) efficiently infers the logical status of mobile users. The proposed inference engine is implemented for unmodified Android phones, and an evaluation on a four-week trial has shown promising accuracy in identifying four logical statuses of mobile users with over 87% accuracy while the average energy impact on the battery life is less than 0.5%. Advisors/Committee Members: Tingxin Yan, Michael S. Gashler, John Gauch.

Subjects/Keywords: Applied sciences; Logical status inference; Mobile computing; Usage statistics; Graphics and Human Computer Interfaces

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Hammer, J. C. (2016). Enabling Usage Pattern-based Logical Status Inference for Mobile Phones. (Masters Thesis). University of Arkansas. Retrieved from https://scholarworks.uark.edu/etd/1552

Chicago Manual of Style (16th Edition):

Hammer, Jon C. “Enabling Usage Pattern-based Logical Status Inference for Mobile Phones.” 2016. Masters Thesis, University of Arkansas. Accessed October 24, 2020. https://scholarworks.uark.edu/etd/1552.

MLA Handbook (7th Edition):

Hammer, Jon C. “Enabling Usage Pattern-based Logical Status Inference for Mobile Phones.” 2016. Web. 24 Oct 2020.

Vancouver:

Hammer JC. Enabling Usage Pattern-based Logical Status Inference for Mobile Phones. [Internet] [Masters thesis]. University of Arkansas; 2016. [cited 2020 Oct 24]. Available from: https://scholarworks.uark.edu/etd/1552.

Council of Science Editors:

Hammer JC. Enabling Usage Pattern-based Logical Status Inference for Mobile Phones. [Masters Thesis]. University of Arkansas; 2016. Available from: https://scholarworks.uark.edu/etd/1552


University of Arkansas

2. Smith, Josh Reeves. Investigation of How Neural Networks Learn From the Experiences of Peers Through Periodic Weight Averaging.

Degree: MSCmpE, 2017, University of Arkansas

We investigate a method, weighted average model fusion, that enables neural networks to learn from the experiences of other networks, as well as from their own experiences. This method is inspired by the the Social natural of humans, which has been shown to be one of the biggest factors in the development of our cognitive abilities. Modern machine learning has focuses predominantly on learning from direct training, and has largely ignored learning through Social engagement with peers, neural networks will the same topology. In order to explore learning through engagement with peers, we have created a way for neural networks to teach each other. Our method allows neural networks to exchange knowledge by combining their weights. It calculates a pairwise weighted average of the weights of two neural networks, and then replaces the existing weight with the new value. We find that weighted average model fusion successfully enables neural networks to learn from the experiences of their peers and combine it with the knowledge that is gained from its own individual experiences. Additionally, we explore the effects that several meta-parameters have on model fusion to provide deeper insights into how the behaves in a variety of scenarios. Advisors/Committee Members: Michael S. Gashler, Susan Gauch, Wing Ning Li.

Subjects/Keywords: Applied sciences; Machine learning; Neural networks; Weight averaging; Civic and Community Engagement

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Smith, J. R. (2017). Investigation of How Neural Networks Learn From the Experiences of Peers Through Periodic Weight Averaging. (Masters Thesis). University of Arkansas. Retrieved from https://scholarworks.uark.edu/etd/1877

Chicago Manual of Style (16th Edition):

Smith, Josh Reeves. “Investigation of How Neural Networks Learn From the Experiences of Peers Through Periodic Weight Averaging.” 2017. Masters Thesis, University of Arkansas. Accessed October 24, 2020. https://scholarworks.uark.edu/etd/1877.

MLA Handbook (7th Edition):

Smith, Josh Reeves. “Investigation of How Neural Networks Learn From the Experiences of Peers Through Periodic Weight Averaging.” 2017. Web. 24 Oct 2020.

Vancouver:

Smith JR. Investigation of How Neural Networks Learn From the Experiences of Peers Through Periodic Weight Averaging. [Internet] [Masters thesis]. University of Arkansas; 2017. [cited 2020 Oct 24]. Available from: https://scholarworks.uark.edu/etd/1877.

Council of Science Editors:

Smith JR. Investigation of How Neural Networks Learn From the Experiences of Peers Through Periodic Weight Averaging. [Masters Thesis]. University of Arkansas; 2017. Available from: https://scholarworks.uark.edu/etd/1877


University of Arkansas

3. Godfrey, Luke Benjamin. Parameterizing and Aggregating Activation Functions in Deep Neural Networks.

Degree: PhD, 2018, University of Arkansas

The nonlinear activation functions applied by each neuron in a neural network are essential for making neural networks powerful representational models. If these are omitted, even deep neural networks reduce to simple linear regression due to the fact that a linear combination of linear combinations is still a linear combination. In much of the existing literature on neural networks, just one or two activation functions are selected for the entire network, even though the use of heterogenous activation functions has been shown to produce superior results in some cases. Even less often employed are activation functions that can adapt their nonlinearities as network parameters along with standard weights and biases. This dissertation presents a collection of papers that advance the state of heterogenous and parameterized activation functions. Contributions of this dissertation include three novel parametric activation functions and applications of each, a study evaluating the utility of the parameters in parametric activation functions, an aggregated activation approach to modeling time-series data as an alternative to recurrent neural networks, and an improvement upon existing work that aggregates neuron inputs using product instead of sum. Advisors/Committee Members: Michael S. Gashler, Wing Ning Li, Xintao Wu.

Subjects/Keywords: Activation function; Deep learning; Forecasting; Machine learning; Neural network; Parametric function; Artificial Intelligence and Robotics

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Godfrey, L. B. (2018). Parameterizing and Aggregating Activation Functions in Deep Neural Networks. (Doctoral Dissertation). University of Arkansas. Retrieved from https://scholarworks.uark.edu/etd/2655

Chicago Manual of Style (16th Edition):

Godfrey, Luke Benjamin. “Parameterizing and Aggregating Activation Functions in Deep Neural Networks.” 2018. Doctoral Dissertation, University of Arkansas. Accessed October 24, 2020. https://scholarworks.uark.edu/etd/2655.

MLA Handbook (7th Edition):

Godfrey, Luke Benjamin. “Parameterizing and Aggregating Activation Functions in Deep Neural Networks.” 2018. Web. 24 Oct 2020.

Vancouver:

Godfrey LB. Parameterizing and Aggregating Activation Functions in Deep Neural Networks. [Internet] [Doctoral dissertation]. University of Arkansas; 2018. [cited 2020 Oct 24]. Available from: https://scholarworks.uark.edu/etd/2655.

Council of Science Editors:

Godfrey LB. Parameterizing and Aggregating Activation Functions in Deep Neural Networks. [Doctoral Dissertation]. University of Arkansas; 2018. Available from: https://scholarworks.uark.edu/etd/2655

.