Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Crop scouting). Showing records 1 – 3 of 3 total matches.

Search Limiters

Last 2 Years | English Only

No search limiters apply to these results.

▼ Search Limiters


Kansas State University

1. Chopra, Shubh. Development of mobile applications for crop scouting with small unmanned aircraft systems.

Degree: MS, Department of Computer Science, 2017, Kansas State University

Small unmanned aircraft systems (sUAS) have been in commercial use since the1980’s and over 8-12% of its current uses are in the agricultural sector, but only involving limited uses like surveying, mapping and imaging, which is expected to increase to 47% according to AUVSI with the association of Artificial Intelligence over the next decade. Our research is one such effort to help farmers utilize advanced sUAS technology coupled with Artificial Intelligence and give them meaningful results in a widely used and user friendly interface, like a mobile application. The vision for this application is to provide a completely automated experience to the farmer for a repetitive and periodic analysis of his/her crops where all the instruction needed from the farmer is a push of a button on a one time configured application and ultimately providing results in seconds. This would help the farmer scout their crops, assess yield potential, and determine if additional inputs are needed for increasing grain yield and profit per acre. For making this application we focused on user-friendliness by abstracting crop algorithms, minimized necessary user inputs, and automate the construction of flight paths. Due to internet connection not always being available at farm fields, processing was kept to on-board compute systems and the mobile device to give live results to farmers without reliance on cloud-based analytics. The application is configured to work with DJI Aircraft using OpenCv for video processing and mobile vision, GIS and GPS data for accurate mapping, locating device, sUAS on the mobile application, and FFMPEG for encoding and decoding compressed video data. An algorithm developed by Precision-Ag Lab at the K-State Agronomy Department was implemented into the sUAS application for providing real time yield estimations and nitrogen recommendation algorithm for winter wheat. Advisors/Committee Members: Antonio R. AsebedoMitchell L. Neilsen.

Subjects/Keywords: sUAS; iOs application; Crop scouting; Mobile applications

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chopra, S. (2017). Development of mobile applications for crop scouting with small unmanned aircraft systems. (Masters Thesis). Kansas State University. Retrieved from http://hdl.handle.net/2097/35507

Chicago Manual of Style (16th Edition):

Chopra, Shubh. “Development of mobile applications for crop scouting with small unmanned aircraft systems.” 2017. Masters Thesis, Kansas State University. Accessed June 20, 2019. http://hdl.handle.net/2097/35507.

MLA Handbook (7th Edition):

Chopra, Shubh. “Development of mobile applications for crop scouting with small unmanned aircraft systems.” 2017. Web. 20 Jun 2019.

Vancouver:

Chopra S. Development of mobile applications for crop scouting with small unmanned aircraft systems. [Internet] [Masters thesis]. Kansas State University; 2017. [cited 2019 Jun 20]. Available from: http://hdl.handle.net/2097/35507.

Council of Science Editors:

Chopra S. Development of mobile applications for crop scouting with small unmanned aircraft systems. [Masters Thesis]. Kansas State University; 2017. Available from: http://hdl.handle.net/2097/35507


Kansas State University

2. Schmitz, Austin. Row crop navigation by autonomous ground vehicle for crop scouting.

Degree: MS, Department of Biological & Agricultural Engineering, 2017, Kansas State University

Robotic vehicles have the potential to play a key role in the future of agriculture. For this to happen designs that are cost effective, robust, and easy to use will be necessary. Robotic vehicles that can pest scout, monitor crop health, and potentially plant and harvest crops will provide new ways to increase production within agriculture. At this time, the use of robotic vehicles to plant and harvest crops poses many challenges including complexity and power consumption. The incorporation of small robotic vehicles for monitoring and scouting fields has the potential to allow for easier integration of robotic systems into current farming practices as the technology continues to develop. Benefits of using unmanned ground vehicles (UGVs) for crop scouting include higher resolution and real time mapping, measuring, and monitoring of pest location density, crop nutrient levels, and soil moisture levels. The focus of this research is the ability of a UGV to scout pest populations and pest patterns to complement existing scouting technology used on UAVs to capture information about nutrient and water levels. There are many challenges to integrating UGVs in conventionally planted fields of row crops including intra-row and inter-row maneuvering. For intra-row maneuvering; i.e. between two rows of corn, cost effective sensors will be needed to keep the UGV between straight rows, to follow contoured rows, and avoid local objects. Inter-row maneuvering involves navigating from long straight rows to the headlands by moving through the space between two plants in a row. Oftentimes headland rows are perpendicular to the row that the UGV is within and if the crop is corn, the spacing between plants can be as narrow as 5”. A vehicle design that minimizes or eliminates crop damage when inter-row maneuvering occurs will be very beneficial and allow for earlier integration of robotic crop scouting into conventional farming practices. Using three fixed HC-SR04 ultrasonic sensors with LabVIEW programming proved to be a cost effective, simple, solution for intra-row maneuvering of an unmanned ground vehicle through a simulated corn row. Inter-row maneuvering was accomplished by designing a transformable tracked vehicle with the two configurations of the tracks being parallel and linear. The robotic vehicle operates with tracks parallel to each other and skid steering being the method of control for traveling between rows of corn. When the robotic vehicle needs to move through narrow spaces or from one row to the next, two motors rotate the frame of the tracks to a linear configuration where one track follows the other track. In the linear configuration the vehicle has a width of 5 inches which allows it to move between corn plants in high population fields for minimally invasive maneuvers. Fleets of robotic vehicles will be required to perform scouting operations on large fields. Some robotic vehicle operations will require coordination between machines to complete the tasks assigned. Simulation of the path planning for coordination… Advisors/Committee Members: Daniel Flippo.

Subjects/Keywords: Inter-row maneuvering; Intra-row maneuvering; Robotic crop scouting; Row crop navigation; Unmanned ground vehicle; Simulated path planning

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Schmitz, A. (2017). Row crop navigation by autonomous ground vehicle for crop scouting. (Masters Thesis). Kansas State University. Retrieved from http://hdl.handle.net/2097/36237

Chicago Manual of Style (16th Edition):

Schmitz, Austin. “Row crop navigation by autonomous ground vehicle for crop scouting.” 2017. Masters Thesis, Kansas State University. Accessed June 20, 2019. http://hdl.handle.net/2097/36237.

MLA Handbook (7th Edition):

Schmitz, Austin. “Row crop navigation by autonomous ground vehicle for crop scouting.” 2017. Web. 20 Jun 2019.

Vancouver:

Schmitz A. Row crop navigation by autonomous ground vehicle for crop scouting. [Internet] [Masters thesis]. Kansas State University; 2017. [cited 2019 Jun 20]. Available from: http://hdl.handle.net/2097/36237.

Council of Science Editors:

Schmitz A. Row crop navigation by autonomous ground vehicle for crop scouting. [Masters Thesis]. Kansas State University; 2017. Available from: http://hdl.handle.net/2097/36237


University of Saskatchewan

3. Fontaine, Veronique. Development of a vision-based local positioning system for weed detection.

Degree: 2004, University of Saskatchewan

Herbicides applications could possibly be reduced if targeted. Targeting the applications requires prior identification and quantification of the weed population. This task could possibly be done by a weed scout robot. The ability to position a camera over the inter-row space of densely seeded crops will help to simplify the task of automatically quantifying weed infestations. As part of the development of an autonomous weed scout, a vision-based local positioning system for weed detection has been developed and tested in a laboratory setting. Four Line-detection algorithms have been tested and a robotic positioning device, or XYZtheta-table, was developed and tested. The Line-detection algorithms were based respectively on a stripe analysis, a blob analysis, a linear regression and the Hough Transform. The last two also included an edge-detection step. Images of parallel line patterns representing crop rows were collected at different angles, with and without weed-simulating noise. The images were processed by the four programs. The ability of the programs to determine the angle of the rows and the location of an inter-row space centreline was evaluated in a laboratory setting. All algorithms behaved approximately the same when determining the rows’ angle in the noise-free images, with a mean error of 0.5°. In the same situation, all algorithms could find the centreline of an inter-row space within 2.7 mm. Generally, the mean errors increased when noise was added to the images, up to 1.1° and 8.5 mm for the Linear Regression algorithm. Specific dispersions of the weeds were identified as possible causes of increase of the error in noisy images. Because of its insensitivity to noise, the Stripe Analysis algorithm was considered the best overall. The fastest program was the Blob Analysis algorithm with a mean processing time of 0.35 s per image. Future work involves evaluation of the Line-detection algorithms with field images. The XYZtheta-table consisted of rails allowing movement of a camera in the 3 orthogonal directions and of a rotational table that could rotate the camera about a vertical axis. The ability of the XYZtheta-table to accurately move the camera within the XY-space and rotate it at a desired angle was evaluated in a laboratory setting. The XYZtheta-table was able to move the camera within 7 mm of a target and to rotate it with a mean error of 0.07°. The positioning accuracy could be improved by simple mechanical modifications on the XYZtheta-table. Advisors/Committee Members: Crowe, Trever G., Guo, Huiqing, Bolton, Ronald J., Roberge, Martin.

Subjects/Keywords: machine vision; automation; precision agriculture; pattern recognition; agricultural applications; robotics; image processing; line detection; weed scouting; crop row detection

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Fontaine, V. (2004). Development of a vision-based local positioning system for weed detection. (Thesis). University of Saskatchewan. Retrieved from http://hdl.handle.net/10388/etd-05182004-051557

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Fontaine, Veronique. “Development of a vision-based local positioning system for weed detection.” 2004. Thesis, University of Saskatchewan. Accessed June 20, 2019. http://hdl.handle.net/10388/etd-05182004-051557.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Fontaine, Veronique. “Development of a vision-based local positioning system for weed detection.” 2004. Web. 20 Jun 2019.

Vancouver:

Fontaine V. Development of a vision-based local positioning system for weed detection. [Internet] [Thesis]. University of Saskatchewan; 2004. [cited 2019 Jun 20]. Available from: http://hdl.handle.net/10388/etd-05182004-051557.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Fontaine V. Development of a vision-based local positioning system for weed detection. [Thesis]. University of Saskatchewan; 2004. Available from: http://hdl.handle.net/10388/etd-05182004-051557

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

.