Perception System Development for Automated Combine-to-Cart Unloading During Harvest of Grain Crops
Abstract
During harvest, the combine harvester needs to frequently unload the grain to another container to empty its hopper for the next round of crop intake. An efficient farming practice is unloading on-the-go since the combine harvester does not need to stop harvesting while unloading the material to another cart drawn by a tractor. Although desirable, unloading on-the-go requires highly skilled labor, and complete concentration, as the combine operator must simultaneously monitor the cart fullness, control the combine movement, and communicate with the tractor operator for vehicle coordination.Automation of grain unloading on-the-go process can ease operator’s burden, lower the demand for skilled labor and enhance harvesting productivity. This work is part of the Auto Unload project which is a collaboration between Purdue University and John Deere to develop a fully automated grain unloading on-the-go system. To fully automate the unloading on-the-go operation, a vision-based perception system is needed to track the grain cart location and monitor fill status inside the cart. This work develops novel tools, frameworks, and algorithms for the development and evaluation of perception systems used for automatic grain unloading applications.First, a simulation environment was developed using the Unreal Engine for the visionbased perception systems. The virtual environment is created specifically for the grain unloading scenarios with the combine and the tractor driven cart, and it simulates perception sensors including the camera and LiDAR sensor. The simulation tool is used to simulate sensor behavior in the unloading environment and to determine a preferable sensor placement for the automatic unloading system design.The fill status information extracted by the perception system is used by the controller for decision making. To investigate the impact of the perception system on the unloading control, a novel co-simulation (CoSim) framework was developed to achieve an integrated simulation enabling models of the perception system and controller to run at the same time in order to interact with each other. The proposed CoSim framework includes perception simulation modules in Unreal Engine and simulates system models and dynamics in MATLAB/Simulink. The two-way communication was established between Unreal and MATLAB/Simulink to mimic the complex interaction between perception module and control module, and between the automatic unloading system and external environment. Two simulation cases were conducted to demonstrate that CoSim can be used for both system design and system evaluation. This CoSim scheme can also be transferred to other applications given the versatility of the software packages used. To the best of author’s knowledge, the presented CoSim work was the first one demonstrating the closed-loop simulation of a complex automation task using Unreal and MATLAB/Simulink.To develop a reliable perception system, experimental validation is necessary. For the Auto Unload project, a stereo-camera based perception system was chosen as the productionintent system due to its advantages of generating high-resolution 3D images at low cost. This work proposes a novel LiDAR-based benchmark system to evaluate the in-situ performance of such combine harvester grain unloading on-the-go automation systems that incorporate stereo camera-based perception. The LiDAR-based benchmark system incorporates a dust filtering strategy to enable its use during dusty conditions. The benchmark system runs simultaneously with the stereo-camera based system to provide the benchmark data from in-field testing. Experimental results demonstrated that the proposed benchmark system provides consistent benchmark data in both clear and dusty environments.
Degree
Ph.D.
Advisors
Shaver, Purdue University.
Subject Area
Mathematics|Systems science
Off-Campus Purdue Users:
To access this dissertation, please log in to our
proxy server.