We create partially annotated datasets from field measurements for developing models and algorithms for perception and control of forest machines using artificial intelligence, simulation, and experiments on physical testbeds. The datasets, algorithms, and trained models for object identification, 3D perception, and motion planning and control will be made publicly available through data and code-sharing repositories.
The data is recorded using forest machines and other equipment with suitable sensors operating in the forest environment. The data include the machine and crane tip position at high resolution, and event time logs (StanForD) while the vehicle operates in high-resolution laser-scanned forest areas. For annotation, the plan is to use both CAN-bus data and audiovisual data from operators that are willing to participate in the research. Also, by fusing visual perception with operator tree characteristics input or decision, we aim to develop a method for auto-annotation, facilitating a rapid increase in labeled training data for computer vision. In other activities, images of tree plants and bark are collected.
Research questions include, how to automate the process of creating annotated datasets and train models for identifying and positioning forestry objects, such as plants, tree species, logs, terrain obstacles, and do 3D reconstruction for motion planning and control? How large and varied datasets are required for the models to handle the variability in forests, weather, light conditions, etc.? Would additional synthetic data increase model inference accuracy?
In part we focus on forwarders traversing terrain, avoiding obstacles, and loading or unloading logs, with consideration for efficiency, safety, and environmental impact. We explore how to auto-generate and calibrate forestry machine simulators and automation scenario descriptions using the data recorded in the field. The demonstrated automation solutions serve as proofs-of-concept and references, important for developing commercial prototypes and for understanding what future research should focus on.
2024. artikel-id T5.10
IUFRO 2024 - XXVI IUFRO World Congress, Stockholm, Sweden, June 23-29, 2024