Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Detection and classification of Root and Butt-Rot (RBR) in Stumps of Norway Spruce Using RGB Images and Machine Learning
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap. Division of Forestry and Forest Resources, Norwegian Institute of Bioeconomy Research (NIBIO), P.O. Box 115, 1431 Ås, Norway.ORCID-id: 0000-0003-0830-5303
Visa övriga samt affilieringar
2019 (Engelska)Ingår i: Sensors, E-ISSN 1424-8220, Vol. 19, nr 7, artikel-id 1579Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Root and butt-rot (RBR) has a significant impact on both the material and economic outcome of timber harvesting, and therewith on the individual forest owner and collectively on the forest and wood processing industries. An accurate recording of the presence of RBR during timber harvesting would enable a mapping of the location and extent of the problem, providing a basis for evaluating spread in a climate anticipated to enhance pathogenic growth in the future. Therefore, a system to automatically identify and detect the presence of RBR would constitute an important contribution to addressing the problem without increasing workload complexity for the machine operator. In this study, we developed and evaluated an approach based on RGB images to automatically detect tree stumps and classify them as to the absence or presence of rot. Furthermore, since knowledge of the extent of RBR is valuable in categorizing logs, we also classify stumps into three classes of infestation; rot = 0%, 0% < rot > 50% and rot ≥ 50%. In this work we used deep-learning approaches and conventional machine-learning algorithms for detection and classification tasks. The results showed that tree stumps were detected with precision rate of 95% and recall of 80%. Using only the correct output (TP) of the stump detector, stumps without and with RBR were correctly classified with accuracy of 83.5% and 77.5%, respectively. Classifying rot into three classes resulted in 79.4%, 72.4%, and 74.1% accuracy for stumps with rot = 0%, 0% < rot > 50% and rot ≥ 50%, respectively. With some modifications, the developed algorithm could be used either during the harvesting operation to detect RBR regions on the tree stumps or as an RBR detector for post-harvest assessment of tree stumps and logs.

Ort, förlag, år, upplaga, sidor
MDPI, 2019. Vol. 19, nr 7, artikel-id 1579
Nyckelord [en]
deep learning; forest harvesting; tree stumps; automatic detection and classification
Nationell ämneskategori
Datorgrafik och datorseende
Forskningsämne
datoriserad bildanalys
Identifikatorer
URN: urn:nbn:se:umu:diva-157716DOI: 10.3390/s19071579ISI: 000465570700098PubMedID: 30939827Scopus ID: 2-s2.0-85064193099OAI: oai:DiVA.org:umu-157716DiVA, id: diva2:1301359
Projekt
PRECISION
Forskningsfinansiär
Norges forskningsråd, NFR281140Tillgänglig från: 2019-04-01 Skapad: 2019-04-01 Senast uppdaterad: 2025-02-07Bibliografiskt granskad
Ingår i avhandling
1. Object Detection and Recognition in Unstructured Outdoor Environments
Öppna denna publikation i ny flik eller fönster >>Object Detection and Recognition in Unstructured Outdoor Environments
2019 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

Computer vision and machine learning based systems are often developed to replace humans in harsh, dangerous, or tedious situations, as well as to reduce the required time to accomplish a task. Another goal is to increase performance by introducing automation to tasks such as inspections in manufacturing applications, sorting timber during harvesting, surveillance, fruit grading, yield prediction, and harvesting operations.Depending on the task, a variety of object detection and recognition algorithms can be applied, including both conventional and deep learning based approaches. Moreover, within the process of developing image analysis algorithms, it is essential to consider environmental challenges, e.g. illumination changes, occlusion, shadows, and divergence in colour, shape, texture, and size of objects.

The goal of this thesis is to address these challenges to support development of autonomous agricultural and forestry systems with enhanced performance and reduced need for human involvement.This thesis provides algorithms and techniques based on adaptive image segmentation for tree detection in forest environment and also yellow pepper recognition in greenhouses. For segmentation, seed point generation and a region growing method was used to detect trees. An algorithm based on reinforcement learning was developed to detect yellow peppers. RGB and depth data was integrated and used in classifiers to detect trees, bushes, stones, and humans in forest environments. Another part of the thesis describe deep learning based approaches to detect stumps and classify the level of rot based on images.

Another major contribution of this thesis is a method using infrared images to detect humans in forest environments. To detect humans, one shape-dependent and one shape-independent method were proposed.

Algorithms to recognize the intention of humans based on hand gestures were also developed. 3D hand gestures were recognized by first detecting and tracking hands in a sequence of depth images, and then utilizing optical flow constraint equations.

The thesis also presents methods to answer human queries about objects and their spatial relation in images. The solution was developed by merging a deep learning based method for object detection and recognition with natural language processing techniques.

Ort, förlag, år, upplaga, sidor
Umeå: Umeå University, 2019. s. 88
Serie
Report / UMINF, ISSN 0348-0542 ; 19.08
Nyckelord
Computer vision, Deep Learning, Harvesting Robots, Automatic Detection and Recognition
Nationell ämneskategori
Datorgrafik och datorseende
Forskningsämne
datalogi
Identifikatorer
urn:nbn:se:umu:diva-165069 (URN)978-91-7855-147-7 (ISBN)
Disputation
2019-12-05, MA121, MIT Building, Umeå, 13:00 (Engelska)
Opponent
Handledare
Tillgänglig från: 2019-11-14 Skapad: 2019-11-08 Senast uppdaterad: 2025-02-07Bibliografiskt granskad

Open Access i DiVA

fulltext(12726 kB)363 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 12726 kBChecksumma SHA-512
ef253ade8b81efeee773c78e8d6b78506fe8c1b2069c55b2690986cdc26706c465c11147a22efdda8b57e923b2d5a0150b0aa056c3f787ca5ad0aa4634278a6e
Typ fulltextMimetyp application/pdf

Övriga länkar

Förlagets fulltextPubMedScopus

Person

Ostovar, AhmadRingdahl, Ola

Sök vidare i DiVA

Av författaren/redaktören
Ostovar, AhmadRingdahl, Ola
Av organisationen
Institutionen för datavetenskap
I samma tidskrift
Sensors
Datorgrafik och datorseende

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 363 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

doi
pubmed
urn-nbn

Altmetricpoäng

doi
pubmed
urn-nbn
Totalt: 1116 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf