Umeå universitets logga

umu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Performance of RGB-D camera for different object types in greenhouse conditions
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-4600-8652
Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel; Centre for Applied Autonomous Sensor Systems, Örebro University, Örebro, Sweden.ORCID-id: 0000-0003-4685-379X
Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel.
2019 (Engelska)Ingår i: 2019 European conference on mobile robots (ECMR): conference proceedings September 4- 6, 2019 Prague Czech Republic / [ed] Libor Přeučil; Sven Behnke; Miroslav Kulich, IEEE, 2019, artikel-id 8870935Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

RGB-D cameras play an increasingly important role in localization and autonomous navigation of mobile robots. Reasonably priced commercial RGB-D cameras have recently been developed for operation in greenhouse and outdoor conditions. They can be employed for different agricultural and horticultural operations such as harvesting, weeding, pruning and phenotyping. However, the depth information extracted from the cameras varies significantly between objects and sensing conditions. This paper presents an evaluation protocol applied to a commercially available Fotonic F80 time-of-flight RGB-D camera for eight different object types. A case study of autonomous sweet pepper harvesting was used as an exemplary agricultural task. Each of the objects chosen is a possible item that an autonomous agricultural robot must detect and localize to perform well. A total of 340 rectangular regions of interests (ROI) were marked for the extraction of performance measures of point cloud density, and variability around center of mass, 30-100 ROIs per object type. An additional 570 ROIs were generated (57 manually and 513 replicated) to evaluate the repeatability and accuracy of the point cloud. A statistical analysis was performed to evaluate the significance of differences between object types. The results show that different objects have significantly different point density. Specifically metallic materials and black colored objects had significantly less point density compared to organic and other artificial materials introduced to the scene as expected. The point cloud variability measures showed no significant differences between object types, except for the metallic knife that presented significant outliers in collected measures. The accuracy and repeatability analysis showed that 1-3 cm errors are due to the the difficulty for a human to annotate the exact same area and up to ±4 cm error is due to the sensor not generating the exact same point cloud when sensing a fixed object.

Ort, förlag, år, upplaga, sidor
IEEE, 2019. artikel-id 8870935
Nyckelord [en]
agriculture, cameras, feature extraction, greenhouses, image colour analysis, image sensors, industrial robots, mobile robots, object tracking, robot vision, statistical analysis, pruning, sensing conditions, evaluation protocol, object types, autonomous sweet pepper harvesting, exemplary agricultural task, autonomous agricultural robot, ROI, point cloud density, object type, point density, black colored objects, point cloud variability measures, fixed object, greenhouse conditions, autonomous navigation, mobile robots, agricultural operations, horticultural operations, commercial RGB-D cameras, Fotonic F80 time-of-flight RGB-D camera, size 4.0 cm, size 1.0 cm to 3.0 cm, Cameras, Three-dimensional displays, Robot vision systems, End effectors, Green products
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Forskningsämne
datalogi
Identifikatorer
URN: urn:nbn:se:umu:diva-165545DOI: 10.1109/ECMR.2019.8870935ISI: 000558081900031Scopus ID: 2-s2.0-85074395978ISBN: 978-1-7281-3606-6 (tryckt)ISBN: 978-1-7281-3605-9 (digital)OAI: oai:DiVA.org:umu-165545DiVA, id: diva2:1373093
Konferens
European Conference on Mobile Robots (ECMR), Prague, Czech Republic, September 4–6, 2019.
Forskningsfinansiär
KK-stiftelsenEU, Horisont 2020, 66313Tillgänglig från: 2019-11-26 Skapad: 2019-11-26 Senast uppdaterad: 2022-08-24Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Ringdahl, OlaKurtser, Polina

Sök vidare i DiVA

Av författaren/redaktören
Ringdahl, OlaKurtser, Polina
Av organisationen
Institutionen för datavetenskap
Datorseende och robotik (autonoma system)

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 248 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf