Umeå University's logo

umu.sePublikasjoner
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
In-field grape cluster size assessment for vine yield estimation using a mobile robot and a consumer level RGB-D camera
The Centre for Applied Autonomous Sensor Systems, Örebro University, Örebro, Sweden.ORCID-id: 0000-0003-4685-379X
Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.ORCID-id: 0000-0002-4600-8652
Department of Industrial Engineering and Management, Ben-Gurion University of the Negev, Beer Sheva, Israel .
The Institute of Agricultural Engineering, Agricultural Research Organization, The Volcani Center, Rishon Lezion, Israel.
Vise andre og tillknytning
2020 (engelsk)Inngår i: IEEE Robotics and Automation Letters, E-ISSN 2377-3766, Vol. 5, nr 2, s. 2031-2038Artikkel i tidsskrift (Fagfellevurdert) Published
Abstract [en]

Current practice for vine yield estimation is based on RGB cameras and has limited performance. In this paper we present a method for outdoor vine yield estimation using a consumer grade RGB-D camera mounted on a mobile robotic platform. An algorithm for automatic grape cluster size estimation using depth information is evaluated both in controlled outdoor conditions and in commercial vineyard conditions. Ten video scans (3 camera viewpoints with 2 different backgrounds and 2 natural light conditions), acquired from a controlled outdoor experiment and a commercial vineyard setup, are used for analyses. The collected dataset (GRAPES3D) is released to the public. A total of 4542 regions of 49 grape clusters were manually labeled by a human annotator for comparison. Eight variations of the algorithm are assessed, both for manually labeled and auto-detected regions. The effect of viewpoint, presence of an artificial background, and the human annotator are analyzed using statistical tools. Results show 2.8-3.5 cm average error for all acquired data and reveal the potential of using lowcost commercial RGB-D cameras for improved robotic yield estimation.

sted, utgiver, år, opplag, sider
IEEE, 2020. Vol. 5, nr 2, s. 2031-2038
Emneord [en]
Field Robots, RGB-D Perception, Agricultural Automation, Robotics in Agriculture and Forestry
HSV kategori
Forskningsprogram
data- och systemvetenskap
Identifikatorer
URN: urn:nbn:se:umu:diva-167778DOI: 10.1109/LRA.2020.2970654ISI: 000526520700001Scopus ID: 2-s2.0-85079829054OAI: oai:DiVA.org:umu-167778DiVA, id: diva2:1390934
Tilgjengelig fra: 2020-02-03 Laget: 2020-02-03 Sist oppdatert: 2024-01-17bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Forlagets fulltekstScopus

Person

Kurtser, PolinaRingdahl, Ola

Søk i DiVA

Av forfatter/redaktør
Kurtser, PolinaRingdahl, Ola
Av organisasjonen
I samme tidsskrift
IEEE Robotics and Automation Letters

Søk utenfor DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric

doi
urn-nbn
Totalt: 520 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf