umu.sePublikationer
Ändra sökning
Länk till posten
Permanent länk

Direktlänk
BETA
Publikationer (10 of 36) Visa alla publikationer
Ostovar, A., Talbot, B., Puliti, S., Astrup, R. & Ringdahl, O. (2019). Detection and classification of Root and Butt-Rot (RBR) in Stumps of Norway Spruce Using RGB Images and Machine Learning. Sensors, 19(7), Article ID 1579.
Öppna denna publikation i ny flik eller fönster >>Detection and classification of Root and Butt-Rot (RBR) in Stumps of Norway Spruce Using RGB Images and Machine Learning
Visa övriga...
2019 (Engelska)Ingår i: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 19, nr 7, artikel-id 1579Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Root and butt-rot (RBR) has a significant impact on both the material and economic outcome of timber harvesting, and therewith on the individual forest owner and collectively on the forest and wood processing industries. An accurate recording of the presence of RBR during timber harvesting would enable a mapping of the location and extent of the problem, providing a basis for evaluating spread in a climate anticipated to enhance pathogenic growth in the future. Therefore, a system to automatically identify and detect the presence of RBR would constitute an important contribution to addressing the problem without increasing workload complexity for the machine operator. In this study, we developed and evaluated an approach based on RGB images to automatically detect tree stumps and classify them as to the absence or presence of rot. Furthermore, since knowledge of the extent of RBR is valuable in categorizing logs, we also classify stumps into three classes of infestation; rot = 0%, 0% < rot > 50% and rot ≥ 50%. In this work we used deep-learning approaches and conventional machine-learning algorithms for detection and classification tasks. The results showed that tree stumps were detected with precision rate of 95% and recall of 80%. Using only the correct output (TP) of the stump detector, stumps without and with RBR were correctly classified with accuracy of 83.5% and 77.5%, respectively. Classifying rot into three classes resulted in 79.4%, 72.4%, and 74.1% accuracy for stumps with rot = 0%, 0% < rot > 50% and rot ≥ 50%, respectively. With some modifications, the developed algorithm could be used either during the harvesting operation to detect RBR regions on the tree stumps or as an RBR detector for post-harvest assessment of tree stumps and logs.

Ort, förlag, år, upplaga, sidor
MDPI, 2019
Nyckelord
deep learning; forest harvesting; tree stumps; automatic detection and classification
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Forskningsämne
datoriserad bildanalys
Identifikatorer
urn:nbn:se:umu:diva-157716 (URN)10.3390/s19071579 (DOI)000465570700098 ()30939827 (PubMedID)
Projekt
PRECISION
Forskningsfinansiär
Norges forskningsråd, NFR281140
Tillgänglig från: 2019-04-01 Skapad: 2019-04-01 Senast uppdaterad: 2019-11-11Bibliografiskt granskad
Ringdahl, O., Kurtser, P. & Edan, Y. (2019). Performance of RGB-D camera for different object types in greenhouse conditions. In: Libor Přeučil, Sven Behnke, Miroslav Kulich (Ed.), 2019 European Conference on Mobile Robots (ECMR): . Paper presented at European Conference on Mobile Robots (ECMR), September 4–6, 2019, Prague, Czech Republic. IEEE
Öppna denna publikation i ny flik eller fönster >>Performance of RGB-D camera for different object types in greenhouse conditions
2019 (Engelska)Ingår i: 2019 European Conference on Mobile Robots (ECMR) / [ed] Libor Přeučil, Sven Behnke, Miroslav Kulich, IEEE, 2019Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

RGB-D cameras play an increasingly important role in localization and autonomous navigation of mobile robots. Reasonably priced commercial RGB-D cameras have recently been developed for operation in greenhouse and outdoor conditions. They can be employed for different agricultural and horticultural operations such as harvesting, weeding, pruning and phenotyping. However, the depth information extracted from the cameras varies significantly between objects and sensing conditions. This paper presents an evaluation protocol applied to a commercially available Fotonic F80 time-of-flight RGB-D camera for eight different object types. A case study of autonomous sweet pepper harvesting was used as an exemplary agricultural task. Each of the objects chosen is a possible item that an autonomous agricultural robot must detect and localize to perform well. A total of 340 rectangular regions of interests (ROI) were marked for the extraction of performance measures of point cloud density, and variability around center of mass, 30-100 ROIs per object type. An additional 570 ROIs were generated (57 manually and 513 replicated) to evaluate the repeatability and accuracy of the point cloud. A statistical analysis was performed to evaluate the significance of differences between object types. The results show that different objects have significantly different point density. Specifically metallic materials and black colored objects had significantly less point density compared to organic and other artificial materials introduced to the scene as expected. The point cloud variability measures showed no significant differences between object types, except for the metallic knife that presented significant outliers in collected measures. The accuracy and repeatability analysis showed that 1-3 cm errors are due to the the difficulty for a human to annotate the exact same area and up to ±4 cm error is due to the sensor not generating the exact same point cloud when sensing a fixed object.

Ort, förlag, år, upplaga, sidor
IEEE, 2019
Nyckelord
agriculture, cameras, feature extraction, greenhouses, image colour analysis, image sensors, industrial robots, mobile robots, object tracking, robot vision, statistical analysis, pruning, sensing conditions, evaluation protocol, object types, autonomous sweet pepper harvesting, exemplary agricultural task, autonomous agricultural robot, ROI, point cloud density, object type, point density, black colored objects, point cloud variability measures, fixed object, greenhouse conditions, autonomous navigation, mobile robots, agricultural operations, horticultural operations, commercial RGB-D cameras, Fotonic F80 time-of-flight RGB-D camera, size 4.0 cm, size 1.0 cm to 3.0 cm, Cameras, Three-dimensional displays, Robot vision systems, End effectors, Green products
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Forskningsämne
datalogi
Identifikatorer
urn:nbn:se:umu:diva-165545 (URN)10.1109/ECMR.2019.8870935 (DOI)2-s2.0-85074395978 (Scopus ID)
Konferens
European Conference on Mobile Robots (ECMR), September 4–6, 2019, Prague, Czech Republic
Forskningsfinansiär
KK-stiftelsenEU, Horisont 2020, 66313
Tillgänglig från: 2019-11-26 Skapad: 2019-11-26 Senast uppdaterad: 2019-11-27Bibliografiskt granskad
Ringdahl, O., Ostovar, A., Talbot, B., Puliti, S. & Rasmus, A. (2019). Using RGB images and machine learning to detect and classify Root and Butt-Rot (RBR) in stumps of Norway spruce. In: Forest Operations in Response to Environmental Challenges: . Paper presented at NB Nord Conference: Forest Operations in Response to Environmental Challenges, Honne, Norway, June 3-5, 2019..
Öppna denna publikation i ny flik eller fönster >>Using RGB images and machine learning to detect and classify Root and Butt-Rot (RBR) in stumps of Norway spruce
Visa övriga...
2019 (Engelska)Ingår i: Forest Operations in Response to Environmental Challenges, 2019Konferensbidrag, Muntlig presentation med publicerat abstract (Refereegranskat)
Abstract [en]

Root and butt-rot (RBR) has a significant impact on both the material and economic outcome of timber harvesting. An accurate recording of the presence of RBR during timber harvesting would enable a mapping of the location and extent of the problem, providing a basis for evaluating spread in a climate anticipated to enhance pathogenic growth in the future. Therefore, a system to automatically identify and detect the presence of RBR would constitute an important contribution in addressing the problem without increasing workload complexity for the machine operator. In this study we developed and evaluated an approach based on RGB images to automatically detect tree-stumps and classify them as to the absence or presence of rot. Furthermore, since knowledge of the extent of RBR is valuable in categorizing logs, we also classify stumps to three classes of infestation; rot = 0%, 0% < rot < 50% and rot ≥50%. We used deep learning approaches and conventional machine learning algorithms for detection and classification tasks. The results showed that tree-stumps were detected with precision rate of 95% and recall of 80%. Stumps without and with root and butt-rot were correctly classified with accuracy of 83.5% and 77.5%. Classifying rot into three classes resulted in 79.4%, 72.4% and 74.1% accuracy respectively. With some modifications, the algorithm developed could be used either during the harvesting operation to detect RBR regions on the tree-stumps or as a RBR detector for post-harvest assessment of tree-stumps and logs.

Nationell ämneskategori
Skogsvetenskap Robotteknik och automation Signalbehandling Datorseende och robotik (autonoma system)
Forskningsämne
data- och systemvetenskap
Identifikatorer
urn:nbn:se:umu:diva-159977 (URN)
Konferens
NB Nord Conference: Forest Operations in Response to Environmental Challenges, Honne, Norway, June 3-5, 2019.
Forskningsfinansiär
Norges forskningsråd, NFR281140
Tillgänglig från: 2019-06-11 Skapad: 2019-06-11 Senast uppdaterad: 2019-06-25
Ostovar, A., Ringdahl, O. & Hellström, T. (2018). Adaptive Image Thresholding of Yellow Peppers for a Harvesting Robot. Robotics, 7(1), Article ID 11.
Öppna denna publikation i ny flik eller fönster >>Adaptive Image Thresholding of Yellow Peppers for a Harvesting Robot
2018 (Engelska)Ingår i: Robotics, E-ISSN 2218-6581, Vol. 7, nr 1, artikel-id 11Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

The presented work is part of the H2020 project SWEEPER with the overall goal to develop a sweet pepper harvesting robot for use in greenhouses. As part of the solution, visual servoing is used to direct the manipulator towards the fruit. This requires accurate and stable fruit detection based on video images. To segment an image into background and foreground, thresholding techniques are commonly used. The varying illumination conditions in the unstructured greenhouse environment often cause shadows and overexposure. Furthermore, the color of the fruits to be harvested varies over the season. All this makes it sub-optimal to use fixed pre-selected thresholds. In this paper we suggest an adaptive image-dependent thresholding method. A variant of reinforcement learning (RL) is used with a reward function that computes the similarity between the segmented image and the labeled image to give feedback for action selection. The RL-based approach requires less computational resources than exhaustive search, which is used as a benchmark, and results in higher performance compared to a Lipschitzian based optimization approach. The proposed method also requires fewer labeled images compared to other methods. Several exploration-exploitation strategies are compared, and the results indicate that the Decaying Epsilon-Greedy algorithm gives highest performance for this task. The highest performance with the Epsilon-Greedy algorithm ( ϵ = 0.7) reached 87% of the performance achieved by exhaustive search, with 50% fewer iterations than the benchmark. The performance increased to 91.5% using Decaying Epsilon-Greedy algorithm, with 73% less number of iterations than the benchmark.

Ort, förlag, år, upplaga, sidor
MDPI, 2018
Nyckelord
reinforcement learning, Q-Learning, image thresholding, ϵ-greedy strategies
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Forskningsämne
datoriserad bildanalys
Identifikatorer
urn:nbn:se:umu:diva-144513 (URN)10.3390/robotics7010011 (DOI)000432680200008 ()
Forskningsfinansiär
EU, Horisont 2020, 644313
Tillgänglig från: 2018-02-05 Skapad: 2018-02-05 Senast uppdaterad: 2019-11-11Bibliografiskt granskad
Ringdahl, O., Kurtser, P. & Edan, Y. (2017). Strategies for selecting best approach direction for a sweet-pepper harvesting robot. In: Yang Gao, Saber Fallah, Yaochu Jin, Constantina Lekakou (Ed.), Yang Gao, Saber Fallah, Yaochu Jin, Constantina Lekakou (Ed.), Towards Autonomous Robotic Systems: 18th Annual Conference, TAROS 2017, Guildford, UK, July 19–21, 2017, Proceedings. Paper presented at TAROS 2017: the 18th Towards Autonomous Robotic Systems (TAROS) Conference, University of Surrey, Guildford, UK, July 19–21, 2017 (pp. 516-525). Cham: Springer
Öppna denna publikation i ny flik eller fönster >>Strategies for selecting best approach direction for a sweet-pepper harvesting robot
2017 (Engelska)Ingår i: Towards Autonomous Robotic Systems: 18th Annual Conference, TAROS 2017, Guildford, UK, July 19–21, 2017, Proceedings / [ed] Yang Gao, Saber Fallah, Yaochu Jin, Constantina Lekakou, Cham: Springer, 2017, s. 516-525Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

An autonomous sweet pepper harvesting robot must perform several tasks to successfully harvest a fruit. Due to the highly unstructured environment in which the robot operates and the presence of occlusions, the current challenges are to improve the detection rate and lower the risk of losing sight of the fruit while approaching the fruit for harvest. Therefore, it is crucial to choose the best approach direction with least occlusion from obstacles.

The value of ideal information regarding the best approach direction was evaluated by comparing it to a method attempting several directions until successful harvesting is performed. A laboratory experiment was conducted on artificial sweet pepper plants using a system based on eye-in-hand configuration comprising a 6DOF robotic manipulator equipped with an RGB camera. The performance is evaluated in laboratorial conditions using both descriptive statistics of the average harvesting times and harvesting success as well as regression models. The results show roughly 40–45% increase in average harvest time when no a-priori information of the correct harvesting direction is available with a nearly linear increase in overall harvesting time for each failed harvesting attempt. The variability of the harvesting times grows with the number of approaches required, causing lower ability to predict them.

Tests show that occlusion of the front of the peppers significantly impacts the harvesting times. The major reason for this is the limited workspace of the robot often making the paths to positions to the side of the peppers significantly longer than to positions in front of the fruit which is more open.

Ort, förlag, år, upplaga, sidor
Cham: Springer, 2017
Serie
Lecture Notes in Computer Science : Lecture Notes in Artificial Intelligence, ISSN 0302-9743, E-ISSN 1611-3349 ; 10454
Nationell ämneskategori
Datorseende och robotik (autonoma system)
Forskningsämne
data- och systemvetenskap
Identifikatorer
urn:nbn:se:umu:diva-138381 (URN)10.1007/978-3-319-64107-2_41 (DOI)000453217100041 ()978-3-319-64107-2 (ISBN)978-3-319-64106-5 (ISBN)
Konferens
TAROS 2017: the 18th Towards Autonomous Robotic Systems (TAROS) Conference, University of Surrey, Guildford, UK, July 19–21, 2017
Forskningsfinansiär
EU, Horisont 2020, 66313
Tillgänglig från: 2017-08-21 Skapad: 2017-08-21 Senast uppdaterad: 2019-01-08Bibliografiskt granskad
Ostovar, A., Hellström, T. & Ringdahl, O. (2016). Human Detection Based on Infrared Images in Forestry Environments. In: Image Analysis and Recognition (ICIAR 2016): 13th International Conference, ICIAR 2016, in Memory of Mohamed Kamel, Póvoa de Varzim, Portugal, July 13-15, 2016, Proceedings. Paper presented at 13th International Conference on Image Analysis and Recognition, ICIAR 2016, July 13-15, 2016, Póvoa de Varzim, Portugal (pp. 175-182).
Öppna denna publikation i ny flik eller fönster >>Human Detection Based on Infrared Images in Forestry Environments
2016 (Engelska)Ingår i: Image Analysis and Recognition (ICIAR 2016): 13th International Conference, ICIAR 2016, in Memory of Mohamed Kamel, Póvoa de Varzim, Portugal, July 13-15, 2016, Proceedings, 2016, s. 175-182Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

It is essential to have a reliable system to detect humans in close range of forestry machines to stop cutting or carrying operations to prohibit any harm to humans. Due to the lighting conditions and high occlusion from the vegetation, human detection using RGB cameras is difficult. This paper introduces two human detection methods in forestry environments using a thermal camera; one shape-dependent and one shape-independent approach. Our segmentation algorithm estimates location of the human by extracting vertical and horizontal borders of regions of interest (ROIs). Based on segmentation results, features such as ratio of height to width and location of the hottest spot are extracted for the shape-dependent method. For the shape-independent method all extracted ROI are resized to the same size, then the pixel values (temperatures) are used as a set of features. The features from both methods are fed into different classifiers and the results are evaluated using side-accuracy and side-efficiency. The results show that by using shape-independent features, based on three consecutive frames, we reach a precision rate of 80 % and recall of 76 %.

Serie
Lecture Notes in Computer Science, ISSN 0302-9743 ; 9730
Nyckelord
Human detection, Thermal images, Shape-dependent, Shape-independent, Side-accuracy, Side-efficiency
Nationell ämneskategori
Robotteknik och automation
Identifikatorer
urn:nbn:se:umu:diva-124428 (URN)10.1007/978-3-319-41501-7_20 (DOI)000386604000020 ()978-3-319-41501-7 (ISBN)978-3-319-41500-0 (ISBN)
Konferens
13th International Conference on Image Analysis and Recognition, ICIAR 2016, July 13-15, 2016, Póvoa de Varzim, Portugal
Tillgänglig från: 2016-08-10 Skapad: 2016-08-10 Senast uppdaterad: 2019-11-11Bibliografiskt granskad
Ringdahl, O., Kurtser, P., Barth, R. & Edan, Y. (2016). Operational flow of an autonomous sweetpepper harvesting robot. In: : . Paper presented at The 5th Israeli Conference on Robotics 2016, 13-14 April 2016, Air Force Conference Center Hertzilya, Israel.
Öppna denna publikation i ny flik eller fönster >>Operational flow of an autonomous sweetpepper harvesting robot
2016 (Engelska)Konferensbidrag, Muntlig presentation med publicerat abstract (Refereegranskat)
Abstract [en]

Advanced automation is required for greenhouse production systems due to the lack of skilled workforce and increasing labour costs [1]. As part of the EU project SWEEPER, we are working on developing an autonomous robot able to harvest sweet pepper fruits in greenhouses. This paper focuses on the operational flow of the robot for the high level task planning.

In the SWEEPER project, an RGB camera is mounted on the end effector to detect fruits. Due to the dense plant rows, the camera is located at a maximum of 40 cm from the plants and hence cannot provide an overview of all fruit locations. Only a few ripe fruits at each acquisition can be seen. This implies that the robot must incorporate a search pattern to look for fruits. When at least one fruit has been detected in the image, the search is aborted and a harvesting phase is initiated. The phase starts with directing the manipulator to a point close to the fruit and then activating a visual servo control loop. This motion approach ensures that the fruit is grasped despite the occlusions caused by the stems and leaves. When the manipulator has reached the fruit, it is harvested and automatically released into a container. If there are more fruits that have already been detected, the system continues to pick them. When all detected fruits have been harvested, the system resumes the search pattern again. When the search pattern is finished and no more fruits are detected, the robot base is advanced along the row to the next plant and repeats the operations above.

To support implementation of the workflow into a program controlling the actual robot, a generic software framework for development of agricultural and forestry robots was used [2]. The framework is constructed with a hybrid robot architecture, using a state machine implementing the following flowchart.

Nationell ämneskategori
Datorseende och robotik (autonoma system)
Forskningsämne
data- och systemvetenskap
Identifikatorer
urn:nbn:se:umu:diva-121333 (URN)
Konferens
The 5th Israeli Conference on Robotics 2016, 13-14 April 2016, Air Force Conference Center Hertzilya, Israel
Forskningsfinansiär
EU, Horisont 2020, 66313
Tillgänglig från: 2016-05-31 Skapad: 2016-05-31 Senast uppdaterad: 2018-08-10
Bontsema, J., Hemming, J., Pekkeriet, E., Saeys, W., Edan, Y., Shapiro, A., . . . Ringdahl, O. (2015). CROPS: Clever Robots for Crops. Engineering & Technology Reference, 1(1)
Öppna denna publikation i ny flik eller fönster >>CROPS: Clever Robots for Crops
Visa övriga...
2015 (Engelska)Ingår i: Engineering & Technology Reference, ISSN 2056-4007, Vol. 1, nr 1Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

In the EU-funded CROPS project robots are developed for site-specific spraying and selective harvesting of fruit and fruit vegetables. The robots are being designed to harvest crops, such as greenhouse vegetables, apples, grapes and for canopy spraying in orchards and for precision target spraying in grape vines. Attention is paid to the detection of obstacles for autonomous navigation in a safe way in plantations and forests. For the different applications, platforms were built. Sensing systems and vision algorithms have been developed. For software the Robot Operating System is used. A 9 degrees of freedom manipulator was designed and tested for sweet-pepper harvesting, apple harvesting and in close range spraying. For the applications different end-effectors were designed and tested. For sweet pepper a platform that can move in between the crop rows on the common greenhouse rail system which also serves as heating pipes was built. The apple harvesting platform is based on a current mechanical grape harvester. In discussion with growers so-called ‘walls of fruit trees’ have been designed which bring robots closer to the practice. A canopy-optimised sprayer has been designed as a trailed sprayer with a centrifugal blower. All the applications have been tested under practical conditions.

Nationell ämneskategori
Datorseende och robotik (autonoma system)
Identifikatorer
urn:nbn:se:umu:diva-108099 (URN)10.1049/etr.2015.0015 (DOI)
Forskningsfinansiär
EU, FP7, Sjunde ramprogrammet, 246252
Tillgänglig från: 2015-09-03 Skapad: 2015-09-03 Senast uppdaterad: 2018-06-07Bibliografiskt granskad
Lindroos, O., Ringdahl, O., Pedro, L. H., Hohnloser, P. & Hellström, T. (2015). Estimating the position of the harvester head: a key step towards the precision forestry of the future?. Croatian Journal of Forest Engineering, 36(2), 147-164
Öppna denna publikation i ny flik eller fönster >>Estimating the position of the harvester head: a key step towards the precision forestry of the future?
Visa övriga...
2015 (Engelska)Ingår i: Croatian Journal of Forest Engineering, ISSN 1845-5719, E-ISSN 1848-9672, Vol. 36, nr 2, s. 147-164Artikel i tidskrift (Refereegranskat) Published
Abstract [en]

Modern harvesters are technologically sophisticated, with many useful features such as the ability to automatically measure stem diameters and lengths. This information is processed in real time to support value optimization when cutting stems into logs. It can also be transferred from the harvesters to centralized systems and used for wood supply management. Such information management systems have been available since the 1990s in Sweden and Finland, and are constantly being upgraded. However, data on the position of the harvester head relative to the machine are generally not recorded during harvesting. The routine acquisition and analysis of such data could offer several opportunities to improve forestry operations and related processes in the future. Here, we analyze the possible benefits of having this information, as well as the steps required to collect and process it. The benefits and drawbacks of different sensing technologies are discussed in terms of potential applications, accuracy and cost. We also present the results of preliminary testing using two of the proposed methods. Our analysis indicates that an improved scope for mapping and controlling machine movement is the main benefit that is directly related to the conduct of forestry operations. In addition, there are important indirect benefits relating to ecological mapping. Our analysis suggests that both of these benefits can be realized by measuring the angles of crane joints or the locations of crane segments and using the resulting information to compute the head's position. In keeping with our findings, two companies have recently introduced sensor equipped crane solutions.

Ort, förlag, år, upplaga, sidor
Zagreb, Croatia: Croatian Journal of Forest Engineering, 2015
Nyckelord
boom tip control, automation, ALS, sensors, harvester data
Nationell ämneskategori
Skogsvetenskap Robotteknik och automation Datorseende och robotik (autonoma system)
Forskningsämne
data- och systemvetenskap
Identifikatorer
urn:nbn:se:umu:diva-109881 (URN)000363907900001 ()
Tillgänglig från: 2015-10-08 Skapad: 2015-10-08 Senast uppdaterad: 2018-06-07Bibliografiskt granskad
Bontsema, J., Hemming, J., Pekkeriet, E., Saeys, W., Edan, Y., Shapiro, A., . . . Ringdahl, O. (2014). CROPS: high tech agricultural robots. In: : . Paper presented at AgEng, International Conference of Agricultural Engineering, Zurich, July 6-10, 2014. The European Society of Agricultural Engineers
Öppna denna publikation i ny flik eller fönster >>CROPS: high tech agricultural robots
Visa övriga...
2014 (Engelska)Konferensbidrag, Publicerat paper (Övrigt vetenskapligt)
Ort, förlag, år, upplaga, sidor
The European Society of Agricultural Engineers, 2014
Nationell ämneskategori
Robotteknik och automation
Forskningsämne
data- och systemvetenskap
Identifikatorer
urn:nbn:se:umu:diva-95232 (URN)
Konferens
AgEng, International Conference of Agricultural Engineering, Zurich, July 6-10, 2014
Projekt
Crops
Forskningsfinansiär
EU, FP7, Sjunde ramprogrammet, 246252
Tillgänglig från: 2014-10-24 Skapad: 2014-10-24 Senast uppdaterad: 2019-06-25Bibliografiskt granskad
Organisationer
Identifikatorer
ORCID-id: ORCID iD iconorcid.org/0000-0002-4600-8652

Sök vidare i DiVA

Visa alla publikationer