Skip to main content

A pollinating nanodrone

29 April '22

One of the main challenges to achieving sustainable food production on Mars is to develop an innovative approach to pollinate crops. On Mars, relying on biological agents such as bee colonies is undesirable or even impossible due to the difficulties in setting up the ecosystems. Rather, a more promising approach is to pollinate crops using autonomous machines. In the SpaceBakery project, Magics Technology leverages their expertise in developing robotic systems and machine learning algorithms to build a prototype of robotic pollinators that can locate and approach crops without any manual intervention. To this end, nanodrones have been selected as the most suitable robot due to their versatility and small form factor that enable navigation through the crops and narrow spaces.

Drone

Problem Definition

To demonstrate a solution to this problem, a scenario is defined where a nanodrone is tasked to pollinate sunflowers. The flowers are initially placed out of sight and the nanodrone has to locate the flowers, approach them, and then pollinate one flower at a time. Pollination is said to be successful when the nanodrone touches the flower with its pollination probe and retracts afterwards. This scenario requires that all sensing and processing operations be performed on board.

Nanodrone

Solution & Achievement

The hardware consists of a nanodrone platform which measures 20 cm in diameter (see figure below). On top, a neural network processor equipped with a camera and a distance sensor is mounted. The software consists of two main components, namely computer vision and flight controller. The computer vision component receives an image from the camera and computes the location of the flowers which are then passed to the flight controller. The controller then performs different flying behaviors from searching, approaching, to retracting based on the input from the computer vision.

We leverage computer vision techniques to detect the sunflower as well as its stamen. Specifically, we train two neural networks for sunflower and stamen detection, respectively. We use a tool based on blender, a 3D engine, to render synthetic training images. Thanks to the usage of the 3D model of sunflowers, the synthetic images can look quite realistic, please see the figures below for details. The green box indicates the location of the sunflower. This tool can also provide the coordinates of the sunflower on images, which can be used for neural network training.

Sunflowers

The flight controller performs visual servoing, that is, controlling the drone position and orientation based on the flower location. Each propeller can be individually controlled to a desired speed to achieve a certain behavior. To successfully pollinate a flower, the drone performs a sequence of behaviors as illustrated in the figure below. Firstly, the drone takes off and searches for the flower. The searching behavior consists of rotation and increasing or decreasing the altitude. Once the detector finds and locates the sunflower, the drone gradually approaches it. The detected coordinates (of the sunflower) serve as guidance to the flight controller, where it adjusts the drone’s position so that the sunflower is always in the center in the camera view. Once the sunflower is close enough, the stamen detector is activated. It provides the location of the stamen, rather than the entire sunflower, for the flight controller.

The result of this solution is promising as the drone can robustly locate and pollinate sunflowers placed in diverse backgrounds and from different takeoff locations. We believe our algorithm can tackle more complex scenarios once the processing chip is more powerful as well as the drone has lower power consumption. For deployment in a realistic scenario, an external global navigation system might be used to help the nanodrones locate itself within the farm.

Nanodrone to sunflower

Currently, our algorithm can tackle single and double flowers scenario. For single sunflower scenario, as illustrated in the above figure, the sequence is: 1) takeoff, 2) search, 3) approach guided by the detected sunflower, 4) approach and touch the stamen guided by the detected stamen and 5) move backwards. For the double sunflower scenario, the drone pollinates the flowers one by one. The procedure is similar to pollination of single flower, however, an additional algorithm to distinguish the two flowers is added. This algorithm enables the flight controller to approach and pollinate the flowers individually even when they look identical. 

Nanodrone ASIC

The Nanodrone ASIC

The further improve the compute performance and efficiency of the nanodrone, we have developed a customized edge AI dataflow processor for this application. The nanodrone ASIC is an edge-ai chip which capable of real-time inference of AI-models on camera images for robot applications. To acquire sensory information from its environment it has been equipped with common industry standard interfaces like SPI, I2C, UART, Camera Parallel Interface, SDIO and I2S. The chip can perform actions based on this sensory information using GPIOs, which can act as simple input, output or a PWM output to control actuators and motors. To speed up the inference of edge-ai models, the chip has been equipped with an accelerator which is capable to run common neural network operations (FC, CNN, Pooling, Activation Functions, Residual Connections, …) efficiently. A system-level diagram of the chip is shown in the figure below.

Summary

We believe that by combining energy-efficient yet powerful AI hardware and advanced computer vision algorithms, we can truly automate many labor-intensive tasks in agriculture such as inspection and pollination, hence improving crop yield whiling reducing production costs. This is beneficial for the sustainability of our agriculture industry. It is even more evidential, that if by any chance, our humans would become an interplanetary species and wanting to set foot on Mars, intelligent robots will be our best companion to colonize the unknown worlds.

Share this news article