Farmaid: Plant Disease Detection Robot

Robot that drives around autonomously in greenhouse environment and identifies diseases.

Inspired by the work of plantvillage.psu.edu and iita.org, we wanted to use the DonkeyCar platform to build a autonomous robot that can move in a farm environment without damaging existing plants or soil and use object detection to find and mark diseased crops with an environmentally safe color. Traditionally, humans have to manually inspect large farms using their phones to mark the crops, in most high tech cases. This takes a lot of time and effort. Additionally, there are a variety of phones being used that don't necessarily have all the features required to do the task efficiently or they have to wait for someone with the proper device. A uniform robotic platform going around the farm will solve these problems and make the marking much faster. The speed can also make it easier to share the platform between multiple farms.

Challenges:

  • Keeping the size/weight of the robot small enough that it doesn't damage the crops itself.
  • Navigating without damaging existing crops.
  • Finding a way to safely mark diseased crops.
  • Finding a dataset and farm to possibly test the platform

Our Teamato team came together as a result of the fact that we are all members of the Detroit Autonomous Vehicle Group and the Ann Arbor Autonomous Vehicle Group. These are both Meetup groups. Our team member Sohaib entered the challenge with the above concept and created a post asking if anyone was interested in participating. Alex, Juanito, and David joined with Sohaib and so began a common quest among individuals that had never worked together before. Beyond finding common ground on approach, tech, timing, etc. we had to lay down a framework of meeting schedules, repositories, conferencing tech, and so on. Essentially, all of the components that go into a professional project had to be put in place, except no one was getting paid, we had no budget, and all had work, school, family, etc. commitments. Not a problem as we shared a mutual vision and the will to execute. Interestingly our group of four individuals represented an international community. Each member of our team was multi-lingual and had direct family ties to one or more of the following: China, Germany, Pakistan, Philippines, Russia.We all had a great time and it was an amazing learning experience.

Building the Robot:

Working on the chassis, autonomous navigation, and image classifiction began imeediately and progressed at a good pace. Where we ran into major unexpected challenges and delays related to our chassis and drive system. Simply put we did not anticipate such varying terrains among the test greenhouses, and motors, wheels, wiring, controls, etc. that were fine in scenario A were overwhelmed in scenario B. We went through a large number of mods to dial-in a workable chassis for all of our environments. We had to make a lot of time and budget constraints but the end product exceeded our initial goal of a minimum viable configuration. The final design at the time of submission is described below.

Camera Pole:

To be able to look at raised beds of plants and potentially upgrade to a moving camera that could look at the top and bottom of tomato plants, we built a camera pole using a carbon fiber rod bought from a garage sale. The rod was fitted with 2 3D-printed clamps for the navigation and classification cameras. We also added 1.2v solar lighting to the pole, as well as, 12v multicolor status lights on top of the pool. Yes, that is a repurposed pill container painted black on top of the pole. One of our many zero-based budget accomodations that worked just great!

The cameras were Raspberry Pi Cameras attached to two different Pis powered by USB chargers. The reason for using 2 Pi's is that both classification and navigation use a neural network which takes a lot of processing power. Additionally, the classification camera had to point towards the plants while the navigation camera had to point in front. The top of the pole also had to have lights to serve as indicators. Upon searching for RGB lights that would be bright enough, we found they would cost upwards of $100 so we made our own using lights from a speaker, a small plastic bag for reflection and encased in an empty pill bottle.Since the lights required 12 Volts and our Arduino output was 5 volts, we connected it to a relay. The connection required a common ground with the Arduino and 3 wires for the red, green and blue lights which we placed at pins 7, 8 and 11 on the Arduino. We could simulate the RGB spectrum on these lights by using analogWrite function to give different values to all three wires. Note that for correct coloring, all three need to be written otherwise a previously written color on any one pin could show unexpected results.

Chassis:

Our experiments with a plastic chassis with both wheels and tracks using low power motors had proven unsuccessful at location at Stone Coop and Growing Hope farms and both options would trench into sandy ground that is beneficial for plants.One of our interim chassis versions, we stripped a lot of plastic gears before upgrading to metal and the ability to handle higher current:

Source: Farmaid: Plant Disease Detection Robot


About The Author

Muhammad Bilal

I am highly skilled and motivated individual with a Master's degree in Computer Science. I have extensive experience in technical writing and a deep understanding of SEO practices.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top