Autonomous Response using Intelligence 4 Emergency Scenarios

Project ARIES – A system of intelligent and connected autonomous vehicles to provide an immediate response and action to emergency scenarios

In UttarkhandNepal and most recently in Kerala, floods have caused a lot of devastation resulting in a tremendous damage to person and property. Due to the accumulated heap of landmass spread over a large area after the flood, it was difficult to carry out the search and rescue operations or navigate manually. This led to numerous loss of lives due to the unavailability of emergency support. Another recent incident that occurred in Thailand has caused 13 people getting trapped in a cave. With the increase in the rate of natural and man-made calamities, we need to recheck our ability to perform an emergency response operation without any delay.

With the help of latest technology, it is now possible to reduce the delay in response required to provide help and support to an emergency scenario such as a disaster, calamity or an incident. It drastically reduces the amount of workforce required for carrying out highly sophisticated tasks. Using a connected autonomous system consisting of aerial (Drone) and ground vehicles (Donkey car), a scene can be monitored in real time and instant support can be provided.

Keywords: Connected autonomous systems, drone, donkey car, emergency response

Project Goals:

  • Autonomous aerial and ground operation
  • Aerial and terrain mapping
  • Image segmentation and analysis for safe zones
  • Path planning and autonomous navigation
  • Obstacle detection and avoidance
  • Emergency assist and Payload transportation
  • Connected systems with multiple units
  • Automatic Solar charging
  • Live monitoring and control
  • Mobile application support

Workflow and Explanation:

As the project unfolded, complexity, design, time and resource constraints were experienced. This caused some variations in the implementation of the original proposed idea. However, this project will be live either till the requirements are met or if additional requirements are discovered.

This project guide however explains all the basic steps and procedures that are performed towards attaining the final goals of the project. I wont be expressing in highly technical contexts. But will try to keep this as simple as possible so that anyone referring to this guide would be able to infer and understand the concept. Also please keep in mind that I would like to keep this guide as compact as possible for the sake of readability. So, wherever applicable, I will be referring to resources in other websites where detailed information can be obtained instead of repeating everything here. However, the most important points will be mentioned here.

This project is divided into two main sections:

1. Aerial drone (Eye In The Sky)

2. Ground vehicle (Ground Scout)

First of all let me introduce our eye in the sky. We call this machine… the ASPIRE.

What ASPIRE basically does is that, it is tasked to follow certain way-points, scan that area and if finds some nasty things, report it back to the home base. In our case, we provide the way-point information. It traverses each of these way-points automatically and checks if there are any humans present in the area with the help of the attached camera. If it happens to find someone, their location is reported back to the base.

Hardware implementation of ASPIRE:

ASPIRE can be any multi-rotor, equipped with Raspberry Pi 3 as a companion computer running on Ubuntu Mate, in additional to the Pixhawk 4 flight controller. We have used an S500 frame to build ASPIRE. The bldc motors are of 920KV boasting a 9045 propeller on each. Details of building a multi-rotor are available online which I wish not to repeat here again. The connection between Raspberry Pi and Pixhawk and how to configure them is explained in detail here. For creating automatic way-pintin a variety of softwares such as QGroundControl, Mission Planner etc are available. Drone-kit is used for the simulations that were done to finalize the mission capabilities of the drone. There is a great set of tutorials by Tiziano Fiorenzani on setting up and using drone-kit in various drone applications. We used FlytOS and its APIs to define and execute way-point missions on the real hardware as it is based on ROS. Raspberry Pi is used to manage and execute these tasks, detect humans and report back the location to home base. A Logitech C270 HD Camera is used for the image capture. Rpi camera can also be used. Ublox NEO-M8N GPS Module (link) with Compass is used for localisation and navigation.

Software implementation of ASPIRE:

The software implementation for ASPIRE is purely based on PythonROS and FlytOS. FlytOS is a software framework and platform that can be used to develop custom APIs for controlling a variety of drones. The back bone of this framework depends on ROS, MAVROS and MAVLink modules. Using FlytOS APIs, we are able to call functions that can carry out specific tasks such as drone takeoff, land, position control, way-point execution etc.

!/usr/bin/python
#-------------------------------------------------------------------------------#
#-- PROJECT ARIES                                                               #
#                                                                               #
#-- PROGRAM GOALS                                                               #
#-- 1. Control a multi-copter autonomously using Pixhawk, RPi, FlytOS           #
#-- 2. Detect humans and provide warnings                                       #
#                                                                               #
#-- Programmers: Cris Thomas, Jiss Joseph Thomas                                #
#-- References: FlytOS, OpenCV HAAR                                             #
#-- Contact: [email protected], [email protected]                     #
#-------------------------------------------------------------------------------#
import time
from flyt_python import api
#-- Setup people detection
person_cascade = cv2.CascadeClassifier('haarcascade_fullbody.xml')
cap = cv2.VideoCapture(0)
#-- People detection function using HAAR
def do_people_detect():
	detected = False
	r, frame = cap.read()
	if r:
		frame = cv2.resize(frame,(640,360)) # Downscaling
		gray_frame = cv2.cvtColor(frame, cv2.COLOR_RGB2GRAY)
		rects = person_cascade.detectMultiScale(gray_frame)       
		for (x, y, w, h) in rects:
			if x:
				detected = True			
			cv2.rectangle(frame, (x,y), (x+w,y+h),(0,255,0),2)
		cv2.imshow("preview", frame)
		print 'human(s) detected'
#-- instance of flyt drone navigation class
drone = api.navigation(timeout=120000)  
#-- at least 3sec sleep time for the drone interface to initialize properly
time.sleep(7)
print 'Drone ready'
print 'taking off'
drone.take_off(5.0)
print 'Executing pre-defined setpoints'
while True:
	drone.position_set(5, 0, 0, relative=True)
	do_people_detect()
	drone.position_set(0, 5, 0, relative=True)
	do_people_detect()
	drone.position_set(-5, 0, 0, relative=True)
	do_people_detect()
	drone.position_set(0, -5, 0, relative=True)
	do_people_detect()
print 'Landing'
drone.land(async=False)
#-- shutdown the instance
drone.disconnect()

What this code basically does is, it calls an instance for the drone class defined in flyt_python class and carries out predefined tasks such as takeoff, survey etc. in a sequential manner. It also synchronously checks for people and when they are identified, produces an output at the ground control terminal screen.

Source: Autonomous Response using Intelligence 4 Emergency Scenarios


About The Author

Muhammad Bilal

I am highly skilled and motivated individual with a Master's degree in Computer Science. I have extensive experience in technical writing and a deep understanding of SEO practices.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top