An Approach to Semi-Autonomous Indoor Drone System: Software Architectur and Integration Testing

INTRODUCTION

An Unmanned Aerial Vehicle (UAV), commonly known as a drone, is an aircraft without
a human pilot aboard. UAVs are a component of an unmanned aircraft system (UAS); which also
includes a ground-based controller and a system of communications between the two (Watts et al.,
2012). The commercial successes of DJI, GoPro, 3D Robotics and other drone companies have
made researchers push technological advances to achieve a higher degree of autonomy (Canis,
2015). Current commercial drone systems are automatically controlled but are manually driven,
and their degree of autonomy lies below a score of 4 defined by Proud et al. (2003) from the NASA
Spaceship Mission Assessment and Re-planning Tool (SMART) team. To push for a system with
higher scale, the open software libraries available have limited support for integrating navigational
and control algorithm. On a higher scale of autonomy, we should be able to integrate machine
learning and artificial intelligence algorithms as well. To achieve a higher scaled UAS system
we must work in the direction which enables not only automatic control but also automatic drive
functionality.

There have been some commercial efforts in getting drones to fly indoors as well, using
Pozyx, OptiTrack and other technologies (3D Robotics, 2019b). Outdoor drone systems face reliability problems related to localization when used indoors. Barometers are inconsistent indoors
because of artificially modified condition, e.g., on temperature and pressure (Li et al., 2013). The
GPS is also unreliable due to a signal loss indoors, even though it has centimeter level precision
outdoors (Dedes and Dempster, 2005). Due to these reasons, the indoor drone system becomes
highly unstable; it is best to remove the barometer and GPS sensors and look for alternatives. In
our case, we trimmed down to sonar and LiDAR (Light Detection and Ranging) sensors as alter

natives to barometer and GPS, respectively, as proposed in previous studies (Gossett, 2018), for
enabling indoor localization.
As our study progressed, we faced problems related to sensor integration with the existing
drone system. Therefore, we revisited our approach and switched to one that is loosely coupled
and scalable (Parnas, 1972). We observed that the existing open source software libraries were
not modular enough when interacting with each other. The concepts of companion computer open
source softwares are available; these enable communication between the companion computer and
the flight controller, e.g., Dronekit-Python. It enables navigational components (e.g. goto/waypoint
commands), GUI implementations, mission management and direct control over vehicle movement
and operations (3D Robotics, 2019a). On a larger note, Dronekit-Python wraps ArduPilot embedded C++ methods as Python methods. These Python methods are atomic and immutable in nature
and act like a basic control unit for a drone. Each of these Python methods send pre-defined standard MAVLink messages to the Flight Controller, e.g., set roi(location) python method creates and
sends MAV CMD DO SET ROI MAVLink messages which point the camera gimbal at a specified
location. If there are no supporting messages in MAVLink communication protocol, we cannot
add new operations or fine tune existing operations with Dronekit-Python. Although, developers
tend to do this convenient Python implementation, we opted for a more permanent solution by
updating the embedded C++ based firmware code on the Flight Controller.
On a similar note, we found the library is not scalable when new sensors are added. We are
not able to do integration if there are no standard messages in MAVLink communication protocol for such newer sensors. To resolve this, we used a customized MAVLink message and
used a Ground Control Software (GCS) to send the message to the firmware on the Flight Controller. Also, there could be a need for subtle changes within the atomic functionality of the drone,
e.g., reading sonar sensor data and then disarming/checking for obstacles, i.e., forking within the
wrapped embedded methods. In that case, we need to look back into the embedded flight controller
source code. It is to note that the firmware itself have limitation on hardware connections and using
this sensor integration technique provides us a workaround for a scalable model. Also, to note that

we are looking at a more challenging implementation of a software patch for the embedded C++
firmware code that enables such new operations/functionality in the drone.
We also found that current sending and receiving of sensor data between companion computer and flight controller is not the most efficient way of communicating. Instead, collecting the
data at the companion computer and then sending it as MAVLink commands saves us from this
two-way communication. This unidirectional communication also halves the amount of processing
we need to send and receive data.
The goal of this thesis is to expand on existing drone systems at the Heterogeneous Systems
research (HEROES) lab and improve on the autonomy ability of drone. The major contributions
of this thesis are:
• Removal of the RC Transmitter and arming the motors so that the Autonomous Driven –
Autonomous Control functionality could be achieved in the future.
• Implementation of a modular architecture for development, deployment and testing of the
drone system. This system follows a unidirectional flow of data which makes it more efficient.
• Instructions to set up the software stack and the proof of concept using integration testing
over the two interfaces: ground station to companion computer interface and companion
computer to flight controller interface.
• Integrating the sonar sensor attached to the companion computer, to the flight controller by
using MAVLink network protocol messages.

BACKGROUND

The initial drone research at the HEROES Lab converges on guiding the drone using a RC
Controller. The software stack in the preceding studies (Gossett, 2018) was implemented using
Python development on an on-board Raspberry Pi 3 (RPI3). The Pulse Width Modulation (PWM)
value to the rotors was set automatically by an on-board autopilot flight control unit using the
DroneKit-Python library on the RPI3. Dronekit-Python uses MAVLink messages to communicate
to the flight control unit. Figure 2.1 describes the data flow for the preceding Drone System.

Figure 2.1. Data flow in initial design.

The software development includes using RC overrides and feedback loops to control the
values of RC Channels. Autonomous operations such as taking off, holding altitude and landing
used throttle channel (Channel 3), which would run a feedback loop for the throttle value until
the target altitude is reached, or the drone would hold the target altitude, or the throttle channel
value would purge to zero, respectively. A similar approach is taken to control attitude where
corresponding channels, viz., channel 1 for roll, channel 2 for pitch and channel 4 for yaw, are
used in a feedback loop. (Gossett, 2018)

Although the Python libraries are easy to pick up and are well documented, the preceding
study faced a problem regarding the stability of the drone. The drone stays uncontrollable for an
initial period and then may flip over due to the feedback delays. There are also concerns, which are
discussed earlier in the Introduction section, for delay in to-and-from data communication, adding
logic within the fundamental control methods/unit such as takeoff() and landing(), and disabling
sensor libraries.
The ArduPilot firmware has updated its source code to include ESC re-calibrations. This
solves the problem of needing to calibrate the ESC before missions. This code is by default executed every time when the flight control unit is powered up. We also changed the time it executes
by changing the source code.
As stated earlier in the Introduction section, NASA through its SMART team (Proud et al.,
2003) has defined a scale for degree of autonomy to answer, “How autonomous should an AFM
(Autonomous Flight Management) System be?” For the autonomous drone study at HEROES Lab,
the objective is to achieve a score of 4 or higher, where the computer takes prime responsibility in
making decision and taking action.

Figure 2.2. Level of autonomy assessment scale (Proud et al., 2003).

OBJECTIVE

In order to communicate with the Drone System, we considered a new headless approach
where input would be transmitted from a remote PC to the on-board Computer Unit (RPI3). The
RPI3 will then communicate to the Flight Controller Unit (PX4) using MAVLink messages to drive
the rotors. Figure 3.1 describes the data flow for the proposed Drone System.

Figure 3.1. Data Flow in Proposed Design.

The design supports data flow that is unidirectional in nature. Although RPI3 communicates on MAVLink messages, this is a user customized message unlike the one provided/leveraged
by the Companion computer development software. There are no changes to the flight controller
generating PWM data but there are flags that we need to set in order to enable this communication.
To summarize, this approach has considerable advantage:

  1. Unidirectional data flow saves a considerable amount of time by using one-way communication.
  2. Inputs could be communicated programmatically to the Drone System instead of being controlled using the RC Transmitter.
  3. Existing Home Wi-Fi system could be leveraged to communicate between the remote computer and the Drone System.
  1. Integration of testing and debugging mechanism to the system.
    Overall, the drone research converges to two broad concepts — semi-autonomous and fully
    autonomous. Semi-autonomous describes the software development pertaining to responsiveness
    of drone. Fully autonomous integrates the artificial intelligence, navigational algorithms and recommendation engine. Following is the summary of the autonomous concepts:
    Semi-Autonomous (Software part):
  2. Set up of streamlined data flow. This creates a robust I/O stream on both interfaces, GS-RPI3
    and RPI3-PX4.
  3. Drone responsiveness.
  4. Precision in control system to achieve attitude and altitude.
  5. Fundamental understanding and interpretation of software system available. E.g. ROS, PyDrone, DroneKit-Py, ArduPilot, MavProxy and other libraries.
    Fully Autonomous (Algorithm part):
  6. Traveling from A to B knowing the obstacles before the start of mission.
  7. Reading 2D/3D data and implementing pathway.
  8. Real-time implementation moving obstacles such as person or ball.
    For this study, our objective is to create a semi-autonomous Drone System which could be
    later developed into a fully autonomous system.

DRONE TECHNOLOGY

Drone technology is developed on two independent fronts — the drone makes and materials, and the in-built control system. In this project we are mostly dealing with the control system
which inherently controls the drone motor movements. To enable the 3D movement, the drone is
equipped with four motor — two running in the clockwise (CW) direction and two running in the
counter-clockwise (CCW) direction assembled in alternative fashion.
The physical movement of a drone can be understood using two concepts — altitude and
attitude. Altitude is related to the height of the drone above ground. Altitude is controlled by
powering up all motors to go up and powering down to go down. Attitude is related to yaw,
roll, and pitch of the drone. Considering the right-hand Cartesian coordinates, the three rotational
component yaw, roll and pitch will rotate on Y, Z, and X axis, respectively. Figure 4.1 describes
the rotational component of the Drone System.

Figure 4.1. Attitude control in drone system (Sopasakis, 2017).

In order to achieve yaw, the two counter-clockwise motors power up to go left and the two
clockwise motors power up to go right. Similarly, for roll, the right two motors power up to go
left and left two motors power up to go right. Pitch controls going forward where the back two
motors power up and going backwards where the front two motors power up. Figure 4.2 describes
the possible movements in the Drone System.

Figure 4.2. Possible movements in drone system. (Sopasakis, 2017)

Apart from motor control, the drone is equipped with various sensors to control and verify
movement. Common sensors that are used on a drone system includes a gyroscope that controls
attitude, a barometer that controls altitude, a GPS that verifies localization, and cameras that enable
object avoidance.

HARDWARE STACK
The foundation of the drone is based on the DJI Flame Wheel F450 Dronekit which comes
with its own frame, motor unit, propellers and power distribution board. Most of the connections
are wired and soldered as instructed in the kit manual. These connections, as well as the connections with newer module such as sonar, Companion Computer, Ground Station, etc. are shown
in Figure 5.1 below. Most of the research on parts and assembly of the Drone System was done
in earlier studies at HEROES lab (Gossett, 2018). The description of each part is summarized in
further sections.

Figure 5.1. Connection on Hardware Interfaces.

5.1 Frame

As suggested earlier, the frame was part of “Almost Ready to Fly” Flame Wheel Dronekit
F450 and, hence, was easy to assemble. The frame is made of hard ”ultra-strength” plastic com-

ponent. The diagonal wheelbase (motor to motor distance) of the frame is 450mm and the frame
naked weight is 282g. The propellers are 250mm long which adds roughly another 100 mm to
diagonal wheelbase per propeller, making the drone safe diameter to 650 mm.

Figure 5.2. Frame for DJI Flame Wheel F450 Dronekit (QuadcopterGarage.com, 2018).

5.2 Flight Control Unit

As suggested earlier, the Flight Control unit controls the motors in order to keep the quadcopter in the air. Overall, the Flight Controller is a circuit board with sensors that detects the
orientation changes of the drone, and receives user commands. In our study, we used Pixhawk
Mini (PX4) Flight Controller from 3DR, which is equipped with the ARM Cortex M4 processor.
All the sensors and command flow are connected directly (wired) or indirectly (via the companion
computer) to the PX4. The detailed specifications of the Flight Controller Unit could be checked
at PX4 Dev Team (2019).

Figure 5.3. PX4 Autopilot Unit with an External Compass Unit (PX4 Dev Team, 2019).

5.3 External Compass Unit

Although there is an inbuilt compass unit in the PX4, it has been depreciated due to a hardware manufacturing defect (davids5, 2017). The hardware defect makes the internal MPU9250
IMU unreliable. The MPU9250 is disabled by default on the PX4 firmware. An external GPS/compass unit is provided which solves this problem, as a secondary sensor ICM20608 provides
both the accelerometer and the gyroscope, while the external GPS provides the magnetometer (PX4
Dev Team, 2019). Figure 5.3 shows the wired connection made between the PX4 and the External
Compass Unit.

5.4 Companion Computer

For our study, we have used the Raspberry Pi 3 (RPI3) as a companion computer, which
connects to the Flight Control Unit using a wired connection between the only telemetry port
available on PX4 and the UART connection built into RPI3 GPIO. The RPI3 is equipped with an

ARM Cortex A53 processor which reads and interprets the sensor data and sends commands to
PX4. In further chapters we would go in a more detailed RPI3 setup procedures.

5.5 Motor Unit and Electronic Speed Controller (ESC)

The Dronekit comes with four DJI 2312 motor units as shown in Figure 5.4. Each motor
weights 60g, provides 960 rpm/V, and has a maximum recommended load of 400g for a 4S LiPo
Battery setting. The Drone System is also equipped with four DJI 420 LITE ESC as shown in
Figure 5.4. The ESC regulates power to the motor and is paired with each motor. It is compatible
with CW and CCW motors from E310 Tuned Propulsion System (2312). It has maximum recommended allowable voltage as 17.4V and maximum allowable current as 30A. The ESC sends
PWM voltage values as the output to motors. The detailed specification for the propulsion system
could be checked at DJI (2014).

Figure 5.4. E310 Tuned Propulsion System.

5.6 Sonar Sensor

For the study, we have used HC-SR04 Ultrasonic Range Finder that points down to give
distance measurements from the ground. The sensor has four pins: Ground (GND), Echo Pulse
Output (ECHO), Trigger Pulse Input (TRIG), and 5V Supply (Vcc). Its connections to the RPI3
are made as described in Figure 5.5. The output ECHO is reduced to 3.3V using a voltage divider

circuit in order to match the recommended GPIO port rating (Gossett, 2018).

Figure 5.5. HC-SR04 Ultrasonic Range Finder Sensor and Connection with Raspberry Pi.

5.7 Safety Switch

The Flight Controller has an integrated safety switch that can be used to enable/disable the
outputs to motors and servos, once the autopilot is ready to take off. If this switch is hard to access,
an external safety button is also attached to the Flight Controller

5.8 Battery and Power Module

The Drone System is internally powered using 1800 mAh 4S Lipo battery rated at 14.8
V. Normally, the battery would get discharged within 15 to 20 minutes of flying use. It should
be noted that if the battery gets discharged to less than 13.5 V (approx 90% of rated voltage), it
could cause permanently damage. The battery is charged using EV-Peak Synchronous Balance
Charger/Discharger.
Although, the ESC can run on 14.8V delivered directly from LiPo battery, the radio, telemetry and flight controller run on 5v. In order to deliver an additional 5V, we can either use an another

5v battery or use a Battery Eliminator Circuit (BEC). There are some ESCs that have built-in BEC
and puts 5v on the signal cable, but the DJI 420 LITE ESC, which we are using, does not have
an in-build BEC. Therefore, the Drone System is equipped with a Micro Power Module, a PM06-
v1.0 UBEC (Universal BEC) which converts the main LiPo battery pack voltage, 14.8V, to a lower
voltage, 5V. The power module can deliver a maximum current of 3A on DC input of 7V-42V. The
power output of the module is 5.1V-5.3V.

5.9 Ground Station

The ground station is a general-purpose computer which is used as development and testing
machine. In our case it was a 4th generation Intel i3 with 12 GB RAM. The ground station is
equipped with Windows 10, Cygwin VM, Eclipse IDE, Mission Planner, Python development
framework, Wi-fi and USB capabilities.

ARCHITECTURE

With technological advances, the Drone Software Development is now incentivized for
more powerful systems. This has enabled more processing power and support for a graphical user
interface. There are many such software packages available commercially and we have divided
them into four broad groups based on functionality – Flight Control Firmware, GUI based GCS,
Terminal based GCS, and Companion Computer Software.
The Flight Control Firmware, viz. ArduPilot based on the GPL license, PX4 Flight Stack
based on the BSD license, iNav, Librepilot, etc. mostly interprets sensor data to control the output
to motor units. GUI based GCS, viz. Mission Planner, QGroundControl, etc. are ground stationbased software which sends user commands to firmware and displays drone current status. It
is computation intensive in nature. Terminal-based GCS, viz. MAVProxy, also sends user and
status commands to the firmware, but it is less resource consuming in that it does use a GUI and
does not perform additional functionality such as features related to map, testing and simulations.
The companion Computer Software includes libraries and framework such as DroneKit-Python,
APSync, Maverick, etc. which create apps that run on an on-board companion computer and
communicate with the Flight Controller.

The current Drone System is divided into three major development modules:

  1. Desktop module, which contains the dev and test box
  2. Flight Controller module
  3. RPI3 module

During this study, there have been efforts taken to streamline the process of development,
deployment and testing of new features. As a result, we selected an architectural pattern which
tends to be easier to debug when the system errors out.
In the current setup as shown in the Figure 6.1, the Flight Controller module runs the
Ardupilot firmware on the PX4 hardware, the RPI3 module runs the MAVProxy GCS software
and the Desktop module runs the Mission Planner for deployment to the firmware, Secure Shell
for testing (test box) and C++ development framework (dev box).
The RPI3 acts as a companion computer sharing processed data with the firmware over
MAVLink Messages. Additional sensors are attached to the RPI3 to create a scalable integration
of sensor technology into the Drone System. In the future, the scope of such integration is not
limited to sensors but may include other technologies as well.
The MAVProxy based RPI3 acts like a true middleware with two-way communication to
firmware, enabling integration of sensor technology, enabling testing/user command interface, and
converging to a modular approach for software development.

Figure 6.1. Current Environment Setup

6.1 ArduPilot

ArduPilot is an open source autopilot software and is capable of controlling vehicle systems, including conventional airplanes, multirotors, and helicopters, boats and submarines. It
is one of the distinguished autopilot suites and is used in many OEM UAV companies, such as
3DR, jDrones, PrecisionHawk, AgEagle and Kespry; several large institutions and corporations
such as NASA, Intel and Insitu/Boeing; and by countless colleges and universities around the
world. (ArduPilot, 2016a)
ArduPilot (including Copter, Plane, Rover, Sub and AntennaTracker) and the ground control software (including Mission Planner, APM Planner2 and MAVProxy) are free software under
the terms of the GNU General Public License version 3 as published by the Free Software Foun

dation. We may copy, distribute and modify the software as long as we track changes/dates of
in source files and keep modifications under GPL. We can distribute our application using a GPL
library commercially, but we must also provide the source code (ArduPilot Dev Team, 2019f).
The ArduPilot code base is quite large (about 700k lines for the core ardupilot git tree)
and can be quite intimidating to a new user. The Dev team has documented the procedures and
libraries well. The documentation is available online on the ArduPilot Dev Team (2019b) website.
To understand the code at an abstract level, the code base runs two type of loops: a single open
loop that runs all the ’core’ libraries and methods and closed loops for every sensor to execute
’sensor’ libraries and methods. For the single open loop, when an input is sent to ArduPilot, setters
methods are called to set flags so that action methods could be executed during the next loop cycles.
The multiple closed loop calls sensor methods regularly to fetch/read data from sensors and sets
corresponding float variables. As an example, the AL BARO, AL CAMERA libraries will contain
the ’sensor’ methods for barometer and camera sensors. It should be noted that print methods are
also available in AL HAL and GCS libraries.
Figure 6.2 explains the overall ArduPilot architecture using data flow patterns in the system
starting at main loop() in ArduCopter.cpp library and ending at outputting PWM to motors

Figure 6.2. ArduPilot Architecture.

The UAV flies at various pre-defined flight modes such as Alt Hold which holds altitude and
self-levels the roll & pitch, Flip which completes an automated flip, Guided which navigates drone

to single points commanded by GCS, etc. Currently, there are more than 20 flight modes (ArduPilot
Dev Team, 2019e). Flight Mode provides an abstraction, as well as coding space where we can
write custom code for flight stabilization, autopilot, setting of variables, setting of sensor flags, etc.
Figure 6.3 shows the architecture of manual modes, i.e., modes that requires a manual control, viz.,
Stabilize, Acro, Drift, etc.

Figure 6.3. Sample ArduPilot Code Overview for Flight Mode

6.2 MAVProxy

MAVProxy is a GCS terminal for MAVLink based UAV systems. MAVProxy was first
developed by CanberraUAV to enable the use of companion computing and multiple datalinks
with ArduPilot. It supports loadable modules, and has modules to support console/s, moving
maps, joysticks, antenna trackers, etc. MAVProxy is too released under the GNU General Public
License v3 or later (ArduPilot, 2016b).

The source code for MAVProxy is written in python. The MAVProxy python-based modules run in parallel threads to the main MAVProxy program in order to maintain overall stability.
Modules need to be loaded before they can be used using management commands.
As suggested earlier, the software has modules to manage similar functionality and each
functionality uses MAVLink messages to communicate to ArduPilot firmware. Definitions of these
MAVLink messages are kept in common.xml and ardupilot.xml files. In order to be reusable,
the definition of these messages is converted to both python and C++ header code using python
pymavlink library, so that it could be referenced in both MavProxy and ArduPilot source code. It
could be noted, therefore, the python MAVProxy library have a dependency on pymavlink library.

6.3 MAVLink

Micro Air Vehicle Link (MAVLink) is a lightweight messaging protocol for communicating with small unmanned vehicles (and between on-board drone components). MAVLink follows a modern hybrid publish-subscribe and point-to-point design pattern: Data streams are sent
/ published as topics while configuration sub-protocols such as the mission protocol or parameter
protocol are point-to-point with retransmission.
It is designed as a header-only message marshaling library. MAVLink 2 has just 14 bytes
of overhead and is a much more secure and extensible protocol than compared to its predecessor,
MAVLink 1. As a result, it is highly efficient and well suited for applications with very limited
communication bandwidth. It also provides methods for detecting packet drops and corruption,
and for packet authentication.
As suggested earlier, messages are defined within XML files. Also, internally the Python
pymavlink library uses the mavgen and mavgenerate tools provided by the MAVLink project to
convert these XML files to Python and C++ header codes. The mavgen and its derivative tools
also support other languages such as C, C#, Objective C, Java , Javascript, Lua, Swift and Clojure. (Dronecode Project, 2018)

Figure 6.4 shows a sample MAVLink message with predefined message id, message name,

variable type and variable name sent/received during communication in xml format.

Figure 6.4. Sample MAVLink message.

SOFTWARE INTEGRATION

The software integration involves setting up the two interfaces between the three software
modules: the RPI3-PX4 Interface and the GS-RPI3 Interface. Also, we need to set up the Desktop
Test/Dev Box on GS and the Sonar sensors into the Drone System.

7.1 Setting up GS-RPI Interface

Assuming that the previously stated hardware setup and connections have been made, enabling the on-board RPI3 integration with the Desktop Ground Station (GS) module requires the
following step. (ArduPilot Dev Team, 2019c)

  1. Download Raspbian from provided by the Raspberry Pi Foundation (2018). The current
    version installed on the RPI3 is Raspbian Stretch updated 2018-11-26.
  2. If the Operating System (OS) image has a dot iso extension file, use Rufus to copy the OS
    image to an SD Card. This is provided by Batard (2019). If the OS image has a dot img
    extension, use Etcher to copy the OS image to an SD Card. This is provided by Balena
    (2019). Using Win32 Disk Imager, provided by Davis (2019), might require formatting the
    SD Card using SDCardFormatterv5 provided by SD Association (2019). Select the timeconsuming overwrite format option and not the quick format option.
    Note: By default, we cannot read Linux file system/drive on Windows. Reading an SD Card
    will give error on Windows Explorer. Also, if we are able to see a partition, it is the
    boot partition, which contains the system settings, and not the user data on the OS.
    The user data is kept on Linux based partitions.
  1. Plug-in the RPI3 This might take some ten minutes when booted up for the first time.
    Connect it to Wi-Fi.
    Note: Currently the connection is set up as dynamic. Ole Miss Network does not allow to
    make static IPs. We would need our own router to make static IPs or get Network
    Administrator onboard.
  2. Disable the OS control of the serial port: Enter sudo raspi-config on terminal,
    Interfacing Options / Serial then specify if you want a Serial Login Shell (Select No) then if you want the Serial Port hardware enabled (Select Yes).
    Note: As a thumb rule, use /dev/serial0 in configuration for any code that accesses
    the Serial Port. This is because, in the latest RPI3 updates, defining /dev/serial0
    automatically selects the appropriate device; so you can replace /dev/ttyAMA0 with
    /dev/serial0 and use the same software configurations on Pi3 and earlier models.
    To understand this in more detail – the /dev/ttyAMA0 was previously used to access
    the UART now connects to Bluetooth. Hence, we connect PX4 (Flight Control Unit)
    with miniUART now available on /dev/ttyS0.
  3. Enable UART: For some reason the default for RPI3 using the latest 4.4.9 kernel is to
    DISABLE UART. To enable it we need to change enable_uart=1 in /boot/config
    .txt

Note: This step solves two problems, One, enables UART, and second, for RPI3 ”Baudrate
is derived from system clock” which makes the miniUART useless as this clock can
change dynamically when the system goes into reduced power or in low power mode.
To fix this problem we modify the /boot/config.txt and remove this dependency by adding the following line core_freq=250. This change is not needed
anymore. The SPI clock frequency and ARM Timer are also dependent on the system
clock. (Raspberry Pi Foundation, 2019a)

6. Enable SSH: Enter sudo raspi-config on terminal, Interfacing Options and Enable SSH, and then execute the following commands. $ sudo apt-get update $ sudo apt-get install openssh-server openssh-client We can verify the active status of the ssh service by executing: $ sudo systemctl status ssh 7. Check IP Address: Use ifconfig CLI command to find the IP Address which would be used for secure login (SSH) from remote ground station (GS). 8. Install Mission Planner: Follow the instructions provided by ArduPilot Dev Team (2019g) in order to install Mission Planner GCS. 7.2 Setting up RPI-PX4 Interface Currently we are going to leverage the ssh connection which we have enabled in the last section to communicate with the RPI3 module. In future the connection over TCP capability should be researched. It would be important to know how the drone behaves on the TCP/IP given the protocol is highly reliable but memory consuming. Following are the softwares we would need: 1. Install Libraries: From GS CLI, enter ssh pi@ and execute these commands on GS successively: (change ardupilot.xml in message def folder and copy the folder in pymavlink clone) $ sudo apt-get install python-wxgtk2.8 python-matplotlib python-opencv python-pip python-numpy python-dev libxml2-dev libxslt-dev python-lxml $ sudo pip install future $ git clone https://github.com/ArduPilot/MAVProxy.git $ git clone https://github.com/ArduPilot/pymavlink.git

Copy the message definition folder from newly created mavproxy folder to pymavlink folder.
This is done to remove the dependency of pymavlink library on MAVProxy library. We
are not using pip to install the libraries would be outdated and have dependency constraint.
Instead, we are installing the most recent version from git. Execute these further commands:
$ cd ./pymavlink
$ sudo python setup.py build
$ sudo python setup.py install
$ cd ./MAVProxy
$ sudo python setup.py build
$ sudo python setup.py install
We might get errors related to modules not found, but need not worry as they are warnings
and could be skipped. To remove the warnings, comment out the modules in setup.py file.

  1. Verifying RPI-PX4 Connection: Execute these command on GS CLI successively.
    $ sudo mavproxy.py –master=/dev/serial0
    –baudrate 57600 –aircraft MyCopter
    Whenever the RPI3 connects to the PX4, three files are created in the in the folder where the
    command was executed. If the logs were set up, the files would be created at /home/pi/
    MyCopter/logs/YYYY-MM-DD directory.
    • mav.parm : a text file with all the parameter values from the Pixhawk.
    • flight.tlog : a telemetry log including the vehicles altitude, attitude, etc which can be
    opened using the mission planner (and other tools).
    • flight.tlog.raw : all the data in the flight.tlog mentioned above plus any other serial data
    received from the Pixhawk which might include non-MAVLink formatted messages
    like startup strings or debug output

Note: If MavProxy is set at higher baud rates, we could get error on the receiving end. In my
case I got MAV > Link 1 Down error for communicating with PX4 at a high baud
rate (Denotes PX4 TELEM port Pin 1 – RCV error / Handshake error). Here, we are
setting baud rate at TELEM default rate of 57600. This is half of the highest baud rate
for most current industry microcontrollers

7.3 Setting up Testing Box

Testing for the drone system is set up in Windows machine (i3-4th gen, 12 GB RAM) and
the development/builds are done with Eclipse IDE (Photon 2018-12). The steps followed are based
on documentation available on ArduPilot Development Site (ArduPilot Dev Team, 2019a). Also,
it is important to note that Python 2.7 will reach the end of its life on January 1st, 2020 and will
be deprecated and there could be changes to the libraries we need to reconsider. Following are the
updated steps used for this project:

Download Eclipse IDE for C/C++.

  1. Install Cygwin: Follow the documentation provided by Cygwin Authors (2019) and then install the following in-software packages – autoconf, automake, ccache, gcc-g++, git, libtool,
    make, gawk, libexpat-devel, libxml2-devel, libxslt-devel, python2-devel, python2-future,
    python2-libxml2, procps-ng, gdb, ddd, zip, libcrypt-devel
  2. Start Cygwin Terminal: Open and close Cygwin Terminal. This creates initialization files
    for the user in Cygwin home directory.
  3. Install pip: We will now configure Cygwin based Linux VM. Execute these commands in
    Cygwin terminal:
    $ curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
    $ python get-pip.py
  1. Install the GCC compiler: firmware.ardupilot.org/Tools/STM32-tools. Remember to check
    option Add path to environment variable while installing.
  2. Install packages: Install the following packages in Cygwin Terminal:
    $ pip install empy
    $ pip install pyserial
    $ pip install pymavlink
  3. Clone ArduPilot project from Git: Excecute the following command:
    $ git clone https://github.com/ArduPilot/ardupilot.git
    Its possible that some files might be missing after clone (In my case, ardupilot/Module folder
    was empty). In that case, run the following command to update recursively.
    $ git submodule update –init –recursive
  4. Setup Eclipse project: Follow the manual provided by ArduPilot Dev Team (2019d). Change
    the target name to configure –board Pixhawk1
    –no-submodule-update when creating the New Build Target.
  5. Building ArduPilot project: Px4 currently compiles with waf instead of make commands.
    Building with waf library is described in ardupilot/BUILD.md. For compiling in Eclipse,
    open it with Administrative Privileges, right click the project and click Build. For the first
    build, it might take a lot of time (In my case, it was more than an hour). For this reason,
    it is not recommended to do a clean before every build. For errors on Eclipse related to
    ’Unresolved Inclusion Error’ follow instructions provided by Hock-Chuan (2013).
    Alternatively, we can build using waf commands on Cygwin Terminal. This was the standard process in our study. In the git/ardupilot folder execute the following commands:
    $ ./waf configure –board PixHawk1
    $ ./waf build –all or ./waf build copter

10. Installing Mission Planner: Follow instruction to install provided by ArduPilot Dev Team (2019g). Issues related to no port found on COM# (USB) are mostly because USB serial device is not set up or disconnected. To set up the device, open Device Manager > View > Show hidden devices. Go to Ports > Right click USB Serial device (or COM6, in my case) > Update Driver. Other reason could be a case of having a bad USB cable. See section on challenges for further details.

11. Loading Firmware: Connect the PX4 with your Windows System using USB port. Open Mission Planner. Click Initial Setup tab and select Install Firmware. Click Load custom firmware on the bottom right of the window and select the newest arducopter build (usually in build//bin folder). Follow the instructions during installation.

7.4 Setting up Sonar Sensor

Raspberry Pi is equipped with row of 40-pin GPIO (general-purpose input/output) along the
top edge of the board as shown in Figure 7.1. As well as simple input and output devices, the GPIO
pins can be used with a variety of alternative functions, viz. PWM, SPI, I2C, Serial. (Raspberry Pi
Foundation, 2019b)

Figure 7.1. Raspberry Pi GPIO Layout Pi 3 Model B. (Hawkins, 2018)

Follow Figure 5.5 in earlier sections for hardware connections enabled with RPI3 and Sonar

Sensor. The sensor’s four pins – Ground (GND), Echo Pulse Output (ECHO), Trigger Pulse Input
(TRIG), and 5V Supply (Vcc) is connected to Pin 39, Pin 18, Pin 16 and Pin 2 on GPIO respectively.
For software setup and changes to Python code, see section on integration testing for testing
of sonar measurement.

INTEGRATION TESTING


For testing the system, we have taken an approach that combines individual modules and
tests them as a group. These tests verify the feasibility of the proposed architecture along with
the viability of communication links for a future scalable model for development of the Drone
System.
8.1 Testing Motor Arm using MavProxy on GCS
This test executes Motor Arm command from GS terminal, which powers up the motor unit
remotely. This test verifies two aspects

  1. Communication between GS and RPI
  2. Communication between RPI and Firmware/Motor Unit.
Figure 8.1. Data Flow in Motor Arm Test.

There was two ways the test was completed – through change in configuration parameters
as shown as 1 in figure 8.1 or setting up boolean flags in the firmware as shown as 2 in figure 8.1,
to make the system armable. The reason we had to do two alternative tests is described in detail
in following chapter 9 on Challenges, where we were not able to reproduce the results by setting
configuration parameters. Following were the steps undertaken:

  1. Connect to firmware: Open Mission Planner on desktop. Connect to the firmware via USB.
    Click Connect on top right of the windows.
  2. Set Configuration Parameters: In Mission Planner windows, Click on
    Config/Tuning tab. On left Pane, Select Full Parameter Tree. Use the Search
    Box on the left pane and set the values of the following parameters: GPS_ENABLE to 0,
    EKF_ENABLE to 0, ADHS_ENABLE to 0, ARMING_CHECK to 0
  3. Update Firmware: On Mission Planner windows on the right pane, click on
    Save to file button. Alternatively, this could be also done on MAVProxy GCS by
    using param fetch and param set commands.
  4. Execute Test: Execute the following commands on MAVProxy GCS:
    $ sudo mavproxy.py
    $ motortest 1 0 30 2
    This will power the motor 1 at 30% of maximum output for 2 seconds. The following figure
    8.2 shows results on the command prompt

Source: An Approach to Semi-Autonomous Indoor Drone System: Software Architectur and Integration Testing


About The Author

Muhammad Bilal

I am highly skilled and motivated individual with a Master's degree in Computer Science. I have extensive experience in technical writing and a deep understanding of SEO practices.

Scroll to Top