Design of Remote Datalogger Connection and Live Data Tweeting System

I. INTRODUCTION

Global average temperatures have increased alongside increases in population and consumption. These changes pose serious threats to the current standard of living as weather events become less predictable and more intense (Westra et. al., 2014). Along with this trend, increases in population are accompanied by corresponding increases in urbanization. Peak runoff is known to increase with urbanization due to the conversion of infiltrating soils to pavements and other impervious surfaces (Miguez et. al., 2015). Low-impact development (LID) is a strategy to sustainably overcome the stormwater runoff hazards posed by climate change and urbanization (PGCo, 1999). Green roofs, also commonly referred to as vegetative or living roofs, are representative of LID through their capture of stormwater and capacity to insulate. An Honors College research grant was awarded to members of the Biological Engineering Department with the intent to study the impacts of a green roof on the University of Arkansas campus. The site selected was the large, extensive green roof above Hillside Auditorium due to its high visibility to passing students. The ongoing study collects environmental data regarding the green roof via a network of sensors connected to a Campbell Scientific CR1000 datalogger (Image 1; Image 2). Along with examining the physical impacts of the green roof, the project also sought to educate the public on how LID designs can be worthwhile investments. As a complimentary activity this thesis seeks to design a method of transmitting collected environmental data to the public in an open and accessible way on a live basis.

On a study abroad course at the University of Gent, Belgium during my freshman year I encountered a tree with embedded sensors that collect and post biological data to Twitter on a live basis. The system struck me as an ingenious way of enabling public engagement in science. That tree also happens to be a member of a large network of trees across European universities that operate on Twitter under the name TreeWatch (Steppe et. al., 2016). The network was designed to study how trees are responding to climate change in their daily biological functions. It is believed that trees can serve as

early warning signals to climate change through die offs (Neumann et. al, 2017). The TreeWatch network seeks to understand more about the trees’ reactions to climate change before they die while also raising public awareness of the hazards posed by climate change. TreeWatch believes that tweeting sensor arrays have an intrinsic educational power (Steppe et. al., 2016). This project intends to utilize that power in order to have a lasting impact on the public’s understanding of LIDs and care for sustainable initiatives.

Disseminating data to the public is the primary goal of this project. However, in the design process new insights were gained regarding the competitive hold scientific instrumentation companies place on their products and markets. Campbell Scientific offers a manual regarding connecting internet capable devices to the CR1000 datalogger. However, all of the options in their product manual refer to using compatible products sold by Campbell Scientific. This limits open access by researchers and imposes increased system costs. Frustration has been expressed in online forums where individuals are trying to establish connections without purchasing Campbell’s compatible hardware.

One aspect of the design will be to program tweets coming from the green roof in first person. This is inspired by the TreeWatch network as a means of further engaging the public. By personifying the roof – giving it a name and a voice – it is hoped that the study will become more personable and impactful. Getting the public engaged in LID infrastructure research on its own is a likely a challenge.

Offering the research in a fun and approachable manner may ease this challenge. The first step in personifying a non-sentient object is to name the object. It was decided that the roof would tweet under the name Rufus the Roof. The selected twitter handle for the account was @UAGreenRoof.

In considering how the datalogger would be able to connect to the internet and compile collected data into tweets the decision was made to use a Raspberry Pi microcomputer. This option was chosen over the various alternatives such as Arduino microprocessors and Campbell Scientific’s own connection device due to its low price, high accessibility, and ease of use. The Biological Engineering department has several Raspberry Pis that are used in the Measurements and Controls lab, so one was readily available for use in this project. However, if this project were to be carried out by any individual without the privilege of lab access they would only see a cost of $35 for the device (not including monitor, keyboard, or mouse). Once set up and connected to a monitor, the Raspberry Pi behaves like any other computer. Thus, programming on it is a more intuitive process than on the desktop-lacking Arduino. The Raspberry Pi also comes with a built in Wi-Fi receiver that makes it perfect for on-campus remote studies.

II. LITERATURE REVIEW

Low-Impact Development (LID) is a relatively new concept in infrastructure design. Therefore, it is unlikely that the general public fully grasps the functionality of some designs (Bowman, et. al., 2009). This limited understanding extends to LID green roofs like that on Hillside Auditorium. Raising public awareness of LID systems is an important step in their promotion and development on a wider scale.

The proposed live data tweeting system attempts to engage the public in the study of LID benefits. Extending findings to the community in an open and accessible manner is a form of citizen science.

The primary objective of the live data system is to educate the public about the benefits of LID green roofs. This community science initiative will not receive direct scientific contributions from the community. Rather, it will provide a means of engagement with an ongoing study through promotion. Promotion of the science will allow for a wide array of the public to better understand water-based LID through regular interaction (Mercer, 2018). An increased level of engagement could also potentially incite new scientific ideas from the general public.

The study also aims to exhibit the value of widespread livestream sensor systems. Specifically, the study aims to exhibit how these systems can be both affordable and effective. Some universities are considering the deployment of environmental sensing arrays in order to become a “smart campus” where students are given live information about occupancy of buildings and other environmental data. These studies have found that Raspberry Pi microcomputers are capable of accomplishing this at a low cost (Hentschel, 2016). Decreasing cost is essential to the design of large scale sensing systems. The cost of computing has decreased dramatically since the technology’s inception. A Raspberry Pi processor is a credit card sized computer that is priced affordably at $35 USD. This affordability makes the widespread deployment of sensing systems a reasonable venture economically.

Widespread sensing is ultimately necessary as a detector of climate change issues. Biological systems are often our first sign of negative climatic change (Steppe, et. al., 2016). However, the way that they present their data is through wilting, discoloration, and eventually die-offs. Live remote sensing and aggregation of data allows us to detect the stressors that organisms are feeling in real time – as they are feeling them. Sensing does not prevent die offs or climate change as a whole. But it does allow for early warnings and a better understanding of the rate of change. Tweeting the data that is collected is a way of conveying the data to a broad audience so that they too can feel the change. Tweeting data is not absolutely crucial, the widespread audience likely cannot make direct use of the data, but larger quantities of people collectively witnessing change is important. The collective of people are much more acquainted with social media than scientific publications.

Twitter specifically possesses an ease and familiarity that is ultimately a fun way to process information. One example is the University of Arkansas Student Union. It has a twitter (@Arkansas Union) and uses it to assist in event planning and organization as well as student outreach. If a student union can have a personality that has an ultimate educational purpose, why can’t a tree, or a green roof? It is believed that bringing it to life and giving it a voice for our campus (even if it simply provides data) will give students a better understanding of the sustainable development inherent in its design.

Specifically, it should be able to quantitatively show students how sustainable design is a good thing, and how it is impacting the environment we inhabit.

III. METHOD

The initial plans for the design involved automated data fetching, organization into graphs, and tweeting with an accompanying randomized message. The first step in designing the live tweeting system was to establish a stable and consistent connection between the Raspberry Pi microcomputer and the Campbell CR1000 datalogger. However, because of Campbell Scientific’s aversion to third party compatibility, a great deal of time was spent merely establishing the connection from the Raspberry Pi to the datalogger. Because of this challenge and the accompanying difficulty in coding automated data fetching and organization programs the design was shifted to a remote file transfer device. The final design included a set protocol for connection and file transfer from the datalogger as well as a series of Python scripts that allow for convenient dissemination of information regarding the green roof. The overall connection protocol and series of commands necessary for the remote publication of a tweet from the green roof is summarized in Figure 1 below.

Figure 1: Protocol diagram for remote tweet publication

1. Raspberry Pi Setup:

Raspberry Pis require a 5v 2.5A power supply, keyboard, mouse, and monitor for initial setup.

The Linux-based operating system, Raspbian, is loaded manually onto a blank Pi via a micro-SD card. The Linux-based operating system is automatically installed upon the first start up. After that, the computer and its pre-programmed software, Raspbian, is ready to use. The first step in making the device remotely accessible was establishing it as a Virtual Network Computing (VNC) server. VNC is a free software that allows any computer with approved credentials to access remote servers and operate them as a normal desktop. If the initial design plans of full automation were realized, a VNC would be unnecessary. However, because the design shifted to manual operations it became a key component.

Logging on to the VNC software from a remote location is the first step in fetching data and operating tweeting scripts.

It was particularly difficult to provide power to the Raspberry Pi from the CR1000. The CR1000 offers multiple ports for powering external devices including two 12V ports, one switching 12V, and a 5V port. Because the Raspberry Pi requires 5V the initial decision was to use the 5V port. However, the CR1000 does not provide enough current through this port. The Raspberry Pi requires upwards of 2.5A while the CR1000 was shown with a multimeter to only produce around 250mA. This low power output made the 5V port ineffective. When examining the CR1000 operation manual it was noticed that the 12V ports have an accompanying amperage of up to 3A. Supplying 12V to the Raspberry Pi could cause serious damage to the circuitry, however it is possible to step down the voltage from 12V to 5V while maintaining the needed current. A linear voltage regulator was tested as the solution to this problem.

Linear voltage regulators take an input of 12V and provides an output of 5V. However, during testing the small device reached an extremely high temperature, slightly melted, and then caused the Raspberry Pi to lose power. The high temperature reached was due to 7V and 3A (21W) being lost as heat. A more

novel solution was found through the dismantlement of a 12V to 5V USB car charger intended for use in cigarette lighter ports (Image 3; Image 4). Inside these devices one can find a positive and negative terminal for connection. These devices are designed in such a way as to not produce excess heat. So, a 12V feed from the CR1000 was attached to the positive terminal (spring) of the car charger while another wire connected from the negative terminal to the “ground” port of the CR1000 (Image 5).

Image 1: Sensing station on Hillside Auditorium Green Roof
Image 2: Instrumentation inside sensing station
Image 3: 12V to 5V USB car charger with casing removed
Image 4: Voltage reducer with lead wires and USB assembled
Image 5: Power adapter connected CR1000 and battery via switching 12V port

The Raspberry Pi requires significantly more power to operate than the environmental sensors attached to the CR1000. In order to avoid the overuse of battery power supplied by the solar panel on the sensing station it is necessary to power down the Raspberry Pi when not being used. Luckily, the CR1000 comes equipped with a “switching 12V” port that can be controlled by the operating script of the datalogger, CRBasic. This addition is simple and can be adjusted to only have the Raspberry Pi powered on when needed for file transfer. The switching 12V port was programmed to only be active from 11:00am to 4:00 pm every day via the CRBasic code in Figure 2 below.

Figure 2: Switching 12V power CRBasic code

One challenge with the switching 12V power supply is that the Raspberry Pi requires a proper shut down before power is disconnected in order to avoid corruption of the operating system. This process was accomplished with a program built in to the Raspberry Pi called “crontab.” Crontab allows for terminal

commands to be scheduled based on the Raspberry Pi’s onboard clock. To schedule a shutdown for 3:50pm with crontab the following script must be added to the “/etc/crontab” directory:

50  15        *  *  *        root     shutdown

2. CR1000 Connection:

Once the Raspberry Pi and its VNC software are functional a connection to the CR1000 datalogger must be established. This connection proved to be the most difficult portion of the project. The CR1000 connects to computers via an RS-232 cable. The Raspberry Pi has no RS-232 ports. Luckily, the device is built to be manipulated to accomplish a wide variety of tasks. The first attempted solution was installing an RS-232 adapter. This solution came in the form of a HAT (hardware on top) that can be soldered to the Raspberry Pi’s pins. This device showed no promise, however, and any attempts at establishing a connection failed. The cause for its ineffectiveness is unknown. As an alternative, it was

decided that a USB to RS-232 adapter would be most effective. This adapter was included in the final design.

This project was not the first time someone has attempted to connect a Raspberry Pi – or any other computer – to a CR1000 datalogger. Therefore a wide array of predeveloped code exists online from sources such as Github.com. However these sources consistently failed to establish steady connections, likely due to being out of date and incompatible with current Python and Linux software versions. The creators of the source code were unresponsive to any troubleshooting requests.

At this point in the project I teamed up with Biological Engineering PhD student, Colby Reavis, to establish a stable connection. Colby and I tried more incompatible, predeveloped code before deciding to establish the connection ourselves. Colby suggested a connection protocol I was unaware of called Point-to-Point Protocol (PPP). The CR1000 datalogger has PPP imbedded into its design. We simply had to assign an IP address to the datalogger and manually tell the Raspberry Pi, via a PPP command, to establish a link between its own IP address and the one assigned to the datalogger. The command is as follows:

sudo pppd /dev/ttyUSB0 115200 debug noauth nodetach 192.168.27.236:10.0.0.1

  • sudo – tells the Raspberry Pi that the command is given full administrator privileges
  • pppd – initiates the Point-to-Point Protocol command
  • /dev/ttyUSB0 – tells the pppd command to use the USB port that is being used by the USB to RS-232 adapter
  • 115200 – the baud rate (communication speed) of the CR1000
  • debug noauth nodetach – suggested PPP alterations to ensure stable connections and bypass authorization requirements
  • 192.168.27.236:10.0.0.1 – the Raspberry Pi IP address and the assigned CR1000 IP address

Upon the input of this command the two devices are effectively communicating. Communication can be confirmed by pinging the datalogger with the following command:

ping 10.0.0.1

If the Raspberry Pi control terminal shows ping communication then the two devices have successfully been connected.

3. Twitter Application:

Twitter has a section of their website that serves as a platform for application developers. In order to have automated scripts write and publish tweets it was first necessary to register the design as an application on the platform. Once the profile for @UAGreenRoof was operational an application request was sent to Twitter under the name “UAGreenRoofBot” in order to acquire access to the developer platform. The request was immediately accepted and access was made available to important information required by the Python scripts. In order to bypass the typical steps of posting a tweet such as entering a password and logging in, the Twitter developer platform grants access to a profile’s specific tokens and keys that allow remote access from external programs. These tokens and keys are basically long strings of letters and numbers that serve as a username and password that grant login bypass access to external programs. With this information in hand, the next step was to create a module in Python that saved these tokens and keys so that other Python modules can simply pull from them in

order to achieve access to the account. The module was titled “auth.py” and looked like the script below (tokens and keys redacted to maintain privacy):

consumer_key = ‘insert_key'

consumer_secret = ‘insert_secret' access_token = ‘insert_token’ access_token_secret = ‘insert_token_secret'

4. Python Script:

With “auth.py” saved it was able to be conveniently referenced by any new Python script. Because full automation was not reached in this design, several different scripts were written for different occasions in order to make the device easily accessible by any future students who might take up the mantle of maintaining the Twitter account. These Python scripts actually operate independent of the datalogger and are built as a way to easily publish graphs, captured water volume data, and randomized messages regarding water capture. The initial design had these scripts pulling data from the datalogger, designing graphs of daily soil water balances, and posting them as images. However, this objective proved too complex of a process for the time available so it was scrapped in favor of simple volumes of captured rain calculations. Below is an example of the Python script titled

“Rain2Volume_random.py”:

Figure 3: Python script for random captured rainfall tweet

This script operates by calculating a captured volume based on rainfall depth, rounding it to two decimal places, selecting a random caption to be tweeted via a list of strings, inputting the calculated value into

the empty braces of the randomly selected caption via the command “random_caption.format(output),” and posting the message value to twitter. The “.format(output) command takes any “{}” and replaces it with an assigned value. The script is primarily used for quickly calculating and publishing a randomized tweet the day following a precipitous day.

The following script, “Rain2Volume_nonrandom.py” is used when a tweet requires more customization:

Figure 4: Python script for custom captured rainfall tweet

This script operates like “Rain2Volume_random.py” except it foregoes randomized messages and allows more control. In the above example it was used to give the captured volume total for a series of rainy days.

The above Python scripts act independently of the green roof datalogger and use rainfall data generated by the National Weather Service rain gauge at Drake Field. They are used to educate the public on the

primary function of green roofs which is to decrease impervious areas in a watershed and reduce the amount of runoff from a precipitation event.

The final Python script is a simple program that uploads an image file to twitter with an accompanying caption. The example, “Image_tweet.py” is displayed below:

Figure 5: Python script for custom image tweet

The function of this script could just as easily by replicated manually on the Twitter app or website without any complicated Python syntax. However, it was designed with an automated system in mind. One that would create the image file from data fetched from the datalogger and post it on a daily basis to update and educate the public on the environmental conditions of the green roof. The remaining challenge here was not automated tweeting, but automated fetching and organization of data. Data fetching was achieved on a manual level via a 47 year old program called “File Transfer Protocol.”

5.  File Transfer Protocol:

The most significant breakthrough of this project was the successful transfer of a file from the data logger to the Raspberry Pi. The significance is because remote file transfer became the primary goal of the design after the pivot was made from automation. File Transfer Protocol (FTP) is an ancient method of sending and receiving files from two devices over any stable connection. For this project it was first successfully tested via a Graphical User Interface (GUI) called Filezilla. Filezilla allows the user to input the IP address and port of the server (in this case the CR1000) to which the Raspberry Pi is connected. After that, the program runs an FTP and allows the free transfer of files between the two devices.

Once it was known that FTP was effective and file transfer was possible, operating FTP from the Raspberry Pi command terminal was faster and simpler than using Filezilla. The Linux commands and responses for FTP are as follows:

Figure 6: File Transfer Protocol (FTP) command example
  • ftp – starts the FTP program
  • open 10.0.0.1 – establishes a connection between the Raspberry Pi and the CR1000
  • Enter username and password of the FTP server
  • cd, CRD – changes the directory in which files are being pulled from to the CR1000 memory card
  • lcd /home/pi/GreenRoofBot – sets the location on the Raspberry Pi for the files to transfer to
  • mget – command to download files (must confirm with y/n)

6. Operation Steps:

This project is designed to continue past the date of thesis publication because the honors research project it accompanies will continue through the 2019 calendar year. It seems apparent that the operation of this design must be made accessible to any future students who wish to use it. It is also apparent that this project is not alone in the quest to establish consistent and stable connection between a Raspberry Pi and a CR1000. I will therefore detail the steps necessary to operate the system below:

File Transfer and CR1000 data tweet:

  1. Log into VNC viewer on your personal computer
  • Log into the VNC server of the Raspberry Pi
  • Run the PPP command in the Raspberry Pi command terminal using the script in Figure 5.
  • Establish an FTP connection
  • Transfer needed data files from the CR1000 to the Raspberry Pi
  • Open the “.dat” data file as a CSV file delimited by spaces and create a graph of the data
  • Save the graph as an .jpg image
  • Open the Image_tweet.py module from the folder titled “GreenRoofBot,” craft a caption to accompany the data, and direct the script to open and upload the .jpg image
  • Run the Python script Volume of water captured tweet:
  • If rain only occurred the previous day
  1. Open the Rain2Volume_random.py module from the folder titled “GreenRoofBot”
  • Find total rainfall (cm) from the NWS page for the Drake Field weather station
  • Input the total depth of rain (cm) on the rainfall_depthcm value
  • Run the Python script
  • If rain occurred over a series of days or if attempting to send a customized tweet
  1. Open the Rain2Volume_nonrandom.py module from the folder titled “GreenRoofBot”
  • Find total rainfall (cm) from the NWS page for the Drake Field weather station
  • Input the total depth of rain (cm) on the rainfall_depthcm value
  • Write accompanying tweet with “{}” as a placeholder for the total volume value
  • Run the Python script

Iv. RESULTS

If a stable connection is in place and any potential troubleshooting complete, following the file transfer operation steps in section III.6 will deliver a successful remote data collection. This was proven multiple times to be a successful procedure. Following the procedure of randomized or customized tweeting was also shown to be consistently effective. While an automated data collection, organization, and tweeting system was not successfully realized, the structure is in place for its establishment. All of the operational Python script was designed to be built upon for the realization of full automation. The possible procedure for continuation of this design is discussed in the Discussion and Future Opportunities section.

Source: Design of Remote Datalogger Connection and Live Data Tweeting System


About The Author

Muhammad Bilal

I am highly skilled and motivated individual with a Master's degree in Computer Science. I have extensive experience in technical writing and a deep understanding of SEO practices.

Scroll to Top