DXpedition Project

236786

This project is about creating a special build of the Raspberry Pi for a Ham Radio Operator Club. Periodically Ham Radio Organizations partner with other organizations and government agencies to embark on “expeditions”. They travel to remote locations, set up a communications operation, and try to contact other Ham Radio Operators world-wide within a very specific time frame. These guys go to seemingly impossible locations, bringing their own power generation and resources. They leverage many laptop computers, a network, and a satellite uplink to record all the radio contacts made during the expedition. With power being so precious, using Raspberry Pi’s where possible is a huge help in saving energy. This story is about a configuration of Raspberry Pi’s to support such an adventure.

Purpose

This project centered on the development of a set of ‘applications’ and the network design to automate the collection, processing, storage and uploading of ham radio contact information. As with most projects, refinements and changes happen. This project was no different, so I am editing the post to reflect those refinements.

Requirements

Data Collection

The information being collected represents the date and time, the operators on each ‘side’ of the connection, and the specifics of the connection used. That information is captured to a local database includes:

  • Contact Date & Time
  • Call Sign of Contact
  • Operator Call Sign
  • Contact Mode (cw or voice)
  • Band
  • Frequency

Each PC on the private network runs an application that collects this information as entered by ham radio operators. After each contact event, the data are transferred to the ‘Collector’ over the network using UDP/IP. The format of the data in the packet conforms to the ADIF specification; a standard format used by ham radio logging programs that has similarities to XML syntax. Information on ADIF can be found at http://www.adif.org/.

Database

We are using a mysql database to records the events in real time on a pair of Raspberry Pi systems. The workstations are actually sending their data to the network broadcast address, but they can also send the data to multiple specific IP addresses. This provides data protection through redundant Pi Collectors. The contact workstations generate mode data points than are used in reporting, but we store both sets of data in case other analysis is required after the fact.

Data Transfer

One goal of the application is to feed the contact data back to a main server using a satellite Internet connection so that the data can be analyzed throughout the event. A cron job on the Pi Collectors is set to run a contact upload script every five minutes. The script performs a quick check to detect an active satellite uplink connection to the internet by trying to access the FTP server. If successful, the mysql database is queried for new records stored since the last uplink transfer, and the record set is formatted for ADIF compliance, saved as a data file and the file is transmitted to the central FTP server over the satellite link. There is a lot of redundant data stored; the locally stored database and the FTP transfer file are saved on each Pi Collector as a backup in the event that the satellite uplink equipment fails. If both Pi Collectors upload their data, record de-duplication takes place at the central site.

Other Functions

  • The Pi Collector will run an NTP service to serve as a time reference for the PCs on the network – and that initial time will be set manually
  • The Pi Collector will support a firewall between the network connections.
  • Redundant Pi Collectors operate in parallel.
  • The Pi will run ssh and xrdp for headless operation to save energy

Reports

During and after the event, a number of reports will need to be available. The Raspberry Pi’s have an apache server and PHP support to facilitate the generation of reports from the mysql database. Some experimentation is being done at this time with Splunk to create dashboards for immediate report creation. These are some of the reporting key metrics.

  • Number of total records stored in the database
  • Number of records written into the database in the last full 24 hour period
  • The number of times files were uploaded via FTP
  • Total number of SQL records saved / uploaded
  • Date/time of each upload

Testing the application

The testing is a little tricky as we don’t want to use a satellite system as our internet connection until absolutely necessary. Until that time a typical non-business Internet connection will be used (cable modem) to test the uplink functionality.

Data generation testing is not as critical since the PC application has been used many times in other expeditions to create the XML data that is sent to a collection system. Simulation of these systems will be done with a python socket client script which will generate records to be sent via UDP/IP to a python socket server on the Raspberry Pi’s, at a pace that would be expected in the field.

The Pi Collector Process

The Pi’s will have a wired network connection to the workstation network used by the operators PC’s to facilitate data collection. The data from the PC’s will be sent to the network broadcast address with a specific UDP port number so that both Pi’s will see the data packets. Alternately, the data can be sent to both Pi’s by the PC’s using their IP addresses, which would require static IP’s.

Both Pi’s collect the same data, storing it in raw format in a log file as well as parsing the XML and writing records to the mysql database.

The Pi Transmission Process

Each Pi will act as an Access Point for a wireless connection for remote management and report viewing. A simple Internet gateway like you might use at home behind your internet connection is the gateway to the satellite up-link terminal. The satellite uplink terminal is not online constantly as that would be cost prohibitive. Instead, an operator will sign on to the uplink connection 3 or 4 times per day for various administrative reasons.

The Pi Collector’s transfer application is set up as a CRON routine that executes every 5 minutes. At those intervals the script runs and attempts to connect to the FTP server in the cloud. If the FTP connect attempt is successful, the Pi will query the database for the records that have not yet been transferred, wrap them in XML according to the ADIF specification, save that as a gzip compressed file which is transferred to the FTP server. This processing of the data minimizes the amount of bandwidth that is used over the satellite uplink.


In part 2 we discuss Building the DX-Pi’s.