Archive for category Raspberry Pi

GTM Stack: Exploring IoT Data Analytics at the Edge with Grafana, Mosquitto, and TimescaleDB on ARM-based Architectures

In the following post, we will explore the integration of several open-source software applications to build an IoT edge analytics stack, designed to operate on ARM-based edge nodes. We will use the stack to collect, analyze, and visualize IoT data without first shipping the data to the Cloud or other external systems.

Image for post
GMT IoT Edge Analytics Stack architecture (Image by author)

The Edge

Edge computing is a fast-growing technology trend, which involves pushing compute capabilities to the edgeWikipedia describes edge computing as a distributed computing paradigm that brings computation and data storage closer to the location needed to improve response times and save bandwidth. The term edge commonly refers to a compute node at the edge of a network (edge device), sitting in close proximity to the source a data and between that data source and external system such as the Cloud.

In his recent post, 3 Advantages (And 1 Disadvantage) Of Edge Computing, well-known futurist Bernard Marr argues reduced bandwidth requirements, reduced latency, and enhanced security and privacy as three primary advantages of edge computing. Due to techniques like data downsampling, Marr advises one potential disadvantage of edge computing is that important data could end up being overlooked and discarded in the quest to save bandwidth and reduce latency.

David Ricketts, Head of Marketing at Quiss Technology PLC, estimates in his post, Cloud and Edge Computing — The Stats You Need to Know for 2018, the global edge computing market is expected to reach USD 6.72 billion by 2022 at a compound annual growth rate of a whopping 35.4 percent. Realizing the market potential, many major Cloud providers, edge device manufacturers, and integrators are rapidly expanding their edge compute capabilities. AWS, for example, currently offers more than a dozen services in the edge computing category.

Internet of Things

Edge computing is frequently associated with the Internet of Things (IoT). IoT devices, industrial equipment, and sensors generate data, which is transmitted to other internal and external systems, often by way of edge nodes, such as an IoT Gateway. IoT devices typically generate time-series data. According to Wikipedia, a time series is a set of data points indexed in time order — a sequence taken at successive equally spaced points in time. IoT devices typically generate continuous high-volume streams of time-series data, often on the scale of millions of data points per second. IoT data characteristics require IoT platforms to minimally support temporal accuracy, high-volume ingestion and processing, efficient data compression and downsampling, and real-time querying capabilities.

The IoT devices and the edge devices, such as IoT Gateways, which aggregate and transmit IoT data from these devices to external systems, are generally lower-powered, with limited processor, memory, and storage capabilities. Accordingly, IoT platforms must satisfy all the requirements of IoT data while simultaneously supporting resource-constrained environments.

IoT Analytics at the Edge

Leading Cloud providers AWS, Azure, Google Cloud, IBM Cloud, Oracle Cloud, and Alibaba Cloud all offer IoT services. Many offer IoT services with edge computing capabilities. AWS offers AWS IoT Greengrass. Greengrass provides local compute, messaging, data management, sync, and ML inference capabilities to edge devices. Azure offers Azure IoT Edge. Azure IoT Edge provides the ability to run AI, Azure and third-party services, and custom business logic on edge devices using standard containers. Google Cloud offers Edge TPU. Edge TPU (Tensor Processing Unit) is Google’s purpose-built application-specific integrated circuit (ASIC), designed to run AI at the edge.

IoT Analytics

Many Cloud providers also offer IoT analytics as part of their suite of IoT services, although not at the edge. AWS offers AWS IoT Analytics, while Azure has Azure Time Series Insights. Google provides IoT analytics, indirectly, through downstream analytic systems and ad hoc analysis using Google BigQuery or advanced analytics and machine learning with Cloud Machine Learning Engine. These services generally all require data to be transmitted to the Cloud for analytics.

Image for post
Cloud-centric IoT platform data flow (Image by author)

The ability to analyze IoT data at the edge, as data is streamed in real-time, is critical to a rapid feedback loop. IoT edge analytics can accelerate anomaly detection, improve predictive maintenance capabilities, and expedite proactive inventory replenishment.

The IoT Edge Analytics Stack

In my opinion, the ideal IoT edge analytics stack is comprised of lightweight, purpose-built, easily deployable and manageable, platform- and programming language-agnostic, open-source software components. The minimal IoT edge analytics stack should include a lightweight message broker, a time-series database, an ANSI-standard ad-hoc query engine, and a data visualization tool. Each component should be purpose-built for IoT.

Lightweight Message Broker

We will use Eclipse Mosquitto as our message broker. According to the project’s description, Mosquitto is an open-source message broker that implements the Message Queuing Telemetry Transport (MQTT) protocol versions 5.0, 3.1.1, and 3.1. Mosquitto is lightweight and suitable for use on all devices from low power single board computers (SBCs) to full servers.

MQTT Client Library

We will interact with Mosquitto using Eclipse Paho. According to the project, the Eclipse Paho project provides open-source, mainly client-side implementations of MQTT and MQTT-SN in a variety of programming languages. MQTT and MQTT for Sensor Networks (MQTT-SN) are lightweight publish/subscribe messaging transports for TCP/IP and connectionless protocols, such as UDP, respectively.

We will be using Paho’s Python Client. The Paho Python Client provides a client class with support for both MQTT v3.1 and v3.1.1 on Python 2.7 or 3.x. The client also provides helper functions to make publishing messages to an MQTT server straightforward.

Time-Series Database

Time-series databases are optimal for storing IoT data. According to InfluxData, makers of a leading time-series database, InfluxDB, a time-series database (TSDB), is a database optimized for time-stamped or time-series data. Time series data are simply measurements or events that are tracked, monitored, downsampled, and aggregated over time. Jiao Xian, of Alibaba Cloud, has authored an insightful post on the time-series database ecosystem, What Are Time Series Databases? A few leading Cloud providers offer purpose-built time-series databases, though they are not available at the edge. AWS offers Amazon Timestream and Alibaba Cloud offers Time Series Database.

InfluxDB is an excellent choice for a time-series database. It was my first choice, along with TimescaleDB, when developing this stack. However, InfluxDB Flux’s apparent incompatibilities with some ARM-based architectures ruled it out for inclusion in the stack for this particular post.

We will use TimescaleDB as our time-series database. TimescaleDB is the leading open-source relational database for time-series data. Described as ‘PostgreSQL for time-series,’ TimescaleDB is based on PostgreSQL, which provides full ANSI SQL, rock-solid reliability, and a massive ecosystem. TimescaleDB claims to achieve 10–100x faster queries than PostgreSQL, InfluxDB, and MongoDB, with native optimizations for time-series analytics.

TimescaleDB claims to achieve 10–100x faster queries than PostgreSQL, InfluxDB, and MongoDB, with native optimizations for time-series analytics.

TimescaleDB is designed for performing analytical queries, both through its native support for PostgreSQL’s full range of SQL functionality, as well as additional functions native to TimescaleDB. These time-series optimized functions include Median/Percentile, Cumulative Sum, Moving Average, Increase, Rate, Delta, Time Bucket, Histogram, and Gap Filling.

Ad-hoc Data Query Engine

We have the option of using psql, the terminal-based front-end to PostgreSQL, to execute ad-hoc queries against TimescaleDB. The psql front-end enables you to enter queries interactively, issue them to PostgreSQL, and see the query results.

Image for post
View of psql terminal-based interface for querying the TimescaleDB database

We also have the option of using pgAdmin, specifically the biarms/pgadmin4 Docker version, to execute ad-hoc queries and perform most other database tasks. pgAdmin is the most popular open-source administration and development platform for PostgreSQL. While several popular Docker versions of pgAdmin only support Linux AMD64 architectures, the biarms/pgadmin4 Docker version supports ARM-based devices.

Image for post
Dashboard view of TimescaleDB database from within pgAdmin UI

Image for post
Executing a query against the TimescaleDB database using pgAdmin’s Query Tool

Data Visualization

For data visualization, we will use Grafana. Grafana allows you to query, visualize, alert on, and understand metrics no matter where they are stored. With Grafana, you can create, explore, and share dashboards, fostering a data-driven culture. Grafana allows you to define thresholds visually and get notified via Slack, PagerDuty, and more. Grafana supports dozens of data sources, including MySQL, PostgreSQL, Elasticsearch, InfluxDB, TimescaleDB, Graphite, Prometheus, Google BigQuery, GraphQL, and Oracle. Grafana is extensible through a large collection of plugins.

Image for post
Example of Grafana dashboard showing the post’s IoT sensor data

Edge Deployment and Management Platform

Docker introduced the current industry standard for containers in 2013. Docker containers are a standardized unit of software that allows developers to isolate apps from their environment. We will use Docker to deploy the IoT edge analytics stack, referred to herein as the GTM Stack, composed of containerized versions of Eclipse Mosquitto, TimescaleDB, and Grafana, and pgAdmin, to an ARM-based edge node. The acronym, GTM, comes from the three primary OSS projects composing the stack. The acronym also suggests Greenwich Mean Time, relating to the precise time-series nature of IoT data.

Image for post
GMT IoT Edge Analytics Stack architecture (Image by author)

Running Docker Engine in swarm mode, we can use Docker to deploy the complete IoT edge analytics stack to the swarm, running on the edge node. The deploy command accepts a stack description in the form of a Docker Compose file, a YAML file used to configure the application’s services. With a single command, we can create and start all the services from the configuration file.

Source Code

All source code for this post is available on GitHub. Use the following command to git clone a local copy of the project.

git clone –branch main –single-branch –depth 1 –no-tags \
https://github.com/garystafford/iot-analytics-at-the-edge.git
view raw github_gtm.sh hosted with ❤ by GitHub

IoT Devices

For this post, I have deployed three Linux ARM-based IoT devices, each connected to a sensor array. Each sensor array contains multiple analog and digital sensors. The sensors record temperature, humidity, air quality (liquefied petroleum gas (LPG), carbon monoxide (CO), and smoke), light, and motion. For more information on the IoT device and sensor hardware involved, please see my previous post: Getting Started with IoT Analytics on AWS.

Each ARM-based IoT device is running a small Python3-based script, sensor_data_to_mosquitto.py, shown below.

import argparse
import json
import logging
import sys
import time
from datetime import datetime
import paho.mqtt.client as client
import paho.mqtt.publish as publish
from Sensors import Sensors
from getmac import get_mac_address
from pytz import timezone
# Author: Gary A. Stafford
# Date: 10/11/2020
# Usage: python3 sensor_data_to_mosquitto.py \
# –host "192.168.1.12" –port 1883 \
# –topic "sensor/output" –frequency 10
sensors = Sensors()
logger = logging.getLogger(__name__)
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
def main():
args = parse_args()
publish_message_to_db(args)
def get_readings():
sensors.led_state(0)
# Retrieve sensor readings
payload_dht = sensors.get_sensor_data_dht()
payload_gas = sensors.get_sensor_data_gas()
payload_light = sensors.get_sensor_data_light()
payload_motion = sensors.get_sensor_data_motion()
message = {
"device_id": get_mac_address(),
"time": datetime.now(timezone("UTC")),
"data": {
"temperature": payload_dht["temperature"],
"humidity": payload_dht["humidity"],
"lpg": payload_gas["lpg"],
"co": payload_gas["co"],
"smoke": payload_gas["smoke"],
"light": payload_light["light"],
"motion": payload_motion["motion"]
}
}
return message
def date_converter(o):
if isinstance(o, datetime):
return o.__str__()
def publish_message_to_db(args):
while True:
message = get_readings()
message_json = json.dumps(message, default=date_converter, sort_keys=True,
indent=None, separators=(',', ':'))
logger.debug(message_json)
try:
publish.single(args.topic, payload=message_json, qos=0, retain=False,
hostname=args.host, port=args.port, client_id="",
keepalive=60, will=None, auth=None, tls=None,
protocol=client.MQTTv311, transport="tcp")
except Exception as error:
logger.error("Exception: {}".format(error))
finally:
time.sleep(args.frequency)
# Read in command-line parameters
def parse_args():
parser = argparse.ArgumentParser(description='Script arguments')
parser.add_argument('–host', help='Mosquitto host', default='localhost')
parser.add_argument('–port', help='Mosquitto port', type=int, default=1883)
parser.add_argument('–topic', help='Mosquitto topic', default='paho/test')
parser.add_argument('–frequency', help='Message frequency in seconds', type=int, default=5)
return parser.parse_args()
if __name__ == "__main__":
main()

The IoT devices’ script implements the Eclipse Paho MQTT Python client library. An MQTT message containing simultaneous readings from each sensor is sent to a Mosquitto topic on the edge node, at a configurable frequency.

message = {
"device_id": get_mac_address(),
"time": datetime.now(timezone("UTC")),
"data": {
"temperature": payload_dht["temperature"],
"humidity": payload_dht["humidity"],
"lpg": payload_gas["lpg"],
"co": payload_gas["co"],
"smoke": payload_gas["smoke"],
"light": payload_light["light"],
"motion": payload_motion["motion"]
}
}
view raw sensor_message.py hosted with ❤ by GitHub

IoT Edge Node

For this post, I have deployed a single Linux ARM-based edge node. The three IoT devices, containing sensor arrays, communicate with the edge node over Wi-Fi. The IoT devices could easily use an alternative communication protocol, such as BLE, LoRaWAN, or Ethernet. For more information on BLE and LoRaWAN, please see some of my previous posts: LoRa and LoRaWAN for IoT: Getting Started with LoRa and LoRaWAN Protocols for Low Power, Wide Area Networking of IoT and BLE and GATT for IoT: Getting Started with Bluetooth Low Energy (BLE) and Generic Attribute Profile (GATT) Specification for IoT.

The edge node is also running a small Python3-based script, mosquitto_to_timescaledb.py, shown below.

import argparse
import json
import logging
import sys
from datetime import datetime
import paho.mqtt.client as mqtt
import psycopg2
# Author: Gary A. Stafford
# Date: 10/11/2020
# Usage: python3 mosquitto_to_timescaledb.py \
# –msqt_topic "sensor/output –msqt_host "192.168.1.12" –msqt_port 1883 \
# –ts_host "192.168.1.12" –ts_port 5432 \
# –ts_username postgres –ts_password postgres1234 –ts_database demo_iot
logger = logging.getLogger(__name__)
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
args = argparse.Namespace
ts_connection: str = ""
def main():
global args
args = parse_args()
global ts_connection
ts_connection = "postgres://{}:{}@{}:{}/{}".format(args.ts_username, args.ts_password, args.ts_host,
args.ts_port, args.ts_database)
logger.debug("TimescaleDB connection: {}".format(ts_connection))
client = mqtt.Client()
client.on_connect = on_connect
client.on_message = on_message
client.connect(args.msqt_host, args.msqt_port, 60)
# Blocking call that processes network traffic, dispatches callbacks and
# handles reconnecting.
# Other loop*() functions are available that give a threaded interface and a
# manual interface.
client.loop_forever()
# The callback for when the client receives a CONNACK response from the server.
def on_connect(client, userdata, flags, rc):
logger.debug("Connected with result code {}".format(str(rc)))
# Subscribing in on_connect() means that if we lose the connection and
# reconnect then subscriptions will be renewed.
client.subscribe(args.msqt_topic)
# The callback for when a PUBLISH message is received from the server.
def on_message(client, userdata, msg):
logger.debug("Topic: {}, Message Payload: {}".format(msg.topic, str(msg.payload)))
publish_message_to_db(msg)
def date_converter(o):
if isinstance(o, datetime):
return o.__str__()
def publish_message_to_db(message):
message_payload = json.loads(message.payload)
# logger.debug("message.payload: {}".format(json.dumps(message_payload, default=date_converter)))
sql = """INSERT INTO sensor_data(time, device_id, temperature, humidity, lpg, co, smoke, light, motion)
VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s);"""
data = (
message_payload["time"], message_payload["device_id"], message_payload["data"]["temperature"],
message_payload["data"]["humidity"], message_payload["data"]["lpg"], message_payload["data"]["co"],
message_payload["data"]["smoke"], message_payload["data"]["light"], message_payload["data"]["motion"])
try:
with psycopg2.connect(ts_connection, connect_timeout=3) as conn:
with conn.cursor() as curs:
try:
curs.execute(sql, data)
except psycopg2.Error as error:
logger.error("Exception: {}".format(error.pgerror))
except Exception as error:
logger.error("Exception: {}".format(error))
except psycopg2.OperationalError as error:
logger.error("Exception: {}".format(error.pgerror))
finally:
conn.close()
# Read in command-line parameters
def parse_args():
parser = argparse.ArgumentParser(description='Script arguments')
parser.add_argument('–msqt_topic', help='Mosquitto topic', default='paho/test')
parser.add_argument('–msqt_host', help='Mosquitto host', default='localhost')
parser.add_argument('–msqt_port', help='Mosquitto port', type=int, default=1883)
parser.add_argument('–ts_host', help='TimescaleDB host', default='localhost')
parser.add_argument('–ts_port', help='TimescaleDB port', type=int, default=5432)
parser.add_argument('–ts_username', help='TimescaleDB username', default='postgres')
parser.add_argument('–ts_password', help='TimescaleDB password', default='postgres1234')
parser.add_argument('–ts_database', help='TimescaleDB password', default='demo_iot')
return parser.parse_args()
if __name__ == "__main__":
main()

Similar to the IoT devices, the edge node’s script implements the Eclipse Paho MQTT Python client library. The script pulls MQTT messages off a Mosquitto topic(s), serializes the message payload to JSON, and writes the payload’s data to the TimescaleDB database. The edge node’s script accepts several arguments, which allow you to configure necessary Mosquitto and TimescaleDB variables.

Why not use Telegraf?

Telegraf is a plugin-driven agent that collects, processes, aggregates, and writes metrics. There is a Telegraf output plugin, the PostgreSQL and TimescaleDB Output Plugin for Telegraf, produced by TimescaleDB. The plugin could replace the need to manage and maintain the above script. However, I chose not to use it because it is not yet an official Telegraf plugin. If the plugin was included in a Telegraf release, I would certainly encourage its use.

Script Management

Both the Linux-based IoT devices and the edge node run systemd system and service manager. To ensure the Python scripts keep running in the case of a system restart, we define a systemd unit. Units are the objects that systemd knows how to manage. These are basically a standardized representation of system resources that can be managed by the suite of daemons and manipulated by the provided utilities. Each script has a systemd unit files. Below, we see the gtm_stack_mosquitto unit file, gtm_stack_mosquitto.service.

[Unit]
Description=GTM Stack – Mosquitto Script
After=network.target
[Service]
ExecStart=/usr/bin/python3 -u sensor_data_to_mosquitto.py \
–host ${MOSQUITTO_HOST} –port ${MOSQUITTO_PORT} –topic ${MOSQUITTO_TOPIC}
WorkingDirectory=/home/pi/iot-analytics-at-the-edge
StandardOutput=inherit
StandardError=inherit
Restart=always
User=pi
[Install]
WantedBy=multi-user.target

The gtm_stack_mosq_to_tmscl unit file, gtm_stack_mosq_to_tmscl.service, is nearly identical.

To install the gtm_stack_mosquitto.service systemd unit file on each IoT device, use the following commands.

SERVICE=gtm_stack_mosquitto
sudo cp ${SERVICE}.service /etc/systemd/system/
sudo systemctl start ${SERVICE}.service
sudo systemctl enable ${SERVICE}.service
# check status
systemctl status ${SERVICE}
view raw systemd.sh hosted with ❤ by GitHub

Installing the gtm_stack_mosq_to_tmscl.service unit file on the edge node is nearly identical.

Docker Stack

The edge node runs the GTM Docker stack, stack.yml, in a swarm. As discussed earlier, the stack contains four containers: Eclipse Mosquitto, TimescaleDB, and Grafana, along with pgAdmin. The Mosquitto, TimescaleDB, and Grafana containers have paths within the containers, bind-mounted to directories on the edge device. With bind-mounting, the container’s data will persist if the containers are removed and re-created. The containers are running on their own isolated overlay network.

version: "3.8"
services:
timescaledb:
image: timescale/timescaledb:1.7.4-pg12
ports:
"5432:5432/tcp"
networks:
demo-iot-net
environment:
POSTGRES_USERNAME: postgres
POSTGRES_PASSWORD: postgres1234
POSTGRES_DB: demo_iot
deploy:
restart_policy:
condition: on-failure
volumes:
$HOME/data/postgres:/var/lib/postgresql/data
grafana:
image: grafana/grafana:7.1.5
ports:
"3000:3000/tcp"
networks:
demo-iot-net
deploy:
restart_policy:
condition: on-failure
volumes:
$HOME/data/grafana:/var/lib/grafana
user: $ID
mosquitto:
image: eclipse-mosquitto:1.6.12
ports:
"1883:1883/tcp"
# – "9001:9001/tcp"
networks:
demo-iot-net
deploy:
restart_policy:
condition: on-failure
volumes:
$HOME/data/mosquitto:/mosquitto
pgadmin:
image: biarms/pgadmin4:4.21
ports:
"5050:5050/tcp"
networks:
demo-iot-net
deploy:
restart_policy:
condition: on-failure
networks:
demo-iot-net:
view raw gtm_stack.yml hosted with ❤ by GitHub

The GTM Docker stack is installed using the following commands on the edge node. We will assume Docker and git are pre-installed on the edge node for this post.

# on edge node
git clone https://github.com/garystafford/iot-analytics-at-the-edge.git
# build directories
mkdir -p ~/data/postgres
mkdir -p ~/data/grafana
mkdir -p ~/data/mosquitto/config
# move mosquitto config
cd iot-analytics-at-the-edge
cp mosquitto.conf ~/data/mosquitto/config/
# deploy stack
docker swarm init
docker stack deploy -c stack.yml iot
# check status of stack
docker stack ps iot –no-trunc
docker stack services iot
view raw gtm_stack.sh hosted with ❤ by GitHub

First, we create the proper local directories on the edge device, which will be used to bind-mount to the container’s directories. Below, we see the bind-mounted local directories with the eventual container’s contents stored within them.

Image for post
The bind-mounted local directories on the edge device from the stack

Next, we copy the custom Mosquitto configuration file, mosquitto.conf, included in the project, to the correct location on the edge device. Lastly, we initialize the Docker swarm and deploy the stack.

Image for post
Output of docker container ls command, showing the running GTM Stack containers
Image for post
Output of docker stats command, showing the resource consumption of GTM Stack containers

TimescaleDB Setup

With the GTM stack running, we need to create a single Timescale hypertablesensor_data, in the TimescaleDB demo_iot database, to hold the incoming IoT sensor data. Hypertables, according to TimescaleDB, are designed to be easy to manage and to behave like standard PostgreSQL tables. Hypertables are comprised of many interlinked “chunk” tables. Commands made to the hypertable automatically propagate changes down to all of the chunks belonging to that hypertable.

CREATE TABLE IF NOT EXISTS sensor_data (
time timestamptz NOT NULL,
device_id text NOT NULL,
temperature double PRECISION NOT NULL,
humidity double PRECISION NOT NULL,
lpg double PRECISION NOT NULL,
co double PRECISION NOT NULL,
smoke double PRECISION NOT NULL,
light boolean NOT NULL,
motion boolean NOT NULL
);
SELECT create_hypertable('sensor_data', 'time');
view raw sensor_data.sql hosted with ❤ by GitHub

I suggest using psql to execute the required DDL statements, which will create the hypertable, as well as the proceeding views and database user permissions. All SQL statements are included in the project’s statements.sql file. One way to use psql is to install it on your local workstation, then use psql to connect to the remote edge node. I prefer to instantiate a local PostgreSQL Docker container instance running psql. I then use the local container’s psql client to connect to the edge node’s TimescaleDB database. For example, from my local machine, I run the following docker run command to connect to the edge node’s TimescaleDB database, on the edge node, located locally at 192.168.1.12.

docker run -it –rm postgres psql \
-U postgres -h 192.168.1.12 -p 5432 -d demo_iot
view raw docker_run.sh hosted with ❤ by GitHub

Although not always practical, could also access psql from within the TimescaleDB Docker container, running on the actual edge node, using the following docker exec command.

TIMESCALEDB_CONTAINER=$(docker ps -q \
–filter='name=iot_timescaledb.1' –format '{{.Names}}')
docker exec -it ${TIMESCALEDB_CONTAINER} psql \
-U postgres -h localhost -d demo_iot
view raw access_psql.sh hosted with ❤ by GitHub

TimescaleDB Continuous Aggregates

For this post’s demonstration, we need to create four TimescaleDB database views, which will be queried from an eventual Grafana Dashboard. The views are TimescaleDB Continuous Aggregates. According to Timescale, aggregate queries that touch large swathes of time-series data can take a long time to compute because the system needs to scan large amounts of data on every query execution. TimescaleDB continuous aggregates automatically calculate the results of a query in the background and materialize the results.

For example, in this post, we generate sensor data every five seconds from the three IoT devices. When visualizing a 24-hour period in Grafana, using continuous aggregates with an interval of one minute, we would reduce the total volume of data queried from ~51,840 rows to ~4,320 rows, a reduction of over 91%. The larger the time period or the number of IoT devices being analyzed, the more significant these savings will positively impact query performance.

time_bucket on the time partitioning column of the hypertable is required in all continuous aggregate views. The time_bucket function, in this case, has a bucket width (interval) of 1 minute. The interval is configurable.

temperature and humidity
CREATE VIEW temperature_humidity_summary_minute WITH (timescaledb.continuous)
AS
SELECT device_id,
time_bucket(INTERVAL '1 minute', time) AS bucket,
AVG(temperature) AS avg_temp,
AVG(humidity) AS avg_humidity
FROM sensor_data
GROUP BY device_id,
bucket;
air quality (lpg, co, smoke)
CREATE VIEW air_quality_summary_minute WITH (timescaledb.continuous)
AS
SELECT device_id,
time_bucket(INTERVAL '1 minute', time) AS bucket,
AVG(lpg) AS avg_lpg,
MAX(co) AS avg_co,
MIN(smoke) AS avg_smoke
FROM sensor_data
GROUP BY device_id,
bucket;
light
CREATE VIEW light_summary_minute WITH (timescaledb.continuous)
AS
SELECT device_id,
time_bucket(INTERVAL '1 minute', time) AS bucket,
AVG(
case
when light = 't' then 1
else 0
end
) AS avg_light
FROM sensor_data
GROUP BY device_id,
bucket;
motion
CREATE VIEW motion_summary_minute WITH (timescaledb.continuous)
AS
SELECT device_id,
time_bucket(INTERVAL '1 minute', time) AS bucket,
AVG(
case
when motion = 't' then 1
else 0
end
) AS avg_motion
FROM sensor_data
GROUP BY device_id,
bucket;
view raw view.sql hosted with ❤ by GitHub

Limiting Grafana’s Access to IoT Data

Following the Grafana recommendation for database user permissions, we create a grafanareader PostgresSQL user, and limit the user’s access to the sensor_data table and the four views we created. Grafana will use this user’s credentials to perform SELECT queries of the TimescaleDB demo_iot database.

CREATE USER grafanareader WITH PASSWORD 'grafana1234';
GRANT USAGE ON SCHEMA public TO grafanareader;
GRANT SELECT ON public.sensor_data TO grafanareader;
GRANT SELECT ON public.temperature_humidity_summary_minute TO grafanareader;
GRANT SELECT ON public.air_quality_summary_minute TO grafanareader;
GRANT SELECT ON public.light_summary_minute TO grafanareader;
GRANT SELECT ON public.motion_summary_minute TO grafanareader;
view raw grafanareader.sql hosted with ❤ by GitHub

Grafana Dashboards

Using the TimescaleDB continuous aggregates we have created, we can quickly build a richly-featured dashboard in Grafana. Below we see a typical IoT Dashboard you might build to monitor the post’s IoT sensor data in near real-time. An exported version, dashboard_external_export.json, is included in the GitHub project.

Image for post
Example of Grafana dashboard showing the post’s IoT sensor data
Image for post
Example of Grafana IoT Demo Dashboard showing sensor data

Grafana’s documentation includes a comprehensive set of instructions for Using PostgreSQL in Grafana. To connect to the TimescaleDB database from Grafana, we use a PostgreSQL data source.

Image for post
Configuring the TimescaleDB database connection in Grafana

The data displayed in each Panel in the Grafana Dashboard is based on a SQL query. For example, the Average Temperature Panel might use a query similar to the example below. This particular query also converts Celcius to Fahrenheit. Note the use of Grafana Macros (e.g., $__time()$__timeFilter()). Macros can be used within a query to simplify syntax and allow for dynamic parts.

SELECT
$__time(bucket),
device_id AS metric,
((avg_temp * 1.9) + 32) AS avg_temp
FROM temperature_humidity_summary_minute
WHERE
$__timeFilter(bucket)
ORDER BY 1,2

Below, we see another example from the Average Humidity Panel. In this particular query, we might choose to validate the humidity data is within an acceptable range of 0%–100%.

SELECT
$__time(bucket),
device_id AS metric,
avg_humidity
FROM temperature_humidity_summary_minute
WHERE
$__timeFilter(bucket)
AND avg_humidity >= 0.0
AND avg_humidity <= 100.0
ORDER BY 1,2

Mobile Friendly

Grafana dashboards are mobile-friendly. Below we see two views of the dashboard, using the Chrome mobile browser on an Apple iPhone.

Image for post

Grafana Alerts

Grafana allows Alerts to be created based on Rules you define in each Panel. If data values match the Rule’s conditions, which you pre-define, such as a temperature reading above a certain threshold for a set amount of time, an alert is sent to your choice of destinations. According to the Rule shown below, If the average temperature exceeds 75°F for a period of 5 minutes, an alert is sent.

Image for post
High-temperature rule configuration

As demonstrated below, when the temperature in the laboratory began to exceed 75°F, the alert entered a ‘Pending’ state. If the temperature exceeded 75°F for the pre-determined period of 5 minutes, the alert status changed to ‘Alerting’, and an alert was sent. When the temperature dropped back below 75°F for the pre-determined period of 5 minutes, the alert status changed from ‘Alerting’ to ‘OK’, and a subsequent notification was sent.

Image for post
Average temperature graph showing the various alert status changes

There are currently 18 destinations available out-of-the-box with Grafana, including Slack, email, PagerDuty, webhooks, HipChat, and Microsoft Teams. We can use Grafana Alerts to notify the proper resources, in near real-time, if an issue is detected, based on the data. Below, we see an actual series of high-temperature alerts sent by Grafana to a Slack channel, followed by subsequent notifications as the temperature returned to normal.

Image for post
Grafana alert notifications in Slack channel

Ad-hoc Queries

The ability to perform ad-hoc queries on the time-series IoT data is an essential feature of the IoT edge analytics stack. We can use psql or pgAdmin to perform ad-hoc queries against the TimescaleDB database. Below are examples of typical ad-hoc queries we might perform on the IoT sensor data. These example queries demonstrate TimescaleDB’s advanced analytical capabilities for working with time-series data, including Moving Average, Delta, Time Bucket, and Histogram.

find max temperature (°C) and humidity (%) for last 3 hours in 15 minute time periods
https://docs.timescale.com/latest/using-timescaledb/reading-data#select
SELECT time_bucket('15 minutes', time) AS fifteen_min,
device_id,
COUNT(*),
MAX(temperature) AS max_temp,
MAX(humidity) AS max_hum
FROM sensor_data
WHERE time > NOW() INTERVAL '3 hours'
AND humidity BETWEEN 0 AND 100
GROUP BY fifteen_min, device_id
ORDER BY fifteen_min DESC, max_temp DESC;
find temperature (°C) anomalies (delta > ~5°F)
https://docs.timescale.com/latest/using-timescaledb/reading-data#delta
SELECT ht.time, ht.temperature, ht.delta
FROM (
SELECT time,
temperature,
abs(temperature LAG(temperature) OVER (ORDER BY time)) AS delta
FROM sensor_data) AS ht
WHERE ht.delta > 2.63
ORDER BY ht.time;
find three minute moving average of temperature (°F) for last day
(5 sec. interval * 36 rows = 3 min.)
https://docs.timescale.com/latest/using-timescaledb/reading-data#moving-average
SELECT time,
AVG((temperature * 1.9) + 32) OVER (ORDER BY time
ROWS BETWEEN 35 PRECEDING AND CURRENT ROW)
AS smooth_temp
FROM sensor_data
WHERE device_id = 'Manufacturing Plant'
AND time > NOW() INTERVAL '1 day'
ORDER BY time DESC;
find average humidity (%) for last 12 hours in 5-minute time periods
https://docs.timescale.com/latest/using-timescaledb/reading-data#time-bucket
SELECT time_bucket('5 minutes', time) AS time_period,
AVG(humidity) AS avg_humidity
FROM sensor_data
WHERE device_id = 'Main Warehouse'
AND humidity BETWEEN 0 AND 100
AND time > NOW() INTERVAL '12 hours'
GROUP BY time_period
ORDER BY time_period DESC;
calculate histograms of avg. temperature (°F) between 55-85°F in 5°F buckets during last 2 days
https://docs.timescale.com/latest/using-timescaledb/reading-data#histogram
SELECT device_id,
COUNT(*),
histogram((temperature * 1.9) + 32, 55.0, 85.0, 5)
FROM sensor_data
WHERE temperature is not Null
AND time > NOW() INTERVAL '2 days'
GROUP BY device_id;
find average light value for last 90 minutes in 5-minute time periods
https://docs.timescale.com/latest/using-timescaledb/reading-data#time-bucket
SELECT device_id,
time_bucket('5 minutes', time) AS five_min,
AVG(case when light = 't' then 1 else 0 end) AS avg_light
FROM sensor_data
WHERE device_id = 'Manufacturing Plant'
AND time > NOW() INTERVAL '90 minutes'
GROUP BY device_id, five_min
ORDER BY five_min DESC;
view raw ad_hoc_timescale.sql hosted with ❤ by GitHub

Conclusion

In this post, we have explored the development of an IoT edge analytics stack, comprised of lightweight, purpose-built, easily deployable and manageable, platform- and programming language-agnostic, open-source software components. These components included Docker containerized versions of Eclipse Mosquitto, TimescaleDB, Grafana, and pgAdmin, referred to as the GTM Stack. Using the GTM stack, we collected, analyzed, and visualized IoT data, without first shipping the data to Cloud or other external systems.


This blog represents my own viewpoints and not of my employer, Amazon Web Services. All product names, logos, and brands are the property of their respective owners.

, , , , , , ,

Leave a comment

Collecting and Analyzing IoT Data in Near Real-Time with AWS IoT, LoRa, and LoRaWAN

Introduction

In a recent post published on ITNEXT, LoRa and LoRaWAN for IoT: Getting Started with LoRa and LoRaWAN Protocols for Low Power, Wide Area Networking of IoT, we explored the use of the LoRa (Long Range) and LoRaWAN protocols to transmit and receive sensor data, over a substantial distance, between an IoT device, containing several embedded sensors, and an IoT gateway. In this post, we will extend that architecture to the Cloud, using AWS IoT, a broad and deep set of IoT services, from the edge to the Cloud. We will securely collect, transmit, and analyze IoT data using the AWS cloud platform.

LoRa and LoRaWAN

According to the LoRa Alliance, Low-Power, Wide-Area Networks (LPWAN) are projected to support a major portion of the billions of devices forecasted for the Internet of Things (IoT). LoRaWAN is designed from the bottom up to optimize LPWANs for battery lifetime, capacity, range, and cost. LoRa and LoRaWAN permit long-range connectivity for IoT devices in different types of industries. According to Wikipedia, LoRaWAN defines the communication protocol and system architecture for the network, while the LoRa physical layer enables the long-range communication link.

AWS IoT

AWS describes AWS IoT as a set of managed services that enable ‘internet-connected devices to connect to the AWS Cloud and lets applications in the cloud interact with internet-connected devices.’ AWS IoT services span three categories: Device Software, Connectivity and Control, and Analytics.

In this post, we will focus on three AWS IOT services, one from each category, including AWS IoT Device SDKs, AWS IoT Core, and AWS IoT Analytics. According to AWS, the AWS IoT Device SDKs include open-source libraries and developer and porting guides with samples to help you build innovative IoT products or solutions on your choice of hardware platforms. AWS IoT Core is a managed cloud service that lets connected devices easily and securely interact with cloud applications and other devices. AWS IoT Core can process and route messages to AWS endpoints and other devices reliably and securely. Finally, AWS IoT Analytics is a fully-managed IoT analytics service, designed specifically for IoT, which collects, pre-processes, enriches, stores, and analyzes IoT device data at scale.

To learn more about AWS IoT, specifically the AWS IoT services we will be exploring within this post, I recommend reading my recent post published on Towards Data Science, Getting Started with IoT Analytics on AWS.

Hardware Selection

In this post, we will use the following hardware.

IoT Device with Embedded Sensors

An Arduino single-board microcontroller will serve as our IoT device. The 3.3V AI-enabled Arduino Nano 33 BLE Sense board (Amazon: USD 36.00), released in August 2019, comes with the powerful nRF52840 processor from Nordic Semiconductors, a 32-bit ARM Cortex-M4 CPU running at 64 MHz, 1MB of CPU Flash Memory, 256KB of SRAM, and a NINA-B306 stand-alone Bluetooth 5 low energy (BLE) module.

The Sense contains an impressive array of embedded sensors:

  • 9-axis Inertial Sensor (LSM9DS1): 3D digital linear acceleration sensor, a 3D digital
    angular rate sensor, and a 3D digital magnetic sensor
  • Humidity and Temperature Sensor (HTS221): Capacitive digital sensor for relative humidity and temperature
  • Barometric Sensor (LPS22HB): MEMS nano pressure sensor: 260–1260 hectopascal (hPa) absolute digital output barometer
  • Microphone (MP34DT05): MEMS audio sensor omnidirectional digital microphone
  • Gesture, Proximity, Light Color, and Light Intensity Sensor (APDS9960): Advanced Gesture detection, Proximity detection, Digital Ambient Light Sense (ALS), and Color Sense (RGBC).

The Arduino Sense is an excellent, low-cost single-board microcontroller for learning about the collection and transmission of IoT sensor data.

IoT Gateway

An IoT Gateway, according to TechTarget, is a physical device or software program that serves as the connection point between the Cloud and controllers, sensors, and intelligent devices. All data moving to the Cloud, or vice versa, goes through the gateway, which can be either a dedicated hardware appliance or software program.

LoRa Gateways, to paraphrase The Things Network, form the bridge between devices and the Cloud. Devices use low power networks like LoRaWAN to connect to the Gateway, while the Gateway uses high bandwidth networks like WiFi, Ethernet, or Cellular to connect to the Cloud.

A third-generation Raspberry Pi 3 Model B+ single-board computer (SBC) will serve as our LoRa IoT Gateway. This Raspberry Pi model features a 1.4GHz Cortex-A53 (ARMv8) 64-bit quad-core processor System on a Chip (SoC), 1GB LPDDR2 SDRAM, dual-band wireless LAN, Bluetooth 4.2 BLE, and Gigabit Ethernet (Amazon: USD 42.99).

LoRa Transceiver Modules

To transmit the IoT sensor data between the IoT device, containing the embedded sensors, and the IoT gateway, I have used the REYAX RYLR896 LoRa transceiver module (Amazon: USD 19.50 x 2). The transceiver modules are commonly referred to as a universal asynchronous receiver-transmitter (UART). A UART is a computer hardware device for asynchronous serial communication in which the data format and transmission speeds are configurable.

According to the manufacturer, REYAX, the RYLR896 contains the Semtech SX1276 long-range, low power transceiver. The RYLR896 module provides ultra-long range spread spectrum communication and high interference immunity while minimizing current consumption. Each RYLR896 module contains a small, PCB integrated, helical antenna. This transceiver operates at both the 868 and 915 MHz frequency ranges. In this demonstration, we will be transmitting at 915 MHz for North America.

The Arduino Sense (IoT device) transmits data, using one of the RYLR896 modules (shown below front). The Raspberry Pi (IoT Gateway), connected to the other RYLR896 module (shown below rear), receives the data.

LoRaWAN Security

The RYLR896 is capable of AES 128-bit data encryption. Using the Advanced Encryption Standard (AES), we will encrypt the data sent from the IoT device to the IoT gateway, using a 32 hex digit password (128 bits / 4 bits/hex digit).

Provisioning AWS Resources

To start, we will create the necessary AWS IoT and associated resources on the AWS cloud platform. Once these resources are in place, we can then proceed to configure the IoT device and IoT gateway to securely transmit the sensor data to the Cloud.

All the source code for this post is on GitHub. Use the following command to git clone a local copy of the project.

git clone \
  –branch master –single-branch –depth 1 –no-tags \
  https://github.com/garystafford/aws-iot-analytics-demo.git

AWS CloudFormation

The CloudFormation template, iot-analytics.yaml, will create an AWS IoT CloudFormation stack containing the following resources.

  • AWS IoT Thing
  • AWS IoT Thing Policy
  • AWS IoT Core Topic Rule
  • AWS IoT Analytics Channel, Pipeline, Data store, and Data set
  • AWS Lambda and Lambda Permission
  • Amazon S3 Bucket
  • Amazon SageMaker Notebook Instance
  • AWS IAM Roles

Please be aware of the costs involved with the AWS resources used in the CloudFormation template before continuing. To create the AWS CloudFormation stack from the included CloudFormation template, execute the following AWS CLI command.

aws cloudformation create-stack \
–stack-name lora-iot-demo \
–template-body file://cloudformation/iot-analytics.yaml \
–parameters ParameterKey=ProjectName,ParameterValue=lora-iot-demo \
ParameterKey=IoTTopicName,ParameterValue=lora-iot-demo \
–capabilities CAPABILITY_NAMED_IAM

The resulting CloudFormation stack should contain 16 AWS resources.

Additional Resources

Unfortunately, AWS CloudFormation cannot create all the AWS IoT resources we require for this demonstration. To complete the AWS provisioning process, execute the following series of AWS CLI commands, aws_cli_commands.md. These commands will create the remaining resources, including an AWS IoT Thing Type, Thing Group, Thing Billing Group, and an X.509 Certificate.

# LoRaWAN / AWS IoT Demo
# Author: Gary Stafford
# Run AWS CLI commands after CloudFormation stack completes successfully
# variables
thingName=lora-iot-gateway-01
thingPolicy=LoRaDevicePolicy
thingType=LoRaIoTGateway
thingGroup=LoRaIoTGateways
thingBillingGroup=LoRaIoTGateways
mkdir ${thingName}
aws iot create-keys-and-certificate \
–certificate-pem-outfile "${thingName}/${thingName}.cert.pem" \
–public-key-outfile "${thingName}/${thingName}.public.key" \
–private-key-outfile "${thingName}/${thingName}.private.key" \
–set-as-active
# assuming you only have one certificate registered
certificate=$(aws iot list-certificates | jq '.[][] | .certificateArn')
## alternately, for a specific certificate if you have more than one
# aws iot list-certificates
## then change the value below
# certificate=arn:aws:iot:us-east-1:123456789012:cert/<certificate>
aws iot attach-policy \
–policy-name $thingPolicy \
–target $certificate
aws iot attach-thing-principal \
–thing-name $thingName \
–principal $certificate
aws iot create-thing-type \
–thing-type-name $thingType \
–thing-type-properties "thingTypeDescription=LoRaWAN IoT Gateway"
aws iot create-thing-group \
–thing-group-name $thingGroup \
–thing-group-properties "thingGroupDescription=\"LoRaWAN IoT Gateway Thing Group\", attributePayload={attributes={Manufacturer=RaspberryPiFoundation}}"
aws iot add-thing-to-thing-group \
–thing-name $thingName \
–thing-group-name $thingGroup
aws iot create-billing-group \
–billing-group-name $thingBillingGroup \
–billing-group-properties "billingGroupDescription=\"Gateway Billing Group\""
aws iot add-thing-to-billing-group \
–thing-name $thingName \
–billing-group-name $thingBillingGroup
aws iot update-thing \
–thing-name $thingName \
–thing-type-name $thingType \
–attribute-payload "{\"attributes\": {\"GatewayMfr\":\"RaspberryPiFoundation\", \"LoRaMfr\":\"REYAX\", \"LoRaModel\":\"RYLR896\"}}"
aws iot describe-thing \
–thing-name $thingName
view raw aws_cli_commands.sh hosted with ❤ by GitHub

IoT Device Configuration

With the AWS resources deployed, we can configure the IoT device and IoT Gateway.

Arduino Sketch

For those not familiar with Arduino, a sketch is the name that Arduino uses for a program. It is the unit of code that is uploaded into non-volatile flash memory and runs on an Arduino board. The Arduino language is a set of C and C++ functions. All standard C and C++ constructs supported by the avr-g++ compiler should work in Arduino.

For this post, the sketch, lora_iot_demo_aws.ino, contains the code necessary to collect and securely transmit the environmental sensor data, including temperature, relative humidity, barometric pressure, Red, Green, and Blue (RGB) color, and ambient light intensity, using the LoRaWAN protocol.

/*
Description: Transmits Arduino Nano 33 BLE Sense sensor telemetry over LoRaWAN,
including temperature, humidity, barometric pressure, and color,
using REYAX RYLR896 transceiver modules
http://reyax.com/wp-content/uploads/2020/01/Lora-AT-Command-RYLR40x_RYLR89x_EN.pdf
Author: Gary Stafford
*/
#include <Arduino_HTS221.h>
#include <Arduino_LPS22HB.h>
#include <Arduino_APDS9960.h>
const int UPDATE_FREQUENCY = 5000; // update frequency in ms
const float CALIBRATION_FACTOR = –4.0; // temperature calibration factor (Celsius)
const int ADDRESS = 116;
const int NETWORK_ID = 6;
const String PASSWORD = "92A0ECEC9000DA0DCF0CAAB0ABA2E0EF";
const String DELIMITER = "|";
String uid = "";
void setup()
{
Serial.begin(9600);
Serial1.begin(115200); // default baud rate of module is 115200
delay(1000); // wait for LoRa module to be ready
// get unique transceiver id to identify iot device on network
Serial1.print((String)"AT+UID?\r\n");
uid = Serial1.readString();
uid.replace("+UID=", ""); // trim off '+UID=' at start of line
uid.replace("\r\n", ""); // trim off CR/LF at end of line
// needs all need to be same for receiver and transmitter
Serial1.print((String)"AT+ADDRESS=" + ADDRESS + "\r\n");
delay(200);
Serial1.print((String)"AT+NETWORKID=" + NETWORK_ID + "\r\n");
delay(200);
Serial1.print("AT+CPIN=" + PASSWORD + "\r\n");
delay(200);
Serial1.print("AT+CPIN?\r\n"); // confirm password is set
if (!HTS.begin())
{ // initialize HTS221 sensor
Serial.println("Failed to initialize humidity temperature sensor!");
while (1);
}
if (!BARO.begin())
{ // initialize LPS22HB sensor
Serial.println("Failed to initialize pressure sensor!");
while (1);
}
// avoid bad readings to start bug
// https://forum.arduino.cc/index.php?topic=660360.0
BARO.readPressure();
delay(1000);
if (!APDS.begin())
{ // initialize APDS9960 sensor
Serial.println("Failed to initialize color sensor!");
while (1);
}
}
void loop()
{
updateReadings();
delay(UPDATE_FREQUENCY);
}
void updateReadings()
{
float temperature = getTemperature(CALIBRATION_FACTOR);
float humidity = getHumidity();
float pressure = getPressure();
int colors[4];
getColor(colors);
String payload = buildPayload(temperature, humidity, pressure, colors);
Serial.println("Payload: " + payload); // display the payload for debugging
Serial1.print(payload); // send the payload over LoRaWAN WiFi
displayResults(temperature, humidity, pressure, colors); // display the results for debugging
}
float getTemperature(float calibration)
{
return HTS.readTemperature() + calibration;
}
float getHumidity()
{
return HTS.readHumidity();
}
float getPressure()
{
return BARO.readPressure();
}
void getColor(int c[])
{
// check if a color reading is available
while (!APDS.colorAvailable())
{
delay(5);
}
int r, g, b, a;
APDS.readColor(r, g, b, a);
c[0] = r;
c[1] = g;
c[2] = b;
c[3] = a;
}
// display for debugging purposes
void displayResults(float t, float h, float p, int c[])
{
Serial.println((String)"UID: " + uid);
Serial.print("Temperature: ");
Serial.println(t);
Serial.print("Humidity: ");
Serial.println(h);
Serial.print("Pressure: ");
Serial.println(p);
Serial.print("Color (r, g, b, a): ");
Serial.print(c[0]);
Serial.print(", ");
Serial.print(c[1]);
Serial.print(", ");
Serial.print(c[2]);
Serial.print(", ");
Serial.println(c[3]);
Serial.println("———-");
}
String buildPayload(float t, float h, float p, int c[])
{
String readings = "";
readings += uid;
readings += DELIMITER;
readings += t;
readings += DELIMITER;
readings += h;
readings += DELIMITER;
readings += p;
readings += DELIMITER;
readings += c[0];
readings += DELIMITER;
readings += c[1];
readings += DELIMITER;
readings += c[2];
readings += DELIMITER;
readings += c[3];
String payload = "";
payload += "AT+SEND=";
payload += ADDRESS;
payload += ",";
payload += readings.length();
payload += ",";
payload += readings;
payload += "\r\n";
return payload;
}

AT Commands

Communications with the RYLR896’s long-range modem is done using AT commands. AT commands are instructions used to control a modem. AT is the abbreviation of ATtention. Every command line starts with AT. That is why modem commands are called AT commands, according to Developer’s Home. A complete list of AT commands can be downloaded as a PDF from the RYLR896 product page.

To efficiently transmit the environmental sensor data from the IoT sensor to the IoT gateway, the sketch concatenates the sensor ID and the sensor values together in a single string. The string will be incorporated into an AT command, sent to the RYLR896 LoRa transceiver module. To make it easier to parse the sensor data on the IoT gateway, we will delimit the sensor values with a pipe (|), as opposed to a comma. According to REYAX, the maximum length of the LoRa payload is approximately 330 bytes.

Below, we see an example of an AT command used to send the sensor data from the IoT sensor and the corresponding unencrypted data received by the IoT gateway. Both contain the LoRa transmitter Address ID, payload length (62 bytes in the example), and the payload. The data received by the IoT gateway also has the Received signal strength indicator (RSSI), and Signal-to-noise ratio (SNR).

Receiving Data on IoT Gateway

The Raspberry Pi will act as a LoRa IoT gateway, receiving the environmental sensor data from the IoT device, the Arduino, and sending the data to AWS. The Raspberry Pi runs a Python script, rasppi_lora_receiver_aws.py, which will receive the data from the Arduino Sense, decrypt the data, parse the sensor values, and serialize the data to a JSON payload, and finally, transmit the payload in an MQTT-protocol message to AWS. The script uses the pyserial, the Python Serial Port Extension, which encapsulates the access for the serial port for communication with the RYLR896 module. The script uses the AWS IoT Device SDK for Python v2 to communicate with AWS.

import json
import logging
import sys
import threading
import time
from argparse import ArgumentParser
import serial
from awscrt import io, mqtt, auth, http, exceptions
from awsiot import mqtt_connection_builder
# LoRaWAN IoT Sensor Demo
# Using REYAX RYLR896 transceiver modules
# http://reyax.com/wp-content/uploads/2020/01/Lora-AT-Command-RYLR40x_RYLR89x_EN.pdf
# Author: Gary Stafford
# Requirements: python3 -m pip install –user -r requirements.txt
# Usage:
# sh ./rasppi_lora_receiver_aws.sh \
# a1d0wxnxn1hs7m-ats.iot.us-east-1.amazonaws.com
# constants
ADDRESS = 116
NETWORK_ID = 6
PASSWORD = "92A0ECEC9000DA0DCF0CAAB0ABA2E0EF"
# global variables
count = 0 # from args
received_count = 0
received_all_event = threading.Event()
def main():
# get args
logging.basicConfig(filename='output.log',
filemode='w', level=logging.DEBUG)
args = get_args() # get args
payload = ""
lora_payload = {}
# set log level
io.init_logging(getattr(io.LogLevel, args.verbosity), 'stderr')
# spin up resources
event_loop_group = io.EventLoopGroup(1)
host_resolver = io.DefaultHostResolver(event_loop_group)
client_bootstrap = io.ClientBootstrap(event_loop_group, host_resolver)
# set MQTT connection
mqtt_connection = set_mqtt_connection(args, client_bootstrap)
logging.debug("Connecting to {} with client ID '{}'…".format(
args.endpoint, args.client_id))
connect_future = mqtt_connection.connect()
# future.result() waits until a result is available
connect_future.result()
logging.debug("Connecting to REYAX RYLR896 transceiver module…")
serial_conn = serial.Serial(
port=args.tty,
baudrate=int(args.baud_rate),
timeout=5,
parity=serial.PARITY_NONE,
stopbits=serial.STOPBITS_ONE,
bytesize=serial.EIGHTBITS
)
if serial_conn.isOpen():
logging.debug("Connected!")
set_lora_config(serial_conn)
check_lora_config(serial_conn)
while True:
# read data from serial port
serial_payload = serial_conn.readline()
logging.debug(serial_payload)
if len(serial_payload) >= 1:
payload = serial_payload.decode(encoding="utf-8")
payload = payload[:2]
try:
data = parse_payload(payload)
lora_payload = {
"ts": time.time(),
"data": {
"device_id": str(data[0]),
"gateway_id": str(args.gateway_id),
"temperature": float(data[1]),
"humidity": float(data[2]),
"pressure": float(data[3]),
"color": {
"red": float(data[4]),
"green": float(data[5]),
"blue": float(data[6]),
"ambient": float(data[7])
}
}
}
logging.debug(lora_payload)
except IndexError:
logging.error("IndexError: {}".format(payload))
except ValueError:
logging.error("ValueError: {}".format(payload))
# publish mqtt message
message_json = json.dumps(
lora_payload,
sort_keys=True,
indent=None,
separators=(',', ':'))
try:
mqtt_connection.publish(
topic=args.topic,
payload=message_json,
qos=mqtt.QoS.AT_LEAST_ONCE)
except mqtt.SubscribeError as err:
logging.error(".SubscribeError: {}".format(err))
except exceptions.AwsCrtError as err:
logging.error("AwsCrtError: {}".format(err))
def set_mqtt_connection(args, client_bootstrap):
if args.use_websocket:
proxy_options = None
if args.proxy_host:
proxy_options = http.HttpProxyOptions(
host_name=args.proxy_host, port=args.proxy_port)
credentials_provider = auth.AwsCredentialsProvider.new_default_chain(
client_bootstrap)
mqtt_connection = mqtt_connection_builder.websockets_with_default_aws_signing(
endpoint=args.endpoint,
client_bootstrap=client_bootstrap,
region=args.signing_region,
credentials_provider=credentials_provider,
websocket_proxy_options=proxy_options,
ca_filepath=args.root_ca,
on_connection_interrupted=on_connection_interrupted,
on_connection_resumed=on_connection_resumed,
client_id=args.client_id,
clean_session=False,
keep_alive_secs=6)
else:
mqtt_connection = mqtt_connection_builder.mtls_from_path(
endpoint=args.endpoint,
cert_filepath=args.cert,
pri_key_filepath=args.key,
client_bootstrap=client_bootstrap,
ca_filepath=args.root_ca,
on_connection_interrupted=on_connection_interrupted,
on_connection_resumed=on_connection_resumed,
client_id=args.client_id,
clean_session=False,
keep_alive_secs=6)
return mqtt_connection
def get_args():
parser = ArgumentParser(
description="Send and receive messages through and MQTT connection.")
parser.add_argument("–tty", required=True,
help="serial tty", default="/dev/ttyAMA0")
parser.add_argument("–baud-rate", required=True,
help="serial baud rate", default=1152000)
parser.add_argument('–endpoint', required=True, help="Your AWS IoT custom endpoint, not including a port. " +
"Ex: \"abcd123456wxyz-ats.iot.us-east-1.amazonaws.com\"")
parser.add_argument('–cert', help="File path to your client certificate, in PEM format.")
parser.add_argument('–key', help="File path to your private key, in PEM format.")
parser.add_argument('–root-ca', help="File path to root certificate authority, in PEM format. " +
"Necessary if MQTT server uses a certificate that's not already in " +
"your trust store.")
parser.add_argument('–client-id', default='samples-client-id',
help="Client ID for MQTT connection.")
parser.add_argument('–topic', default="samples/test",
help="Topic to subscribe to, and publish messages to.")
parser.add_argument('–message', default="Hello World!", help="Message to publish. " +
"Specify empty string to publish nothing.")
parser.add_argument('–count', default=0, type=int, help="Number of messages to publish/receive before exiting. " +
"Specify 0 to run forever.")
parser.add_argument('–use-websocket', default=False, action='store_true',
help="To use a websocket instead of raw mqtt. If you specify this option you must "
"specify a region for signing, you can also enable proxy mode.")
parser.add_argument('–signing-region', default='us-east-1',
help="If you specify –use-web-socket, this is the region that will be used for computing "
"the Sigv4 signature")
parser.add_argument('–proxy-host', help="Hostname for proxy to connect to. Note: if you use this feature, " +
"you will likely need to set –root-ca to the ca for your proxy.")
parser.add_argument('–proxy-port', type=int, default=8080,
help="Port for proxy to connect to.")
parser.add_argument('–verbosity', choices=[x.name for x in io.LogLevel], default=io.LogLevel.NoLogs.name,
help='Logging level')
parser.add_argument("–gateway-id", help="IoT Gateway serial number")
args = parser.parse_args()
return args
def parse_payload(payload):
# input: +RCV=116,29,0447383033363932003C0034|23.94|37.71|99.89|16|38|53|80,-61,56
# output: [0447383033363932003C0034, 23.94, 37.71, 99.89, 16.0, 38.0, 53.0, 80.0]
payload = payload.split(",")
payload = payload[2].split("|")
payload = [i for i in payload]
return payload
def set_lora_config(serial_conn):
# configures the REYAX RYLR896 transceiver module
serial_conn.write(str.encode("AT+ADDRESS=" + str(ADDRESS) + "\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("Address set? {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+NETWORKID=" + str(NETWORK_ID) + "\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("Network Id set? {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+CPIN=" + PASSWORD + "\r\n"))
time.sleep(1)
serial_payload = (serial_conn.readline())[:2]
logging.debug("AES-128 password set? {}".format(serial_payload.decode(encoding="utf-8")))
def check_lora_config(serial_conn):
serial_conn.write(str.encode("AT?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("Module responding? {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+ADDRESS?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("Address: {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+NETWORKID?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("Network id: {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+IPR?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("UART baud rate: {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+BAND?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("RF frequency: {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+CRFOP?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("RF output power: {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+MODE?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("Work mode: {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+PARAMETER?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("RF parameters: {}".format(serial_payload.decode(encoding="utf-8")))
serial_conn.write(str.encode("AT+CPIN?\r\n"))
serial_payload = (serial_conn.readline())[:2]
logging.debug("AES128 password of the network: {}".format(serial_payload.decode(encoding="utf-8")))
# Callback when connection is accidentally lost.
def on_connection_interrupted(connection, error, **kwargs):
logging.error("Connection interrupted. error: {}".format(error))
# Callback when an interrupted connection is re-established.
def on_connection_resumed(connection, return_code, session_present, **kwargs):
logging.warning("Connection resumed. return_code: {} session_present: {}".format(
return_code, session_present))
if return_code == mqtt.ConnectReturnCode.ACCEPTED and not session_present:
logging.warning("Session did not persist. Resubscribing to existing topics…")
resubscribe_future, _ = connection.resubscribe_existing_topics()
# Cannot synchronously wait for resubscribe result because we're on the connection's event-loop thread,
# evaluate result with a callback instead.
resubscribe_future.add_done_callback(on_resubscribe_complete)
def on_resubscribe_complete(resubscribe_future):
resubscribe_results = resubscribe_future.result()
logging.warning("Resubscribe results: {}".format(resubscribe_results))
for topic, qos in resubscribe_results['topics']:
if qos is None:
sys.exit("Server rejected resubscribe to topic: {}".format(topic))
# Callback when the subscribed topic receives a message
def on_message_received(topic, payload, **kwargs):
logging.debug("Received message from topic '{}': {}".format(topic, payload))
global received_count
received_count += 1
if received_count == count:
received_all_event.set()
if __name__ == "__main__":
sys.exit(main())

Running the IoT Gateway Python Script

To run the Python script on the Raspberry Pi, we will use a helper shell script, rasppi_lora_receiver_aws.sh. The shell script helps construct the arguments required to execute the Python script.

#!/bin/bash
# Author: Gary A. Stafford
# Start IoT data collector script and tails output
# Usage:
# sh ./rasppi_lora_receiver_aws.sh \
# a1b2c3d4e5678f-ats.iot.us-east-1.amazonaws.com
if [[ $# -ne 1 ]]; then
echo "Script requires 1 parameter!"
exit 1
fi
# input parameters
ENDPOINT=$1 # e.g. a1b2c3d4e5678f-ats.iot.us-east-1.amazonaws.com
DEVICE="lora-iot-gateway-01" # matches CloudFormation thing name
CERTIFICATE="${DEVICE}-certificate.pem.crt" # e.g. lora-iot-gateway-01-certificate.pem.crt
KEY="${DEVICE}-private.pem.key" # e.g. lora-iot-gateway-01-private.pem.key
GATEWAY_ID=$(< /proc/cpuinfo grep Serial | grep -oh "[a-z0-9]*$") # e.g. 00000000f62051ce
# output for debugging
echo "DEVICE: ${DEVICE}"
echo "ENDPOINT: ${ENDPOINT}"
echo "CERTIFICATE: ${CERTIFICATE}"
echo "KEY: ${KEY}"
echo "GATEWAY_ID: ${GATEWAY_ID}"
# call the python script
nohup python3 rasppi_lora_receiver_aws.py \
–endpoint "${ENDPOINT}" \
–cert "${DEVICE}-creds/${CERTIFICATE}" \
–key "${DEVICE}-creds/${KEY}" \
–root-ca "${DEVICE}-creds/AmazonRootCA1.pem" \
–client-id "${DEVICE}" \
–topic "lora-iot-demo" \
–gateway-id "${GATEWAY_ID}" \
–verbosity "Info" \
–tty "/dev/ttyAMA0" \
–baud-rate 115200 \
>collector.log 2>&1 </dev/null &
sleep 2
# tail the log (Control-C to exit)
tail -f collector.log

To run the helper script, we execute the following command, substituting the input parameter, the AWS IoT endpoint, with your endpoint.

sh ./rasppi_lora_receiver_aws.sh \
  a1b2c3d4e5678f-ats.iot.us-east-1.amazonaws.com

You should see the console output, similar to the following.

The script starts by configuring the RYLR896 module and outputting that configuration to a log file, output.log. If successful, we should see the following debug information logged.

DEBUG:root:Connecting to a1b2c3d4e5f6-ats.iot.us-east-1.amazonaws.com with client ID 'lora-iot-gateway-01'
DEBUG:root:Connecting to REYAX RYLR896 transceiver module
DEBUG:root:Connected!
DEBUG:root:Address set? +OK
DEBUG:root:Network Id set? +OK
DEBUG:root:AES-128 password set? +OK
DEBUG:root:Module responding? +OK
DEBUG:root:Address: +ADDRESS=116
DEBUG:root:Network id: +NETWORKID=6
DEBUG:root:UART baud rate: +IPR=115200
DEBUG:root:RF frequency: +BAND=915000000
DEBUG:root:RF output power: +CRFOP=15
DEBUG:root:Work mode: +MODE=0
DEBUG:root:RF parameters: +PARAMETER=12,7,1,4
DEBUG:root:AES128 password of the network: +CPIN=92A0ECEC9000DA0DCF0CAAB0ABA2E0EF

That sensor data is also written to the log file for debugging purposes. This first line in the log (shown below) is the raw decrypted data received from the IoT device via LoRaWAN. The second line is the JSON-serialized payload, sent securely to AWS, using the MQTT protocol.

DEBUG:root:b'+RCV=116,59,0447383033363932003C0034|23.46|41.89|99.38|230|692|833|1116,-48,39\r\n'

DEBUG:root:{'ts': 1598305503.7041512, 'data': {'humidity': 41.89, 'temperature': 23.46, 'device_id': '0447383033363932003C0034', 'gateway_id': '00000000f62051ce', 'pressure': 99.38, 'color': {'red': 230.0, 'blue': 833.0, 'ambient': 1116.0, 'green': 692.0}}}

DEBUG:root:b'+RCV=116,59,0447383033363932003C0034|23.46|41.63|99.38|236|696|837|1127,-49,35\r\n'

DEBUG:root:{'ts': 1598305513.7918658, 'data': {'humidity': 41.63, 'temperature': 23.46, 'device_id': '0447383033363932003C0034', 'gateway_id': '00000000f62051ce', 'pressure': 99.38, 'color': {'red': 236.0, 'blue': 837.0, 'ambient': 1127.0, 'green': 696.0}}}

DEBUG:root:b'+RCV=116,59,0447383033363932003C0034|23.44|41.57|99.38|232|686|830|1113,-48,32\r\n'

DEBUG:root:{'ts': 1598305523.8556132, 'data': {'humidity': 41.57, 'temperature': 23.44, 'device_id': '0447383033363932003C0034', 'gateway_id': '00000000f62051ce', 'pressure': 99.38, 'color': {'red': 232.0, 'blue': 830.0, 'ambient': 1113.0, 'green': 686.0}}}

DEBUG:root:b'+RCV=116,59,0447383033363932003C0034|23.51|41.44|99.38|205|658|802|1040,-48,36\r\n'

DEBUG:root:{'ts': 1598305528.8890748, 'data': {'humidity': 41.44, 'temperature': 23.51, 'device_id': '0447383033363932003C0034', 'gateway_id': '00000000f62051ce', 'pressure': 99.38, 'color': {'red': 205.0, 'blue': 802.0, 'ambient': 1040.0, 'green': 658.0}}}

AWS IoT Core

The Raspberry Pi-based IoT gateway will be registered with AWS IoT Core. IoT Core allows users to connect devices quickly and securely to AWS.

Things

According to AWS, IoT Core can reliably scale to billions of devices and trillions of messages. Registered devices are referred to as things in AWS IoT Core. A thing is a representation of a specific device or logical entity. Information about a thing is stored in the registry as JSON data.

Below, we see an example of the Thing created by CloudFormation. The Thing, lora-iot-gateway-01, represents the physical IoT gateway. We have assigned the IoT gateway a Thing Type, LoRaIoTGateway, a Thing Group, LoRaIoTGateways, and a Thing Billing Group, IoTGateways.

In a real IoT environment, containing hundreds, thousands, even millions of IoT devices, gateways, and sensors, these classification mechanisms, Thing Type, Thing Group, and Thing Billing Group, will help to organize IoT assets.

Device Gateway and Message Broker

IoT Core provides a Device Gateway, which manages all active device connections. The Gateway currently supports MQTT, WebSockets, and HTTP 1.1 protocols. Behind the Message Gateway is a high-throughput pub/sub Message Broker, which securely transmits messages to and from all IoT devices and applications with low latency. Below, we see a typical AWS IoT Core architecture containing multiple Topics, Rules, and Actions.

AWS IoT Security

AWS IoT Core provides mutual authentication and encryption, ensuring all data is exchanged between AWS and the devices are secure by default. In the demonstration, all data is sent securely using Transport Layer Security (TLS) 1.2 with X.509 digital certificates on port 443. Below, we see an example of an X.509 certificate assigned to the Thing, lora-iot-gateway-01, which represents the physical IoT gateway. The X.509 certificate and the private key, generated using the AWS CLI, previously, are installed on the IoT gateway.

Authorization of the device to access any resource on AWS is controlled by AWS IoT Core Policies. These policies are similar to AWS IAM Policies. Below, we see an example of an AWS IoT Core Policy, LoRaDevicePolicy, which is assigned to the IoT gateway.

AWS IoT Core Rules

Once an MQTT message is received from the IoT gateway (a thing), we use AWS IoT Rules to send message data to an AWS IoT Analytics Channel. Rules give your devices the ability to interact with AWS services. Rules are analyzed, and Actions are performed based on the MQTT topic stream. Below, we see an example rule that forwards our messages to an IoT Analytics Channel.

Rule query statements are written in standard Structured Query Language (SQL). The datasource for the Rule query is an IoT Topic.

SELECT
data.device_id,
data.gateway_id,
data.temperature,
data.humidity,
data.pressure,
data.color.red,
data.color.green,
data.color.blue,
data.color.ambient,
ts,
Clientid () AS device,
parse_time ("yyyy-MM-dd'T'HH:mm:ss.SSSZ", timestamp(), "UTC") AS msg_received
FROM
"${IoTTopicName}"
view raw iot_rule.sql hosted with ❤ by GitHub

AWS IoT Analytics

AWS IoT Analytics is composed of five primary components: Channels, Pipelines, Data stores, Data sets, and Notebooks. These components enable you to collect, prepare, store, analyze, and visualize your IoT data.

Below, we see a typical AWS IoT Analytics architecture. IoT messages are received from AWS IoT Core, thought a Rule Action. Amazon QuickSight provides business intelligence, visualization. Amazon QuickSight ML Insights adds anomaly detection and forecasting.

IoT Analytics Channel

An AWS IoT Analytics Channel pulls messages or data into IoT Analytics from other AWS sources, such as Amazon S3, Amazon Kinesis, or Amazon IoT Core. Channels store data for IoT Analytics Pipelines. Both Channels and Data store support storing data in your own Amazon S3 bucket or an IoT Analytics service-managed S3 bucket. In the demonstration, we are using a service managed S3 bucket.

When creating a Channel, you also decide how long to retain the data. For the demonstration, we have set the data retention period for 21 days. Channels are generally not used for long term storage of data. Typically, you would only retain data in the Channel for the period you need to analyze. For long term storage of IoT message data, I recommend using an AWS IoT Core Rule to send a copy of the raw IoT data to Amazon S3, using a service such as Amazon Kinesis Data Firehose.

IoT Analytics Pipeline

An AWS IoT Analytics Pipeline consumes messages from one or more Channels. Pipelines transform, filter, and enrich the messages before storing them in IoT Analytics Data stores. A Pipeline is composed of an ordered list of activities. Logically, you must specify both a Channel (source) and a Datastore (destination) activity. Optionally, you may choose as many as 23 additional activities in the pipelineActivities array.

In our demonstration’s Pipeline, iot_analytics_pipeline, we have specified three additional activities, including DeviceRegistryEnrich, Filter, and Lambda. Other activity types include Math, SelectAttributes, RemoveAttributes, and AddAttributes.

The Filter activity ensures the sensor values are not Null or otherwise erroneous; if true, the message is dropped. The Lambda Pipeline activity executes an AWS Lambda function to transform the messages in the pipeline. Messages are sent in an event object to the Lambda. The message is modified, and the event object is returned to the activity.

The Python-based Lambda function easily handles typical IoT data transformation tasks, including converting the temperature from Celsius to Fahrenheit, pressure from kilopascals (kPa) to inches of Mercury (inHg), and 12-bit RGBA values to 8-bit color values (0–255). The Lambda function also rounds down all the values to between 0 and 2 decimal places of precision.

def lambda_handler(event, context):
for e in event:
e['temperature'] = round((e['temperature'] * 1.8) + 32, 2)
e['humidity'] = round(e['humidity'], 2)
e['pressure'] = round((e['pressure'] / 3.3864), 2)
e['red'] = int(round(e['red'] / (4097 / 255), 0))
e['green'] = int(round(e['green'] / (4097 / 255), 0))
e['blue'] = int(round(e['blue'] / (4097 / 255), 0))
e['ambient'] = int(round(e['ambient'] / (4097 / 255), 0))
return event

The demonstration’s Pipeline also enriches the IoT data with metadata from the IoT device’s AWS IoT Core Registry. The metadata includes additional information about the device that generated the IoT data, including the custom attributes such as LoRa transceiver manufacturer and model, and the IoT gateway manufacturer.

A notable feature of Pipelines is the ability to reprocess messages. If you make changes to the Pipeline, which often happens during the data preparation stage, you can reprocess any or all the IoT data in the associated Channel, and overwrite the IoT data in the Data set.

IoT Analytics Data store

An AWS IoT Analytics Data store stores prepared data from an AWS IoT Analytics Pipeline, in a fully-managed database. Both Channels and Data store support storing IoT data in your own Amazon S3 bucket or an IoT Analytics managed S3 bucket. In the demonstration, we are using a service-managed S3 bucket to store the IoT data in our Data store, iot_analytics_data_store.

IoT Analytics Data set

An AWS IoT Analytics Data set automatically provides regular, up-to-date insights for data analysts by querying a Data store using standard SQL. Periodic updates are implemented using a cron expression. For the demonstration, we are updating our Data set, iot_analytics_data_set, at a 15-minute interval. The time interval can be increased or reduced, depending on the desired ‘near real-time’ nature of the IoT data being analyzed.

Below, we see messages in the Result preview pane of the Data set. Note the SQL query used to obtain the messages, which queries the Data store. The Data store, as you will recall, contains the transformed messages from the Pipeline.

IoT Analytics Data sets also support sending content results, which are materialized views of your IoT Analytics data, to an Amazon S3 bucket.

The CloudFormation stack created an encrypted Amazon S3 Bucket. This bucket receives a copy of the messages from the IoT Analytics Data set whenever the cron expression runs the scheduled update.

IoT Analytics Notebook

An AWS IoT Analytics Notebook allows users to perform statistical analysis and machine learning on IoT Analytics Data sets using Jupyter Notebooks. The IoT Analytics Notebook service includes a set of notebook templates that contain AWS-authored machine learning models and visualizations. Notebook Instances can be linked to a GitHub or other source code repository. Notebooks created with IoT Analytics Notebook can also be accessed directly through Amazon SageMaker. For the demonstration, the Notebooks Instance is cloned from our project’s GitHub repository.

The repository contains a sample Jupyter Notebook, LoRa_IoT_Analytics_Demo.ipynb, based on the conda_python3 kernel. This preinstalled environment includes the default Anaconda installation and Python 3.

The Notebook uses pandas, matplotlib, and plotly to manipulate and visualize the sample IoT data stored in the IoT Analytics Data set.

The Notebook can be modified, and the changes pushed back to GitHub. You could easily fork the demonstration’s GitHub repository and modify the CloudFormation template to point to your source code repository.

Amazon QuickSight

Amazon QuickSight provides business intelligence (BI) and visualization. Amazon QuickSight ML Insights adds anomaly detection and forecasting. We can use Amazon QuickSight to visualize the IoT message data, stored in the IoT Analytics Data set.

Amazon QuickSight has both a Standard and an Enterprise Edition. AWS provides a detailed product comparison of each edition. For the post, I am demonstrating the Enterprise Edition, which includes additional features, such as ML Insights, hourly refreshes of SPICE (super-fast, parallel, in-memory, calculation engine), and theme customization.

Please be aware of the costs of Amazon QuickSight if you choose to follow along with this part of the demo. Although there is an Amazon QuickSight API, Amazon QuickSight is not automatically enabled or configured with CloudFormation or using the AWS CLI in this demonstration.

QuickSight Data Sets

Amazon QuickSight has a wide variety of data source options for creating Amazon QuickSight Data sets, including the ones shown below. Do not confuse Amazon QuickSight Data sets with IoT Analytics Data sets; they are two different service features.

For the demonstration, we will create an Amazon QuickSight Data set that will use our IoT Analytics Data set, iot_analytics_data_set.

Amazon QuickSight gives you the ability to view and modify QuickSight Data sets before visualizing. QuickSight even provides a wide variety of functions, enabling us to perform dynamic calculations on the field values. For this demonstration, we will leave the data unchanged since all transformations were already completed in the IoT Analytics Pipeline.

QuickSight Analysis

Using the QuickSight Data set, built from the IoT Analytics Data set as a data source, we create a QuickSight Analysis. The QuickSight Analysis console is shown below. An Analysis is primarily a collection of Visuals (aka Visual types). QuickSight provides several Visual types. Each visual is associated with a Data set. Data for the QuickSight Analysis or each visual within the Analysis can be filtered. For the demo, I have created a simple QuickSight Analysis, including a few typical QuickSight visuals.

QuickSight Dashboards

To share a QuickSight Analysis, we can create a QuickSight Dashboard. Below, we see a few views of the QuickSight Analysis, shown above, as a Dashboard. Although viewers of the Dashboard cannot edit the visuals, they can apply filtering and interactively drill-down into data in the Visuals.

Amazon QuickSight ML Insights

According to Amazon, ML Insights leverages AWS’s machine learning (ML) and natural language capabilities to gain deeper insights from data. QuickSight’s ML-powered Anomaly Detection continuously analyze data to discover anomalies and variations inside of the aggregates, giving you the insights to act when business changes occur. QuickSight’s ML-powered Forecasting can be used to predict your business metrics accurately, and perform interactive what-if analysis with point-and-click simplicity. QuickSight’s built-in algorithms make it easy for anyone to use ML that learns from your data patterns to provide you with accurate predictions based on historical trends.

Below, we see the ML Insights tab (left) in the demonstration’s QuickSight Analysis. Individually detected anomalies can be added to the QuickSight Analysis, like Visuals, and configured to tune the detection parameters. Observe the temperature, humidity, and barometric pressure anomalies, identified by ML Insights, based on their Anomaly Score, which is higher or lower, given a minimum delta of five percent. These anomalies accurately reflected an actual failure of the IoT device, caused by overheated during testing, which resulted in abnormal sensor readings.

Receiving the Messages on AWS

To confirm the IoT gateway is sending messages, we can use a packet analyzer, like tcpdump, on the IoT gateway. Running tcpdump on the IoT gateway, below, we see outbound encrypted MQTT messages being sent to AWS on port 443.

To confirm those messages are being received from the IoT gateway on AWS, we can use the AWS IoT Core Test feature and subscribe to the lora-iot-demo topic. We should see messages flowing in from the IoT gateway at approximately 5-second intervals.

The JSON payload structure of the incoming MQTT messages will look similar to the below example. The device_id is the unique id of the IoT device that transmitted the message using LoRaWAN. The gateway_id is the unique id of the IoT gateway that received the message using LoRaWAN and sent it to AWS. A single IoT gateway would usually manage messages from multiple IoT devices, each with a unique id.

{
"data": {
"color": {
"ambient": 1057,
"blue": 650,
"green": 667,
"red": 281
},
"device_id": "0447383033363932003C0034",
"gateway_id": "00000000f62051ce",
"humidity": 45.73,
"pressure": 98.65,
"temperature": 23.6
},
"ts": 1598543131.9861386
}

The SQL query used by the AWS IoT Rule described earlier, transforms and flattens the nested JSON payload structure, before passing it to the AWS IoT Analytics Channel, as shown below.

{
"ambient": 1057,
"blue": 650,
"green": 667,
"red": 281,
"device_id": "0447383033363932003C0034",
"gateway_id": "00000000f62051ce",
"humidity": 45.73,
"pressure": 98.65,
"temperature": 23.6,
"ts": 1598543131.9861386,
"msg_received": "2020-08-27T11:45:32.074+0000",
"device": "lora-iot-gateway-01"
}

We can measure the near real-time nature of the IoT data using the ts and msg_received data fields. The ts data field is date and time when the sensor reading occurred on the IoT device, while the msg_received data field is the date and time when the message was received on AWS. The delta between the two values is a measure of how near real-time the sensor readings are being streamed to the AWS IoT Analytics Channel. In the below example, the difference between ts (2020–08–27T11:45:31.986) and msg_received (2020–08–27T11:45:32.074) is 88 ms.

Final IoT Data Message Structure

Once the message payload passes through the AWS IoT Analytics Pipeline and lands in the AWS IoT Analytics Data set, its final data structure looks as follows. Note that the device’s attribute metadata has been added from the AWS IoT Core device registry. Regrettably, the metadata is not well-formatted JSON and will require additional transformation to be usable.

{
"device_id": "0447383033363932003C0034",
"gateway_id": "00000000f62051ce",
"temperature": 74.48,
"humidity": 45.73,
"pressure": 29.13,
"red": 17,
"green": 42,
"blue": 40,
"ambient": 66,
"ts": 1598543131.9861386,
"device": "lora-iot-gateway-01",
"msg_received": "2020-08-27T15:45:32.024+0000",
"metadata": {
"defaultclientid": "lora-iot-gateway-01",
"thingname": "lora-iot-gateway-01",
"thingid": "017db4b8-7fca-4617-aa58-7125dd94ab36",
"thingarn": "arn:aws:iot:us-east-1:123456789012:thing/lora-iot-gateway-01",
"thingtypename": "LoRaIoTGateway",
"attributes": {
"loramfr": "REYAX",
"gatewaymfr": "RaspberryPiFoundation",
"loramodel": "RYLR896"
},
"version": "2",
"billinggroupname": "LoRaIoTGateways"
},
"__dt": "2020-08-27 00:00:00.000"
}

A set of sample messages is included in the GitHub project’s sample_messages directory.

Conclusion

In this post, we explored the use of the LoRa and LoRaWAN protocols to transmit environmental sensor data from an IoT device to an IoT gateway. Given its low energy consumption, long-distance transmission capabilities, and well-developed protocols, LoRaWAN is an ideal long-range wireless protocol for IoT devices. We then demonstrated how to use AWS IoT Device SDKs, AWS IoT Core, and AWS IoT Analytics to securely collect, analyze, and visualize streaming messages from the IoT device, in near real-time.


This blog represents my own viewpoints and not of my employer, Amazon Web Services.

, , , , , ,

Leave a comment

LoRa and LoRaWAN for IoT: Getting Started with LoRa and LoRaWAN Protocols for Low Power, Wide Area Networking of IoT

 

Introduction

According to the LoRa Alliance, Low-Power, Wide-Area Networks (LPWAN) are projected to support a major portion of the billions of devices forecasted for the Internet of Things (IoT). LoRaWAN is designed from the bottom up to optimize LPWANs for battery lifetime, capacity, range, and cost. LoRa and LoRaWAN permit long-range connectivity for the Internet of Things (IoT) devices in different types of industries. According to Wikipedia, LoRaWAN defines the communication protocol and system architecture for the network, while the LoRa physical layer enables the long-range communication link.

LoRa

Long Range (LoRa), the low-power wide-area network (LPWAN) protocol developed by Semtech, sits at layer 1, the physical layer, of the seven-layer OSI model (Open Systems Interconnection model) of computer networking. The physical layer defines the means of transmitting raw bits over a physical data link connecting network nodes. LoRa uses license-free sub-gigahertz radio frequency (RF) bands, including 433 MHz, 868 MHz (Europe), 915 MHz (Australia and North America), and 923 MHz (Asia). LoRa enables long-range transmissions with low power consumption.

LoRaWAN

LoRaWAN is a cloud-based medium access control (MAC) sublayer (layer 2) protocol but acts mainly as a network layer (layer 3) protocol for managing communication between LPWAN gateways and end-node devices as a routing protocol, maintained by the LoRa Alliance. The MAC sublayer and the logical link control (LLC) sublayer together make up layer 2, the data link layer, of the OSI model.

LoRaWAN is often cited as having greater than a 10-km-wide coverage area in rural locations. However, according to other sources, it is generally more limited. According to the Electronic Design article, 11 Myths About LoRaWAN, a typical LoRaWAN network range depends on numerous factors—indoor or outdoor gateways, the payload of the message, the antenna used, etc. On average, in an urban environment with an outdoor gateway, you can expect up to 2- to 3-km-wide coverage, while in the rural areas it can reach beyond 5 to 7 km. LoRa’s range depends on the “radio line-of-sight.” Radio waves in the 400 MHz to 900 MHz range may pass through some obstructions, depending on their composition, but will be absorbed or reflected otherwise. This means that the signal can potentially reach as far as the horizon, as long as there are no physical barriers to block it.

In the following hands-on post, we will explore the use of the LoRa and LoRaWAN protocols to transmit and receive sensor data, over a substantial distance, between an IoT device, containing a number of embedded sensors, and an IoT gateway.

python_script_running

Recommended Hardware

For this post, I have used the following hardware.

IoT Device with Embedded Sensors

I have used an Arduino single-board microcontroller as an IoT sensor, actually an array of sensors. The 3.3V AI-enabled Arduino Nano 33 BLE Sense board (Amazon: USD 36.00), released in August 2019, comes with the powerful nRF52840 processor from Nordic Semiconductors, a 32-bit ARM Cortex-M4 CPU running at 64 MHz, 1MB of CPU Flash Memory, 256KB of SRAM, and a NINA-B306 stand-alone Bluetooth 5 low energy (BLE) module.

IMG_7016

The Sense also contains an impressive array of embedded sensors:

  • 9-axis Inertial Sensor (LSM9DS1): 3D digital linear acceleration sensor, a 3D digital
    angular rate sensor, and a 3D digital magnetic sensor
  • Humidity and Temperature Sensor (HTS221): Capacitive digital sensor for relative humidity and temperature
  • Barometric Sensor (LPS22HB): MEMS nano pressure sensor: 260–1260 hectopascal (hPa) absolute digital output barometer
  • Microphone (MP34DT05): MEMS audio sensor omnidirectional digital microphone
  • Gesture, Proximity, Light Color, and Light Intensity Sensor (APDS9960): Advanced Gesture detection, Proximity detection, Digital Ambient Light Sense (ALS), and Color Sense (RGBC).

The Arduino Sense is an excellent, low-cost single-board microcontroller for learning about the collection and transmission of IoT sensor data.

IoT Gateway

An IoT Gateway, according to TechTarget, is a physical device or software program that serves as the connection point between the Cloud and controllers, sensors, and intelligent devices. All data moving to the Cloud, or vice versa goes through the gateway, which can be either a dedicated hardware appliance or software program.

lora_network

I have used an a third-generation Raspberry Pi 3 Model B+ single-board computer (SBC), to serve as an IoT Gateway. This Raspberry Pi model features a 1.4GHz Cortex-A53 (ARMv8) 64-bit quad-core processor System on a Chip (SoC), 1GB LPDDR2 SDRAM, dual-band wireless LAN, Bluetooth 4.2 BLE, and Gigabit Ethernet (Amazon: USD 42.99).

To follow along with the post, you could substitute the Raspberry Pi for any Linux-based machine to run the included sample Python script.

IMG_7025

LoRa Transceiver Modules

To transmit the IoT sensor data between the IoT device, containing the embedded sensors, and the IoT gateway, I have used the REYAX RYLR896 LoRa transceiver module (Amazon: USD 19.50 x 2). The transceiver modules are commonly referred to as a universal asynchronous receiver-transmitter (UART). A UART is a computer hardware device for asynchronous serial communication in which the data format and transmission speeds are configurable.

IMG_7116

According to the manufacturer, REYAX, the RYLR896 contains the Semtech SX1276 long-range, low power transceiver. The RYLR896 module provides ultra-long range spread spectrum communication and high interference immunity while minimizing current consumption. This transceiver operates at both the 868 and 915 MHz frequency ranges. We will be transmitting at 915 MHz for North America, in this post. Each RYLR896 module contains a small, PCB integrated, helical antenna.

Security

The RYLR896 is capable of the AES 128-bit data encryption. Using the Advanced Encryption Standard (AES), we will encrypt the data sent from the IoT device to the IoT gateway, using a 32 hex digit password (128 bits / 4 bits/hex digit = 32 hex digits). Using hexadecimal notation, the password is limited to digits 0–9 and characters A–F.

USB to TTL Serial Converter Adapter

Optionally, to configure, test, and debug the RYLR896 LoRa transceiver module directly from your laptop, you can use a USB to TTL serial converter adapter. I currently use the IZOKEE FT232RL FTDI USB to TTL Serial Converter Adapter Module for 3.3V and 5V (Amazon: USD 9.49 for 2). The 3.3V RYLR896 module easily connects to the USB to TTL Serial Converter Adapter using the TXD/TX, RXD/RX, VDD/VCC, and GND pins. We use serial communication to send and receive data through TX (transmit) and RX (receive) pins. The wiring is shown below: VDD to VCC, GND to GND, TXD to RX, and RXD to TX.

UART Diagram

The FT232RL has support for baud rates up to 115,200 bps, which is the speed we will use to communicate with the RYLR896 module.

Arduino Sketch

For those not familiar with Arduino, a sketch is the name that Arduino uses for a program. It is the unit of code that is uploaded into non-volatile flash memory and runs on an Arduino board. The Arduino language is a set of C and C++ functions. All standard C and C++ constructs supported by the avr-g++ compiler should work in Arduino.

For this post, the sketch, lora_iot_demo.ino, contains all the code necessary to collect and securely transmit the environmental sensor data, including temperature, relative humidity, barometric pressure, RGB color, and ambient light intensity, using the LoRaWAN protocol. All code for this post, including the sketch, can be found on GitHub.

/*
Description: Transmits Arduino Nano 33 BLE Sense sensor telemetry over LoRaWAN,
including temperature, humidity, barometric pressure, and color,
using REYAX RYLR896 transceiver modules
http://reyax.com/wp-content/uploads/2020/01/Lora-AT-Command-RYLR40x_RYLR89x_EN.pdf
Author: Gary Stafford
*/
#include <Arduino_HTS221.h>
#include <Arduino_LPS22HB.h>
#include <Arduino_APDS9960.h>
const int UPDATE_FREQUENCY = 5000; // update frequency in ms
const float CALIBRATION_FACTOR = –4.0; // temperature calibration factor (Celsius)
const int ADDRESS = 116;
const int NETWORK_ID = 6;
const String PASSWORD = "92A0ECEC9000DA0DCF0CAAB0ABA2E0EF";
const String DELIMITER = "|";
void setup()
{
Serial.begin(9600);
Serial1.begin(115200); // default baud rate of module is 115200
delay(1000); // wait for LoRa module to be ready
// needs all need to be same for receiver and transmitter
Serial1.print((String)"AT+ADDRESS=" + ADDRESS + "\r\n");
delay(200);
Serial1.print((String)"AT+NETWORKID=" + NETWORK_ID + "\r\n");
delay(200);
Serial1.print("AT+CPIN=" + PASSWORD + "\r\n");
delay(200);
Serial1.print("AT+CPIN?\r\n"); // confirm password is set
if (!HTS.begin())
{ // initialize HTS221 sensor
Serial.println("Failed to initialize humidity temperature sensor!");
while (1);
}
if (!BARO.begin())
{ // initialize LPS22HB sensor
Serial.println("Failed to initialize pressure sensor!");
while (1);
}
// avoid bad readings to start bug
// https://forum.arduino.cc/index.php?topic=660360.0
BARO.readPressure();
delay(1000);
if (!APDS.begin())
{ // initialize APDS9960 sensor
Serial.println("Failed to initialize color sensor!");
while (1);
}
}
void loop()
{
updateReadings();
delay(UPDATE_FREQUENCY);
}
void updateReadings()
{
float temperature = getTemperature(CALIBRATION_FACTOR);
float humidity = getHumidity();
float pressure = getPressure();
int colors[4];
getColor(colors);
String payload = buildPayload(temperature, humidity, pressure, colors);
// Serial.println("Payload: " + payload); // display the payload for debugging
Serial1.print(payload); // send the payload over LoRaWAN WiFi
displayResults(temperature, humidity, pressure, colors); // display the results for debugging
}
float getTemperature(float calibration)
{
return HTS.readTemperature() + calibration;
}
float getHumidity()
{
return HTS.readHumidity();
}
float getPressure()
{
return BARO.readPressure();
}
void getColor(int c[])
{
// check if a color reading is available
while (!APDS.colorAvailable())
{
delay(5);
}
int r, g, b, a;
APDS.readColor(r, g, b, a);
c[0] = r;
c[1] = g;
c[2] = b;
c[3] = a;
}
void displayResults(float t, float h, float p, int c[])
{
Serial.print("Temperature: ");
Serial.println(t);
Serial.print("Humidity: ");
Serial.println(h);
Serial.print("Pressure: ");
Serial.println(p);
Serial.print("Color (r, g, b, a): ");
Serial.print(c[0]);
Serial.print(", ");
Serial.print(c[1]);
Serial.print(", ");
Serial.print(c[2]);
Serial.print(", ");
Serial.println(c[3]);
Serial.println("———-");
}
String buildPayload(float t, float h, float p, int c[])
{
String readings = "";
readings += t;
readings += DELIMITER;
readings += h;
readings += DELIMITER;
readings += p;
readings += DELIMITER;
readings += c[0];
readings += DELIMITER;
readings += c[1];
readings += DELIMITER;
readings += c[2];
readings += DELIMITER;
readings += c[3];
String payload = "";
payload += "AT+SEND=";
payload += ADDRESS;
payload += ",";
payload += readings.length();
payload += ",";
payload += readings;
payload += "\r\n";
return payload;
}

view raw
lora_iot_demo.ino
hosted with ❤ by GitHub

AT Commands

Communications with the RYLR896’s long-range modem is done using AT commands. AT commands are instructions used to control a modem. AT is the abbreviation of ATtention. Every command line starts with “AT”. That is why modem commands are called AT commands, according to Developer’s Home. A complete list of AT commands can be downloaded as a PDF from the RYLR896 product page.

To efficiently transmit the environmental sensor data from the IoT sensor to the IoT gateway, the sketch concatenates the sensor values together in a single string. The string will be incorporated into AT command to send the data to the RYLR896 LoRa transceiver module. To make it easier to parse the sensor data on the IoT gateway, we will delimit the sensor values with a pipe (|), as opposed to a comma. The maximum length of the payload (sensor data) is 240 bytes.

Below, we see an example of an AT command used to send the sensor data from the IoT sensor and the corresponding unencrypted data received by the IoT gateway. Both strings contain the LoRa transmitter Address ID, payload length, and the payload. The data received by the IoT gateway also contains the Received signal strength indicator (RSSI), and Signal-to-noise ratio (SNR).

Message Diagram

Configure, Test, and Debug

As discussed earlier, to configure, test, and debug the RYLR896 LoRa transceiver modules without the use of the IoT gateway, you can use a USB to TTL serial converter adapter. The sketch is loaded on the Arduino Sense (the IoT device) and actively transmits data through one of the RYLR896 modules (shown below right). The other RYLR896 module is connected to your laptop’s USB port, via the USB to TTL serial converter adapter (shown below left). Using a terminal and the screen command, or the Arduino desktop application’s Serial Terminal, we can receive the sensor data from the Arduino Sense.

IMG_7027

Using a terminal on your laptop, we first need to locate the correct virtual console (aka virtual terminal). On Linux or Mac, the virtual consoles are represented by device special files, such as /dev/tty1, /dev/tty2, and so forth. To find the virtual console for the USB to TTL serial converter adapter plugged into the laptop, use the following command.

ls -alh /dev/tty.*

We should see a virtual console with a name similar to /dev/tty.usbserial-.

... /dev/tty.Bluetooth-Incoming-Port
... /dev/tty.GarysBoseQC35II-SPPDev
... /dev/tty.a483e767cbac-Bluetooth-
... /dev/tty.usbserial-A50285BI

To connect to the RYLR896 module via the USB to TTL serial converter adapter, using the virtual terminal, we use the screen command and connect at a baud rate of 115,200 bps.

screen /dev/tty.usbserial-A50285BI 115200

If everything is configured and working correctly, we should see data being transmitted from the Arduino Sense and received by the local machine, at five second intervals. Each line of unencrypted data transmitted will look similar to the following, +RCV=116,25,22.18|41.57|99.74|2343|1190|543|4011,-34,47. In the example below, the AES 128-bit data encryption is not enabled on the Arduino, yet. With encryption turned on the sensor data (the payload) would appear garbled.

scree_mac_uart

Even easier than the screen command, we can also use the Arduino desktop application’s Serial Terminal, as shown in the following short screen recording. Select the correct Port (virtual console) from the Tools menu and open the Serial Terminal. Since the transmitted data should be secured using AES 128-bit data encryption, we need to send an AT command (AT+CPIN) containing the transceiver module’s common password, to correctly decrypt the data on the receiving device (e.g., AT+CPIN=92A0ECEC9000DA0DCF0CAAB0ABA2E0EF).

Receiving Data on IoT Gateway

The Raspberry Pi will act as an IoT gateway, receiving the environmental sensor data from the IoT device, the Arduino. The Raspberry Pi will run a Python script, rasppi_lora_receiver.py, which will receive and decrypt the data payload, parse the sensor values, and display the values in the terminal. The script uses the pyserial, the Python Serial Port Extension. This Python module encapsulates the access for the serial port.

import logging
import time
from argparse import ArgumentParser
from datetime import datetime
import serial
from colr import color as colr
# LoRaWAN IoT Sensor Demo
# Using REYAX RYLR896 transceiver modules
# Author: Gary Stafford
# Requirements: python3 -m pip install –user -r requirements.txt
# To Run: python3 ./rasppi_lora_receiver.py –tty /dev/ttyAMA0 –baud-rate 115200
# constants
ADDRESS = 116
NETWORK_ID = 6
PASSWORD = "92A0ECEC9000DA0DCF0CAAB0ABA2E0EF"
def main():
logging.basicConfig(filename='output.log', filemode='w', level=logging.DEBUG)
args = get_args() # get args
payload = ""
print("Connecting to REYAX RYLR896 transceiver module…")
serial_conn = serial.Serial(
port=args.tty,
baudrate=int(args.baud_rate),
timeout=5,
parity=serial.PARITY_NONE,
stopbits=serial.STOPBITS_ONE,
bytesize=serial.EIGHTBITS
)
if serial_conn.isOpen():
set_lora_config(serial_conn)
check_lora_config(serial_conn)
while True:
serial_payload = serial_conn.readline() # read data from serial port
if len(serial_payload) > 0:
try:
payload = serial_payload.decode(encoding="utf-8")
except UnicodeDecodeError: # receiving corrupt data?
logging.error("UnicodeDecodeError: {}".format(serial_payload))
payload = payload[:2]
try:
data = parse_payload(payload)
print("\n———-")
print("Timestamp: {}".format(datetime.now()))
print("Payload: {}".format(payload))
print("Sensor Data: {}".format(data))
display_temperature(data[0])
display_humidity(data[1])
display_pressure(data[2])
display_color(data[3], data[4], data[5], data[6])
except IndexError:
logging.error("IndexError: {}".format(payload))
except ValueError:
logging.error("ValueError: {}".format(payload))
# time.sleep(2) # transmission frequency set on IoT device
def eight_bit_color(value):
return int(round(value / (4097 / 255), 0))
def celsius_to_fahrenheit(value):
return (value * 1.8) + 32
def display_color(r, g, b, a):
print("12-bit Color values (r,g,b,a): {},{},{},{}".format(r, g, b, a))
r = eight_bit_color(r)
g = eight_bit_color(g)
b = eight_bit_color(b)
a = eight_bit_color(a) # ambient light intensity
print(" 8-bit Color values (r,g,b,a): {},{},{},{}".format(r, g, b, a))
print("RGB Color")
print(colr("\t\t", fore=(127, 127, 127), back=(r, g, b)))
print("Light Intensity")
print(colr("\t\t", fore=(127, 127, 127), back=(a, a, a)))
def display_pressure(value):
print("Barometric Pressure: {} kPa".format(round(value, 2)))
def display_humidity(value):
print("Humidity: {}%".format(round(value, 2)))
def display_temperature(value):
temperature = celsius_to_fahrenheit(value)
print("Temperature: {}°F".format(round(temperature, 2)))
def get_args():
arg_parser = ArgumentParser(description="BLE IoT Sensor Demo")
arg_parser.add_argument("–tty", required=True, help="serial tty", default="/dev/ttyAMA0")
arg_parser.add_argument("–baud-rate", required=True, help="serial baud rate", default=1152000)
args = arg_parser.parse_args()
return args
def parse_payload(payload):
# input: +RCV=116,29,23.94|37.71|99.89|16|38|53|80,-61,56
# output: [23.94, 37.71, 99.89, 16.0, 38.0, 53.0, 80.0]
payload = payload.split(",")
payload = payload[2].split("|")
payload = [float(i) for i in payload]
return payload
def set_lora_config(serial_conn):
# configures the REYAX RYLR896 transceiver module
serial_conn.write(str.encode("AT+ADDRESS=" + str(ADDRESS) + "\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("Address set?", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+NETWORKID=" + str(NETWORK_ID) + "\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("Network Id set?", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+CPIN=" + PASSWORD + "\r\n"))
time.sleep(1)
serial_payload = (serial_conn.readline())[:2]
print("AES-128 password set?", serial_payload.decode(encoding="utf-8"))
def check_lora_config(serial_conn):
serial_conn.write(str.encode("AT?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("Module responding?", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+ADDRESS?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("Address:", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+NETWORKID?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("Network id:", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+IPR?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("UART baud rate:", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+BAND?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("RF frequency", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+CRFOP?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("RF output power", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+MODE?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("Work mode", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+PARAMETER?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("RF parameters", serial_payload.decode(encoding="utf-8"))
serial_conn.write(str.encode("AT+CPIN?\r\n"))
serial_payload = (serial_conn.readline())[:2]
print("AES128 password of the network",
serial_payload.decode(encoding="utf-8"))
if __name__ == "__main__":
main()

Prior to running the Python script, we can test and debug the connection from the Arduino Sense to the Raspberry Pi using a general application such as Minicom. Minicom is a text-based modem control and terminal emulator program. To install Minicom on the Raspberry Pi, use the following command.

sudo apt-get install minicom

To run Minicom or the Python script, we will need to know the virtual console of the serial connection (Serial1 in the script) used to communicate with the RYLR896 module, wired to the Raspberry Pi. This can found using the following command.

dmesg | grep -E --color 'serial|tty'

Search for a line, similar to the last line, shown below. Note the name of the virtual console, in my case, ttyAMA0.

[    0.000000] Kernel command line: coherent_pool=1M bcm2708_fb.fbwidth=656 bcm2708_fb.fbheight=416 bcm2708_fb.fbswap=1 vc_mem.mem_base=0x1ec00000 vc_mem.mem_size=0x20000000  dwc_otg.lpm_enable=0 console=tty1 root=PARTUUID=509d1565-02 rootfstype=ext4 elevator=deadline fsck.repair=yes rootwait quiet splash plymouth.ignore-serial-consoles
[    0.000637] console [tty1] enabled
[    0.863147] uart-pl011 20201000.serial: cts_event_workaround enabled
[    0.863289] 20201000.serial: ttyAMA0 at MMIO 0x20201000 (irq = 81, base_baud = 0) is a PL011 rev2

To view the data received from the Arduino Sense, using Minicom, use the following command, substituting the virtual console value, found above.

minicom -b 115200 -o -D /dev/ttyAMA0

If successful, we should see output similar to the lower right terminal window. Data is being transmitted by the Arduino Sense and being received by the Raspberry Pi, via LoRaWAN. In the below example, the AES 128-bit data encryption is not enabled on the Arduino, yet. With encryption turned on the sensor data (the payload) would appear garbled.

lora_tx_rx

IoT Gateway Python Script

To run the Python script on the Raspberry Pi, use the following command, substituting the name of the virtual console (e.g., /dev/ttyAMA0).

python3 ./rasppi_lora_receiver.py \
  --tty /dev/ttyAMA0 --baud-rate 115200

The script starts by configuring the RYLR896 and outputting that configuration to the terminal. If successful, we should see the following informational output.

Connecting to REYAX RYLR896 transceiver module...

Address set? +OK
Network Id set? +OK
AES-128 password set? +OK
Module responding? +OK

Address: +ADDRESS=116
Firmware version: +VER=RYLR89C_V1.2.7
Network Id: +NETWORKID=6
UART baud rate: +IPR=115200
RF frequency +BAND=915000000
RF output power +CRFOP=15
Work mode +MODE=0
RF parameters +PARAMETER=12,7,1,4
AES-128 password of the network +CPIN=92A0ECEC9000DA0DCF0CAAB0ABA2E0EF

Once configured, the script will receive the data from the Arduino Sense, decrypt the data, parse the sensor values, and format and display the values within the terminal.

python_script_running

The following screen recording shows a parallel view of both the Arduino Serial Monitor (upper right window) and the Raspberry Pi’s terminal output (lower right window). The Raspberry Pi (receiver) receives data from the Arduino (transmitter). The Raspberry Pi successfully reads, decrypts, interprets, and displays the sensor data, including displaying color swatches for the RGB and light intensity sensor readings.

Conclusion

In this post, we explored the use of the LoRa and LoRaWAN protocols to transmit environmental sensor data from an IoT device to an IoT gateway. Given its low energy consumption, long-distance transmission capabilities, and well-developed protocols, LoRaWAN is an ideal long-range wireless protocol for IoT devices.

This blog represents my own viewpoints and not of my employer, Amazon Web Services.

, , , , , , ,

Leave a comment

BLE and GATT for IoT: Getting Started with Bluetooth Low Energy and the Generic Attribute Profile Specification for IoT

Introduction

According to Wikipedia, Bluetooth is a wireless technology standard used for exchanging data between fixed and mobile devices over short distances. Bluetooth Low Energy (Bluetooth LE or BLE) is a wireless personal area network (WPAN) technology designed and marketed by the Bluetooth Special Interest Group (Bluetooth SIG). According to the Bluetooth SIG, BLE is designed for very low power operation. BLE supports data rates from 125 Kb/s to 2 Mb/s, with multiple power levels from 1 milliwatt (mW) to 100 mW. Several key factors influence the effective range of a reliable Bluetooth connection, which can vary from a kilometer down to less than a meter. The newer generation Bluetooth 5 provides a theoretical 4x range improvement over Bluetooth 4.2, from approximately 200 feet (60 meters) to 800 feet (240 meters).

Wikipedia currently lists 36 definitions of Bluetooth profiles defined and adopted by the Bluetooth SIG, including the Generic Attribute Profile (GATT) Specification. According to the Bluetooth SIG, GATT is built on top of the Attribute Protocol (ATT) and establishes common operations and a framework for the data transported and stored by the ATT. GATT provides profile discovery and description services for the BLE protocol. It defines how ATT attributes are grouped together into sets to form services.

Given its low energy consumption and well-developed profiles, such as GATT, BLE is an ideal short-range wireless protocol for Internet of Things (IoT) devices, when compared to competing protocols, such as ZigBee, Bluetooth classic, and Wi-Fi. In this post, we will explore the use of BLE and the GATT specification to transmit environmental sensor data from an IoT Sensor to an IoT Gateway.

IoT Sensor

In this post, we will use an Arduino single-board microcontroller to serve as an IoT sensor, actually an array of sensors. The 3.3V AI-enabled Arduino Nano 33 BLE Sense board, released in August 2019, comes with the powerful nRF52840 processor from Nordic Semiconductors, a 32-bit ARM Cortex-M4 CPU running at 64 MHz, 1MB of CPU Flash Memory, 256KB of SRAM, and a NINA-B306 stand-alone Bluetooth 5 low energy module.

IMG_7016

The Sense also contains an impressive array of embedded sensors:

  • 9-axis Inertial Sensor (LSM9DS1): 3D digital linear acceleration sensor, a 3D digital
    angular rate sensor, and a 3D digital magnetic sensor
  • Humidity and Temperature Sensor (HTS221): Capacitive digital sensor for relative humidity and temperature
  • Barometric Sensor (LPS22HB): MEMS nano pressure sensor: 260–1260 hectopascal (hPa) absolute digital output barometer
  • Microphone (MP34DT05): MEMS audio sensor omnidirectional digital microphone
  • Gesture, Proximity, Light Color, and Light Intensity Sensor (APDS9960): Advanced Gesture detection, Proximity detection, Digital Ambient Light Sense (ALS), and Color Sense (RGBC).

The Sense is an excellent, low-cost single-board microcontroller for learning about collecting and transmitting IoT sensor data.

IoT Gateway

An IoT Gateway, according to TechTarget, is a physical device or software program that serves as the connection point between the Cloud and controllers, sensors, and intelligent devices. All data moving to the Cloud, or vice versa goes through the gateway, which can be either a dedicated hardware appliance or software program.

ble_aws_iot_2

In this post, we will use a recent Raspberry Pi 3 Model B+ single-board computer (SBC), to serve as an IoT Gateway. This Raspberry Pi model features a 1.4GHz Cortex-A53 (ARMv8) 64-bit quad-core processor System on a Chip (SoC), 1GB LPDDR2 SDRAM, dual-band wireless LAN, Bluetooth 4.2 BLE, and Gigabit Ethernet. To follow along with the post, you could substitute the Raspberry Pi for any Linux-based machine to run the included sample Python script.

IMG_7025

The Arduino will transmit IoT sensor telemetry, over BLE, to the Raspberry Pi. The Raspberry Pi, using Wi-Fi or Ethernet, is then able to securely transmit the sensor telemetry data to the Cloud. In Bluetooth terminology, the Bluetooth Peripheral device (aka GATT Server), which is the Arduino, will transmit data to the Bluetooth Central device (aka GATT Client), which is the Raspberry Pi.

ble_aws_iot

Arduino Sketch

For those not familiar with Arduino, a sketch is the name that Arduino uses for a program. It is the unit of code that is uploaded into non-volatile flash memory and runs on an Arduino board. The Arduino language is a set of C/C++ functions. All standard C and C++ constructs supported by the avr-g++ compiler should work in Arduino.

For this post, the sketch, combo_sensor_ble.ino, contains all the code necessary to collect environmental sensor telemetry, including temperature, relative humidity, barometric pressure, and ambient light and RGB color. All code for this post, including the sketch, can be found on GitHub.

The sensor telemetry will be advertised by the Sense, over BLE, as a GATT Environmental Sensing Service (GATT Assigned Number 0x181A) with multiple GATT Characteristics. Each Characteristic represents a sensor reading and contains the most current sensor value(s), for example, Temperature (0x2A6E) or Humidity (0x2A6F).

Each GATT Characteristic defines how the data should be represented. To represent the data accurately, the sensor readings need to be modified. For example, using ArduinoHTS221 library, the temperature is captured with two decimal points of precision (e.g., 22.21 °C). However, the Temperature GATT Characteristic (0x2A6E) requires a signed 16-bit value (-32,768–32,767). To maintain precision, the captured value (e.g., 22.21 °C) is multiplied by 100 to convert it to an integer (e.g., 2221). The Raspberry Pi will then handle converting the value back to the original value with the correct precision.

The GATT specification has no current predefined Characteristic representing ambient light and RGB color. Therefore, I have created a custom Characteristic for the color values and assigned it a universally unique identifier (UUID).

According to the documentation, ambient light and RGB color are captured as 16-bit values (a range of 0–65,535). However, using the ArduinoAPDS9960 library, I have found the scale of the readings to be within a range of 0–4097. Without diving into the weeds, the maximum count (or saturation) value is variable. It can be calculated based upon the integration time and the size of the count register (e.g., 16-bits). The ADC integration time appears to be set to 10 ms in the library’s file, Arduino_APDS9960.cpp.

RGB values are typically represented as 8-bit color. We could convert the values to 8-bit before sending or handle it later on the Raspberry Pi IoT Gateway. For the sake of demonstration purposes versus data transfer efficiency, the sketch concatenates the 12-bit values together as a string (e.g., 4097,2811,1500,4097). The string will be converted from 12-bit to 8-bit on the Raspberry Pi (e.g., 255,175,93,255).

/*
Description: Transmits Arduino Nano 33 BLE Sense sensor readings over BLE,
including temperature, humidity, barometric pressure, and color,
using the Bluetooth Generic Attribute Profile (GATT) Specification
Author: Gary Stafford
Reference: Source code adapted from `Nano 33 BLE Sense Getting Started`
Adapted from Arduino BatteryMonitor example by Peter Milne
*/
/*
Generic Attribute Profile (GATT) Specifications
GATT Service: Environmental Sensing Service (ESS) Characteristics
Temperature
sint16 (decimalexponent -2)
Unit is in degrees Celsius with a resolution of 0.01 degrees Celsius
https://www.bluetooth.com/xml-viewer/?src=https://www.bluetooth.com/wp-content/uploads/Sitecore-Media-Library/Gatt/Xml/Characteristics/org.bluetooth.characteristic.temperature.xml
Humidity
uint16 (decimalexponent -2)
Unit is in percent with a resolution of 0.01 percent
https://www.bluetooth.com/xml-viewer/?src=https://www.bluetooth.com/wp-content/uploads/Sitecore-Media-Library/Gatt/Xml/Characteristics/org.bluetooth.characteristic.humidity.xml
Barometric Pressure
uint32 (decimalexponent -1)
Unit is in pascals with a resolution of 0.1 Pa
https://www.bluetooth.com/xml-viewer/?src=https://www.bluetooth.com/wp-content/uploads/Sitecore-Media-Library/Gatt/Xml/Characteristics/org.bluetooth.characteristic.pressure.xml
*/
#include <ArduinoBLE.h>
#include <Arduino_HTS221.h>
#include <Arduino_LPS22HB.h>
#include <Arduino_APDS9960.h>
const int UPDATE_FREQUENCY = 2000; // Update frequency in ms
const float CALIBRATION_FACTOR = –4.0; // Temperature calibration factor (Celsius)
int previousTemperature = 0;
unsigned int previousHumidity = 0;
unsigned int previousPressure = 0;
String previousColor = "";
int r, g, b, a;
long previousMillis = 0; // last time readings were checked, in ms
BLEService environmentService("181A"); // Standard Environmental Sensing service
BLEIntCharacteristic tempCharacteristic("2A6E", // Standard 16-bit Temperature characteristic
BLERead | BLENotify); // Remote clients can read and get updates
BLEUnsignedIntCharacteristic humidCharacteristic("2A6F", // Unsigned 16-bit Humidity characteristic
BLERead | BLENotify);
BLEUnsignedIntCharacteristic pressureCharacteristic("2A6D", // Unsigned 32-bit Pressure characteristic
BLERead | BLENotify); // Remote clients can read and get updates
BLECharacteristic colorCharacteristic("936b6a25-e503-4f7c-9349-bcc76c22b8c3", // Custom Characteristics
BLERead | BLENotify, 24); // 1234,5678,
BLEDescriptor colorLabelDescriptor("2901", "16-bit ints: r, g, b, a");
void setup() {
Serial.begin(9600); // Initialize serial communication
// while (!Serial); // only when connected to laptop
if (!HTS.begin()) { // Initialize HTS221 sensor
Serial.println("Failed to initialize humidity temperature sensor!");
while (1);
}
if (!BARO.begin()) { // Initialize LPS22HB sensor
Serial.println("Failed to initialize pressure sensor!");
while (1);
}
// Avoid bad readings to start bug
// https://forum.arduino.cc/index.php?topic=660360.0
BARO.readPressure();
delay(1000);
if (!APDS.begin()) { // Initialize APDS9960 sensor
Serial.println("Failed to initialize color sensor!");
while (1);
}
pinMode(LED_BUILTIN, OUTPUT); // Initialize the built-in LED pin
if (!BLE.begin()) { // Initialize NINA B306 BLE
Serial.println("starting BLE failed!");
while (1);
}
BLE.setLocalName("ArduinoNano33BLESense"); // Set name for connection
BLE.setAdvertisedService(environmentService); // Advertise environment service
environmentService.addCharacteristic(tempCharacteristic); // Add temperature characteristic
environmentService.addCharacteristic(humidCharacteristic); // Add humidity characteristic
environmentService.addCharacteristic(pressureCharacteristic); // Add pressure characteristic
environmentService.addCharacteristic(colorCharacteristic); // Add color characteristic
colorCharacteristic.addDescriptor(colorLabelDescriptor); // Add color characteristic descriptor
BLE.addService(environmentService); // Add environment service
tempCharacteristic.setValue(0); // Set initial temperature value
humidCharacteristic.setValue(0); // Set initial humidity value
pressureCharacteristic.setValue(0); // Set initial pressure value
colorCharacteristic.setValue(""); // Set initial color value
BLE.advertise(); // Start advertising
Serial.print("Peripheral device MAC: ");
Serial.println(BLE.address());
Serial.println("Waiting for connections…");
}
void loop() {
BLEDevice central = BLE.central(); // Wait for a BLE central to connect
// If central is connected to peripheral
if (central) {
Serial.print("Connected to central MAC: ");
Serial.println(central.address()); // Central's BT address:
digitalWrite(LED_BUILTIN, HIGH); // Turn on the LED to indicate the connection
while (central.connected()) {
long currentMillis = millis();
// After UPDATE_FREQUENCY ms have passed, check temperature & humidity
if (currentMillis – previousMillis >= UPDATE_FREQUENCY) {
previousMillis = currentMillis;
updateReadings();
}
}
digitalWrite(LED_BUILTIN, LOW); // When the central disconnects, turn off the LED
Serial.print("Disconnected from central MAC: ");
Serial.println(central.address());
}
}
int getTemperature(float calibration) {
// Get calibrated temperature as signed 16-bit int for BLE characteristic
return (int) (HTS.readTemperature() * 100) + (int) (calibration * 100);
}
unsigned int getHumidity() {
// Get humidity as unsigned 16-bit int for BLE characteristic
return (unsigned int) (HTS.readHumidity() * 100);
}
unsigned int getPressure() {
// Get humidity as unsigned 32-bit int for BLE characteristic
return (unsigned int) (BARO.readPressure() * 1000 * 10);
}
void getColor() {
// check if a color reading is available
while (!APDS.colorAvailable()) {
delay(5);
}
// Get color as (4) unsigned 16-bit ints
int tmp_r, tmp_g, tmp_b, tmp_a;
APDS.readColor(tmp_r, tmp_g, tmp_b, tmp_a);
r = tmp_r;
g = tmp_g;
b = tmp_b;
a = tmp_a;
}
void updateReadings() {
int temperature = getTemperature(CALIBRATION_FACTOR);
unsigned int humidity = getHumidity();
unsigned int pressure = getPressure();
getColor();
if (temperature != previousTemperature) { // If reading has changed
Serial.print("Temperature: ");
Serial.println(temperature);
tempCharacteristic.writeValue(temperature); // Update characteristic
previousTemperature = temperature; // Save value
}
if (humidity != previousHumidity) { // If reading has changed
Serial.print("Humidity: ");
Serial.println(humidity);
humidCharacteristic.writeValue(humidity);
previousHumidity = humidity;
}
if (pressure != previousPressure) { // If reading has changed
Serial.print("Pressure: ");
Serial.println(pressure);
pressureCharacteristic.writeValue(pressure);
previousPressure = pressure;
}
// Get color reading everytime
// e.g. "12345,45678,89012,23456"
String stringColor = "";
stringColor += r;
stringColor += ",";
stringColor += g;
stringColor += ",";
stringColor += b;
stringColor += ",";
stringColor += a;
if (stringColor != previousColor) { // If reading has changed
byte bytes[stringColor.length() + 1];
stringColor.getBytes(bytes, stringColor.length() + 1);
Serial.print("r, g, b, a: ");
Serial.println(stringColor);
colorCharacteristic.writeValue(bytes, sizeof(bytes));
previousColor = stringColor;
}
}

view raw
combo_sensor_ble.ino
hosted with ❤ by GitHub

Previewing and Debugging BLE Device Services

Before looking at the code running on the Raspberry Pi, we can use any number of mobile applications to preview and debug the Environmental Sensing service running on the Arduino and being advertised over BLE. A commonly recommended application is Nordic Semiconductor’s nRF Connect for Mobile, available on Google Play. I have found the Android version works better at correctly interpreting and displaying GATT Characteristic values than the iOS version of the app.

Below, we see a scan of my local vicinity for BLE devices being advertised, using the Android version of the nRF Connect mobile application. Note the BLE device, ArduinoNano33BLESense (indicated in red). Also, note the media access control address (MAC address) of that BLE device, in my case, d1:aa:89:0c:ee:82. The MAC address will be required later on the IoT Gateway.

Screenshot_20200731-185352

Connecting to the device, we see three Services. The Environmental Sensing Service (indicated in red) contains the sensor readings.

Screenshot_20200731-185659

Drilling down into the Environmental Sensing Service (0x181A), we see the four expected Characteristics: Temperature (0x2A6E), Humidity (0x2A6F), Pressure (0x2A6D), and Unknown Characteristic (936b6a25-e503-4f7c-9349-bcc76c22b8c3). Since nRF Connect cannot recognize the color sensor reading as a registered GATT Characteristic (no GATT Assigned Number), it is displayed as an Unknown Characteristic. Whereas the temperature, humidity, and pressure values (indicated in red) are interpreted and displayed correctly, the color sensor reading is left as raw hexadecimal text (e.g., 30-2c-30-2c-30-2c-30-00 or 0,0,0,0).

Screenshot_20200731-185558

These results indicate everything is working as expected.

BLE Client Python Code

To act as the BLE Client (aka central device), the Raspberry Pi runs a Python script. The script, rasppi_ble_receiver.py, uses the bluepy Python module for interfacing with BLE devices through Bluez, on Linux.

import sys
import time
from argparse import ArgumentParser
from bluepy import btle # linux only (no mac)
from colr import color as colr
# BLE IoT Sensor Demo
# Author: Gary Stafford
# Reference: https://elinux.org/RPi_Bluetooth_LE
# Requirements: python3 -m pip install –user -r requirements.txt
# To Run: python3 ./rasppi_ble_receiver.py d1:aa:89:0c:ee:82 <- MAC address – change me!
def main():
# get args
args = get_args()
print("Connecting…")
nano_sense = btle.Peripheral(args.mac_address)
print("Discovering Services…")
_ = nano_sense.services
environmental_sensing_service = nano_sense.getServiceByUUID("181A")
print("Discovering Characteristics…")
_ = environmental_sensing_service.getCharacteristics()
while True:
print("\n")
read_temperature(environmental_sensing_service)
read_humidity(environmental_sensing_service)
read_pressure(environmental_sensing_service)
read_color(environmental_sensing_service)
# time.sleep(2) # transmission frequency set on IoT device
def byte_array_to_int(value):
# Raw data is hexstring of int values, as a series of bytes, in little endian byte order
# values are converted from bytes -> bytearray -> int
# e.g., b'\xb8\x08\x00\x00' -> bytearray(b'\xb8\x08\x00\x00') -> 2232
# print(f"{sys._getframe().f_code.co_name}: {value}")
value = bytearray(value)
value = int.from_bytes(value, byteorder="little")
return value
def split_color_str_to_array(value):
# e.g., b'2660,2059,1787,4097\x00' -> 2660,2059,1787,4097 ->
# [2660, 2059, 1787, 4097] -> 166.0,128.0,111.0,255.0
# print(f"{sys._getframe().f_code.co_name}: {value}")
# remove extra bit on end ('\x00')
value = value[0:1]
# split r, g, b, a values into array of 16-bit ints
values = list(map(int, value.split(",")))
# convert from 16-bit ints (2^16 or 0-65535) to 8-bit ints (2^8 or 0-255)
# values[:] = [int(v) % 256 for v in values]
# actual sensor is reading values are from 0 – 4097
print(f"12-bit Color values (r,g,b,a): {values}")
values[:] = [round(int(v) / (4097 / 255), 0) for v in values]
return values
def byte_array_to_char(value):
# e.g., b'2660,2058,1787,4097\x00' -> 2659,2058,1785,4097
value = value.decode("utf-8")
return value
def decimal_exponent_two(value):
# e.g., 2350 -> 23.5
return value / 100
def decimal_exponent_one(value):
# e.g., 988343 -> 98834.3
return value / 10
def pascals_to_kilopascals(value):
# 1 Kilopascal (kPa) is equal to 1000 pascals (Pa)
# to convert kPa to pascal, multiply the kPa value by 1000
# 98834.3 -> 98.8343
return value / 1000
def celsius_to_fahrenheit(value):
return (value * 1.8) + 32
def read_color(service):
color_char = service.getCharacteristics("936b6a25-e503-4f7c-9349-bcc76c22b8c3")[0]
color = color_char.read()
color = byte_array_to_char(color)
color = split_color_str_to_array(color)
print(f" 8-bit Color values (r,g,b,a): {color[0]},{color[1]},{color[2]},{color[3]}")
print("RGB Color")
print(colr('\t\t', fore=(127, 127, 127), back=(color[0], color[1], color[2])))
print("Light Intensity")
print(colr('\t\t', fore=(127, 127, 127), back=(color[3], color[3], color[3])))
def read_pressure(service):
pressure_char = service.getCharacteristics("2A6D")[0]
pressure = pressure_char.read()
pressure = byte_array_to_int(pressure)
pressure = decimal_exponent_one(pressure)
pressure = pascals_to_kilopascals(pressure)
print(f"Barometric Pressure: {round(pressure, 2)} kPa")
def read_humidity(service):
humidity_char = service.getCharacteristics("2A6F")[0]
humidity = humidity_char.read()
humidity = byte_array_to_int(humidity)
humidity = decimal_exponent_two(humidity)
print(f"Humidity: {round(humidity, 2)}%")
def read_temperature(service):
temperature_char = service.getCharacteristics("2A6E")[0]
temperature = temperature_char.read()
temperature = byte_array_to_int(temperature)
temperature = decimal_exponent_two(temperature)
temperature = celsius_to_fahrenheit(temperature)
print(f"Temperature: {round(temperature, 2)}°F")
def get_args():
arg_parser = ArgumentParser(description="BLE IoT Sensor Demo")
arg_parser.add_argument('mac_address', help="MAC address of device to connect")
args = arg_parser.parse_args()
return args
if __name__ == "__main__":
main()

python3 ./rasppi_ble_receiver.py d1:aa:89:0c:ee:82

Unlike the nRF Connect app, the bluepy Python module is not capable of correctly interpreting and displaying the GATT Characteristic values. Hence, the script takes the raw, incoming hexadecimal text from the Arduino and coerces it to the correct values. For example, a temperature reading must be transformed from bytes, b'\xb8\x08\x00\x00', to a byte array, bytearray(b'\xb8\x08\x00\x00'), then to an integer, 2232, then to a decimal, 22.32, and finally to the Fahrenheit scale, 72.18°F.

Sensor readings are retrieved from the BLE device every two seconds. In addition to displaying the numeric sensor readings, the Python script also displays a color swatch of the 8-bit RGB color, as well as a grayscale swatch representing the light intensity using the colr Python module.

ScreenShot2020-08-04at8.55.08AM

The following screen recording shows a parallel view of both the Arduino Serial Monitor and the Raspberry Pi’s terminal output. The Raspberry Pi (central device) connects to the Arduino (peripheral device) when the Python script is started. The Raspberry Pi successfully reads and interprets the telemetry data from the Environmental Sensing Service.

Conclusion

In this post, we explored the use of BLE and the GATT specification to transmit environmental sensor data from a peripheral device to a central device. Given its low energy consumption and well-developed profiles, such as GATT, Bluetooth Low Energy (BLE) is an ideal short-range wireless protocol for IoT devices.

This blog represents my own viewpoints and not of my employer, Amazon Web Services.

, , , , , , ,

Leave a comment

Prevent Motion From Running Without a Camera Connected

Introduction

If you read my post, Raspberry Pi-Powered Dashboard Video Camera Using Motion and FFmpeg, you know Motion with FFmpeg on a Raspberry Pi makes an ideal dashboard camera system. However, an issue I still struggled with when using the dash-cam was Motion running without a webcam connected.

When I start my car, the Raspberry Pi boots-up, and subsequently, Motion starts. No interaction with the Pi is required. The dash-cam starts capturing images and making the time-lapse video. However, when I get home and plug my Pi back into my local network, Motion starts up again and starts recording blank images and creating the time-lapse video, even though there is no webcam connected.

To get prevent Motion from starting up without a webcam connected, I’ve added a simple function to the Motion startup script. When the system calls Motion on startup,  the new function checks if a webcam is connected. If not, it immediately exits the script, without ever starting Motion. No blank images or empty time-lapse videos are created. This saves a lot of wasted processing on the Pi. It also saves a lot of wasted time moving videos and images off the Pi that end up being blank, because no webcam was connected.

Find Your Webcam

First, attach your webcam to the Raspberry Pi. Run the following command to list the USB devices connected to the Pi:

lsusb

You should see similar output to the example below. Note your webcam’s ID(s). I ran the command twice in this example, to identify both of my webcams.

Identifying Webcams with lsusb Command

Identifying Webcams with lsusb Command

There are several ways to detect your webcam, depending on you Linux distro. I found this post particularly helpful, The Webcam HOWTO.

Modify Motion

Next, open the Motion startup script, using the following command:

sudo nano /etc/init.d/motion

Add the following ‘check_for_webcam ()’ function to the top of the script, adjacent to the existing ‘check_daemon_enabled()’ function:

# Check if specific webcam(s) are connected to Pi
check_for_webcam () {
    if lsusb | grep -s -q -e 0000:ABCD
    then
        echo "Webcam found. Continuing..."
        return 0
    else
        echo "No webcam found? Shutting down Motion!"
        return 1
    fi
}

You will need to modify the function, changing the string ‘0000:ABCD’, to match your webcam’s ID. If you change your webcam model, remember to update the ID you entered in this function.

Next add the following statement to the beginning of the ‘start’ function.  This code calls the new function when Motion’s ‘start’ command is executed. If no webcam is found, the Motion script exits without starting.

if ! check_for_webcam; then
    exit 1
fi

In my example below, I have two possible webcams that might be connected, so I search (grep) for either ID.

Modifying Motion Startup Script

Modifying Motion Startup Script

Testing the Script Change

Save and close the Motion script. To test the script is working, run the following command to stop Motion:

sudo /etc/init.d/motion stop

Unplug your webcam from the Raspberry Pi. Then, run the following command to start Motion:

sudo /etc/init.d/motion start

You should see the following output:

No webcam found? Shutting down Motion!

Now, plug your webcam back in and run the ‘start’ command, again. You should now see the following output:

Webcam found. Continuing...

Starting Motion With and Without WebCam

Starting Motion With and Without Webcam

Conclusion

Now, when you start the Raspberry Pi and don’t have a web-cam connected, Motion will no longer automatically start. Just remember, if you don’t have a way to interact directly with your Pi, you will need to restart the Pi to get Motion running again after connecting a webcam.

, , , , , , , ,

Leave a comment

Travel-Size Wireless Router for Your Raspberry Pi

Use a low-cost nano-size wireless router to connect to your Raspberry Pi while traveling. Set up your own private wireless network in your vehicle, hotel, or coffee shop.

Introduction

Recently, I purchased a USB-powered wireless router for to use with my Raspberry Pi when travelling. In an earlier post, Raspberry Pi-Powered Dashboard Video Camera Using Motion and FFmpeg, I discussed the use of the Raspberry Pi, combined with a webcam, Motion, and FFmpeg, to create a low-cost dashboard video camera. Like many, I find one the big challenges with the Raspberry Pi, is how to connect and interact with it. Being in my car, and usually out of range of my home’s wireless network, except maybe in the garage, this becomes even more of an issue. That’s where adding an inexpensive travel-size router to my vehicle comes in handy.

I chose the TP-LINK TL-WR702N Wireless N150 Travel Router, sold by Amazon. The TP-LINK router, described as ‘nano size’, measures only 2.2 inches square by 0.7 inches wide. It has several modes of operation, including as a router, access point, client, bridge, or repeater. It operates at wireless speeds up to 150Mpbs and is compatible with IEEE 802.11b/g/n networks. It supports several common network security protocols, including WEP, WPA/WPA2, WPA-PSK/WPA2-PSK encryption. For $22 USD, what more could you ask for!

TP-LINK Nano Router

My goal with the router was to do the following:

  1. Have the Raspberry Pi auto-connect to the new TP-LINK router’s wireless network when in range, just like my home network.
  2. Since I might still be in range of my home network, have the Raspberry Pi try to connect to the TP-LINK first, before falling back to my home network.
  3. Ensure the network was relatively secure, since I would be exposed to many more potential threats when traveling.

My vehicle has two power outlets. I plug my Raspberry Pi into one outlet and the router into the other. You could daisy chain the router off the Pi. However, my Pi’s ports are in use my the USB wireless adapter and the USB webcam. Using the TP-LINK router, I can easily connect to the Raspberry Pi with my mobile phone or tablet, using an SSH client.

Using Fing to Locate the Pi on the TP-LINK Wireless Network

Using Fing to Locate the Pi on the TP-LINK Wireless Network

When I arrive at my destination, I log into the Pi and do a proper shutdown. This activates my shutdown script (see my last post), which moves the newly created Motion/FFmpeg time-lapse dash-cam videos to a secure folder on my Pi, before powering down.

Using SSH Terminal for iOS to Shutdown the Pi

Using SSH Terminal for iOS to Shutdown the Pi

Of course there are many other uses for the router. For example, I can remove the Pi and router from my car and plug it back in at the hotel while traveling, or power the router from my laptop while at work or the coffee shop. I now have my own private wireless network wherever I am to use the Raspberry Pi, or work with other users. Remember the TP-LINK can act as a router, access point, client, bridge, or a repeater.

The Raspberry Pi and Router both fit in a Small Container for Travel

The Raspberry Pi and Router both fit in a Small Container for Travel

Network Security

Before configuring your Raspberry Pi, the first thing you should do is change all the default security related settings for the router. Start with the default SSID and the PSK password. Both these default values are printed right on the router. That’s motivation enough to change!

TP-LINK Administration Console 2

Additionally, change the default IP address of the router and the username and password for the browser-based Administration Console.

TP-LINK Administration Console

Lastly, pick the most secure protocol possible. I chose ‘WPA-PSK/WPA2-PSK’. All these changes are done through the TP-LINK’s browser-based Administration Console.

Configuring Multiple Wireless Networks

In an earlier post, Installing a Miniature WiFi Module on the Raspberry Pi (w/ Roaming Enabled), I detailed the installation and configuration of a Miniature WiFi Module, from Adafruit Industries, on a Pi running Soft-float Debian “wheezy”. I normally connect my Pi to my home wireless network. I wanted to continue to do this in the house, but connect the new router when traveling.

Based on the earlier post, I was already using Jouni Malinen’s wpa_supplicant, the WPA Supplicant for Linux, BSD, Mac OS X, and Windows with support for WPA and WPA2. This made network configuration relatively simple. If you use wpa_supplicant, your ‘/etc/network/interfaces’ file should look like the following. If you’re not familiar with configuring the interfaces file for wpa_supplicant, this post on NoWiresSecurity.com is a good starting point.

Interfaces File

Note that in this example, I am using DHCP for all wireless network connections. If you chose to use static IP addresses for any of the networks, you will have to change the interfaces file accordingly. Once you add multiple networks, configuring static IP addresses for each network, becomes more complex. That is my next project…

First, I generated a new pre-shared key (PSK) for the router’s SSID configuration using the following command. Substitute your own SSID (‘your_ssid’) and passphrase (‘your_passphrase’).

wpa_passphrase your_ssid your_passphrase

Based your SSID and passphrase, this command will generate a pre-shared key (PSK), similar to the following. Save or copy the PSK to the clipboard. We will need the PSK in the next step.

Creating PSK 2

Then, I modified my wpa_supplicant configuration file with the following command:

sudo nano /etc/wpa_supplicant/wpa_supplicant.conf

I added the second network configuration, similar to the existing configuration for my home wireless network, using the newly generated PSK. Below is an example of what mine looks like (of course, not the actual PSKs).

WPA Supplicant Configuration

Depending on your Raspberry Pi and router configurations, your wpa_supplicant configuration will look slightly different. You may wish to add more settings. Don’t consider my example the absolute right way for your networks.

Wireless Network Priority

Note the priority of the TP-LINK router is set to 2, while my home NETGEAR router is set to 1. This ensures wpa_supplicant will attempt to connect to the TP-LINK network first, before attempting the home network. The higher number gets priority. The best resource I’ve found, which explains all the configuration options is detail, is here. In this example wpa_supplicant configuration file, priority is explained this way, ‘by default, all networks will get same priority group (0). If some of the networks are more desirable, this field can be used to change the order in which wpa_supplicant goes through the networks when selecting a BSS. The priority groups will be iterated in decreasing priority (i.e., the larger the priority value, the sooner the network is matched against the scan results). Within each priority group, networks will be selected based on security policy, signal strength, etc.’

Conclusion

If you want an easy, inexpensive, secure way to connect to your Raspberry Pi, in the vehicle or other location, a travel-size wireless router is a great solution. Best of all, configuring it for your Raspberry Pi is simple if you use wpa_supplicant.

, , , , , , , , , , , , ,

6 Comments

Using a Startup Script to Save Motion/FFmpeg Videos and Images on The Raspberry Pi

Use a start-up script to overcome limitations of Motion/FFmpeg and save multiple Raspberry Pi dashboard camera timelapse videos and images, automatically.

01-20130624190454-00

Introduction

In my last post, Raspberry Pi-Powered Dashboard Video Camera Using Motion and FFmpeg, I demonstrated how the Raspberry Pi can be used as a low-cost dashboard video camera. One of the challenges I faced in that post was how to save the timelapse videos and individual images (frames) created by Motion and FFmpeg when the Raspberry Pi is turned on and off. Each time the car starts, the Raspberry Pi boots up, and Motion begins to run, the previous images and video, stored in the default ‘/tmp/motion/’ directory are removed and new images and video, created.

Take the average daily commute, we drive to and from work. Maybe we stop for a morning coffee, or stop at the store on the way home to pick up dinner. Maybe we use our car go out for lunch. Our car starts, stops, starts, stops, starts, and stops. Our daily commute actually encompasses a series small trips, and therefore multiple dash-cam timelapse videos. If you are only interested in keeping the latest timelapse video in case of an accident, then this may not be a problem. When the accident occurs, simply pull the SDHC card from the Raspberry Pi and copy the video and images off to your laptop.

However, if you are interested in capturing and preserving series of dash-cam videos, such as in the daily commute example above, then the default behavior of Motion is insufficient. To preserve each video segment or series of images, we need a way to preserve the content created by Motion and FFmpeg, before they are overwritten. In this post, I will present a solution to overcome this limitation.

The process involves the following steps:

  1. Change the default location where Motion stores timelapse videos and images to somewhere other than a temporary directory;
  2. Create a startup script that will move the video and images to a safe location when restarting the Pi;
  3. Configure the Pi’s Debian operating system to run this script at startup (and optionally shutdown), before Motion starts;

Sounds pretty simple. However, understanding how startup scripts work with Debian’s Init program, making sure the new move script run before Motion starts, and knowing how to move a huge number of files, all required forethought.

Change Motion’s Default Location for Video and Images

To start, change the default location where Motion stores timelapse video and images, from ‘/tmp/motion/’ to a location outside the ‘/tmp’ directory. I chose to create a directory called ‘/motiontmp’. Make sure you set the permissions on the new ‘/motiontmp’ directory, so Motion can write to it:

sudo chmod -R 777 /motiontmp

To have Motion  use this location, we need to modify the Motion configuration file:

sudo nano /etc/motion/motion.conf

Change the following setting (in bold below). Note when Motion starts for the first time, it will create the ‘motion’ sub folder inside ‘motiontmp’. You do not have to create it yourself.

# Target base directory for pictures and films
# Recommended to use absolute path. (Default: current working directory)
target_dir /motiontmp/motion

Motion Target Directory

Create the Startup Script to Move Video and Images

Next, create the new shell script that will run at startup to move Motion’s video and images. The script creates a timestamped folder in new ‘motiontmp’ directory for each series of images and video. The script then copies all files from the ‘motion’ directory to the new timestamped directory. Before copying, the script deletes any zero-byte jpegs, which are images that did not fully process prior to the Raspberry Pi being shut off when the car stopped. To create the new script, run the following command.

sudo nano /etc/init.d/motionStartup.sh

Copy the following contents into the script and save it.

### BEGIN INIT INFO
# Provides:          motionStartup
# Required-Start:    $remote_fs $syslog
# Required-Stop:     $remote_fs $syslog
# Default-Start:     2 3 4 5
# Default-Stop:      0 1 6
# Short-Description: Move motion files at startup.
# Description:       Move motion files at startup.
# X-Start-Before:    motion
### END INIT INFO

#! /bin/sh
# /etc/init.d/motionStartup
#

# Some things that run always
#touch /var/lock/motionStartup
logger -s "Script motionStartup called"

# Carry out specific functions when asked to by the system
case "$1" in
  start)
    logger -s "Script motionStartup started"
    TIMESTAMP=$(date +%Y%m%d%H%M%S | sed 's/ //g') # No spaces
    logger -s "Script motionStartup $TIMESTAMP"
    sudo mkdir /motiontmp/$TIMESTAMP || logger -s "Error mkdir start"
    find /motiontmp/motion/. -type f -size 0 -print0 -delete
    find /motiontmp/motion/. -maxdepth 1 -type f | \
        xargs -I '{}' sudo mv {} /motiontmp/$TIMESTAMP
    ;;
  stop)
    logger -s "Script motionStartup stopped"
    ;;
  *)
    echo "Usage: /etc/init.d/motionStartup {start|stop}"
    exit 1
    ;;
esac

exit 0

Note the ‘X-Start-Before’ setting at the top of the script (in bold). An explanation of this setting is found on the Debian Wiki website. According to the site, ”There is no such standard-defined header, but there is a proposed extension implemented in the insserv package (since version 1.09.0-8). Use the X-Start-Before and X-Stop-After headers proposed by SuSe.” To make sure you have a current version of ‘insserv‘, you can run the following command:

dpkg -l insserv

Also, note how the files are moved by the script:

find /motiontmp/motion/. -maxdepth 1 -type f | \
        xargs -I '{}' sudo mv {} /motiontmp/$TIMESTAMP

It’s not as simple as using ‘mv *.*’ when you have a few thousand files. This will likely throw a ‘Argument list too long’ exception. According to one stackoverflow, the exception is because bash actually expands the asterisk to every matching file, producing a very long command line. Using the ‘find’ combined with ‘xargs’ gets around these problem. The ‘xargs’ command splits up the list and issues several commands if necessary. This issue applies to several commands, including rm, cp, and mv.

Lastly, note the use of the ‘logger‘ commands throughout the script. These are optional and may be removed. I like to log the script’s progress for troubleshooting purposes. Using the above ‘logger’ commands, I can easily pinpoint issues by looking at the log with grep, such as:

tail -500 /var/log/messages | grep 'motionStartup' | grep 'logger:'

View of Log with Script Messages

You can test the script by running the following command:

/etc/init.d/./motionStartup.sh start

You should see a series of three messages output to the screen by the script, confirming the script is working. Note the new timestamped folder created by the script, below.

Testing the New Script

Below, is an example of the how the directory structure should look after a few videos are created by Motion, and the Raspberry Pi cycled off and on. You need to complete the rest of the steps in this post for this to work automatically.


Shutdown Script?

I know, the name of the post clearly says ‘Startup Script’. Well, a little tip, if you copy the code from the ‘start’ method and paste it in the ‘stop’ method, this now also works at shutdown. If you do a proper shutdown (like ‘sudo reboot’), the Raspberry Pi’s OS will call the script’s ‘stop’ method. The ‘start’ method is more useful to use for us in the car, where we may not be able to do a proper shutdown; we just turn the car off and kill power to the Pi. However, if you are shutting down from mobile device via ssh, or using a micro keyboard and LCD monitor, the script will do it’s work on the way down.

Configure Debian OS to Run the New Startup Script

To have our new script run on startup, install it by running the following command:

sudo update-rc.d motionStartup.sh defaults

A full explanation of this command is to complex for this brief post. A good overview of creating startup scripts and installing them in Debian is found on the Debian Administration website. This is the source I used to start to understand runlevels. There are also a few links at the end of the post. To tell which runlevel (state) you running at, use the following command:

runlevel

To make sure the startup script was installed properly, run the following command. This will display the contents of each ‘rc*.d’ folder. Each folder corresponds to a runlevel – 0, 1, 2, etc. Each folder contains symbolic links to the actual scripts. The links are named by order of executed (S01…, S02…, S03…):

ls /etc/rc*.d

Look for the new script listed under the appropriate runlevel(s). The new script should be listed before ‘motion’, as shown below.

View of Runlevels

View of Runlevels 2

If for any reason you need to uninstall the new script (not delete/remove), run the following command. This not a common task, but necessary to change the order of execution of the scripts or rename a script.

sudo update-rc.d -f motionStartup.sh remove

Copy and Remove Files from the Raspberry Pi

Once the startup script is working and we are capturing images and timelapse video, the next thing we will probably want to do is copy files off the Raspberry Pi. To do this over your WiFi network, use a ‘scp’ command from a remote machine. The below script copies all directories, stating with ‘2013’, and their contents to remote machine, preserving the directory structure.

scp -rp user@ip_address_of_pi:/motiontmp/2013* ~/local_directory/

Maybe you just want all the timelapse videos Motion/FFmpeg creates; you don’t care about the images. The following command copies just the MPEG videos from all ‘2013’ folders to a single directory on the your remote machine. The directory structure is ignored during the copy. This is the quickest way to just store all the videos.

scp -rp user@ip_address_of_pi:/motiontmp/2013*/*.mpg ~/local_directory/

If you are going to save all the MPEG timelapse videos in one location, I recommend changing the naming convention of the videos in the motion.conf file. I have added the hour, minute, and seconds to mine. This will ensure the names don’t conflict when moved to a common directory:

# File path for timelapse mpegs relative to target_dir
# Default: %Y%m%d-timelapse
# Default value is near equivalent to legacy oldlayout option
# For Motion 3.0 compatible mode choose: %Y/%m/%d-timelapse
# File extension .mpg is automatically added so do not include this
timelapse_filename %Y%m%d%H%M%S-timelapse

To remove all the videos and images once they have been moved off the Pi and are no longer needed, you can run a rm command. Using the ‘-rf’ options make sure the directories and their contents are removed.

sudo rm -rf /motiontmp/2013*

Conclusion

The only issue I have yet to overcome is maintaining the current time on the Raspberry Pi. The Pi lacks a Real Time Clock (RTC). Therefore, turning the Pi on and off in the car causes it to loose the current time. Since the Pi is not always on a WiFi network, it can’t sync to the current time when restarted. The only side-effects I’ve seen so far caused by this, the videos occasionally contain more than one driving event and the time displayed in the videos is not always correct. Otherwise, the process works pretty well.

Resources

The following are some useful resources on this topic:

Debian Reference: Chapter 3. The system initialization

How-To: Managing services with update-rc.d

Files and scripts that execute on boot

Making scripts run at boot time with Debian

Finding all files and move to new directory from shell prompt

Shell scripting: Write message to a syslog / log file

“Argument list too long”: Beyond Arguments and Limitations

Linux / Unix Command: date (used for TIMESTAMP)

Time / Date Commands

To have our new script run on startup, install it by running the following command:

, , , , , , , , , ,

3 Comments

Raspberry Pi-Powered Dashboard Video Camera Using Motion and FFmpeg

Demonstrate the use of the Raspberry Pi and a basic webcam, along with Motion and FFmpeg, to build low-cost dashboard video camera for your daily commute.

01-20130622162908-00_feature

Dashboard Video Cameras

Most of us remember the proliferation of dashboard camera videos of the February 2013 meteor racing across the skies of Russia. This rare astronomical event was captured on many Russian motorist’s dashboard cameras. Due to the dangerous driving conditions in Russia, many drivers rely on dashboard cameras for insurance and legal purposes. In the United States, we are more use to seeing dashboard cameras used by law-enforcement. Who hasn’t seen those thrilling police videos of car crashes, drunk drivers, and traffic stops gone wrong.

Although driving in the United States is not as dangerous as in Russia, there is reason we can’t also use dashboard cameras. In case you are involved in an accident, you will have a video record of the event for your insurance company. If you witness an accident or other dangerous situation, your video may help law enforcement and other emergency responders. Maybe you just want to record a video diary of your next road trip.

A wide variety of dashboard video cameras, available for civilian vehicles, can be seen on Amazon’s website. They range in price and quality from less that $50 USD to well over $300 USD or more, depending on their features. In a popular earlier post, Remote Motion-Activated Web-Based Surveillance with Raspberry Pi, I demonstrated the use of the Raspberry Pi and a webcam, along with Motion and FFmpeg, to provide low-cost web-based, remote surveillance. There are many other uses for this combination of hardware and software, including as a dashboard video camera.

Methods for Creating Dashboard Camera Videos

I’ve found two methods for capturing dashboard camera videos. The first and easiest method involves configuring Motion to use FFmpeg to create a video. FFmpeg creates a video from individual images (frames) taken at regular intervals while driving. The upside of the FFmpeg option, it gives you a quick ready-made video. The downside of FFmpeg option, your inability to fully control the high-level of video compression and high frame-rate (fps). This makes it hard to discern fine details when viewing the video.

Alternately, you can capture individual JPEG images and combine them using FFmpeg from the command line or using third-party movie-editing tools. The advantage of combining the images yourself, you have more control over the quality and frame-rate of the video. Altering the frame-rate, alters your perception of the speed of the vehicle recording the video. The only disadvantage of combining the images yourself, you have the extra steps involved to process the images into a video.

At one frame every two seconds (.5 fps), a 30 minute commute to work will generate 30 frames/minute x 30 minutes, or 900 jpeg images. At 640 x 480 pixels, depending on your jpeg compression ratio, that’s a lot of data to move around and crunch into a video. If you just want a basic record of your travels, use FFmpeg. If you want a higher-quality record of trip, maybe for a video-diary, combining the frames yourself is a better way to go.

Configuring Motion for a Dashboard Camera

The installation and setup of FFmpeg and Motion are covered in my earlier post so I won’t repeat that here. Below are several Motion settings I recommend starting with for use with a dashboard video camera. To configure Motion, open it’s configuration file, by entering the following command on your Raspberry Pi:

sudo nano /etc/motion/motion.conf

To use FFmpeg, the first method, find the ‘FFMPEG related options’ section of the configuration and locate ‘Use ffmpeg to encode a timelapse movie’. Enter a number for the ‘ffmpeg_timelapse’ setting. This is the rate at which images are captured and combined into a video. I suggest starting with 2 seconds. With a dashboard camera, you are trying to record important events as you drive. In as little as 2-3 seconds at 55 mph, you can miss a lot of action. Moving the setting down to 1 second will give more detail, but you will chew up a lot of disk space, if that is an issue for you. I would experiment with different values:

# Use ffmpeg to encode a timelapse movie
# Default value 0 = off - else save frame every Nth second
ffmpeg_timelapse 2

To use the ‘do-it-yourself’ FFmpeg method, locate the ‘Snapshots’ section. Find ‘Make automated snapshot every N seconds (default: 0 = disabled)’. Change the ‘snapshot_interval’ setting, using the same logic as the ‘ffmpeg_timelapse’ setting, above:

# Make automated snapshot every N seconds (default: 0 = disabled)
snapshot_interval 2

Irregardless of which method you choose (or use them both), you will want to tweak some more settings. In the ‘Text Display’ section, locate ‘Set to ‘preview’ will only draw a box in preview_shot pictures.’ Change the ‘locate’ setting to ‘off’. As shown in the video frame below, since you are moving in your vehicle most of the time, there is no sense turning on this option. Motion cannot differentiate between the highway zipping by the camera and the approaching vehicles. Everything is in motion to the camera, the box just gets in the way:

# Set to 'preview' will only draw a box in preview_shot pictures.
locate off

01-20130625193032-01

Optionally, I recommend turning on the time-stamp option. This is found right below the ‘locate’ setting. Especially in the event of an accident, you want an accurate time-stamp on the video or still images (make sure you Raspberry Pi’s time is correct):

# Draws the timestamp using same options as C function strftime(3)
# Default: %Y-%m-%d\n%T = date in ISO format and time in 24 hour clock
# Text is placed in lower right corner
text_right %Y-%m-%d\n%T-%q

01-20130622162908-00

Starting with the largest, best quality images will ensure  the video quality is optimal. Start with a large size capture and reduce it only if you are having trouble capturing the video quickly enough. These settings are found in the ‘Capture device options’ section:

# Image width (pixels). Valid range: Camera dependent, default: 352
width 640

# Image height (pixels). Valid range: Camera dependent, default: 288
height 480

Similarly, I suggest starting with a low amount of jpeg compression to maximize quality and only lower if necessary. This setting is found in the ‘Image File Output’ section:

# The quality (in percent) to be used by the jpeg compression (default: 75)
quality 90

Once you have completed the configuration of Motion, restart Motion for the changes to take effect:

sudo /etc/init.d/motion restart

Since you will be powering on your Raspberry Pi in your vehicle, and may have no way to reach Motion from a command line, you will want Motion to start capturing video and images for you automatically at startup. To enable Motion (the motion daemon) on start-up, edit the /etc/default/motion file.

sudo nano /etc/default/motion

Change the ‘start_motion_daemon‘ setting to ‘yes’. If you decide to stop using the Raspberry Pi for capturing video, remember to disable this option. Motion will keep generating video and images, even without a camera connected, if the daemon process is running.

Capturing Dashboard Video

Although taking dashboard camera videos with your Raspberry Pi sounds easy, it presents several challenges. How will you mount your camera? How will you adjust your camera’s view? How will you power your Raspberry Pi in the vehicle? How will you power-down your Raspberry Pi from the vehicle? How will you make sure Motion is running? How will you get the video and images off the Raspberry Pi? Do you have one a mini keyboard and LCD monitor to use in your vehicle? Or, is your Raspberry Pi on your wireless network? If so, do you know how to bring up the camera’s view and Motion’s admin site on your smartphone’s web-browser?

My start-up process is as follows:

  1. Start my car.
  2. Plug the webcam and the power cable into the Raspberry Pi.
  3. Let the Raspberry Pi boot up fully and allow Motion to start. This takes less than one minute.
  4. Open the http address Motion serves up using my mobile browser.
    (Since my Raspberry Pi has a wireless USB adapter installed and I’m still able to connect from my garage).
  5. Adjust the camera using the mobile browser view from the camera.
  6. Optionally, use Motion’s ‘HTTP Based Control’ feature to adjust any Motion configurations, on-the-fly (great option).

Logitech Webcam C210 Webcam Mounted on Car Sun Visor

Logitech Webcam C210 Webcam Mounted on Car Sun Visor

Raspberry Pi in Vehicle with iPhone Preview of Dashboard Camera

Raspberry Pi in Vehicle with iPhone Preview of Dashboard Camera

Adjusting Dashboard Camera using iPhone Preview over LAN Connection to Raspberry Pi

Adjusting Camera using iPhone WiFi Connection to Raspberry Pi

Using Motion's HTTP Based Control on iPhone Mobile Web Browser

Using Motion’s HTTP Based Control on iPhone Mobile Web Browser

Once I reach my destination, I copy the video and/or still image frames off the Raspberry Pi:

  1. Let the car run for at least 1-2 minutes after you stop. The Raspberry Pi is still processing the images and video.
  2. Copy the files off the Raspberry Pi over the local network, right from car (if in range of my LAN).
  3. Alternately, shut down the Raspberry Pi by using a SSH mobile app on your smartphone, or just shut the car off (this not the safest method!).
  4. Place the Pi’s SDHC card into my laptop and copy the video and/or still image frames.

Shutting Down Raspberry Pi Using SSH Terminal iPhone App

Shutting Down Raspberry Pi Using SSH Terminal iPhone App

Here are some tips I’ve found to make creating dashboard camera video’s easier and better quality:

  • Leave your camera in your vehicle once you mount and position it.
  • Make sure your camera is secure so the vehicle’s vibrations while driving don’t create bouncy-images or change the position of the camera field of view.
  • Clean your vehicle’s front window, inside and out. Bugs or other dirt are picked up by the camera and may affect the webcam’s focus.
  • Likewise, film on the window from smoking or dirt will soften the details of the video and create harsh glare when driving on sunny days.
  • Similarly, make sure your camera’s lens is clean.
  • Keep your dashboard clear of objects such as paper, as it reflects on the window and will obscure the dashboard camera’s video.
  • Constantly stopping your Raspberry Pi by shutting the vehicle off can potential damage the Raspberry Pi and/or corrupt the operating system.
  • Make sure to keep your Raspberry Pi out of sight of potential thieves and the direct sun when you are not driving.
  • Backup your Raspberry Pi’s SDHC card before using for dashboard camera, see Duplicating Your Raspberry Pi’s SDHC Card.

Creating Video from Individual Dashboard Camera Images

FFmpeg

If you choose the second method for capturing dashboard camera videos, the easiest way to combine the individual dashboard camera images is by calling FFmpeg from the command line. To create the example #3 video, shown below, I ran two commands from a Linux Terminal prompt. The first command is a bash command to rename all the images to four-digit incremented numbers (‘0001.jpg’, ‘0002.jpg’, ‘0003.jpg’, etc.). This makes it easier to execute the second command. I found this script on stackoverflow. It requires Gawk (‘sudo apt-get install gawk’). If you are unsure about running this command, make a copy of the original images in case something goes wrong.

The second command is a basic FFmpeg command to combine the images into a 20 fps MPEG-4 video file. More information on running FFmpeg can be found on their website. There is a huge number of options available with FFmpeg from the command line. Running this command, FFmpeg processed 4,666 frames at 640 x 480 pixels in 233.30 seconds, outputting a 147.5 Mb MPEG-4 video file.

find -name '*.jpg' | sort | gawk '{ printf "mv %s %04d.jpg\n", $0, NR }' | bash 
ffmpeg -r 20 -qscale 2  -i %04d.jpg output.mp4

FFmpeg Command Line Video Creation Output

FFmpeg Command Line Video Creation Output


Example #3 – FFmpeg Video from Command Line

If you want to compress the video, you can chain a second FFmpeg command to the first one, similar to the one below. In my tests, this reduced the video size to 20-25% of the original uncompressed version.

ffmpeg -r 20 -qscale 2 -i %04d.jpg output.mp4 && ffmpeg -i output.mp4 -vcodec mpeg2video output_compressed.mp4

If your images are to dark (early morning or overcast) or have a color-cast (poor webcam or tinted-windows), you can use programs like ImageMagick to adjust all the images as a single batch. In example #5 below, I pre-processed all the images prior to making the video. With one ImageMagick command, I adjusting their levels to make them lighter and less flat.

mogrify -level 12%,98%,1.79 *.jpg


Example #5 – FFmpeg Uncompressed Video from Command Line

Windows MovieMaker

Using Windows MovieMaker was not my first choice, but I’ve had a tough time finding an equivalent Linux gui-based application. If you are going to create your own video from the still images, you need to be able to import and adjust thousands of images quickly and easily. I can import, create, and export a typical video of a 30 minute trip in 10 minutes with MovieMaker. With MovieMaker, you can also add titles, special effects, and so forth.

Single Images Combined in Windows MovieMaker

Single Images Combined in Windows MovieMaker

Sample Videos

Below are a few dashboard video examples using a variety of methods. In the first two examples, I captured still images and created the FFmpeg video at the same time. You can compare quality of Method #1 to #2.


Example #2a – Motion/FFmpeg Video


Example #2b – Windows MovieMaker


Example #5 – FFmpeg Compressed Video from Command Line


Example #6 – FFmpeg Compressed Video from Command Line

Useful Links

Renaming files in a folder to sequential numbers

Useful FFmpeg Syntax Examples

ImageMagick: Command-line Options

ImageMagick: Mogrify — in-place batch processing

http://wp.me/p1RD28-AW

, , , , , , , , , ,

12 Comments

Object Tracking on the Raspberry Pi with C++, OpenCV, and cvBlob

Use C++ with OpenCV and cvBlob to perform image processing and object tracking on the Raspberry Pi, using a webcam.

Source code and compiled samples are now available on GitHub. The below post describes the original code on the ‘Master’ branch. As of  May 2014, there is a revised and improved version of the project on the ‘rev05_2014’ branch, on GitHub. The README.md details the changes and also describes how to install OpenCV, cvBlob, and all dependencies!

Introduction

As part of a project with a local FIRST Robotics Competition (FRC) Team, I’ve been involved in developing a Computer Vision application for use on the Raspberry Pi. Our FRC team’s goal is to develop an object tracking and target acquisition application that could be run on the Raspberry Pi, as opposed to the robot’s primary embedded processor, a National Instrument’s NI cRIO-FRC II. We chose to work in C++ for its speed, We also decided to test two popular open-source Computer Vision (CV) libraries, OpenCV and cvBlob.

Due to its single ARM1176JZF-S 700 MHz ARM processor, a significant limitation of the Raspberry Pi is the ability to perform complex operations in real-time, such as image processing. In an earlier post, I discussed Motion to detect motion with a webcam on the Raspberry Pi. Although the Raspberry Pi was capable of running Motion, it required a greatly reduced capture size and frame-rate. And even then, the Raspberry Pi’s ability to process the webcam’s feed was very slow. I had doubts it would be able to meet the processor-intense requirements of this project.

Development for the Raspberry Pi

Using C++ in NetBeans 7.2.1 on Ubuntu 12.04.1 LTS and 12.10, I wrote several small pieces of code to demonstrate the Raspberry Pi’s ability to perform basic image processing and object tracking. Parts of the follow code are based on several OpenCV and cvBlob code examples, found in my research. Many of those examples are linked on the end of this article. Examples of cvBlob are especially hard to find.

Project in NetBeans

Project in NetBeans

The Code

There are five files: ‘main.cpp’, ‘testfps.cpp (testfps.h)’, and ‘testcvblob.cpp (testcvblob.h)’. The main.cpp file’s main method calls the test methods in the other two files. The cvBlob library only works with the pre-OpenCV 2.0. Therefore, I wrote all the code using the older objects and methods. The code is not written using the latest OpenCV 2.0 conventions. For example, cvBlob uses 1.0’s ‘IplImage’ image type instead 2.0’s newer ‘CvMat’ image type. My next projects is to re-write the cvBlob code to use OpenCV 2.0 conventions and/or find a newer library. The cvBlob library offered so many advantages, I felt not using the newer OpenCV 2.0 features was still worthwhile.

Main Program Method (main.cpp)

/*
* File: main.cpp
* Author: Gary Stafford
* Description: Program entry point
* Created: February 3, 2013
*/
#include <stdio.h>
#include <sstream>
#include <stdlib.h>
#include <iostream>
#include "testfps.hpp"
#include "testcvblob.hpp"
using namespace std;
int main(int argc, char* argv[]) {
int captureMethod = 0;
int captureWidth = 0;
int captureHeight = 0;
if (argc == 4) { // user input parameters with call
captureMethod = strtol(argv[1], NULL, 0);
captureWidth = strtol(argv[2], NULL, 0);
captureHeight = strtol(argv[3], NULL, 0);
} else { // user did not input parameters with call
cout << endl << "Demonstrations/Tests: " << endl;
cout << endl << "(1) Test OpenCV - Show Webcam" << endl;
cout << endl << "(2) Test OpenCV - No Webcam" << endl;
cout << endl << "(3) Test cvBlob - Show Image" << endl;
cout << endl << "(4) Test cvBlob - No Image" << endl;
cout << endl << "(5) Test Blob Tracking - Show Webcam" << endl;
cout << endl << "(6) Test Blob Tracking - No Webcam" << endl;
cout << endl << "Input test # (1-6): ";
cin >> captureMethod;
// test 3 and 4 don't require width and height parameters
if (captureMethod != 3 && captureMethod != 4) {
cout << endl << "Input capture width (pixels): ";
cin >> captureWidth;
cout << endl << "Input capture height (pixels): ";
cin >> captureHeight;
cout << endl;
if (!captureWidth > 0) {
cout << endl << "Width value incorrect" << endl;
return -1;
}
if (!captureHeight > 0) {
cout << endl << "Height value incorrect" << endl;
return -1;
}
}
}
switch (captureMethod) {
case 1:
TestFpsShowVideo(captureWidth, captureHeight);
case 2:
TestFpsNoVideo(captureWidth, captureHeight);
break;
case 3:
DetectBlobsShowStillImage();
break;
case 4:
DetectBlobsNoStillImage();
break;
case 5:
DetectBlobsShowVideo(captureWidth, captureHeight);
break;
case 6:
DetectBlobsNoVideo(captureWidth, captureHeight);
break;
default:
break;
}
return 0;
}
view raw main.cpp hosted with ❤ by GitHub

Tests 1-2 (testcvblob.hpp)

// -*- C++ -*-
/*
* File: testcvblob.hpp
* Author: Gary Stafford
* Created: February 3, 2013
*/
#ifndef TESTCVBLOB_HPP
#define TESTCVBLOB_HPP
int DetectBlobsNoStillImage();
int DetectBlobsShowStillImage();
int DetectBlobsNoVideo(int captureWidth, int captureHeight);
int DetectBlobsShowVideo(int captureWidth, int captureHeight);
#endif /* TESTCVBLOB_HPP */
view raw testcvblob.hpp hosted with ❤ by GitHub

Tests 1-2 (testcvblob.cpp)

/*
* File: testcvblob.cpp
* Author: Gary Stafford
* Description: Track blobs using OpenCV and cvBlob
* Created: February 3, 2013
*/
#include <cv.h>
#include <highgui.h>
#include <cvblob.h>
#include "testcvblob.hpp"
using namespace cvb;
using namespace std;
// Test 3: OpenCV and cvBlob (w/ webcam feed)
int DetectBlobsNoStillImage() {
/// Variables /////////////////////////////////////////////////////////
CvSize imgSize;
IplImage *image, *segmentated, *labelImg;
CvBlobs blobs;
unsigned int result = 0;
///////////////////////////////////////////////////////////////////////
image = cvLoadImage("colored_balls.jpg");
if (image == NULL) {
return -1;
}
imgSize = cvGetSize(image);
cout << endl << "Width (pixels): " << image->width;
cout << endl << "Height (pixels): " << image->height;
cout << endl << "Channels: " << image->nChannels;
cout << endl << "Bit Depth: " << image->depth;
cout << endl << "Image Data Size (kB): "
<< image->imageSize / 1024 << endl << endl;
segmentated = cvCreateImage(imgSize, 8, 1);
cvInRangeS(image, CV_RGB(155, 0, 0), CV_RGB(255, 130, 130), segmentated);
labelImg = cvCreateImage(cvGetSize(image), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);
cvFilterByArea(blobs, 500, 1000000);
cout << endl << "Blob Count: " << blobs.size();
cout << endl << "Pixels Labeled: " << result << endl << endl;
cvReleaseBlobs(blobs);
cvReleaseImage(&labelImg);
cvReleaseImage(&segmentated);
cvReleaseImage(&image);
return 0;
}
// Test 4: OpenCV and cvBlob (w/o webcam feed)
int DetectBlobsShowStillImage() {
/// Variables /////////////////////////////////////////////////////////
CvSize imgSize;
IplImage *image, *frame, *segmentated, *labelImg;
CvBlobs blobs;
unsigned int result = 0;
bool quit = false;
///////////////////////////////////////////////////////////////////////
cvNamedWindow("Processed Image", CV_WINDOW_AUTOSIZE);
cvMoveWindow("Processed Image", 750, 100);
cvNamedWindow("Image", CV_WINDOW_AUTOSIZE);
cvMoveWindow("Image", 100, 100);
image = cvLoadImage("colored_balls.jpg");
if (image == NULL) {
return -1;
}
imgSize = cvGetSize(image);
cout << endl << "Width (pixels): " << image->width;
cout << endl << "Height (pixels): " << image->height;
cout << endl << "Channels: " << image->nChannels;
cout << endl << "Bit Depth: " << image->depth;
cout << endl << "Image Data Size (kB): "
<< image->imageSize / 1024 << endl << endl;
frame = cvCreateImage(imgSize, image->depth, image->nChannels);
cvConvertScale(image, frame, 1, 0);
segmentated = cvCreateImage(imgSize, 8, 1);
cvInRangeS(image, CV_RGB(155, 0, 0), CV_RGB(255, 130, 130), segmentated);
cvSmooth(segmentated, segmentated, CV_MEDIAN, 7, 7);
labelImg = cvCreateImage(cvGetSize(frame), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);
cvFilterByArea(blobs, 500, 1000000);
cvRenderBlobs(labelImg, blobs, frame, frame,
CV_BLOB_RENDER_BOUNDING_BOX | CV_BLOB_RENDER_TO_STD, 1.);
cvShowImage("Image", frame);
cvShowImage("Processed Image", segmentated);
while (!quit) {
char k = cvWaitKey(10)&0xff;
switch (k) {
case 27:
case 'q':
case 'Q':
quit = true;
break;
}
}
cvReleaseBlobs(blobs);
cvReleaseImage(&labelImg);
cvReleaseImage(&segmentated);
cvReleaseImage(&frame);
cvReleaseImage(&image);
cvDestroyAllWindows();
return 0;
}
// Test 5: Blob Tracking (w/ webcam feed)
int DetectBlobsNoVideo(int captureWidth, int captureHeight) {
/// Variables /////////////////////////////////////////////////////////
CvCapture *capture;
CvSize imgSize;
IplImage *image, *frame, *segmentated, *labelImg;
int picWidth, picHeight;
CvTracks tracks;
CvBlobs blobs;
CvBlob* blob;
unsigned int result = 0;
bool quit = false;
///////////////////////////////////////////////////////////////////////
capture = cvCaptureFromCAM(-1);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH, captureWidth);
cvSetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT, captureHeight);
cvGrabFrame(capture);
image = cvRetrieveFrame(capture);
if (image == NULL) {
return -1;
}
imgSize = cvGetSize(image);
cout << endl << "Width (pixels): " << image->width;
cout << endl << "Height (pixels): " << image->height << endl << endl;
frame = cvCreateImage(imgSize, image->depth, image->nChannels);
while (!quit && cvGrabFrame(capture)) {
image = cvRetrieveFrame(capture);
cvConvertScale(image, frame, 1, 0);
segmentated = cvCreateImage(imgSize, 8, 1);
cvInRangeS(image, CV_RGB(155, 0, 0), CV_RGB(255, 130, 130), segmentated);
//Can experiment either or both
cvSmooth(segmentated, segmentated, CV_MEDIAN, 7, 7);
cvSmooth(segmentated, segmentated, CV_GAUSSIAN, 9, 9);
labelImg = cvCreateImage(cvGetSize(frame), IPL_DEPTH_LABEL, 1);
result = cvLabel(segmentated, labelImg, blobs);
cvFilterByArea(blobs, 500, 1000000);
cvRenderBlobs(labelImg, blobs, frame, frame, 0x000f, 1.);
cvUpdateTracks(blobs, tracks, 200., 5);
cvRenderTracks(tracks, frame, frame, 0x000f, NULL);
picWidth = frame->width;
picHeight = frame->height;
if (cvGreaterBlob(blobs)) {
blob = blobs[cvGreaterBlob(blobs)];
cout << "Blobs found: " << blobs.size() << endl;
cout << "Pixels labeled: " << result << endl;
cout << "center-x: " << blob->centroid.x
<< " center-y: " << blob->centroid.y
<< endl;
cout << "offset-x: " << ((picWidth / 2)-(blob->centroid.x))
<< " offset-y: " << (picHeight / 2)-(blob->centroid.y)
<< endl;
cout << "\n";
}
char k = cvWaitKey(10)&0xff;
switch (k) {
case 27:
case 'q':
case 'Q':
quit = true;
break;
}
}
cvReleaseBlobs(blobs);
cvReleaseImage(&labelImg);
cvReleaseImage(&segmentated);
cvReleaseImage(&frame);
cvReleaseImage(&image);
cvDestroyAllWindows();
cvReleaseCapture(&capture);
return 0;
}
// Test 6: Blob Tracking (w/o webcam feed)
int DetectBlobsShowVideo(int captureWidth, int captureHeight) {
/// Variables /////////////////////////////////////////////////////////
CvCapture *capture;
CvSize imgSize;
IplImage *image, *frame, *segmentated, *labelImg;
CvPoint pt1, pt2, pt3, pt4, pt5, pt6;
CvScalar red, green, blue;
int picWidth, picHei