The IoT in the service of the environment using Apache Camel & JBoss A-MQ


This post proposes use of Internet of Things (IoT) to address the problem of the degradation of air quality in cities.

The idea is that the sensor will measure the air pollution level and transmit this data. This information will then be shared to the users of the system.

The example is demonstrating how to get real time and value from IoT Data by using Apache Camel, JBoss A-MQ and Elasticsearch.

The basic architecture of demo is basically like this:



There are 3 common protocols when it comes to messaging on Internet of Things (MQTT, AMQP, and STOMP), in this example all sensor devices use MQTT to interact and connect. MQTT is one of them.  JBoss A-MQ supports MQTT, Message Queuing Telemetry Transport.

Why MQTT? – is OASIS standard, adopted by various vendor and companies, it is an extremely light weight transport, and is excellent for low computing power device, and platforms. And it only supports publish and subscribe mechanism.

All IoT Data collected by the sensors are sent and aggregated reliably in JBoss A-MQ and processed by apache camel.  At end of processing by Apache camel, data are stocked in Elasticsearch for a real-time notification.

How we can implement this?

You have to add the MQTT transport connector to JBoss A-MQ. To do that, you have to add the following line in Broker.xml:

<transportConnector name=”mqtt” uri=”mqtt://${Hostname}:1883″/>

Then install Camel Route. This route use built-in Components in JBoss Fuse to poll in the message from broker. With the built-in component, it will save lots of code implementation time.


As you can see from the camel route, after we retrieve messages from the topic IoT, we will process it and store them into Elasticsearch, which will be picked up by Kibana and displayed in a Dashboard.

At Elasticsearch level, you have to create a new index, with the following mapping:

curl -XPUT ‘http://localhost:9200/pollutionlevel/_mapping/command’ -d ‘

{“tweet” : {

“properties” : {

“event_datetime” : {“type” : “date”},

“pollution_degree” : {“type” : “integer”},

“latitude” : {“type” : “double”},

“longitude” : {“type” : “double”},

“location” : {“type” : “geo_point”}




At this moment all data are available in Elasticsearch, and we have just to explore, discover data and visualise it via Kibana. For that you have to

create some visualisations that will be included to you Dashboards.

Below an example of a custom Dashboard:



You can find the project code here:



Admittedly, this example is very simple.  But I believe it is a good illustration of how I went from retrieving sensor values, analyzing, and exploiting data in real time in less than a day.

I hope that you find this post useful and good luck with your projects!


Abdellatif Bouchama

Abdellatif's main area of expertise lies within the fields of SOA, ESB, BigData and IoT.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>