Data transfer - PaVoS-TECO/core GitHub Wiki
Basic usage
1. Setup all needed components
In this example, we will use Graphite (with Grafana):
To install Graphite via Docker, please run the following docker command
docker run -d --name graphite --restart=always -p 80:80 -p 2003-2004:2003-2004 -p 2023-2024:2023-2024 -p 8125:8125/udp -p 8126:8126 graphiteapp/graphite-statsd
This will download graphite, set the ports 80, 2003-2004, 2023-2024, 8125, 8126
Port 80 - nginx (server internal communication)
Port 2003 - send data into Graphite via the Plaintext-Protocol
Port 2004 - send data into Graphite via the Pickle-Protocol (better efficiency)
Port 2023 - carbon-aggregator for Plaintext-Protocol
Port 2024 - carbon-aggregator for Pickle-Protocol
Port 8125 - statsd
Port 8126 - statsd admin
Also note that nginx will run our website on port 8080
http://your.address.here/8080/dashboard
2. Workflow explanation
Let's ignore the fact that we don't yet know how to write our code.
We have to start by using Kafka.
Kafka has Topics, which are String values.
We will use a Topic called "RandomGraphiteSenderTest" in our example here.
We are then going to produce values for the variable "particulateMatter_PM10" for our Topic.
Now, we have to understand how Grafana (the destination of our data) stores values.
Grafana uses a query to look for values from datasources.
We will nearly always have such a nested query.
In this case, we ask for all values of "RandomGraphiteSenderTest" and then filter them to only show us
the ones, where the next nested value equals "particulateMatter_PM10"
As you can see in this example, we have used the Kafka Topic name "RandomGraphiteSenderTest"
as a general identifier for Grafana and then subcategorized this identifier into values.
The value that was used is "particulateMatter_PM10".
You can customize the name used in Grafana as well. For example, we could create a KafkaTopic "temperatureTest" and save the transfered data to "temperature".
3. Sending data
First, you will need a Java application that produces data and writes it to a kafka-topic.
In your app, where you want to send data, include the following statements:
TransferManager tm = new TransferManager();
The TransferManager will be the main object we are going to need, in order to transfer our data.
tm.startDataTransfer(kafkaTopics, graphTopic dest);
with a List of kafkaTopics
tm.startDataTransfer(kafkaTopic, graphTopic dest);
or a single one
Now we will start to transfer data. The TransferManager will collect data from the kafkaTopic and will send it to the destination "dest". All data sent, will be stored in a folder-like structure, identified by graphTopic
which is our root and from there extended by the different observation-types.
kafkaTopic is a String (you can also use a list of topics), graphTopic is a single String and "dest" is the enum "Destination".
tm.startDataTransfer("SenderTest", "Cluster-1", Destination.GRAPHITE);
In this case, we are going to send our data from the topic "SenderTest" to Graphite. We store the data in the root folder "Cluster-1".
tm.stopDataTransfer();
After you are done with sending, or just closing the application, you will want to stop the data transfer.