Solution: Eyevinn Open Analytics - EyevinnOSC/community GitHub Wiki

Eyevinn Open Analytics

Background and starting point

The Eyevinn Player Analytics Specification (EPAS) is an open sourced framework and specification for tracking events from video- and audio players. It is a modular framework where you can pick and choose the modules you need.

This tutorial is based on the use case where you have a streaming solution up, and running and you want to gather analytics. We will use Open Source Cloud as well as open source components to achieve this.

epas_osc (3)

Figure 1: Eyevinn Open Analytics

This solution is based on the following open source projects made available as services or components:

Prerequisites

  • If you have not already done so, sign up for an OSC account, you can create on on osaas.io.
  • OSC Command line tool installed. You need version v 0.14.2 or higher.
  • AWS CLI to create a queue
% brew install awscli

Overview

We have to implement a client (Web, Android or iOS), connect it with a Player Analytics Eventsink that writes out the events on a queue handled by SmoothMQ. This queue is processed by the Player Analytics Worker who persists the data in a database, here served by ClickHouse.

Note

Throughout this guide it is important that you keep track of your secrets and the values assigned to them. In this guide we will use the following Secret Names and Secret Values as can be seen in the table below.

Service Service Secret Name Secret Value Description
Eventsink

-

mysink The name of the Player Analytics Eventsink instance
SmoothMQ - myqueuename The name of the message-queue
SmoothMQ mqaccesskey mymqaccessvalue The Access Key for the queue
SmoothMQ mqsecretkey mymqsecretvalue The Secret Key for the queue
Click House - myclickdbinstance The name of the ClickHouse DB-instance
Click House myclickdb myclickdbname The name of your database
Click House clickdbuserkey myclickdbuservalue The Admin user for your DB
Click House clickdbsecretkey myclickdbsecretvalue The password used for your user
Worker - myworkername The name of the Player Analytics Worker instance

Table 1: Secrets used in this guide. Step 2 & 3

Feel free to fill out the name of your keys and your secret values in the table for easy access throughout this guide.

When we refer to these secret-keys or secret-values we will do it with a < >-notation.
Example:
<mysecretusername> - refers to your secret key that corresponds to mysecretusername in Table 1
likewise
<myuser> - refers to your secret value that corresponds to myuser in Table1.

Step 1: Client setup

The SDK:s comes with an informative ReadMe that describes how to incorporate them in your app. Both Android-, Swift- and web-SDK:s all try to be so easy to implement that you only initialize them and then the SDK:s do the rest and you don't have to do anything more. You can find the client-setups here for Web, Android and iOS.

Further information regarding the different events and the SEPA specification describing the event-flow can you read here in Player Analytics Specification that is useful for all clients.

Step 2: Smooth MQ setup

SmoothMQ is a drop-in-replacement for SQS and it is used by the Player Analytics Eventsink as a messageing queue.

You will need your OSC Personal Access Token since we are going to use the OSC CLI. This you can find by clicking on the Settings link, down on your left side. This will display your Account-page. To the right of the Account-tab is a tab called { } API, and when you click there you will see your Personal Access Token. Click the copy-symbol.

In the terminal type this:

% export OSC_ACCESS_TOKEN=<Paste your OSC Personal Access Token here>

Now your OSC Personal Access Token is saved for this terminal-session.

Create the two SmoothMQ service secretes in the OSC web user interface. Click the Create message-queue +-button and fill in the dialog that follows with your values: Name = Table1.<myqueuename>, AccessKey = Table1.<mqaccesskey>, and SecretKey = Table1.<mqsecretkey>

Screenshot 2025-04-29 at 22 48 36

Figure 2. SmoothMQ parameters

And click the Create-button.

If you click on the tab-link "My message-queues (n)" you can see all running SmoothMQ instances. Locate your SmoothMQ and click on the copy-symbol to the right of the URL.

Screenshot 2025-05-02 at 15 12 45

Figure 3. SmoothMQ-URL for your instances

You will need this SmoothMQ-URL below.

Step 3: Player Analytics Eventsink setup

Now when the SmoothMQ instance is up and running we can create a queue that we want to use. And also configure the eventsink module that receives the data from the players and pushes the data on to a processing queue.

Create an SQS queue

NOTE!! The --endpoint-url is the SmoothMQ-URL from Figure 3 above.

% export AWS_ACCESS_KEY_ID=<mymqaccessvalue>
% export AWS_SECRET_ACCESS_KEY=<mymqsecretvalue>
% aws sqs create-queue --queue-name=events --region='eu-west-1' --endpoint-url=<SmoothMQ-URL>

We have now created a queue named events on your SmoothMQ instance. You will get an output from the command above, like this:

{
    "QueueUrl": "https://sqs.us-east-1.amazonaws.com/1/events"
}

Figure 4: SmoothMQ QueueURL

This is the QueueURL for your queue. Take a note on this as you will use it later.

Create an EPAS eventsink

Navigate to the Player Analytics Eventsink in OSC web console. Press the button Create eventsink +.

Fill out the dialog and press Create. Name=Table1.<mysink>, SqsQueueUrl=<SmoothMQ QueueUrl from Figure 4>, AwsAccessKeyId=Table1.<mymqaccessvalue>, AwsSecretAccessKey=Table1.<mymqsecretvalue>, SqsEndpoint=<SmoothMQ-URL for you instance from figure 3>

Screenshot 2025-04-30 at 01 12 41

Figure 5: Player Analytics Eventsink parameters

And click the Create-button

If you click on the tab-link "My eventsinks (n)" you can see all running Eventsink-instances. Locate your Eventsink-instance and click on the copy-symbol to the right of the URL.

Screenshot 2025-05-02 at 15 36 49

Figure 6. Eventsink-URL for your instance

Send a test event

We can test the eventsink using curl. The URL is the Eventsink-URL from Figure 6.

% url -X POST --json '{ "event": "init", "sessionId": "3", "timestamp": 1740411580982, "playhead": -1, "duration":
-1 }' <Eventsink-URL from Figure 6>
{"sessionId":"3","heartbeatInterval":5000}

We can now use AWS CLI to check the message was placed in the queue, with the following CLI: aws sqs receive-message --queue-url=<SmoothMQ QueueURL from figure 4> --endpoint-url <SmoothMQ-URL for your instance from Figure 3> --region eu-west-1

It can look like this:

%  aws sqs receive-message --queue-url=https://sqs.us-east-1.amazonaws.com/1/events --endpoint-url https://eyevinnlab-myqueuename.poundifdef-smoothmq.auto.prod.osaas.io --region eu-west-1

If the call is successful you will get an answer like below:

{ 
    "Messages": [
        {
            "MessageId": "1917361731521220608",
            "ReceiptHandle": "1917361731521220608",
            "MD5OfBody": "56e41ee2399ef83003d1d230e8d11212",
            "Body": "{\"event\":\"init\",\"sessionId\":\"3\",\"timestamp\":1740411580982,\"playhead\":-1,\"duration\":-1}",
            "MessageAttributes": {
                "Event": {
                    "StringValue": "init",
                    "DataType": "String"
                },
                "Time": {
                    "StringValue": "1740411580982",
                    "DataType": "String"
                }
            }
        }
    ]
}



Step 4: ClickHouse setup

ClickHouseDB is a fast, open-source columnar database. We will setup a database instance for your Player Analytics. Go to ClickHouse and create the Click House secrets from Table 1.

Click on Create clickhouse-server + and fill out the dialog, by choosing a secret for each field: Name = Table1.<myclickdbinstance>, Db = Table1.<myclickdb>, User = Table1.<clickdbuserkey>, and Password = Table1.<clickdbsecretkey>

Screenshot 2025-05-02 at 16 45 01

Figure 7. ClickHouse Server parameters

Then press Create.

This will create a ClickHouse DB server-instance called <Table1.myclickdbinstance> with one database called <Table1.myclickdbname>. This you can verify by clicking on the three dots upon the card for your DB Server, and choose Open application. This will open up a query dialog.

Provide your <Table1.myclickdbuservalue> and <Table1.myclickdbuserkey> in the upper right-hand corner. And enter the query:

select * from system.databases

And your view ought to be similar to the picture below.

Screenshot 2025-04-30 at 13 47 24

Figure 8. ClickHouse web query interface

You can get the URL to you ClickHouse Server both from the view above or by clicking the copy symbol on the card for your ClickHouse Server instance:

Screenshot 2025-05-05 at 11 24 32

Figure 9. URL to your ClickHouse Server instance

Now we have verified that the DB is valid and up and running.

Step 5: Player Analytics Worker setup

Player analytics Worker is the worker module that process the data from the queue and stores it in a database.

Go to Player analytics Worker and click on the "Create worker +" button.

Screenshot 2025-04-17 at 13 43 11

Figure 10: Player Analytics Worker setup dialog

Field name Description
Name The name you want for your instance
ClickHouseUrl URL to your ClickHouse instance. NOTE: ClickHouseUser@ClickHousePassword in the address https://myclickdbuservalue:myclickdbsecretvalue@eyevinnlab-myclickdbinstance.clickhouse-clickhouse.auto.prod.osaas.io
SqsQueueUrl SmoothMQ QueueURL from figure 4
AwsAccessKeyId User name of the SmoothMQ instance (AccessKey)
AwsSecretAccessKey User password of the SmoothMQ instance (SecretKey)
SqsEndpoint SmoothMQ-URL for your instance from figure 3

Table 2: Explanation of fields

After you have pressed Create you should wait a few minutes to let the worker start and also create your database. You can see in the log if it is ready.

Test the full flow

One easy way to test the full flow is to make a small sample app on either of the platforms and play and pause a movie, then you will see your events appearing in the Click House database.

Step 6: Use ClickHouse MCP to query from Claude Desktop

It is easy to enable ClickHouse to be able to answer natural questions from Claude Desktop thanks to the ClickHouse MCP Server project.

Prerequisites

Install Claude Desktop and login. Run brew install uv since ClickHouse MCP needs this fast python package resolver.

Then it is needed to change the claude_desktop_config.json file. You can find it at:

% ~/Library/Application Support/Claude/claude_desktop_config.json

In the event that the file does not exist you can create it and add the following in the file:

{
    "mcpServers": {
        "Open Analytics": {
          "command": "uv",
          "args": [
            "run",
            "--with",
            "mcp-clickhouse",
            "--python",
            "3.13",
            "mcp-clickhouse"
          ],
          "env": {
            "CLICKHOUSE_HOST": "eyevinnlab-myclickdbinstance.clickhouse-clickhouse.auto.prod.osaas.io",
            "CLICKHOUSE_PORT": "443",
            "CLICKHOUSE_USER": "myclickdbuservalue",
            "CLICKHOUSE_PASSWORD": "myclickdbsecretvalue",
            "CLICKHOUSE_SECURE": "true",
            "CLICKHOUSE_VERIFY": "true",
            "CLICKHOUSE_CONNECT_TIMEOUT": "30",
            "CLICKHOUSE_SEND_RECEIVE_TIMEOUT": "30"
          }
        }
    }
}

Figure 10. claude_desktop_config.json with our example values

NOTE: Change the values to your own as explained below:

Parameter Your value
CLICKHOUSE_HOST - is the URL to your ClickHouse Server instance from Figure 9
CLICKHOUSE_USER - is your <myclickdbuservalue>
CLICKHOUSE_PASSWORD - is your <myclickdbsecretvalue>

Now you can use your Claude Desktop and query your Open Analytics about your data.

Screenshot 2025-05-05 at 12 27 27

Figure 11. Claude Desktop using your ClickHouse data.

Step 7: Grafana Integration for Analytics Pipeline

After configuring ClickHouse queries with MCP, enhance your analytics by visualizing data in Grafana. Follow these steps to connect Grafana to your ClickHouse instance and build dashboards.

Architecture Summary

[Video Player/Client] → [Eyevinn SDK] → [Eventsink] → [SmoothMQ Queue] → [Worker] → [ClickHouse DB] → [MCP Server] → [Grafana Dashboard]

Prerequisites

Before starting this guide, one should have:

  • An existing ClickHouse instance with analytics data (this is already set up as part of the analytics worker)
    • ClickHouse connection details:
    • Endpoint URL: URL: https://<your-clickhouse-endpoint>/play
    • Database name (typically epas_default)
    • <Clickhouse-Username> and <Clickhouse-password>

7.1: Provision Grafana

Option A: Create Grafana on OSC (Recommended)

  1. Launch Grafana

    • Go to OSC UI → Web Services → Grafana → Create Grafana
    • Name: grafana
    • Plugins to Preinstall: clickhouse-datasource
    • Click Create and wait until status is running.
    Screenshot 2025-05-06 at 10 55 08
Screenshot 2025-04-24 at 11 33 31
  1. Log in to Grafana

    • Open the provided Grafana URL
    • Default credentials: Username: admin, Password: admin
    • Set a new password when prompted.

Option B: Run Grafana Locally with Docker

docker run -d \
  -p 3000:3000 \
  --name=grafana \
  grafana/grafana:latest
  • Access: http://localhost:3000/

  • Default credentials: admin/admin

  • Set new password.

  • Install ClickHouse plugin:

    • In Grafana: ⚙️ Plugins → search ClickHouseInstallEnable
    • If needed: docker restart grafana

7.2: Configure ClickHouse Data Source

  1. In Grafana sidebar: Configuration (⚙️) → Data Sources → Add data source

  2. Select ClickHouse (or Altinity plugin).

  3. Enter connection details:

    • URL: https://<your-clickhouse-endpoint>/play
    • Default database: epas_default
    • Basic Auth: Enable
    • User: <ClickHouse-username>
    • Password: <ClickHouse-password>
  4. Click Save & Test and confirm Data source is working.

Screenshot 2025-05-07 at 11 56 36 Screenshot 2025-04-27 at 17 35 09

7.3: Import or Build Dashboards

7.3.1: Import Sample Dashboard

  1. Grafana sidebar: + → Import
  2. Upload /sample-grafana-dashboard/Clickhouse-1745399875287.json or paste its JSON
  3. Or import directly using the raw JSON from this link

7.3.2: Create Custom Panels

  • Event Frequency

    SELECT
      toStartOfHour(timestamp) AS time,
      event,
      count(*) AS count_of_events
    FROM epas_default
    WHERE $__timeFilter(timestamp)
    GROUP BY time, event
    ORDER BY time
  • Top Content Titles

    SELECT
      JSONExtractString(payload, 'contentTitle') AS title,
      count(*) AS plays
    FROM epas_default
    WHERE event = 'metadata' AND $__timeFilter(timestamp)
    GROUP BY title
    ORDER BY plays DESC
    LIMIT 10
  • Playback Errors

    SELECT
      toStartOfMinute(timestamp) AS time,
      JSONExtractString(payload, 'reason') AS error_reason,
      count(*) AS count_errors
    FROM epas_default
    WHERE event = 'stopped'
      AND JSONExtractString(payload, 'reason') = 'error'
      AND $__timeFilter(timestamp)
    GROUP BY time, error_reason
    ORDER BY time
Screenshot 2025-05-07 at 11 41 37 Screenshot 2025-05-07 at 11 41 19 Screenshot 2025-05-07 at 11 43 01
⚠️ **GitHub.com Fallback** ⚠️