Event Forwarding Playground - CrowdStrike/logscale-community-content GitHub Wiki

The Event Forwarding Playground is a self-contained docker environment made available for the purposes of learning how to setup Event Forwarding in LogScale, analyzing and testing the format of formatted events in the Kafka Topic and an environment to test Kafka Subscribers to collect the events from the topic.

The following 2 diagrams provide details on what the Playground is and how to use it.

Containers and their interactions

Once the Playground is up there will be 3 containers running the latest stable releases of LogScale and Kafka from the Humio docker repository and a version of Kafka Connect. These instances are not configured to persist configurations and therefore you may find that between restarts you lose configuration settings.

Using docker-compose the containers communicate over a shared network and are available via port mapping from localhost of the host environment or if allowed directly from external systems.

Detailed interaction and flow through the environment

The playground follows the following workflow when running and configured for Event Forwarding as per steps detailed below:

  1. Test events are POSTed to the LogScale raw ingest API using the configured ingest token
  2. The associated parser is invoked to extract KV fields
  3. An Event Forwarding rule send Events where the query "type=forward" matches to the Event Forwarder
  4. The Event Forwarder places the Event on the configured Kafka Topic
  5. The contents of the Topic can be viewed via the Kafka-consumer CLI from within the Kafka container
  6. Kafka Connect is monitoring the Topic for new Events. On arrival Kafka Connect picks up the Event and applied a default JSON formatter
  7. Kafka Connect sends the Event back to LogScale to a different repository using the HEC end-point and associated ingest token.

Note: The playground includes the LogScale and Splunk Kafka Connect plugins. Other plugins can be added for testing by copying them to the data directory and referencing them in the Kafka Connect configuration

Setup up the Playground

  1. Clone the LogScale Community Content repository

    git clone https://github.com/CrowdStrike/logscale-community-content.git

  2. Change to the EventForwarding-Playground directory

    cd logscale-community-content/Config-Samples/EventForwarding-Playground

  3. Follow the steps to download and build the Kafka Connect sink for HEC here - https://github.com/humio/kafka-connect-hec-sink. After being built copy the jar file to the data/connect-jars directory

    cp target/kafka-connect-hec-sink-1.0-SNAPSHOT-jar-with-dependencies.jar logscale-community-content/Config-Samples/EventForwarding-Playground/data/connect-jars/.

  4. Start the playground

    docker-compose up

Details instructions for setting up Event Forwarding

Once the environment is running access the LogScale management UI on port 8080. It should be accessible from localhost as well as the hosts remote address - http://localhost:8080

  1. You will need to provide a valid Self-Hosted License file when prompted. A Trial license can be obtained from here - https://www.crowdstrike.com/products/observability/falcon-logscale/. For Enterprise customers with active Subscriptions please reach out to your Account Manager.

  2. Create a new Repository called "Repo1"

  3. Create an Ingest Token called "Repo1" and set the parser to "json". Take a copy of the Ingest Token you will need it later when referenced as [REPO1 INGEST TOKEN]

  4. Send 2 test Events and confirm that they appear in Repo1.

curl http://[LOGSCALE URL]:8080/api/v1/ingest/raw \
  -X POST \
  -H "Authorization: Bearer [REPO1 INGEST TOKEN]" \
  -d '{"level":"INFO","type":"forward","message":"Forward this on to the Kafka Queue"}'

curl http://[LOGSCALE URL]:8080/api/v1/ingest/raw \
  -X POST \
  -H "Authorization: Bearer [REPO1 INGEST TOKEN]" \
  -d '{"level":"INFO","type":"ignore","message":"Do not forward this on"}'
  1. Create a new Event Forwarder

    My Organization | Event Forwarder | New Event Forwarder

    Specify the following settings:

Name: NewEventForwarder

Topic: forwardedEvents

Properties:
  * bootstrap.servers=broker:29092
  * batch.size=100
  1. Create a Forwarding rule that forwards everything with "type=forward" to the new Event Forwarder

    Repo1 | Settings | Event Forwarding | Add Forwarding Rule

    Specify the following settings:

The results of: type=forward
Are forwarded through: NewEventForwarder

If available, Test the Connection

  1. REPEAT STEP 4

    Once you have repeated Step 4 you should see 2 new log messages in Repo1

  2. DISPLAY TOPICS

    From the host running the Playground open a shell on the Kafka docker and list out the topics

    docker exec -it broker bash

    /bin/kafka-topics --bootstrap-server=localhost:9092 --list

    You should see a Topic called "forwardedEvents"

  3. DISPLAY TOPIC MESSAGES

    From the same shell connection in Step 7 issue the following command

    /bin/kafka-console-consumer --bootstrap-server localhost:9092 --topic forwardedEvents --from-beginning

    You should see 1 Event, the one with "type=forward"

  4. CREATE NEW REPO RECEIVE FORWARDED LOGS

    Create a new Repository called "Repo2"

    Create an Ingest Token called "Repo2" and set the parser to "json". Take a copy of the Ingest Token you will need it later when referenced as [REPO2 INGEST TOKEN]

  5. CONFIGURE KAFKA CONNECT

    Update the value of [REPO2 INGEST TOKEN] in the following command and issue it on the command line of the host running the Playground

curl -s http://localhost:8083/connectors -X POST -H "Content-Type: application/json" -d '{
  "name": "kafka-connect-hec",
  "config": {
    "name": "kafka-connect-hec",
    "connector.class": "com.humio.connect.hec.HECSinkConnector",
    "tasks.max": "3",
    "topics": "forwardedEvents",
    "humio.hec.buffer_size" : 1,
    "humio.repo": "Repo2",
    "humio.hec.url": "http://humio:8080/api/v1/ingest/hec",
    "humio.hec.ingest_token": "[REPO2 INGEST TOKEN]",
    "value.converter": "org.apache.kafka.connect.json.JsonConverter",
    "key.converter": "org.apache.kafka.connect.storage.StringConverter",
    "key.converter.schemas.enable": "false",
    "value.converter.schemas.enable": "false"
  }
}'
  1. REPEAT STEP 4

    Once you have repeated Step 4 you should see 2 new Events in Repo1 and the single Event in Repo2 that matches "type=forward".