Demonstration 1 - adjacentlink/python-etce-tutorial GitHub Wiki

Purpose

Demonstration 1 contains a simple example of an ETCE Test Directory. We'll use it to show the basic ETCE workflow and to introduce these main ideas:

  • An ETCE Test defines the way a sequence of applications is executed on a set of hosts. In an EMANE context, an ETCE Test corresponds to an EMANE emulation.
  • An ETCE Test is defined in an ETCE Test Directory - a directory of application configurations organized by host that requires test.xml and steps.xml ETCE files.
  • The etce-test application operates on ETCE Test Directories - to list and execute the underlying Tests.
  • Executing a Test results in configurations and outputs written in a structured way to the ETCE Work Directory.

To learn to use ETCE means to learn to create ETCE Test Directories, to run etce-test and to navigate the ETCE Work Directory to troubleshoot or analyze test outputs.

Activity 1 - Listing Tests

Each of the demonstration subdirectories in the ETCE Tutorial is an ETCE Test Directory. Running etce-test list on the parent tutorial directory lists the names of all of the ETCE Test Directories underneath.

[etceuser@host]$ git clone https://github.com/adjacentlink/python-etce-tutorial
Cloning into 'python-etce-tutorial'...
remote: Enumerating objects: 126, done.
remote: Counting objects: 100% (126/126), done.
remote: Compressing objects: 100% (50/50), done.
remote: Total 126 (delta 69), reused 121 (delta 64), pack-reused 0
Receiving objects: 100% (126/126), 107.93 KiB | 0 bytes/s, done.
Resolving deltas: 100% (69/69), done.
Checking connectivity... done.

[etceuser@host]$ ls -1 python-etce-tutorial
01.hello_etce
02.hello_lxc
03.emane
04.templates
config
COPYING
README.md
scripts
start_demo.sh
stop_demo.sh

[etceuser@host]$ etce-test list python-etce-tutorial
01.hello_etce
02.hello_lxc
03.emane
04.templates

Running with the -v (verbose) option and adding a test name prints the test's location and description information. overlays prints the names of template variables present in the test configuration - we'll cover this later.

[etceuser@host]$ etce-test list -v python-etce-tutorial 01.hello_etce
-------------
01.hello_etce
-------------
location:
  python-etce-tutorial/01.hello_etce
description:

    Demonstrate a minimal test with
    a single field node (localhost)
    and a single step (say.hello).

overlays:

Activity 2 - Required Files

An ETCE Test Directory requires two top level configuration files test.xml and steps.xml.

[etceuser@host]$ tree python-etce-tutorial/01.hello_etce/
python-etce-tutorial/01.hello_etce/
|__ doc
|   |__ hostfile
|__ localhost
|   |__ hello.args
|__ steps.xml
|__ test.xml

At its most basic test.xml contains name and description elements to document the test:

<test>
  <name>01.hello_etce</name>

  <description>
    Demonstrate a minimal test with
    a single field node (localhost)
    and a single step (say.hello).
  </description>
</test>

steps.xml specifies the user defined sequence of steps that make up the test. Each step element requires a name and encloses one or more run or stop elements that are associated with an application. The wrapper attribute names the ETCE Wrapper to invoke to run or stop the associated application on the test hosts. More on Wrappers later.

Demonstration 1 has one step that calls run on a wrapper named utils.hello.

<steps>
  <step name="say.hello">
    <run wrapper="utils.hello"/>
  </step>
</steps>

The 01.hello Test Directory also contains two sub-directories. doc is an ETCE reserved name for a Test Directory subdirectory to store documents that are not treated as test configuration. More precisely, ETCE does not treat files in doc as template configuration files that can be parsed and filled in with template variable values (more on this later). A typical use of doc is to store test related artifacts that aren't part of test configuration - test descriptions (above and beyond what you'd in include in test.xml) or past results, for example.

Here we're using doc to store another important ETCE file - the ETCE Host File. An ETCE Host File lists the names of the hosts available for running the test. The Host File is not strictly part of the test definition. Frequently one Host File will apply to several tests. For the tutorial demos, we follow a pattern of placing the Host File for each test in doc. Demonstration 1 runs on one node, localhost. Accordingly, the Host File has one entry:

[etceuser@host]$ cat 01.hello_etce/doc/hostfile
localhost

After looking at the Host File, the other subdirectory, localhost, becomes easy to understand. localhost contains configuration files for applications that will run on localhost during the test. As localhost is the only participant in this first test, it is the only host subdirectory in this Test Directory. Later we'll see more complicated definitions.

In the present case localhost contains one file, hello.args. As we'll see, hello.args corresponds to utils.hello, the single application we've specified to run in steps.xml for this test.

Activity 3 - Run The Demo

The top level tutorial directory contains a start-demo.sh script for running the demos. We'll use this script initially as it calls 2 or 3 different applications in the correct order. Later, we'll invoke the commands separately.

[etceuser@host]$ cd python-etce-tutorial

[etceuser@host]$ ./start_demo.sh -h
usage: start_demo.sh [-e ENVFILE] [-p SSHPORT] [-k SSHKEYFILE] SSHUSER DEMODIR

The script requires two parameters. SSHUSER names the user account to use to connect to the test's hosts via SSH to run the test applications. DEMODIR is the test directory to execute.

Demonstration 1 executes on localhost only and does not invoke any privileged applications, so use your user name (in place of etceuser).

[etceuser@host]$ ./start_demo.sh etceuser 01.hello_etce
sshuser=etceuser
demodir=01.hello_etce

Checking ssh connections on port 22 ...
host

TestCollection:
  01.hello_etce

Enter passphrase for /home/etceuser/.ssh/id_rsa:
===============
BEGIN "01.hello_etce" trial 1
Skipping host "localhost". Source and destination are the same.
Trial Start Time: 2019-05-09T16:01:20
----------
testprepper run 2019-05-09T16:01:20 data/etcedemo-01.hello_etce-20190509T160038/template data/etcedemo-01.hello_etce-20190509T160038/data
[localhost]
[localhost] Publishing 01.hello_etce to /tmp/etce/current_test
----------
step: say.hello 2019-05-09T16:01:20 data/etcedemo-01.hello_etce-20190509T160038/data
[localhost] /bin/echo "Hello ETCE!"
trial time: 0000001
----------
Collecting "01.hello_etce" results.
Skipping host "localhost". Source and destination are the same.
----------
END "01.hello_etce" trial 1
===============
Result Directories:
        /tmp/etce/data/etcedemo-01.hello_etce-20190509T160038

Activity 4 - What Happened?

Each test run results in executing the application sequence defined by the Test Directory. Test artifacts are written collected to the ETCE Work Directory. Let's break down the test into individual steps.

Step 1 - Publish The Test

The Test Directory is published to the current_test subdirectory of the ETCE Work Directory (/tmp/etce by default) on each test host:

----------
testprepper run 2019-05-09T16:01:20 data/etcedemo-01.hello_etce-20190509T160038/template data/etcedemo-01.hello_etce-20190509T160038/data
[localhost]
[localhost] Publishing 01.hello_etce to /tmp/etce/current_test

To publish a Test Directory means to auto-generate any test configuration that derives from template files or directories into final, usable form. There are no templates in our simple example, so the published version looks the same as the original:

[etceuser@host]$ tree /tmp/etce/current_test
/tmp/etce/current_test
|__ localhost
|   |__ hello.args
|__ nodefile.txt
|__ steps.xml
|__ test.xml

The only notable difference is that the doc subdirectory is not included in the output since it is not part of the test configuration.

This publish step also creates a data directory on each host to hold files generated by the test applications. The output directory is placed in the data subdirectory of the ETCE Work Directory. In this example:

/tmp/etce/data/etcedemo-01.hello_etce-20190509T160038

The directory name is autogenerated with format TAG-TESTNAME-TIMESTAMP. The TAG is a required argument to etce-test run and is set to etcedemo in the start_demo.sh script. The TESTNAME comes directly from test.xml and TIMESTAMP is the time date-time that the test is launched.

Step 2 - Run The Steps

The heart of the test occurs when etce-test executes each of the steps from steps.xml. In this case, the single step say.hello:

----------
step: say.hello 2019-05-09T16:01:20 data/etcedemo-01.hello_etce-20190509T160038/data
[localhost] /bin/echo "Hello ETCE!"
trial time: 0000001

Underneath the hood, etce-test invokes the etce-exec-field application over the SSH connection to each host, passing it the name of the step to run. On each host etce-exec-field examines the current_test directory, switches its current working directory to the appropriate host subdirectory (the one that matches the local host name), and runs/stops the step applications through the associated ETCE Wrappers.

One item worth noting here - each test step is run to completion on all test hosts before progressing to the next step. Because the test steps are defined by the user, this behavior is useful where test applications depend on start-up order; applications that must be launched in a specific sequence can be placed in separate steps accordingly.

In a similar way that the test configuration files are organized by host, the test applications write their outputs to host subdirectories in the data subdirectory in the output directory:

[etceuser@host]$ tree /tmp/etce/data/etcedemo-01.hello_etce-20190509T160038/data
/tmp/etce/data/etcedemo-01.hello_etce-20190509T160038/data
|__ localhost
    |__ etce.store
    |__ hello.log

Step 3 - Collect The Outputs

In the final step, output directory contents from each of the test hosts are collected back to the output directory where etce-test was invoked. In our case, ETCE detects that our source and destination nodes are the same and skips the collection step:

----------
Collecting "01.hello_etce" results.
Skipping host "localhost". Source and destination are the same.
----------
END "01.hello_etce" trial 1
===============
Result Directories:
        /tmp/etce/data/etcedemo-01.hello_etce-20190509T160038

Activity 5 - Examine The Output

Taking a look again at our output directory, we see two output files generated by localhost:

[etceuser@host]$ tree /tmp/etce/data/etcedemo-01.hello_etce-20190509T160038/data
/tmp/etce/data/etcedemo-01.hello_etce-20190509T160038/data
|__ localhost
    |__ etce.store
    |__ hello.log

hello.log is generated by the utils.hello ETCE Wrapper. You can see its contents coming a mile away:

[etceuser@host]$ cat /tmp/etce/data/etcedemo-01.hello_etce-20190509T160038/data/localhost/hello.log
Hello ETCE!

etce.store is a JSON format file generated by ETCE. It is a data cache available to all Wrappers to store meta information about the test. It is organized by host and Wrapper name, with the section etce reserved for ETCE.

[etceuser@host]$ cat /tmp/etce/data/etcedemo-01.hello_etce-20190509T160038/data/localhost/etce.store
{"localhost": {"etce": {"starttime": "2019-05-09T16:01:20"}}}

ETCE currently stores one piece of information in etce.store, the start time of the test. This time is passed to all Wrappers and is distinct from the timestamp generated for the output directory name. It is a time synchronization point for all Wrappers to share as a common reference. In the context of EMANE, the start time serves as the time T=0 point for the emulation scenario. ETCE captures the start time to aid post analysis to time align data across applications.

etce-test calculates the test start time as a fixed delay from the time it is launched. The delay allows all test applications to be up and ready to run when the start time occurs. The --delaysecs option allows the user to set the delay. The common use is to extend the delay for larger tests where more time may required to get all applications running.

⚠️ **GitHub.com Fallback** ⚠️