Getting Started - dice-project/DICE-Simulation GitHub Wiki
This section shows what the Simulation Tool looks like from the users' point of view, and provides a quick description on how to use it in combination with GreatSPN.
UML Modelling
First, it is worth to recall that the modeling phase is done using Papyrus. Since there exists extensive documentation on how to use this tool to create profiled UML models, we will not provide details on the usage of this specific tool.
Next figure shows a general view of the Papyrus modeling perspective.
On the left of the figure, the different explorers (Project Explorer
, Model Explorer
and Outline
) are shown.
The rest of the figure shows the Model Editor
and the Properties
view.
The model itself is depicted in the canvas of the Model Editor
.
MARTE Profiling
Profiles, stereotypes and tagged values are defined using the Properties
view (Profile
tab).
The following images show in the Properties
view some tagged values that are applied to some model elements. Specifically:
- The first figure shows the
inserDB
element, stereotyped asGaStep
, and itshost demand
tagged value. The latter defined as(expr=$timeAdd,unit=ms,statQ=mean,source=est)
, where$timeAdd
is an input parameter representing amean
time duration in millisecondsms
.
- The second figure shows the the selected control flow, stereotyped as
GaStep
, and itsprob
tagged value. The latter is defined as(expr=1-$probActLoop)
, where$probActLoop
is an input parameter.
- The third figure shows the
start
element (the initial node), stereotyped asGaWorkloadEvent
, and itspattern
tagged value. The latter is defined asopen=(arrivalRate=(expr=$arrRate,unit=Hz,statQ=mean,source=est))
, that is an open workload characterized by amean
arrival rate input parameter ($arrRate
), where the rate unit isHz
.
Tagged values are specified in Papyrus-MARTE using the so-called Value Specification Language.
As already seen in the previous figures, we can specify model input parameters ($timeAdd
, $probActLoop
and $arrRate
) that will be set to actual values in the simulation configuration step (next section).
Performance analysis
A SSH connection to the simulation server must be provided before launching the simulation (see the Configure GreatSPN SSH Connection section in the First-Steps page). The simulation server hosts the GreatSPN tool, which is in charge of computing the performance metrics selected by the user. The SSH connection is configured in Eclipse - Preferences - Simulation Tools - SSH connection.
Simulation configuration
The first step to be carried out is to set the configuration for the simulation experiments.
Therefore, we need to open the Run Configurations...
window either
by selecting the Run as -> Run Configurations...
option from the contextual menu associated to the UML model (Project Explorer
view)
or by clicking the clock
button (red marked in the figure below).
Run as -> Run Configurations...
option
Selecting the In this case we can create a new DICE Simulation
configuration from scratch from the Run Configuration...
windows:
clock
button option
Clicking the the In this case a DICE Simulation
configuration is created (called Model
) that is filled with the information
retrieved from the UML model profiled with MARTE:
Note that, depending on the annotations defined in the UML model, the Model
configuration can be partially filled or complete.
In the running example it is complete and a simulation experiment can be run without any changes.
The figure above shows the Main
tab of the Model
configuration where it is possible to select:
- The
Model to Analyse
, i.e., a UML model annotated with MARTE, by browsing in the worksapce - The
Active scenario
, by selecting the possibleGaScenario
in the model (the running example includes one one) - The
NFP to calculate
, i.e., the type of analysis:Performance
orReliability
Moreover, the tables shown at the bottom of the Main
tab can be used to customize the values assigned to the input parameters
specified in the UML model.
The Filters
tab of the Model
configuration is shown in the following figure:
It includes two panels:
- The
Measure
panel, where it is possible to select/deselect the metrics to be estimated during the simulation experiment: such metrics, like the input parameters, are retrieved from the UML model annotated with MARTE. - The
Sensitivity analysis
panel, where it is possible to select/deselect the input parameter configurations. Observe that the tool generates all the possible parameter configurations from the range of values assigned to the input parameters in theMain
tab.
The Parameters
tab of the Model
configuration is shown in the following figure:
It includes General
and Simulation parameters
of the GreatSPN simulator.
In particular, the following are relevant for controlling the duration of a simulation run:
Maximum simulation execution time
, a simulation run lasts at most the time value set to this parameterConfidence level
, the level of the confidence intervals computed for the metrics of the performance modelAccuracy
, the accuracy of the estimated metrics of the performance model. It is expressed as percentage and it is an integer number (lower the value, higher is the accuracy).
This tab also shows the path of the GreatSPN simulator executable in the simulation server (WNSIM File Path
).
Remember to save the changes in a tab of the Model
configuration windows by clicking the Apply
button before launching a simulation
experiment.
Running a simulation experiment
A simulation experiment is run by clicking the Run
button in the Model
configuration window and it consists of as many simulation runs as the number of configurations selected in the Filters
tab.
In the running example, a simulation experiments consists of 10 simulation runs.
The simulation can be monitored using the DICE Simulation
perspective that can be set by clicking the clock
button (red marked in
the figure below):
The following figure, shows the DICE Simulation
perspective while the simulation experiment is running.
In the figure, three key views can be identified:
- The
Debug
view that shows information about theSimulation process
(identifier, state, exit value, etc.); - The
Console
view that shows the messages that the simulation process dumps into the standard out and the standard error streams. In the case ofGreatSPN
, these messages enable to monitor the accuracy achieved by the running process and the number of simulation steps that have been already performed. If an error happens during the process of simulation, it will be notified in theConsole
view. - The
Invocation Registry
view that shows the starting/ending times and the status of the simulation runs belonging to the simulation experiments.
In the DICE Simulation
perspective it is also possible to stop the simulation process at any moment by using the Stop
button of the GUI
(red marked in the figure).
When the simulation finishes, the user can still access to the simulation console and the simulation process information (until he/she cleans the Console
view using the corresponding button ).
As the next image shows, all the simulation runs terminated correctly (exit value 0
) but second-last one that terminated with exit value -10
, meaning that the simulation run reached the maximum simulation execution time without achieving the accuracy for all the estimated metrics in the performance model (remember that the maximum simulation execution time and the accuracy are two parameters set in the Parameters
tab of the Model
configuration window).
Therefore, the simulation results are not saved for such run.
Simulation results
The results of a simulation experiment are reported both in textual and graphical formats.
Performance metrics of a simulation run
From the Invocation Registry
view it is possible to see the estimated performance metrics by right clicking on a particular simulation run
and selecting from the contextual menu the Open Simulation Result
option, as shown in the following figure:
A new view, labeled with the id of the simulation run, pops up above the Invocation view
as shown in the following figure:
For each performance metric, selected in the Filters
tab of the Model
configuraton window, the estimated mean value is shown.
Performance curves of the simulation experiment
When a simulation experiments consists of a set of simulation runs, then it is possible to generate 2D plots showing the trends of the estimated performance metrics against an input parameter, in the range of values set during the configuration step.
To generate the 2D plots, we consider again the Invocation Registry
view and right click on the simulation experiment as shown
in the following figure:
The contextual menu shows the Plot Results...
options that launches a wizard for the plot generation.
The following three figures shows the windows that pop ups in the wizard:
When the three steps are completed the plot file (data.plot
) is saved in the project and a new view pops-up with the 2D plots.
The figure below, shows the 2D plot of the utilization metric vs/ the arrival rate.
The system is clearly not stable for arrival rates greater than 0.14 Hz, this is a reason for the possible long simulation runs that may occur for such parameter configurations.