Phasespace Tracking Code Overview - Carleton-SRCL/SPOT GitHub Wiki

Overview

This section of the Wiki will cover the current implementation of the code for the PhaseSpace motion tracking system. The code is located under /SPOT/Custom_Library/PhaseSpace_Cameras/PHASESPACE/.

Device Driver

For details on how a device driver class functions, refer to the relevant Wiki. For this section, we will focus on the specific for the PhaseSpace code. The current implementation of the cameras operates on a host/client structure. What does this mean? This means that one platform is designated to be the host; the hostinitializes the cameras and receives ALL of the position data for both the HOST spacecraft and the client spacecraft(s). The host then transmits the camera data to the client(s) via UDP. Any platform can be the host when there is only one platform active. However, only the RED platform can be the HOST if there is more then one platform (for now). So, when RED and BLACK are both active, RED must be the host. This will likely change in a future update with the introduction of the third BLUE platform.

The most salient code pieces from the driver are as follows. In this code block, the PhaseSpace system is initialized at a given samplerate (as set by the user), and then the function 'stream_phasespace' is executed as each step - more on this function later. The only changes that typically need to be made by the user are the samplerate, which is done in the Run_Initializer script. In the future, when the BLUE platform is activated, this structure is likely to change. Note that all outputs from the C++ function are divided by 1000 - this is done to convert the units from mm to m before the user see them in Simulink.

methods (Access=protected)
    function setupImpl(obj) 
        if isempty(coder.target)
            % Place simulation setup code here
        else
            % Call C-function implementing device initialization
             coder.cinclude('owl.hpp');
             coder.cinclude('phasespace_headers.h');
             coder.ceval('initialize_phasespace',obj.platformSelection,obj.PS_SampleRate);
        end
    end
    
    function y = stepImpl(obj)
        
        y1 = double(0);
        y2 = double(0);
        y3 = double(0);
        y4 = double(0);
        y5 = double(0);
        y6 = double(0);
        y7 = double(0);
        y8 = double(0);
        y9 = double(0);
        y10 = double(0);
        y11 = double(0);
        y12 = double(0);
        y13 = double(0);
        y   = zeros(1,13,'double');

        if isempty(coder.target)
            % Place simulation output code here
        else
            
            coder.ceval('stream_phasespace',coder.ref(y1),...
                         coder.ref(y2),coder.ref(y3),coder.ref(y4),...
                         coder.ref(y5),coder.ref(y6),coder.ref(y7),...
                         coder.ref(y8),coder.ref(y9),coder.ref(y10),...
                         coder.ref(y11),coder.ref(y12),coder.ref(y13),...
                         obj.platformSelection);
            y  =[y7/obj.PS_SampleRate, y1/1000, y2/1000, y3, y4/1000, y5/1000, y6, y8/1000, y9/1000, y10/1000, y11/1000, y12/1000, y13/1000];

        end
    end
    
    function releaseImpl(obj) %#ok<MANU>
        if isempty(coder.target)
            % Place simulation termination code here
        else
            % Call C-function implementing device termination
            coder.ceval('terminate_phasespace');
        end
    end
end

Regardless of which platforms are active, this driver will always output 13 parameters in a [1,13] vector of doubles. These are, in order:

  • Time
  • RED X-Position
  • RED Y-Position
  • RED Attitude
  • BLACK X-Position
  • BLACK Y-Position
  • BLACK Attitude
  • ARM Elbow X-Position
  • ARM Elbow Y-Position
  • ARM Wrist X-Position
  • ARM Wrist Y-Position
  • ARM End-Effector X-Position
  • ARM End-Effector Y-Position

Source Code

The only custom piece of source code is the file phasespace_functions.cpp, located under src. This file contains the 3 primary functions for the cameras:

  • initialize_phasespace() - This function initializes the options for the camera system.
  • stream_phasespace() - This function is executed during each loop of the Simulink diagram. It checks if data is available from the cameras; if there is data, it is returned to Simulink.
  • terminate_phasespace() - This function stops all cameras from running and closes all open communication ports/clears all buffers.

The inilialization function is as follows:

/* initialize_phasespace() initializes the options for the phasespace cameras
   and starts streaming data. */
double initialize_phasespace(double platformSelection, double PS_SampleRate)
{
	/* The address is the IP address for the phasespace computer. */
	string address = "192.168.0.109";
	std::string phaseSpaceOptions;
	std::string tracker_id_RED_7_pos_string, tracker_id_RED_1_pos_string;
	std::string tracker_id_RED_3_pos_string, tracker_id_RED_5_pos_string;
	std::string tracker_id_BLACK_15_pos_string, tracker_id_BLACK_9_pos_string;
	std::string tracker_id_BLACK_11_pos_string, tracker_id_BLACK_13_pos_string;

	
	/* The options are sent to the phasespace computer. The most important 
	   setting here is the frequency, which indicates how fast the phasespace
	   sends data up to the groundstation. This frequency must be smaller then
	   the samplerate of the function reading the data. If the frequency is 
	   too high, the buffer will fill up and there will be a growing time delay
	   in the data. */
	phaseSpaceOptions = "profile=all120 frequency=" + std::to_string(PS_SampleRate);
	
	/* The ID's indicate the location of each LED relative to the center of
	   mass for the platform. */
	tracker_id_RED_5_pos_string = "pos=146.960175,124.9189470,0";
	tracker_id_RED_3_pos_string = "pos=144.960175,-154.081053,0";
	tracker_id_RED_1_pos_string = "pos=-133.039825,-153.581053,0";
	tracker_id_RED_7_pos_string = "pos=-131.539825,124.418947,0";
	tracker_id_BLACK_13_pos_string = "pos=125.944730,153.415965,0";
	tracker_id_BLACK_11_pos_string = "pos=124.944730,-125.084035,0";
	tracker_id_BLACK_9_pos_string = "pos=-153.555270,-124.584035,0";
	tracker_id_BLACK_15_pos_string = "pos=-151.805270,154.415965,0";

	const std::string myoptions = phaseSpaceOptions; 
	
	/* Open the TCP/IP port and send the options string. */
	if (owl.open(address) <= 0 || owl.initialize(myoptions) <= 0)
	{
		return 0; 
	}
	/* Create the rigid tracker for the RED satellite. The "rigid tracker"
	   refers to the syntax used by phasespace, and it used when you want to 
	   track a rigid body. */
	uint32_t tracker_id_RED = 0;
	owl.createTracker(tracker_id_RED, "rigid", "RED_rigid");

	/* Assign markers to the rigid body and indicate their positions
	   w.r.t the centre of mass (obtained from calibration text file) */
	owl.assignMarker(tracker_id_RED, 5, "5", tracker_id_RED_5_pos_string); // top left
	owl.assignMarker(tracker_id_RED, 3, "3", tracker_id_RED_3_pos_string); // top right
	owl.assignMarker(tracker_id_RED, 1, "1", tracker_id_RED_1_pos_string); // bottom right
	owl.assignMarker(tracker_id_RED, 7, "7", tracker_id_RED_7_pos_string); // bottom left 

	uint32_t tracker_id_RED_arm = 1;
	owl.createTracker(tracker_id_RED_arm, "rigid", "RED_arm_flexible");

	owl.assignMarker(tracker_id_RED_arm, 2, "2", "pos=0,0,0"); // elbow 
	owl.assignMarker(tracker_id_RED_arm, 6, "6", "pos=0,0,0"); // wrist
	owl.assignMarker(tracker_id_RED_arm, 0, "0", "pos=0,0,0"); // end-effector

	uint32_t tracker_id_BLACK = 2;
	owl.createTracker(tracker_id_BLACK, "rigid", "BLACK_rigid");

	/* Assign markers to the rigid body and indicate their positions
	   w.r.t the centre of mass (obtained from calibration text file) */
	owl.assignMarker(tracker_id_BLACK, 13, "13", tracker_id_BLACK_13_pos_string); // top left
	owl.assignMarker(tracker_id_BLACK, 11, "11", tracker_id_BLACK_11_pos_string); // top right
	owl.assignMarker(tracker_id_BLACK, 9,  "9", tracker_id_BLACK_9_pos_string); // bottom right
	owl.assignMarker(tracker_id_BLACK, 15, "15", tracker_id_BLACK_15_pos_string); // bottom left

	/* Start streaming phasespace data. Sending (1) streams data using TCP/IP,
	   sending (2) streams data using UDP, and sending (3) streams data using
	   UDP but broadcasts to all IP addresses. */
	owl.streaming(2);
   
}

Most of the critical sections of this code are sufficiently described with a comment. Of note, however, is the following segment:

	/* The ID's indicate the location of each LED relative to the center of
	   mass for the platform. */
	tracker_id_RED_5_pos_string = "pos=146.960175,124.9189470,0";
	tracker_id_RED_3_pos_string = "pos=144.960175,-154.081053,0";
	tracker_id_RED_1_pos_string = "pos=-133.039825,-153.581053,0";
	tracker_id_RED_7_pos_string = "pos=-131.539825,124.418947,0";
	tracker_id_BLACK_13_pos_string = "pos=125.944730,153.415965,0";
	tracker_id_BLACK_11_pos_string = "pos=124.944730,-125.084035,0";
	tracker_id_BLACK_9_pos_string = "pos=-153.555270,-124.584035,0";
	tracker_id_BLACK_15_pos_string = "pos=-151.805270,154.415965,0";

It is in this segment of code that the results from the center of mass calibration would be placed. The next function is the streaming function:

void stream_phasespace(double* XPOS_red, double* YPOS_red, 
		double* ATTI_red, double* XPOS_black, double* YPOS_black, double* ATTI_black,
		double* current_time, double* ElbowX, double* ElbowY, double* WristX, double* WristY,
		double* EndEffX, double* EndEffY, double platformSelection)
{

	/* Initialize the "event" parameter. This parameter indicates if there is
	   any data available. If there is no data, a zero is returned. */
	const OWL::Event *event = owl.nextEvent(1000);

	/* If the connection is available and the properties are initialized,
	   check if there is data available. If there is data, then check which
	   rigid body has been located. Then, loop through the different IDs 
	   found and store the data. */
	if (owl.isOpen() && owl.property<int>("initialized"))
	{
		if (!event)
		{
			// Do not do anything! There is no good data.
		}
		else if (event->type_id() == OWL::Type::FRAME)
		{
			if (event->find("rigids", rigids) > 0)
			{
				for (OWL::Rigids::iterator r = rigids.begin(); r != rigids.end(); r++)
				{
					if (r->cond > 0) 
					{
						if (r->id == 0)
						{
							*XPOS_red = r->pose[0];
							*YPOS_red = r->pose[1];
							*ATTI_red = atan2(2 * r->pose[4] * r->pose[5] 
									+ 2 * r->pose[3] * r->pose[6], 
									2 * r->pose[3] * r->pose[3] - 1
									+ 2 * r->pose[4] * r->pose[4]);
							*current_time = event->time();
						}
						else if (r->id == 2)
						{
							*XPOS_black = r->pose[0];
							*YPOS_black = r->pose[1];
							*ATTI_black = atan2(2 * r->pose[4] * r->pose[5] 
									+ 2 * r->pose[3] * r->pose[6], 
									2 * r->pose[3] * r->pose[3] - 1
									+ 2 * r->pose[4] * r->pose[4]);
							*current_time = event->time();
						}   
					}
				}
			}
			if (event->find("markers", markers) > 0)
			{
				for (OWL::Markers::iterator m = markers.begin(); m != markers.end(); m++)
				{
					if (m->cond > 0)
					{
						if (m->id == 2)
						{
							*ElbowX = m->x;
							*ElbowY = m->y;
						}
						if (m->id == 6)
						{
							*WristX = m->x;
							*WristY = m->y;
						}
						if (m->id == 0)
						{
							*EndEffX = m->x;
							*EndEffY = m->y;
						}
					}
				}
			}
		}
	}
}

This code looks for any connected "rigid bodies", then checks if there is data. If data is present, then it returns the data via pointer. When the BLUE platform is activated, this section will need to be expanded. The final piece of code is very simple due to it's reliance on the PhaseSpace API:

/* terminate_phasespace() stops all cameras from running and closes all
   communication ports/clears all buffers */
void terminate_phasespace()
{ 
	owl.done();
	owl.close();   
}

This code simply tells the PhaseSpace system to close the communication port.

Simulink Code

Now, let's look at the Simulink code. The template files contain all code required for communication with the current setup:

Inside this subsystem, there are two options. Either the code is being executed on a single platform (RED, BLACK, BLUE, or RED+ARM), or the code is being executed on either RED+BLACK or RED+BLACK+ARM:

When operating with a single platform, the device driver can be used as is; the data packet for any inactive platform will be zero-padded:

When more then one platform is active, the RED platform collects all of the data, then transmits it via UDP to the other platforms. Currently, the code only supports the RED and BLACK platforms, and RED is always the host that collects the data packets and transmits them.

⚠️ **GitHub.com Fallback** ⚠️