App Note: Bandwidth sharing in multi-camera systems

Introduction

Sharing bandwidth among cameras often results in reduced frame rate and incomplete images due to packet collisions. This application note discusses interleaving of packets to address collision of packets and describes various methods for synchronization of cameras.

Synchronization of cameras is important to make sure that timing of interleaving of packets doesn’t drift over time. Two methods of synchronization are discussed:

1. PTPSync (built-in firmware)
2. External hardware trigger

For the purposes of this application note, 4 IMX265 cameras were connected over a 1Gbps network via a switch. However, the methods outlined in this app note applies to all LUCID cameras.

Prerequisites

  • Arena SDK 1.0.31.8 or newer
  • Firmware version with PTPSync support
  • Microsoft Visual Studio 2015

Equipment Used

  • 4 x PHX032S-C cameras
  • Netgear MS510TXPP switch
  • Intel I350 Gigabit 1000BaseT Network Ethernet Adapter
  • PC with specifications
    • ASUS PRIME Z370-A
    • Intel Core i7-8700 3.2Ghz
    • 16 GB DDR4 RAM

Delay Calculator

In order to interleave the packets, we will use two type of delays:

  1. Stream Channel Packet Delay (GevSCPD) – This node controls the delay (in GEV timestamp counter unit) to insert between each packet for the stream channel.
  2. Stream Channel Frame Transmission Delay (GevSCFTD) – This node controls the transmission delay before transmission occurs on the camera side i.e. the wait time between when the camera is ready to transmit an image and when an image is actually transmitted.

The formula to calculate these delays is as follow:

Calculating Packet Delay (GevSCPD) for 1 camera Packet Delay (ns)=Packet Size (Bytes) × 109
DeviceLinkSpeed (Bytes per second)
+ Buffer*
* Buffer can be anywhere between 10% to 30% of packet delay Calculating Packet Delay (GevSCPD) and Transmission Delay (GevSCFTD) for x # of cameras x = Total number of cameras Camera #1 GevSCPD = Packet Delay (ns) × (x - 1) Camera #1 GevSCFTD = Packet Delay (ns) × 0 Camera #2 GevSCPD = Packet Delay (ns) × (x - 1) Camera #2 GevSCFTD = Packet Delay (ns) × 1 Camera #3 GevSCPD = Packet Delay (ns) × (x - 1) Camera #3 GevSCFTD = Packet Delay (ns) × 2 Camera #x GevSCPD = Packet Delay (ns) × (x - 1) Camera #x GevSCFTD = Packet Delay (ns) × (x - 1)

Delay Calculator Example

Here is an example using 4 cameras:

Device Link Speed = 1 Gbps = 125000000 Bps
Time to transfer 1 byte = 1/125000000 Bps 0.000000008 seconds or 8ns
Packet Size = 9014 Bytes
Delay for one packet = 9014 Bytes * 8 ns = 72112 ns
Buffer = 10.93%
Packet Delay for 1 camera = 72112 ns + (10.93% of 72112 ns) = 80000 ns

Transmission Delay for 4 cameras  = 3 x 80000 ns = 240000 ns

  • Camera 1
    • Packet Delay (GevSCPD) = 240000ns
    • Transmission Delay (GevSCFTD) = 0ns
  • Camera 2
    • Packet Delay (GevSCPD) = 240000ns
    • Transmission Delay (GevSCFTD) = 80000ns
  • Camera 3
    • Packet Delay (GevSCPD) = 240000ns
    • Transmission Delay (GevSCFTD) = 160000ns
  • Camera 4
    • Packet Delay (GevSCPD) = 240000ns
    • Transmission Delay (GevSCFTD) = 240000ns

Figure 1: This diagram shows the packet delay (GevSCPD) and transmission delay (GevSCFTD) for camera #2 in a 4xIMX265 system

Synchronization using PTPSync

First of all, what is the Precision Time Protocol (PTP)?

Precision Time Protocol (PTP or IEEE1588) is an IEEE standard that has been integrated into GigE Vision 2.0. It is a method of synchronizing the clocks of multiple devices on an Ethernet network. This is achieved by setting one device as the master clock and have all other devices synchronize and adjust to the master clock periodically. The synchronization is handled automatically by the devices once the master and slave devices have been set.

PTP can be used with LUCID cameras in several ways.

PTPSync is a built-in firmware feature that schedules 4 advance action commands in the firmware itself as opposed to scheduling using the SDK. Handling of action commands scheduling within the firmware eliminates overhead in the application code and helps in achieving higher frame rate.

This section describes the steps to enable PTP and setup PTPSync. Complete example code that uses PTPSync can be found at the bottom of this page.

Figure 2: Wireshark logs show the interleaving of packets achieved from using PTPSync

Step 1: Enable PTP

In order to enable PTP, set “PtpEnable” node to true.
Arena::SetNodeValue(pDevice->GetNodeMap(), "PtpEnable", true);

Step 2: Confirm PTP Status

Make sure that the PTPStatus node has been set to either “Master” or “Slave”.

GenICam::gcstring ptpStatus = Arena::GetNodeValue(pDevice->GetNodeMap(), "PtpStatus");

Step 3: Turn on ‘PTPSync’ mode

Set acquisition start mode to “PTPSync”.

std::cout << TAB3 << "Set acquisition start mode to 'PTPSync'\n";
Arena::SetNodeValue(pDevice->GetNodeMap(), "AcquisitionStartMode", "PTPSync")

Step 4: Maximize AcquisitionFrameRate

When PTPSync mode is turned on, AcquisitionFrameRate node won’t be controlling the actual frame rate but it might cap the max achievable frame rate. There is a separate node called PTPSyncFrameRate for changing frame rate when PTPSync mode is turned on.

In order to avoid capping of the frame rate, we will set AcquisitionFrameRate to it’s maximum value.

GenApi::CFloatPtr pAcquisitionFrameRate = pDevice->GetNodeMap()->GetNode("AcquisitionFrameRate");
pAcquisitionFrameRate->SetValue(pAcquisitionFrameRate->GetMax());

Step 5: Set PTPSyncFrameRate

Value for PTPSyncFrameRate should always be less than the value set in AcquisitionFrameRate node.

GenApi::CFloatPtr pPTPSyncFrameRate = pDevice->GetNodeMap()->GetNode("PTPSyncFrameRate");
pPTPSyncFrameRate->SetValue(PTPSYNC_FRAME_RATE);

Step 6: Set the desired StreamChannelPacketDelay and StreamChannelFrameTransmissionDelay for each camera

This code can be used for setting Packet Delay and Transmission Delay calculated using the above formula.

GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
pStreamChannelPacketDelay->SetValue(240000);
GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
pStreamChannelFrameTransmissionDelay->SetValue(80000);

Synchronization using External Hardware Trigger

If you are using an external source to trigger the cameras, you can apply the same settings for packet delay and transmission delay that you calculated using the above the formula.

Pay attention to the following settings so that packets are properly interleaved:

  • Only use “RisingEdge”, “FallingEdge” or “AnyEdge” for the TriggerActivation node. Using “LevelHigh” or “LevelLow” will not work to interleave the packets.
  • Make sure the frequency of the trigger source takes into account the drop in frame rate because of packet delay and transmission delay.

Step 1: Enable PTP (Optional)

Enabling PTP is optional when using hardware trigger. The only benefit of turning on PTP is synchronization of internal clocks of all the connected cameras. If you want synchronized timestamps for your images then enable PTP.
In order to enable PTP, set “PtpEnable” node to true.
Arena::SetNodeValue(pDevice->GetNodeMap(), "PtpEnable", true);

Step 2: Setup Trigger Mode

In this example we will be using Line0 as the input line. In case you are using a different line, please make sure it is properly configured under “Digital IO Control”.

Only use “RisingEdge”, “FallingEdge” or “AnyEdge” for the TriggerActivation node.

Arena::SetNodeValue(pDevice->GetNodeMap(), "TriggerSelector","FrameStart");
Arena::SetNodeValue(pDevice->GetNodeMap(), "TriggerSource", "Line0");
Arena::SetNodeValue(pDevice->GetNodeMap(), "TriggerActivation", "RisingEdge");

Step 3: Enable Trigger Mode

In order to enable Trigger Mode, set “TriggerMode” node to “On”.
Arena::SetNodeValue(pDevice->GetNodeMap(), "TriggerMode", "On");

Step 4: Set the desired StreamChannelPacketDelay and StreamChannelFrameTransmissionDelay for each camera

This code can be used for setting Packet Delay and Transmission Delay calculated using the above formula.
GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
pStreamChannelPacketDelay->SetValue(240000);
GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
pStreamChannelFrameTransmissionDelay->SetValue(80000);

PTPSync Example Code

#include "stdafx.h"
#include "ArenaApi.h"
#include  // for std::find
#include 	 // for sleep

#define TAB1 "  "
#define TAB2 "    "
#define TAB3 "      "
#define ERASE_LINE "                            "

#define EXPOSURE_TIME 10000.0
#define PTPSYNC_FRAME_RATE 7.0

// =-=-=-=-=-=-=-=-=-
// =-=- SETTINGS =-=-
// =-=-=-=-=-=-=-=-=-

// Image timeout
//    Timeout for grabbing images (in milliseconds). If no image is available at
//    the end of the timeout, an exception is thrown. The timeout is the maximum
//    time to wait for an image; however, getting an image will return as soon as
//    an image is available, not waiting the full extent of the timeout.
#define TIMEOUT 20000

// number of images to grab
#define NUM_IMAGES 25000


// =-=-=-=-=-=-=-=-=-
// =-=- EXAMPLE -=-=-
// =-=-=-=-=-=-=-=-=-

// 
void PTPSyncCamerasAndAcquireImages(Arena::ISystem* pSystem, std::vector& devices)
{

	for (size_t i = 0; i < devices.size(); i++)
	{
		Arena::IDevice* pDevice = devices.at(i);
		GenICam::gcstring deviceSerialNumber = Arena::GetNodeValue(pDevice->GetNodeMap(), "DeviceSerialNumber");

		std::cout << TAB2 << "Prepare camera " << deviceSerialNumber << "\n";

		// Manually set exposure time
		//    In order to get synchronized images, the exposure time must be
		//    synchronized as well.
		std::cout << TAB3 << "Exposure: ";

		Arena::SetNodeValue(
			pDevice->GetNodeMap(),
			"ExposureAuto",
			"Off");

		Arena::SetNodeValue(
			pDevice->GetNodeMap(),
			"ExposureTime",
			EXPOSURE_TIME);

		std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "ExposureTime") << "\n";

		// Synchronize devices by enabling PTP
		//    Enabling PTP on multiple devices causes them to negotiate amongst
		//    themselves so that there is a single master device while all the
		//    rest become slaves. The slaves' clocks all synchronize to the
		//    master's clock.
		std::cout << TAB3 << "PTP: ";

		Arena::SetNodeValue(
			pDevice->GetNodeMap(),
			"PtpEnable",
			true);

		std::cout << (Arena::GetNodeValue(pDevice->GetNodeMap(), "PtpEnable") ? "enabled" : "disabled") << "\n";

		// Use max supported packet size. We use transfer control to ensure that only one camera
		// is transmitting at a time.
		std::cout << TAB3 << "StreamAutoNegotiatePacketSize: ";
		Arena::SetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamAutoNegotiatePacketSize", true);
		std::cout << Arena::GetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamAutoNegotiatePacketSize") << "\n";

		// enable stream packet resend
		std::cout << TAB3 << "StreamPacketResendEnable: ";
		Arena::SetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamPacketResendEnable", true);
		std::cout << Arena::GetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamPacketResendEnable") << "\n";

		// Set acquisition mode to 'Continuous'
		std::cout << TAB3 << "Set acquisition mode to 'Continuous'\n";
		Arena::SetNodeValue(pDevice->GetNodeMap(), "AcquisitionMode", "Continuous");

		//Set acquisition start mode to 'PTPSync'
		std::cout << TAB3 << "Set acquisition start mode to 'PTPSync'\n";
		Arena::SetNodeValue(pDevice->GetNodeMap(), "AcquisitionStartMode", "PTPSync");
		
		// 	Set StreamBufferHandlingMode to 'NewestOnly'
		std::cout << TAB3 << "Set StreamBufferHandlingMode to 'NewestOnly'\n";
		Arena::SetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamBufferHandlingMode", "NewestOnly");
		
		// Set pixel format to Mono8
		Arena::SetNodeValue(pDevice->GetNodeMap(), "PixelFormat", "Mono8");
		std::cout << TAB3 << "Set pixel format to 'Mono8' \n";
		
		if (i == 0)
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(0);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}

		else if (i == 1)
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(80000);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}

		else if (i == 2)
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(160000);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}

		else if (i == 3)
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}

		// Frame rate
		GenApi::CFloatPtr pAcquisitionFrameRate = pDevice->GetNodeMap()->GetNode("AcquisitionFrameRate");
		pAcquisitionFrameRate->SetValue(pAcquisitionFrameRate->GetMax());

		// PTPSyncFrameRate
		GenApi::CFloatPtr pPTPSyncFrameRate = pDevice->GetNodeMap()->GetNode("PTPSyncFrameRate");
		pPTPSyncFrameRate->SetValue(PTPSYNC_FRAME_RATE);
		
	}

	// prepare system
	std::cout << TAB2 << "Prepare system\n";
	
	// Wait for devices to negotiate their PTP relationship
	//    Before starting any PTP-dependent actions, it is important to wait for
	//    the devices to complete their negotiation; otherwise, the devices may
	//    not yet be synced. Depending on the initial PTP state of each camera,
	//    it can take about 40 seconds for all devices to autonegotiate. Below,
	//    we wait for the PTP status of each device until there is only one
	//    'Master' and the rest are all 'Slaves'. During the negotiation phase,
	//    multiple devices may initially come up as Master so we will wait until
	//    the ptp negotiation completes.
	std::cout << TAB1 << "Wait for devices to negotiate. This can take up to about 40s.\n";

	std::vector serials;
	int i = 0;
	do
	{
		bool masterFound = false;
		bool restartSyncCheck = false;

		// check devices
		for (size_t j = 0; j < devices.size(); j++)
		{
			Arena::IDevice* pDevice = devices.at(j);

			// get PTP status
			GenICam::gcstring ptpStatus = Arena::GetNodeValue(pDevice->GetNodeMap(), "PtpStatus");

			if (ptpStatus == "Master")
			{
				if (masterFound)
				{
					// Multiple masters -- ptp negotiation is not complete
					restartSyncCheck = true;
					break;
				}

				masterFound = true;
			}
			else if (ptpStatus != "Slave")
			{
				// Uncalibrated state -- ptp negotiation is not complete
				restartSyncCheck = true;
				break;
			}
		}

		// A single master was found and all remaining cameras are slaves
		if (!restartSyncCheck && masterFound)
			break;

		std::this_thread::sleep_for(std::chrono::duration(1));

		// for output
		if (i % 10 == 0)
			std::cout << "\r" << ERASE_LINE << "\r" << TAB2 << std::flush;

		std::cout << "." << std::flush;

		i++;

	} while (true);


	// start stream
	std::cout << "\n"
		<< TAB1 << "Start stream\n";

	for (size_t i = 0; i < devices.size(); i++)
	{
		devices.at(i)->StartStream();
	}


	// get images and check timestamps
	std::cout << TAB1 << "Get images\n";

	for (size_t i = 0; i < NUM_IMAGES; i++)
	{
		for (size_t j = 0; j < devices.size(); j++)
		{
			Arena::IDevice* pDevice = devices.at(j);
			GenICam::gcstring deviceSerialNumber = Arena::GetNodeValue(pDevice->GetNodeMap(), "DeviceSerialNumber");

			std::cout << TAB2 << "Image " << i << " from device " << deviceSerialNumber << "\n";

			// Compare timestamps
			//    Scheduling action commands amongst PTP synchronized devices results
			//    in synchronized images with synchronized timestamps.
			std::cout << TAB3 << "Timestamp: ";

			Arena::IImage* pImage = pDevice->GetImage(3000);

			std::cout << pImage->GetTimestamp() << "\n";

			// requeue buffer
			pDevice->RequeueBuffer(pImage);
		}
	}
	
	// stop stream
	std::cout << TAB1 << "Stop stream\n";

	for (size_t i = 0; i < devices.size(); i++)
	{
		devices.at(i)->StopStream();
	}
	
	
}

// =-=-=-=-=-=-=-=-=-
// =- PREPARATION -=-
// =- & CLEAN UP =-=-
// =-=-=-=-=-=-=-=-=-

int main()
{
	// flag to track when an exception has been thrown
	bool exceptionThrown = false;

	std::cout << "Cpp_PTPSync\n";

	try
	{
		// prepare example
		Arena::ISystem* pSystem = Arena::OpenSystem();
		pSystem->UpdateDevices(100);
		std::vector deviceInfos = pSystem->GetDevices();
		if (deviceInfos.size() < 2)
		{
			if (deviceInfos.size() == 0)
				std::cout << "\nNo camera connected. Example requires at least 2 devices\n";
			else if (deviceInfos.size() == 1)
				std::cout << "\nOnly one device connected. Example requires at least 2 devices\n";

			std::cout << "Press enter to complete\n";

			// clear input
			while (std::cin.get() != '\n')
				continue;

			std::getchar();
			return 0;
		}
		std::vector devices;
		for (size_t i = 0; i < deviceInfos.size(); i++)
		{
			devices.push_back(pSystem->CreateDevice(deviceInfos.at(i)));
		}

		// run example
		std::cout << "Commence example\n\n";
		PTPSyncCamerasAndAcquireImages(pSystem, devices);
		std::cout << "\nExample complete\n";

		// clean up example
		for (size_t i = 0; i < devices.size(); i++)
		{
			pSystem->DestroyDevice(devices.at(i));
		}
		Arena::CloseSystem(pSystem);
	}
	catch (GenICam::GenericException& ge)
	{
		std::cout << "\nGenICam exception thrown: " << ge.what() << "\n";
		exceptionThrown = true;
	}
	catch (std::exception& ex)
	{
		std::cout << "\nStandard exception thrown: " << ex.what() << "\n";
		exceptionThrown = true;
	}
	catch (...)
	{
		std::cout << "\nUnexpected exception thrown\n";
		exceptionThrown = true;
	}

	std::cout << "Press enter to complete\n";
	std::getchar();

	if (exceptionThrown)
		return -1;
	else
		return 0;
}

External Hardware Trigger Example Code

#include "stdafx.h"
#include "GenTL.h"
#include "ArenaApi.h"
#include  // for std::find
#include 	 // for sleep

#ifdef __linux__
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Wunused-but-set-variable"
#endif

#include "GenICam.h"

#ifdef __linux__
#pragma GCC diagnostic pop
#endif

#include "ArenaApi.h"

#define TAB1 "  "
#define TAB2 "    "
#define TAB3 "      "
#define ERASE_LINE "                            "

#define EXPOSURE_TIME 10000.0

// number of images to grab
#define NUM_IMAGES 500


// =-=-=-=-=-=-=-=-=-
// =-=- SETTINGS =-=-
// =-=-=-=-=-=-=-=-=-

// image timeout
#define TIMEOUT 20000

// =-=-=-=-=-=-=-=-=-
// =-=- EXAMPLE -=-=-
// =-=-=-=-=-=-=-=-=-

// trigger configuration and use
void ConfigureTriggerAndAcquireImage(Arena::ISystem* pSystem, std::vector& devices)
{

	for (size_t i = 0; i < devices.size(); i++)
	{
		Arena::IDevice* pDevice = devices.at(i);
		GenICam::gcstring deviceSerialNumber = Arena::GetNodeValue(pDevice->GetNodeMap(), "DeviceSerialNumber");

		std::cout << TAB2 << "DeviceSerialNumber " << i << " from device " << deviceSerialNumber << "\n";

		// get node values that will be changed in order to return their values at
		// the end of the example
		GenICam::gcstring triggerSelectorInitial = Arena::GetNodeValue(pDevice->GetNodeMap(), "TriggerSelector");
		GenICam::gcstring triggerModeInitial = Arena::GetNodeValue(pDevice->GetNodeMap(), "TriggerMode");
		GenICam::gcstring triggerSourceInitial = Arena::GetNodeValue(pDevice->GetNodeMap(), "TriggerSource");

		// Set trigger selector
		//    Set the trigger selector to FrameStart. When triggered, the device will
		//    start acquiring a single frame. This can also be set to
		//    AcquisitionStart or FrameBurstStart.
		std::cout << TAB1 << "Set trigger selector to FrameStart\n";

		Arena::SetNodeValue(
			pDevice->GetNodeMap(),
			"TriggerSelector",
			"FrameStart");

		// Set trigger mode
		//    Enable trigger mode before setting the source and selector and before
		//    starting the stream. Trigger mode cannot be turned on and off while the
		//    device is streaming.
		std::cout << TAB1 << "Enable trigger mode\n";

		// Set trigger source
		//    Set the trigger source to software in order to trigger images without
		//    the use of any additional hardware. Lines of the GPIO can also be used
		//    to trigger.
		std::cout << TAB1 << "Set trigger source to Line0\n";

		Arena::SetNodeValue(pDevice->GetNodeMap(), "TriggerSource", "Line0");

		Arena::SetNodeValue(pDevice->GetNodeMap(), "TriggerActivation", "RisingEdge");

		/*Arena::SetNodeValue(
			pDevice->GetNodeMap(),
			"TriggerOverlap",
			"PreviousFrame");*/

		Arena::SetNodeValue(pDevice->GetNodeMap(), "TriggerMode", "On");


		// Manually set exposure time
		//    In order to get synchronized images, the exposure time must be
		//    synchronized as well.
		std::cout << TAB3 << "Exposure: ";

		Arena::SetNodeValue(pDevice->GetNodeMap(), "ExposureAuto", "Off");

		Arena::SetNodeValue(pDevice->GetNodeMap(), "ExposureTime", EXPOSURE_TIME);

		std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "ExposureTime") << "\n";

		// Synchronize devices by enabling PTP
		//    Enabling PTP on multiple devices causes them to negotiate amongst
		//    themselves so that there is a single master device while all the
		//    rest become slaves. The slaves' clocks all synchronize to the
		//    master's clock.
		std::cout << TAB3 << "PTP: ";

		Arena::SetNodeValue(pDevice->GetNodeMap(), "PtpEnable", true);

		std::cout << (Arena::GetNodeValue(pDevice->GetNodeMap(), "PtpEnable") ? "enabled" : "disabled") << "\n";

		// Use max supported packet size. We use transfer control to ensure that only one camera
		// is transmitting at a time.
		std::cout << TAB3 << "StreamAutoNegotiatePacketSize: ";
		Arena::SetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamAutoNegotiatePacketSize", true);
		std::cout << Arena::GetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamAutoNegotiatePacketSize") << "\n";

		// enable stream packet resend
		std::cout << TAB3 << "StreamPacketResendEnable: ";
		Arena::SetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamPacketResendEnable", true);
		std::cout << Arena::GetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamPacketResendEnable") << "\n";

		// Set acquisition mode to 'Continuous'
		std::cout << TAB3 << "Set acquisition mode to 'Continuous'\n";
		Arena::SetNodeValue(pDevice->GetNodeMap(), "AcquisitionMode", "Continuous");

		// 	Set StreamBufferHandlingMode to 'NewestOnly'
		std::cout << TAB3 << "Set StreamBufferHandlingMode to 'NewestOnly'\n";
		Arena::SetNodeValue(pDevice->GetTLStreamNodeMap(), "StreamBufferHandlingMode", "NewestOnly");


		if (i == 0) 
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(0);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}

		else if (i == 1)
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(80000);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}

		else if (i == 2)
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(160000);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}

		else if (i == 3)
		{
			// Packet Delay 
			GenApi::CIntegerPtr pStreamChannelPacketDelay = pDevice->GetNodeMap()->GetNode("GevSCPD");
			pStreamChannelPacketDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCPD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCPD") << "\n";

			// Transmission Delay 
			GenApi::CIntegerPtr pStreamChannelFrameTransmissionDelay = pDevice->GetNodeMap()->GetNode("GevSCFTD");
			pStreamChannelFrameTransmissionDelay->SetValue(240000);
			std::cout << TAB3 << "GevSCFTD: ";
			std::cout << Arena::GetNodeValue(pDevice->GetNodeMap(), "GevSCFTD") << "\n";
		}
	}

	// prepare system
	std::cout << TAB2 << "Prepare system\n";

	// Wait for devices to negotiate their PTP relationship
	//    Before starting any PTP-dependent actions, it is important to wait for
	//    the devices to complete their negotiation; otherwise, the devices may
	//    not yet be synced. Depending on the initial PTP state of each camera,
	//    it can take about 40 seconds for all devices to autonegotiate. Below,
	//    we wait for the PTP status of each device until there is only one
	//    'Master' and the rest are all 'Slaves'. During the negotiation phase,
	//    multiple devices may initially come up as Master so we will wait until
	//    the ptp negotiation completes.
	std::cout << TAB1 << "Wait for devices to negotiate. This can take up to about 40s.\n";

	std::vector serials;
	int i = 0;
	do
	{
		bool masterFound = false;
		bool restartSyncCheck = false;

		// check devices
		for (size_t j = 0; j < devices.size(); j++)
		{
			Arena::IDevice* pDevice = devices.at(j);

			// get PTP status
			GenICam::gcstring ptpStatus = Arena::GetNodeValue(pDevice->GetNodeMap(), "PtpStatus");

			if (ptpStatus == "Master")
			{
				if (masterFound)
				{
					// Multiple masters -- ptp negotiation is not complete
					restartSyncCheck = true;
					break;
				}

				masterFound = true;
			}
			else if (ptpStatus != "Slave")
			{
				// Uncalibrated state -- ptp negotiation is not complete
				restartSyncCheck = true;
				break;
			}
		}

		// A single master was found and all remaining cameras are slaves
		if (!restartSyncCheck && masterFound)
			break;

		std::this_thread::sleep_for(std::chrono::duration(1));

		// for output
		if (i % 10 == 0)
			std::cout << "\r" << ERASE_LINE << "\r" << TAB2 << std::flush;

		std::cout << "." << std::flush;

		i++;

	} while (true);
	

	// Start stream
	//    When trigger mode is off and the acquisition mode is set to stream
	//    continuously, starting the stream will have the camera begin acquiring
	//    a steady stream of images. However, with trigger mode enabled, the
	//    device will wait for the trigger before acquiring any.
	std::cout << TAB1 << "Start stream\n";

	//pDevice->StartStream();
	
	// start stream
	std::cout << "\n"
		<< TAB1 << "Start stream\n";

	for (size_t i = 0; i < devices.size(); i++)
	{
		devices.at(i)->StartStream();
	}

	uint64_t timestampNsPrevious = 0;

	// Get image
	//    Once an image has been triggered, it can be retrieved. If no image has
	//    been triggered, trying to retrieve an image will hang for the duration
	//    of the timeout and then throw an exception.
	std::cout << TAB2 << "Get image";

	for (size_t i = 0; i < NUM_IMAGES; i++)
	{
		for (size_t j = 0; j < devices.size(); j++)
		{
			Arena::IDevice* pDevice = devices.at(j);
			GenICam::gcstring deviceSerialNumber = Arena::GetNodeValue(pDevice->GetNodeMap(), "DeviceSerialNumber");

			std::cout << TAB2 << "Image " << i << " from device " << deviceSerialNumber << "\n";

			// Compare timestamps
			//    Scheduling action commands amongst PTP synchronized devices results
			//    in synchronized images with synchronized timestamps.
			std::cout << TAB3 << "Timestamp: ";

			Arena::IImage* pImage = pDevice->GetImage(30000);

			std::cout << pImage->GetTimestamp() << "\n";

			std::cout << "Grabbed FrameId " << pImage->GetFrameId() << std::endl;

			uint64_t timestampNs = pImage->GetTimestampNs();

			uint64_t difference = timestampNs - timestampNsPrevious;
			timestampNsPrevious = timestampNs;

			std::cout << " (" << "timestamp (ns): " << timestampNs << "; FrameRate : " << 1 / (difference*1E-9) << " FPS" << ")";

			// requeue buffer
			pDevice->RequeueBuffer(pImage);
		}
	}

	// stop stream
	std::cout << TAB1 << "Stop stream\n";

	for (size_t i = 0; i < devices.size(); i++)
	{
		devices.at(i)->StopStream();
	}

}

// =-=-=-=-=-=-=-=-=-
// =- PREPARATION -=-
// =- & CLEAN UP =-=-
// =-=-=-=-=-=-=-=-=-

int main()
{
	// flag to track when an exception has been thrown
	bool exceptionThrown = false;

	std::cout << "Cpp_Trigger\n";

	try
	{
		// prepare example
		Arena::ISystem* pSystem = Arena::OpenSystem();
		pSystem->UpdateDevices(1000);
		std::vector deviceInfos = pSystem->GetDevices();
		if (deviceInfos.size() == 0)
		{
			std::cout << "\nNo camera connected\nPress enter to complete\n";
			std::getchar();
			return 0;
		}

		//Arena::IDevice* pDevice = pSystem->CreateDevice(deviceInfos[0]);

		std::vector devices;

		for (size_t i = 0; i < deviceInfos.size(); i++)
		{
			if ((deviceInfos[i].SerialNumber() == "192800283") || (deviceInfos[i].SerialNumber() == "181700080") || (deviceInfos[i].SerialNumber() == "193300005") || (deviceInfos[i].SerialNumber() == "200600418"))
			{
				devices.push_back(pSystem->CreateDevice(deviceInfos.at(i)));
			}
		}

		// run example
		std::cout << "Commence example\n\n";
		ConfigureTriggerAndAcquireImage(pSystem, devices);
		std::cout << "\nExample complete\n";

		// clean up example
		for (size_t i = 0; i < devices.size(); i++)
		{
			pSystem->DestroyDevice(devices.at(i));
		}
		Arena::CloseSystem(pSystem);
	}
	catch (GenICam::GenericException& ge)
	{
		std::cout << "\nGenICam exception thrown: " << ge.what() << "\n";
		exceptionThrown = true;
	}
	catch (std::exception& ex)
	{
		std::cout << "Standard exception thrown: " << ex.what() << "\n";
		exceptionThrown = true;
	}
	catch (...)
	{
		std::cout << "Unexpected exception thrown\n";
		exceptionThrown = true;
	}

	std::cout << "Press enter to complete\n";
	std::getchar();

	if (exceptionThrown)
		return -1;
	else
		return 0;
}