As modern warfare changes, so must the technical innovations from global defense sector technology partners.  The changing face of military engagements, fewer troops on the ground, more use of reconnaissance gathered via autonomous vehicles, real-time feeds to operations and the emergence of network-centric warfare are driving the solutions and applications needed to better support today’s warfighter. 

At their core, today’s battlefield engagements depend on access to and the ability to share complex, real-time data with battlefield commanders, who in turn can push select information all the way down to the front-line warfighter.  As warfare adjusts to incorporate more types of autonomous vehicles, including those discussed in the Unmanned Ground Systems Roadmap developed by the US Army’s Robotics Systems Joint Project Office (RS JPO), the need to further reduce SWaP—most-effectively through a standards-based footprint—while also providing High Performance Embedded Computing (HPEC) with flexible sensor I/O, will once again demand a quantum leap in engineering innovation.  

The support of autonomous ground mobile computing requirements for vehicle operating functions such as vision, communications and autonomous navigation, in parallel with support for payload functions such as custom sensor input or weapons management, will place a high burden on the current crop of rugged HPEC offerings. Will the answer be more custom-fit proprietary solutions, a mix of smaller dedicated processors or the evolution of standards to meet the needs of an autonomous vehicle future? The optimistic answer is that the evolution of technology standards, COTS and engineering innovation will be help usher in the age of vehicle autonomy in all forms of military engagements.

Real World Robotic Systems JPO Drives the Roadmap

Today’s UGVs are either tele-operated by a remote human driver, or run semi-autonomously. At this stage of UGV development, there is a range of capability for autonomous operation. For example, the UGV can either be slaved to another human-operated vehicle in a convoy scenario, or follow a tracking beacon or geographic waypoints using onboard sensors, GPS and computing power to guide progress. HPEC can play a big role in the evolution of autonomous capabilities as they head toward full independence. In addition, the needs of payloads, such as Improvised Explosive Detection (IED) devices, will become ever more sophisticated. Autonomous operation will need situational awareness provided by payload computing as UGVs become fully autonomous.  

To support progress toward full autonomy, the US Army’s RS JPO has developed a functional plan for multiple types of UGVs, including multiple classes of vehicles and unmanned ground vehicle platforms. Specifically, the classes known as self-transportable and appliqué will have the most influence over the HPEC evolution.

The RS JPO’s Unmanned Ground Systems Roadmap was created with key technology enablers for UGV growth over time. Some of these enablers will have a unique evolutionary/revolutionary HPEC requirement, especially as applied to the sub-segments of autonomous navigation, power, vision, architecture and payload support. To support this roadmap, HPEC solutions will soon require performance upgrades beyond what is available today. Within the UGV self-transportable and appliqué classes there are specific programs with unique capability sets, that require technology enablers in order to adhere to the roadmap. These programs include:

· Project Workhorse: UGV program deploying in Afghanistan that involves a self-transportable utility platform in the form of the Army sponsored Squad Mission Support System (SMSS) from Lockheed Martin.  The SMSS is an autonomous ground vehicle that can carry up to a half-ton of squad equipment and can be remotely operated via satellite to perform autonomous operations such as follow-me, go-to-point and retro-traverse. The SMSS sensor suite integrates Light Detection And Ranging (LIDAR), infrared (IR) and a color camera. The vehicle can lock-on and follow any person by identifying his 3D profile captured by the onboard sensors. The SMSS can autonomously navigate through a pre-programmed route using GPS waypoints. Evolution of this class of UGV will require improvements in onboard computer power consumption and more and better sensor integration, while also providing equal or higher computes with a reduced detectable emission signature (See Figure 1).                      

· Convoy Active Safety Technology (CAST): Autonomous Mobility Applique System (AMAS) in the form of an add-on or appliqué retrofit kit to virtually any existing manned vehicle, permitting a wide range of autonomous behavior. Capabilities range from remote operation to driver assist to fully autonomous driving and navigation. The AMAS will be produced using a common open architecture and delivered in multi-kit form: an “A-Kit,” which is the universal brain; a “B-Kit,” which contains the vehicle-specific sensors, aggregation and connectors; and the “C-Kit,” which is oriented toward payload management. With the AMAS, more processing means more autonomous capability; to meet the scale of expected demand, the kits should be delivered in a smaller, standard footprint and take advantage of standardized connections, lowering system costs.

A common need across programs is the function of autonomous operation and payload support. For the AMAS technology illustrated in Figure 3, autonomous operation is achieved using a combination of multiple sensors, onboard processing, drive-by-wire functionality and additional payload control.

While these programs are currently underway, the Army’s RS JPO technology roadmap demands enhanced capabilities for future revisions of these programs that support the following:

· Integration of higher definition IR cameras, more onboard image enhancement for visible spectrum cameras, future integration of both visible and IR data in real time, more camera/sensor inputs that can support higher bandwidth.

· Algorithm support for object detection and avoidance, intelligent object detection and tracking, stereographic imaging and processing (eventually reaching object identification).

· HPEC computing support for the above, along with integration of multi-sensor payloads such as IED detection, weapons management, manipulators and sensor cross-cueing.

· Future common, standards-based architecture for UGV computing (per the RS JPO and its Interoperability Initiative – currently at IOP v.0).

For UGVs to achieve improved autonomous operation, the technology roadmap calls for progress in sensor capabilities in terms of input speed, multiple sensor data aggregation, real-time data processing and results dissemination to the controller subsystems. With the sensor requirements and payload-specific support, such as side-looking radar for IED detection, the demand on a single HPEC solution is great. In addition, the push for open standards across the entire scope of product architecture will drive adoption of less proprietary physical hardware, connectivity and software solutions, thus creating the potential for more competitive , interchangeable and evolutionary options.

Imaging and Payload Technology Drives HPEC Requirements in UGVs

Computing requirements in UGVs are being driven by imaging used in support of machine vision and the advent of complex payloads for IED detection. There are military UGV programs that need an ability to perform autonomous navigation during the day, as well as the night.   They require the ability to navigate in stealth mode (where perception sensor energy is not emitted). Using a pair of Thermal Infrared (TIR) cameras, stereo ranging and terrain classification can be performed to generate an annotated map of the terrain. TIR is a convenient option, since a single TIR camera may already be a part of the sensor suite of many vehicles. A HPEC is provided to analyze the thermal image data and perform the terrain mapping.

For the evolution of autonomous operation relying on TIR offered in UGVs, the image processing that is critical to control functions like autonomous navigation will need to increase as the sensor data streams increase. To achieve useful machine vision, a camera sensor fusion will likely include IR, color CCD and LIDAR capability in a single turret.  Each of these cameras will operate between 15 to 60 fps and can today generate uncompressed 516 Mbits/second of image data per camera, growing to 1.3 Gbits/second and finally 3.48 Gbits/second. Camera data might not be compressed at the source, so as not to degrade the level of image processing that can be rendered by the HPEC interfaced using the RS-170 or RS-422 video signal standard. As data rates increase, CameraLink, GigEVision or CoaXpress will replace the above interfaces.  

The RS JPO roadmap calls for new obstacle and collision avoidance algorithms, which rely heavily on recursive calculations best done on GP-GPUs or specialised FPGAs. For example, recent research done for UAV image processing using GP-GPU based algorithms has shown a 99.5% increase in performance over running the same algorithm on an Intel CPU. In all cases, the GP-GPU rendered the results in under 50 msec. Given a fully autonomous vehicle scenario where a human operator is not involved, and vehicle operation decisions must be made in real-time at speed; having an HPEC equipped with GP-GPU capability that can correlate all the inputs and successfully execute the mission is imperative. Hence the use of multiple types of higher definition cameras running at a higher resolution; higher bandwidth will drive the design of rugged HPEC computing that supports future UGVs.     

A complete anti-IED payload system requires an IED-detection component, an IED-assessment component and an IED-defeat component. The payload processing must be accomplished in real-time to achieve the desired level of safety for the UGV and its mission. As with autonomous navigation and machine vision, the real-time detection of the changes in the data coming from the detector components will require a large amount of either GP-GPU or FPGA processing.  

Today, a divide and conquer approach is used to separate vehicle control, sensors and payload processing. Separating functions into kits as described with AMAS technology is a good approach to the future growth of HPEC in UGVs. For example, a fully autonomous vehicle with a payload of ground penetrating radar could not execute all of its processing tasks with a single HPEC solution. By sub-dividing the problem into compute and function nodes, a scalable long-term solution emerges. Having standards for the UGV solutions that regularise the HPEC physical box size, supported I/O and connector types will enable interchangeability and evolution as HPEC solutions grow and change. 

Evolution of Technology Standards, COTS and Engineering Innovation

Evolving UGV requirements need raw processing speed and execute algorithms that are highly recursive, creating the need to have HPEC solutions that combine generic COTS Intel CPU processing and a closely coupled GP-GPU into a single solution.

As mentioned, the RS JPO is promoting the use of standards in the fielding of UGV solutions, current market 3U & 6U VPX provide rugged HPEC solutions. Emerging standards in smaller footprint HPEC solutions include the VITA Technologies standard known as VITA 75. VITA 75 takes a fundamentally different approach from other small form factor standards, in that it concentrates on the physical box, a set of standard enclosures dimensions, connectors and I/O pin assignments, rather than on specifying the individual computer modules inside.   

VITA 75 subsystem profiles are composed of up to four separate sub-profiles:

·         VITA 75.0 component of subsystem profile (base profile)

·         VITA 75.11 component of subsystem profile (front panel profile)

·         VITA 75.2x cooling and mounting, consisting of a VITA 75.2x dot specification followed by profile nomenclature specified by VITA 75.2x

VITA 75 solutions are especially well-suited to address UGV HPEC requirements, as they provide designers with a set of standardised footprints that are generally smaller than equivalently equipped OpenVPX 3U or 6U solutions, while also offering a standardised connector scheme that allows for sub-system interchangeability at the vehicle-level and provides for evolution of the vehicles sub-system in a predictable fashion. ADLINK’s HPERC (High Performance Extreme Rugged Computer) system is typical of this type of VITA 75 solution. HPERC provides a solid foundation of Intel i7 processing closely coupled to either an embedded NVIDIA or ATI GP-GPU, as well as a wealth of camera and vehicle data bus and I/O support. This solution can readily provide the necessary image processing and I/O required for UGV applications both today and in the future.


UGVs represent a force multiplier for ground forces. The challenges of true autonomous operation and adequate payload support represent a clear direction for HPEC. If the aggressive roadmap for UGVs is to be realised, a common, standards-based HPEC architecture must emerge and evolve. ADLINK, along with fellow embedded platform vendors, is working to define and develop against industry standards in order to meet the SWaP requirements for HPEC systems of the future in order to meet the demanding requirements of UGV and other programs that benefit the warfighter. 

By Mike Jones, Rugged Systems product manager