top of page
Search

LODESTAR

Live Operational Data Enhancement for Situational Awareness Through Augmented Reality

LODESTAR paves the way for the introduction of advanced new technology into soldier systems, in particular AI on the edge, AR and advanced Computer Vision and integrated micro-UAVs. It lays the groundworks for future soldier systems to make optimal and cost-effective use of any combination of these technologies.


The aim of LODESTAR is to develop the necessary knowledge and technical frameworks to allow a further and fast development and integration of emerging technologies related to AI and AR into new and existing soldier system architectures. We aim to enable a level of integration that is efficient in terms of power and weight, adaptable to the user’s need in any tactical situation, while systematically evaluating all aspects to keep the cognitive burden to a minimum.


In addition, LODESTAR will study, define and combine individual new technology to introduce unprecedented new capabilities, only achievable through deep integration of sensors and displays. For example, think of fused (multi-spectral) night vision images overlayed with blue force tracking symbology, dynamically presented to the soldier where needed – depending on the situation, this could be an optical see-through display on the helmet or the weapon scope.


The purpose of LODESTAR is to deliver a significant contribution towards the following overall goals:

  • increase situational awareness of the dismounted soldier, thus enhancing his or her effectiveness and improving survivability;

  • reduce the cognitive burden of the dismounted soldier in relation to information processing, which in turn translates into effectiveness, survivability and reduced mental fatigue;

  • make the process of introducing ground-breaking new technology into soldier systems more cost-effective.

The introduction of new technology into the soldier system must always serve to achieve operational impact. In our vision, focus must be on three areas of improvement relative to current soldier systems: enabling human-centric design, ensuring seamless interoperability and facilitating operations under all conditions.


The LODESTAR goals are achieved through tasks aimed at developing the following functionality and technology:

  • deeply integrated mixed-augmented reality, presenting operational/tactical data, currently available only through a touchscreen-operated device at best, in a way that allows the soldier to keep his eyes and ears on his surroundings;

  • seamless integration among multiple displays (helmet mounted and otherwise) available within the (future) soldier system, i.e. AR glasses, weapon sights, NVGs and other vision equipment, making sure that the technology does not force the soldier to alternate between display device and simply use the one most suited for the task at hand;

  • advanced multi-spectral sensor fusion of data acquired across multiple sensors, based on AI “on the edge”, into images and sound that are intuitive to the soldier and direct his attention to the most important information;

  • seamlessly integrating remote camera feeds (“eye in the sky”) with locally acquired sensor data and overlaid on the real-world view;

  • collecting meta-information on the soldier’s state (where his attention is directed, his level of alertness) and situation (such as his stance, day/night, moving speed) from multiple sensors within the soldier system based on newly developed AI algorithms.

A TRL4-5 technology demonstration system will then be designed and constructed consisting of the following components:

  • A head-worn display platform (“smart helmet”) comprising an Optical See Through - Head Mount Sight and Display (OST-HMSD) and 3D audio headset, delivering visual and auditory information to the soldier;

  • An advanced digital weapon sight with aim tracking;

  • A body-worn Vision Computer capable of advanced AI-based multi-channel image processing on a small power budget;

  • Short-range wireless communication between the body-worn unit and the weapon sight based on Ultra Wide Band (UWB) radio technology.

More information




This project has received funding from the European Defence Fund (EDF) under grant agreement no. 101102526

bottom of page