European Commission logo
español español
CORDIS - Resultados de investigaciones de la UE
CORDIS
Contenido archivado el 2024-05-28

Small Integrated Navigator for PLanetary EXploration

Final Report Summary - SINPLEX (Small Integrated Navigator for PLanetary EXploration)

Executive Summary:
The main goal of SINPLEX was to develop an innovative solution to reduce significantly the mass of the navigation subsystem for exploration missions which include a landing and/or a rendezvous and capture phase. It is a contribution to the strengthening of the European position for space exploration. It targets increasing the scientific return of exploration missions, enabling new types of missions and targets, and reducing launch cost and travel time.
Future space exploration missions target asteroids, comets, planets and planetary moons. They will bring robotic vehicles to these targets and will provide the capability to return samples to Earth. In general for all space mission but in particular for this kind of missions, mass is one of the most critical factors. Thus reducing the mass of components or complete subsystems of an exploration vehicle is a key enabling factor for the future exploration of our solar system and beyond.
The mass reduction - while still creating a good navigation performance - is achieved by applying functional integration of the different sensors, utilizing micro and nanotechnologies for compacting electronics, and using sensor hybridization approaches to improve the performance of the complete navigation subsystem.
The project objectives results: the development of an integrated novel navigation subsystem architecture, the production of a breadboard and the demonstration of its applicability for object relative robotic navigation for space applications.

Project Context and Objectives:
The main objective of the SINPLEX project is the miniaturization of the navigation subsystem for exploration missions. The aim is to develop innovative in-space technologies which allow reducing mass and size of such onboard systems. This mass reduction enables a higher scientific return of robotic exploration. It will help strengthen Europe’s position in space exploration.
To enable a higher scientific return and to increase the envelope of reachable target bodies, the mass and size of all subsystems must be reduced. Several planned international space exploration missions1 target the Moon, asteroids, planets and planetary moons. They will bring humans or robotic vehicles to the Moon, Mars, or asteroids and will provide the capability to return samples from cosmic bodies.
In order to achieve this new concepts and technologies are required for the Guidance Navigation and Control (GNC) subsystem. The objectives of the SINPLEX project are:
• To develop an integrated navigation subsystem architecture for exploration missions (approach, entry, descent, landing, rendezvous and capture).
• To reduce mass by miniaturization of components and functional integration of navigation sensors.
• To assess the integrated system’s applicability for space exploration mission scenarios.
• To verify and assess the performance of the integrated system.
SINPLEX combines inertial measurement units, navigation cameras, star trackers, laser altimeters and a processing unit in a compact navigation system. For each sensor, unique leading technologies in miniature optical sensors, inertial sensors and miniaturized electronics (nano technology) for space applications will be combined. This alone would yield a significant mass reduction.
For the navigation subsystem, different components will be integrated to share parts of the assembly between the different functions of the subsystem. This is complemented by specially tailored navigation software which fuses measurements of single detectors into an integrated navigation solution. This method compensates the weaknesses of some sensors with the strengths of others.
The small integrated navigator is an innovative solution for GNC subsystems which offers accurate and robust position and attitude determination of orbiting and landing spacecraft. It is applicable to planetary approach, soft and precise landing of a sensitive payload on an extraterrestrial body and rendezvous and docking for sample return missions. It will be demonstrated by verifying the system and assessing performance.
SINPLEX aims to progress beyond state-of–the-art architectures and reduce the mass of a navigation subsystem for in-space and increase Europe’s competitiveness in the field of micro and nano technology applications for space exploration missions.
The miniaturized navigation subsystem will be capable of providing the position (or the range), attitude and velocity of the spacecraft to guidance and control subsystems in the approach phase relative to the local surface of a celestial body or another spacecraft. The main ideas leading this proposal are induced by the requirements for future space vehicles that will be designed for future planetary exploration missions to Mars (e.g. Mars Sample Return), the Moon or asteroids (e.g. Marco Polo). These missions are key elements of the near-term European space exploration vision and future international space missions.
The highly integrated and miniaturized navigation subsystem developed by the proposed work will be a major leap in the scientific return of space exploration missions. Applying this technology and transferring the miniaturization concept to other subsystems will allow more payload, faster missions and new, more distant targets.

Project Results:
I. Summary of activites and S & T results/foregrounds

During the SINPLEX project the following work has been performed:
• Analysing of potential application mission and compilation of a set of mission requirements. Several planned missions have been investigated by the project team: Mars Sample Return, Marco Polo and Lunar landing scenarios.
• Deriving of system requirements of the miniaturized navigation system from taking into account current limitations of the technologies.
• Creating a consistent preliminary design of the miniaturized navigation system. This design was done for an internally redundant system capable to sustain the harsh space environment of exploration missions.
• Reviewing the preliminary design (Preliminary Design Review – PDR) with the advisory board as external reviewers. The review rendered several comments and actions to which were incorporated in the design.
• Designing test facilities to conduct ground based tests for verification of function and performance of the prototype.
• Creating a consistent final design for a breadboard of the miniaturized navigation system. In contradiction to the PDR design the breadboard is not redundant but will to demonstrate the technologies needed to create a miniaturized navigation system.
• Reviewing the final design (Critical Design Review – CDR) with the advisory board and external reviewers. The review rendered several comments and actions to which were incorporated in the design.
• Producing the elements for the breadboard of the integrated navigation system.
• Unit testing of the single elements of the integrated navigation system.
• Calibration and alignment measurement of the inertial sensor unit.
• Integrating the elements of the breadboard electronics and optics into a single housing.
• Alignment and calibration of the optical sensors and elements in the system.
• Functional verification of the electronics, the internal and external communication.
• Functional verification of the image processing algorithms for star tracker and navigation camera.
• Test and demonstration of the navigation system performance in the lab TRON (Testbed for Robotic Optical Navigation).
• The main results of the project are:
• A set of mission and system requirements for the SINPLEX system. Following the extended review of mission scenarios and requirements modes of operation have been defined.
• A preliminary design for a flight model which includes the integrated design of optical, mechanical and electronical subsystems and parts. The preliminary design is documented in the PDR data package
• A final design for a breadboard which contains all technologies to demonstrate the feasibility and maturity of the design. The final breadboard design is documented in the CDR data package.
• A design for test facilities to test and verify the performance of the developed navigation system prototype.
• a challenging list of requirements for a miniaturized planetary navigation system.
• defined performance validation procedures derived from analysis of currently developed and future exploration missions,
• a prototype design of the integrated navigation subsystem which meets the requirements and thereby achieve a maximum mass reduction,
• a manufactured and assembled breadboard of the subsystem ready to be used in test facilities for performance validation,
• test results of the operation and capabilities of the navigation subsystem in ground-based test benches and test facilities reproducing the down-scaled dynamics and conditions of exploration missions with soft-landing or rendezvous and capture phases,
• dissemination items for promoting the miniature navigation subsystem and the miniaturization concept, as well as project and results to the interested communities

II. System Requirements

SINPLEX is designed to be used as the primary navigation subsystem for space exploration missions involving landing and/or a sample return. These missions have approach, entry/descent/landing, rendezvous and/or capture phases where the navigation system can be used. Future missions target the Moon, Mars or asteroids and some involve rendezvous and docking for sample return. In designing SINPLEX, several potential mission scenarios were analysed and four reference mission scenarios were chosen which encompass the most demanding requirements. These missions serve as a means to derive the system-level technical characteristics, operation modes and navigation performance requirements.
Scenario 1 is a typical Moon descent and landing mission, in which the spacecraft performs an autonomous, precise and safe landing at a target location. The mission scenario used for SINPLEX is derived from the trajectories used by several Moon landing missions including Surveyor, Apollo 11 and Altair [3]. The spacecraft starts in a 100km x 100km polar parking orbit around the Moon and performs a breaking manoeuvre, which brings the vehicle into a descent orbit with 10km periapsis. Powered descent starts near the periapsis on a controlled trajectory towards a pre-defined landing area. The target landing location is autonomously refined based on landing safety considerations (this could be done by the SINPLEX system but it outside of the scope of the project). The trajectory ends at the chosen refined landing site.
Scenario 2 is an asteroid descent and landing mission, in which the spacecraft performs an autonomous, precise and safe landing at a target location. This scenario is derived from the Marco Polo [9] and Hayabusa [10] missions. The scenario begins with the spacecraft entering into a slow collision course with a 1km diameter near Earth asteroid. The guided descent phase starts 400m above the surface aiming towards a given target landing location. The landing phase occurs at 50m altitude when vertical thrusters are turned off to avoid sample contamination and ends at touchdown.
Scenario 3 is a rendezvous and capture mission in which the spacecraft performs an autonomous, precise and safe capture of a target spacecraft. This is a Mars sample return scenario derived from the HARVD [12, 13] and FOSTERNAV [11] studies. In this scenario a spherical target sample container (20cm diameter) is in a 500km circular orbit around Mars and a chaser spacecraft with a capture mechanism executes manoeuvres to capture the target. Before the scenario starts, the spacecraft has successfully completed a search and approach phase, where the container is found and its orbit is estimated using a narrow view camera. The scenario begins in the middle of the closing phase when the spacecraft is in the same orbit as the target and is 1km behind. The spacecraft moves towards the target with a series of hoping manoeuvres, the last of which brings the spacecraft 100m behind the target. During the last 100m a forced translation manoeuvre is executed, where the spacecraft is continuously controlled to maintain a nominal velocity of 0.1m/s towards the target until the last 10m. The capture phase starts when a final thrust is used to move the spacecraft towards the target at 0.1m/s and is uncontrolled until target contact 100s later.
Scenario 4 is a Mars entry, descent and landing mission, in which the spacecraft performs an autonomous, precise and safe landing at a target location. This scenario is only used to include dynamics and environmental conditions for sensor hardware considerations which are not present in the other scenarios. Navigation performance is not considered.
The result of this study is a challenging list of requirements for a miniaturized planetary navigation system. In total, 107 mission requirements define the mission scenarios, performance requirements and environments. From this and the description of work, 40 system requirements were defined, including a mass limit of 6kg and power limit of 60W. A detailed requirements analysis identified 14 key system requirements to drive the PDR design. A further 51 detailed subsystem requirements were derived by combing the PDR design assumptions with the other requirements. The flight model (FM) is designed to meet all of these requirements.
Additionally, a set of performance validation procedures were defined, which are derived from analysis of current and future exploration missions. These are used to evaluate the navigation system performance during hardware-in-the-loop (HIL) tests and validate the performance requirements.

III. System Design

The SINPLEX FM is a highly integrated, fully redundant, miniaturized, autonomous navigation system capable of withstanding the harsh space environment faced in exploration missions. It is designed to meet the combined requirements of all described mission scenarios. The system features a suit of redundant components, each chosen to fulfil the performance requirements with minimal mass. A star tracker (STR) provides inertial relative attitude. A navigation camera provides target relative position and attitude information. A laser altimeter/range finder (LA) provides range information relative to the target and scaling information for the navigation camera images. An inertial measurement unit (IMU) provides high-rate specific force and angular rate information. Processing of sensor measurements is distributed over a number of components. Raw sensor measurements are processed on dedicated subsystem electronics and then fused together on the navigation computer (NC) using a navigation filter to compute the vehicle state. All components have a corresponding redundant counterpart, which is only used if a subsystem failure occurs. Data is passed between components using the SPA-S (Space Plug and Play Avionics) protocol over SpaceWire. As with any real-time system, it is important to control the timing and reference time of sensor measurements. SINPLEX uses a custom triggering protocol to control sensor timing, which uses a coded word on a single RS485 line shared by all sensors. The FM is shown in Figure 1 and is estimated to be 5.3kg and use 32W with redundant systems on idle, which meets the main mass and power requirements. The overall volume is roughly 170 x 210 x 200mm. The details of this design were presented in [19, 27].

Figure 1: Top (top) and bottom (bottom) view of SINPLEX FM.

Figure 2: Overview of the hardware building blocks, their interconnections and the six areas which are miniaturized.

Six main measures are adopted to create compact miniaturized hardware as shown in Figure 2:
1. All components are integrated in one housing. This reduces the number of mechanical interfaces and provides a stable platform for all sensors. The system is designed to be very stiff which allows the system to be qualified for several missions. The high thermal conductivity of the housing ensure good thermal conduction and equal temperature distribution throughout the system. All sensors are co-aligned and verified at lab level. Apart from saving mass, this measure also saves significant costs associated with qualification and co-alignment verification at spacecraft level.
2. All sensors share the same processor (partly), power conversion and data interface. The unnecessary redundancy of these components contained in conventional systems is removed.
3. All electrical subsystems are miniaturized using three main types of technologies: thin films instead of bond wires, through silicon vias instead of bond wires and mounting of bare dies.
4. Measurements from the navigation camera and LA are used in combination to update the navigation solution. Since both sensors need to be co-aligned a natural choice is that they share the same aperture, which saves the corresponding structural mass needed for creating duplicated baffled optical paths.
5. Since the navigation camera and LA take measurements of the same target they also share parts of the same optics. The optical path is split just before the detectors to differentiate the two signals.
6. All measurement data generated by the single sensors are fused together in the NC to achieve an accurate navigation solution. Sensor hybridization is a common technique for navigation which allows the weaknesses of each sensor to be compensated by the strengths of others. Additionally, sensors are aided by the navigation solution to simplify image processing measurements and reduce computational time.

The final FM design is the product of a 9 month long design phase. During the PDR design phase a number of design trade-offs were considered, including:
• Modular vs. integrated system
• Up to 3 sensors of each type
• All combinations of combining the STR, navigation camera and LA receiving optics apertures
• LA receiving field of view (FOV) size and direction
• A pointable laser vs. a static direction
• A 532nm vs. 1064nm laser
• Wishbone, SPA-1, SPA-U or SPA-S for the electronics interfaces
• STR FOV from 10deg to 20deg
• Navigation camera FOV from 40deg to 80deg
• Camera and LA measurement rates from 0.1Hz to 10Hz
• Image detectors with 512 x 512 pixels or 1024 x 1024 pixels
• IMU rates from 50Hz to 100Hz
• A full range of acceptable gyro and accelerometer error limits
• 3 options for reading out the sensors in the IMU
• 4 options for placing a redundant IMU sensor axis
• Using IMU sensors with and without integrated read-out electronics
• 4 options for IMU packaging

A direction was chosen for each design option, a preliminary FM design was created and this was reviewed at PDR by the advisory board. A detailed breadboard (BB) design was then created, which is equivalent to the FM design in terms of navigation performance. The BB design was reviewed at CDR by the advisory board. A number of components were descoped when building the BB in order to decrease costs and manufacturing time, none of which affect the ability to reach TRL 4. These include:
• Using a single set of sensors instead of a full redundant system
• Using a COTS DCPU-1 to distribute some power and act as a SPA-1 hub instead of designing a custom power distribution subsystem
• Using a COTS laser driver board instead of a custom board
• Using equivalent terrestrial components instead space qualified components
• Using Ethernet as the external interface
• Including a number of debug interfaces
• Not producing a cover for the SPAD
• Not implementing some image processing algorithms onboard
• Not implementing any FDIR software

The produced BB has a mass of 3.1kg and uses 16.3W which are well within the desired range.

III.1 IMU

The SINPLEX IMU is custom built using MEMS accelerometers and gyros in a 4-axis tetrahedral configuration providing redundancy on the sensor level. There are two IMUs in the FM providing redundancy on the subsystem level. Figure 3 shows the BB IMU design and a configuration with two stacked IMUs. One gyro/accelerometer pair is mounted on the main board and the other three pairs are folded up and mounted on the tetrahedral support structure. Two IMUs can be tightly stacked together with minimal volume.
Unfortunately there are no space qualified MEMS accelerometers or gyros available on the market today which meet the project goals for a small integrated system. This is one of the major technical changes of building a SINPLEX FM. Of course this is not a problem for the BB, which uses Colibrys MS9002 accelerometers and Analog Devices ADXRS646BBGZ gyros. These are the most suitable candidate sensors on the market today which meet the performance requirements. The accelerometer was designed for harsh environments and is currently undergoing space qualification, but the gyro's applicability in harsh environments is unknown. For both sensors additional research and testing would be needed to assess their performance in a space environment.
The main IMU electronics board is based on ÅAC Microtec's µRTU, which is a COTS component. This embedded computer is responsible for sampling the sensors, compensating for known errors from laboratory calibration (ex. temperature effects) and compensating for estimated errors sent from the navigation filter on the NC. Additionally, to conserve processing power on the NC, the IMU data is integrated at a high rate within the IMU processor instead of the more common approach of doing this on the NC.
The BB IMU was calibrated on a rotation table over the full range of dynamics needed for testing. All IMU requirements were passed up to the tested ranges except for two requirements related to sensor alignment, which can be easily fixed with minor design and integration procedure changes.

Figure 3: Model of single unfolded IMU (left) and stacked configuration with 2 IMUs (right).

Figure 4: Schematic overview of the LA subsystem.

III.2 Laser Altimeter

A micro-laser altimeter works by emitting short, powerful light pulses to a target surface and measuring the time of flight of the reflected photons. By using a single photon detector, the power of the emitted light pulse can be lowered dramatically with respect to traditional laser altimeters. This makes it feasible to use a passive Q-switched microchip laser that emits short light pulses, typically on the order of a nanosecond, at repetition rates of a few to several tens of kilohertz and pulse energies of a few micro-Joule. These high repetition rates allow statistical analysis of the time of flights, which reduces the effect of background or noise photons on the distance measurement while still maintaining a high measurement rate. The measurement accuracy is determined by the pulse length, the timing resolution of the detectors and electronics and the number of samples taking into account in the statistical analysis.
Figure 4 shows a schematic overview of the LA subsystem for the SINPLEX BB. A Teem Photonics MNG-03E laser is driven and powered by the custom designed LA laser driver board. The laser divergence is reduced from 12 mrad down to 1 mrad by the beam expander (BE) optics (Figure 5). The BE is designed so that the three mirrors which compose it fold the beam inside the SINPLEX housing, resulting in a very compact system. A Thorlabs FDS100 photodiode (PD) is located close to the laser and its output serves as the START signal for the time-of-flight measurement. The returned photons are measured by the receiving optics (RO), which are shared with the navigation camera (Figure 5) and are discussed later. A Sensl PCDMini-00020 single photon avalanche diode (SPAD) detects the returned photons and its output serves as the STOP signal for the time-of-flight measurement. The outputs of both the PD and the SPAD are fed into the custom designed LA analysis board which performs the time-of-flight measurement and statistical analysis of the signal on a Xilinx Virtex-4 FX12 FPGA. The expected measurement accuracy of the LA is 12cm 3sigma with a range up to 10km.
A custom LA board was designed, which combines the laser driver and analysis electronics into a single board which fits in the housing. However, due to time constraints the board was not produced and a COTS laser driver board was used in the BB.
The integrated BB laser was aligned with the receiver optics and functional test were done in a laboratory environment to validate the optical and electronic subsystems. The LA was tested up to a range of 32m and validated against the expected performance results. The measured throughput of the BE is 62% and the divergence is 0.8mrad which agree with the expected values.

Figure 5: Cutaway of laser beam expander optics (top left). Cutaway of navigation camera and laser receiver optics (top right). Cutaway of STR optics (bottom). In all images only one set of components is installed.

III.3 Navigation Camera and Laser Receiver Optics

One of the more difficult miniaturization efforts was to combine the navigation camera and LA receiver optics. For the LA a small spot of light reflecting from the target surface is detected by the SPAD. Due to the small opening angle, the preferred focal lengths are large. Furthermore, the signal to noise ratio (SNR) increases with the aperture. These properties lead to long and large optical systems. In general, navigation cameras are designed to have relatively large FOVs and assume relatively bright objects. This results in optical systems with short focal lengths and small apertures. To combine these two types of receiving optics a balance between the desired properties was needed.
In the BB design most of the same optics are shared between the two receivers, as shown in Figure 5. The design specifications of the two systems is listed in Table 1. The main lens barrel is a double gauss design, which is common in camera lenses. It consists of six lenses of different material in order to compensate for chromatic and geometric aberrations. The green light for the LA is split out by a notch reflection filter and exits the lens barrel through a hole on the side. The remainder of the light goes through the filter to the navigation camera detector. In order to compensate the aberrations introduced by the reflection filter, the camera detector is tilted by 1deg. Since there is not enough space to fit the SPAD in the focal plane a single lens with 4x demagnification is used to relay the green light, enabling the SPAD to be placed further away and ensuring the correct FOV. It is difficult, if not impossible to manufacture a notch reflection with the narrow bandwidth that is desired for the LA performance (~1nm). To avoid this problem a COTS dichroic beam splitter (Semrock NFD01-532) and add a narrow notch filter in the relay optics (Thorlabs FL05532-1) were used in combination. This reduces the overall transmission of the LA optics, but this can be compensated by statistical analysis.

Table 1: Optical properties for each sensor.

Specification Navigation Camera STR Camera LA Receiver
Rectangular FOV 40deg 16 x 20deg 3.2mrad
Entry aperture diameter 10mm 10 x 30mm elliptical 10mm
Sensor HAS2 HAS2 Sensl PCDMini-00020
Spectral range 40 – 800nm 40 – 800nm 532nm
Sensor size (active) 1024 x 1024pixels 472 x 712pixels 1pixel
Pixel size 18µm 18µm 20µm

III.4 Star Tracker Optics

The STR is based off of the Multiple Aperture Baffled Star Tracker (MABS) design developed by a consortium led by TNO. The main idea behind this sensor is to reduce the size of the STR baffle (the largest mechanical component of the system) by using reflective optics and integrating the baffle into the SINPLEX housing, as shown in Figure 5. The sensor's optical properties are listed in Table 1. The STR is defocused to improve centroiding accuracy. The baffles are placed such that the maximal amount of stray light on the detector is reduced.

III.5 On Board Computer

An OBC LiteTM EM with Ethernet add-on board is used for the NC. Ethernet was chosen for the spacecraft data interface instead of a space qualified interface (which is dependent on the spacecraft architecture) to reduce development efforts. The BB NC (Figure 6) has the following specifications: CPU @ 18MHz, ÅAC certified fault tolerant enhanced 32bit OpenRISC 1200; MAC/DSP, FPU, MMU, Wishbone rev.B3; 8kB Instruction cache; 8kB Data cache; 64MB SDRAM @ 72MHz; 8Gbit NAND parallel FLASH storage with advanced error correction; 26x GPIO TTL; 4x master/slave I2C busses (multimaster compatible) with pull-ups; 2x SPI master @ up to 9MHz clock; 1x fast USB Host v1.1 12Mbps with DMA when using ÅAC RIA mode; 1x Ethernet BASE-T 10/100 Mbps with DMA; 1x RS422; 1x RS485; 4x 12bit Analog IO with anti-aliasing filters at 160kHz; 48bit Spacecraft Elapsed Timer; Open Source Linux OS; Open Source Universal Bootloader (U-Boot).

Figure 6: OBC LiteTM EM with Ethernet add-on board.

III.6 Camera Front End Electronics

The STR and navigation camera both use identical custom designed front end electronics (FEE). In total there are 4 sets of FEEs, one for each navigation and STR camera, which are all placed on the bottom of the housing. Each FEE consists of a processing board and a detector board. The detector board hosts a HAS2 detector and is placed in the STR or navigation camera optical path. The amount of resources needed for image processing on the FEEs was not accurately known at the time of development, so the processing board is designed with more than enough resources. This also allows the system to serve as a development platform for future image processing applications. In the BB design the processing board consists of a Xilinx XC5VLX110T FPGA (a non-rad hard model was chosen to reduce development costs), 1GB RAM and 8MB flash.

III.7 Housing

One of the crucial elements of the integrated system approach is the housing design and manufacturing, which is called Sophia casting. With this method, a single monolithic aluminum housing was created and a large part of the mechanical interfaces were already in place after manufacturing, which allows for shared functionality.
A thermal analysis was performed on the most critical point, which is the temperature of the STR detectors. Qualitatively, when these detectors become too hot, the dark current will increase to a point that magnitude 6 stars cannot be observed with sufficient precision. The thermal model contains a baseplate with two elevated detector mounts. Based on the thermal analysis performed during PDR, a housing temperature up to 45 deg Celsius was used in the thermal analysis. It is clear from the analysis that the detector chip will experience a 5 deg Celsius drop assuming the baseplate of the system is bolted the spacecraft in a conductive manner. The main performance effect is on the dark current of the HAS2 detector, which would typically rise from 190 electrons/pixel/s at 25 deg Celsius to 300 electrons/pixel/s. This has a fractional/negligible negative effect on the dark current.

IV. Navigation Algorithm

The mission scenarios require an autonomous real-time navigation solution (vehicle position, velocity and attitude) at every mission phase. The SINPLEX system provides this by fusing sensor measurements on the NC in real-time with a delayed error state extended Kalman filter (EKF) similar to [21]. Sensor fusion aims to overcome the weaknesses of each sensor by combining all sensor measurements into a single accurate navigation solution. This effort aids in miniaturizing the system by loosening the performance requirements for each individual sensor.
The system and navigation modes of operation depend on the scenario phase, which were identified by the requirements analysis. The BB computes three types of navigation solutions, depending on the scenario. An inertial solution is calculated when the vehicle is high above the surface, as in the first half of scenario 1, by using absolution position measurements provided by the crater navigation algorithm. An inertial solution is also required for scenario 3; however there are no performance requirements for this case. A TR navigation solution is calculated when the vehicle is close to the surface when few mapped craters are seen, as in the second half of scenario 1 and all of scenario 2. In this case landmark recognition is used to find the target landing location and feature tracking is used to track the spacecraft's motion relative to the surface using a variant of EKF-SLAM [22]. Finally, a container relative navigation solution is calculated for scenario 3 by using object recognition measurements.
The navigation software is distributed over several subsystems, as shown in Figure 7. Additionally, the navigation algorithm is distributed over several software tasks running at three separate rates, as shown in Figure 8. Image processing algorithms are run on the camera FPGAs and only the final measurement results are sent to the NC. Raw and compensated IMU measurements are processed within the IMU subsystem, which uses the 100Hz high-rate (HR) navigation algorithm to provide 10Hz integrated delta-velocity and delta-angle increments to the NC, which are compensated for any rotations over the interval. Raw LA measurements are processed within the LA subsystem and range measurements are sent to the NC. There are two additional algorithms that are run directly on the NC: the navigation medium-rate (MR) and low-rate (LR) tasks. The MR task runs at 10Hz and is responsible for integrating the equations of motion, calibrating clocks and calculating the state transition matrix for the EKF. The LR task runs at 1Hz and is responsible for propagating the EKF with the state transition matrix and updating the EKF with the LA and image processing measurements. State corrections from the LR task are sent to the MR task to correct the navigation state. IMU error estimates (sensor biases and scale factors) are sent from the MR task to the HR task to compensate the integrated IMU measurements. A similar 2-rate approach was successfully used for the Hybrid Navigation System experiment [21], which allows for an optimal filter, reduced real-time constraints and the processing load to be spread out over time. An initial concept for the filter measurements was presented in [28].

Figure 7: Block diagram showing the distribution of navigation and image processing software.

Figure 8: Distribution of navigation algorithm over three separate rates.

V. Image Processing Algorithms

A number of image processing techniques are used to obtain useful measurements for each mission phase. This includes star tracking, crater navigation, landmark recognition, feature tracking, long range and close range object recognition.
STR measurements are used in every mission phase using the STR camera. The STR FEE receives a trigger from the NC, which starts the integration period of the detector. The image is preprocessed using dark image subtraction to avoid the presence of bad pixels and to reduce most of the background noise in the raw image. The centroiding algorithm runs in either lost-in-space (LIS) or tracking mode, depending on the availability of aiding information from the NC. In LIS mode the entire image is searched and in tracking mode successive image centroids are predicted using angular velocity aiding information [15]. In both cases the moment method is used to calculate centroids, which calculates the star location using a weighted average of bright pixels. Star identification is done on the NC using the pyramid algorithm [16] in LIS mode and a star neighbors approach [17] in tracking mode. Attitude determination uses the q-Method [18] and is also run on the NC.
Crater navigation is only used in the descent orbit phase of scenario 1 when a high-quality database of mapped craters is available and the spacecraft is high enough above the surface to see multiple mapped craters in a single image. The crater finding algorithm runs on the FEEs and processes an image to find and measure candidate craters by matching light and dark (in shadow) sections of a crater [23]. On the NC, the crater identification algorithm uses a stochastic approach to search for the candidate craters in a database of known craters, which are then used to find the absolute position and attitude of the spacecraft [24]. The estimated navigation solution is used to aid this search effort. The crater navigation algorithm was developed within the SINPLEX project, but was not implemented onboard.
Landmark recognition is used to identify the target landing location relative to the spacecraft. Without this information the control system would not be able to interpret the TR navigation solution. However, this algorithm involves hazard avoidance, which is outside of the scope of the SINPLEX project. Therefore, this is reduced to a functional model for the BB.
Feature tracking is used during the powered descent and landing phases of scenario 1 and throughout all of scenario 2, when crater navigation is not suitable. On the FEEs, the image is searched with the ``Good Features to Track'' algorithm [25] to find suitable features with properties that would be easy to find in subsequent images. Features are re-found in subsequent images using brute force search around a position estimated from aiding data. This is the most important image processing algorithm used during HIL testing, since it provides several simultaneous terrain measurements.
Long range object recognition is used in scenario 3 when the sample container is less than a few pixels wide in the navigation camera image (distances greater than ~50m). At these distances the width of the container cannot be accurately measured and the container image is similar to a bright star, which is not easily distinguishable from the background stars. The algorithm first uses the star centroiding algorithm to compute the centers of all bright stars (including the container). On the NC, the centroid locations are transformed into an inertial vector and stored. Comparing the list of inertial vectors over time, the container is found by identifying the one vector which moves with respect to the inertial frame, providing a bearing measurement. Due to a lack of testing time, this algorithm was not implemented on the BB.
Close range object recognition is used in scenario 3 when the sample container is more than a few pixels wide, such that shading details can be seen and the width of the image can be measured. In this case the container will be seen as a partially lit sphere. A multiple step algorithm is entirely run on the FEEs. It starts by using aiding information to reduce the search space. It then uses a magnitude threshold and ``blob'' finding algorithm to identify the pixels belonging to the container. It then fits these pixel locations to a circle by minimizing a cost function relating the circle radius and the number of pixels included inside the circle. The result is the center and diameter of the fit circle. This algorithm was implemented on the BB, but there was not enough time to test it in HIL.

VI. Test Facility Preparation

Two test facilities were foreseen to test the BB: TRON (Testbed for Robotic Optical Navigation) and TENSOR (Test Environment for optical Navigation Systems On airfield Runway). First, a list of requirements for the test benches was created based on the mission scenarios. A design for the test facilities was then created, which addressed the needs to test and verify the performance of the BB. The facilities were then adapted in order to meet the requirements for the environment and optical target properties, including building optical targets, configuring the facilities and producing mechanical adaptors. The BB was mounted in TRON for all navigation performance tests. Tests in TENSOR were descoped due to project timeline constraints.
Terrain relative navigation tests were done in the TRON laboratory [26]. The BB was mounted to the end of a 7-DOF (degrees of freedom) robotic arm (Figure 9) which can move along a 10.5m track. The robot can then be precisely positioned in real-time using a dSPACE real-time simulator (a modular COTS real-time simulation platform) and can automatically run through any programmed trajectory, within the limits of the robot. Several 3D terrain models are present in the lab on 3 of the walls, which represent scaled Moon and asteroid landscapes. A sample return container model and a complete 3D scaled asteroid model are also available. A 5-DOF lamp and gantry system provide uniform lighting in a programmed direction and path. Figure 10 shows the BB attached to the robot and all the additional hardware components needed for HIL testing.
TRON can also provide an accurate measurement of the run trajectory by tracking the BB with a laser tracker (AT901-MR from Lecia). A T-Mac device is rigidly attached to the BB, which can be tracked by the laser tracker in 6-DOF.
The position and attitude alignment between the T-Mac and navigation camera is measured by tracking the T-Mac with the laser tracker while the navigation camera views an optical calibration target in various poses. With this, the reference trajectory from the laser tracker can be transformed into the reference trajectory of the navigation camera. This provides an accurate reference trajectory to compare with the navigation data from the BB.
A Jenoptik Optical Sky field Simulator (OSI) stimulates optical camera systems for observing objects with an optically finite distance. It mainly consists of an optical head, which projects the image, and a control computer including remote interface software. The optical head includes a micro display with 800 x 600 pixel resolution and is inserted into the STR baffle with a special adapter. The OSI host PC receives attitude commands remotely from the dSPACE system. The PC calculates and projects a simulated image of the sky (including stars, planets, the Moon, single event upsets, etc.) through the optical head as a collimated beam. The beam enters the camera, the NC triggers the camera, and the STR processes the captured image. Optical distortions in the STR image caused by the OSI orientation are calibrated and compensated in the STR and NC software.

Figure 9: SINPLEX breadboard attached to TRON robotic arm.

Figure 10: Additional hardware components attached to breadboard for HIL testing.

VII. Test Campaign

The test campaign began in the manufacturing phase with individual subsystem unit tests performed by the responsible partners. Two “flat-sat” tests were done during the summer of 2013 to functionally test the internal communications between subsystem components and external interfaces. During the design and manufacturing phase all of the image processing and navigation algorithms were tested in software-in-the-loop and on development FPGA boards. After system integration and alignment, the integrated BB passed the TRR and was ready for HIL tests in late October 2013. An aggressive testing phase then followed.
The amount of time available for HIL testing was significantly reduced compared to the original project work plan. For this reason, a number of tests were descoped including:
• Scenario 3 tests, including container finding algorithms
• Scaled trajectory testing with modelled sensor data
• TENSOR tests

The reduced HIL test plan started by testing each subsystem individually in the integrated system in order to confirm the results of previous tests and for some subsystem performance tests. This phase lasted roughly 8 weeks during which numerous problems were found in every subsystem. These problems were subsequently debugged, fixed, mitigated or accepted as a failure in the system. In mid-December 2013 the BB was integrated into TRON and aligned to the lab reference frame. In early 2014 initial tests in TRON showed a reasonable navigation solution in-line with expectations. At this point testing with representative trajectories began. A number of lateral motion trajectories were run with the two landscape terrain models in TRON and at different relative velocities, which are representative of the DO phase of scenario 1. Additionally, a number of runs using the last 10m of the asteroid trajectory were done, which is the best trajectory to use for performance requirements validation since no trajectory scaling is needed.

VIII. Navigation Performance

VIII.1 Software-in-the-Loop

Navigation performance was first evaluated using Monte Carlo simulations with the software-in-the-loop simulation. The SINPLEX simulation environment uses the subsystem requirements to characterize the sensor errors. For each mission scenario, a Monte Carlo of 20 runs was done. Each run used random values for all sensor parameters, noises, errors, etc. The results were then combined to form a statistically significant performance estimate.
Two different versions of the navigation algorithm were tested: the BB code and the development code. The development code is a full fidelity version of the SINPLEX navigation filter. The BB code is a lower fidelity filter which implements several approximations, lower precision variables and short cuts in order to speed up the code so that it can run in real-time on the NC. The major changes for the BB code are as follows:
• Single precision (instead of double) for most variables
• Only one feature position is estimated and tracked by the filter
• Approximations to several equations

For scenario 1, the results show that both the development and BB codes fail several of the navigation requirements. There is still much improvement to be desired in altitude and velocity performance. Altitude performance can be improved by pointing the LA towards nadir instead of along the flight path. Velocity performance can be improved by tracking more features, using better accelerometers or changing the trajectory to allow better online characterization of accelerometer errors. Using more features would increase the processing burden, however. Further analysis is needed to determine if tighter accelerometer performance requirements are necessary.
For scenario 2, the results show that both the development and BB codes fail some of the navigation requirements. The worst case 3σ value is used, which represents only a small time period during each phase. For example, during the GD phase, the velocity errors are the worst just after each burn and after 50s this error is significantly lower. Excluding these transient phases greatly improves the reported performance. The development code only fails the TR attitude requirement at landing, which can be improved by changing the trajectory or increasing the number of tracked features..
For scenario 3, the results show that both the development and BB codes fail several of the navigation requirements. Again, the worst case 3σ value is shown here, which represents only a small time period during each phase. Excluding these transient phases greatly improves the reported performance.

VIII.2 Hardware-in-the-Loop

Without using scaled trajectories only some of the navigation performance requirements can be verified. Due to time and technical constraints, testing in TRON focused on DO representative trajectories which fly laterally over a terrain and on the last portion of the asteroid landing trajectory.
Initially, a spiral calibration trajectory was used to check system stability and get an initial performance estimate. The results show a stable filter and everything seems to check out. These initial results are published in [20, 29].
16 different runs were done using a linear lateral trajectory with different velocities and terrain models. The results cannot be directly compared to most of the scenario 1 DO requirements since the HIL test setup is too different from the actual mission scenario. However, the results show the general performance of the system for lateral motion, which is around 0.2m position error drift, 5cm/s velocity error, 0.1deg inertial attitude error, 0.1m altitude error and 20deg TR attitude error for most of the runs. The only value that can be compared to the mission requirements is the inertial attitude error, which has a requirement of 0.03deg. This fails the requirement, which is likely due to the poor gyro performance and STR image quality.
10 runs were done using the asteroid landing trajectory using the asteroid wall in TRON. The only difference between the HIL setup and the mission scenario is that in HIL Earth gravity is used and the acceleration dynamics are completely different. Therefore, the asteroid mission performance requirements can be compared to the HIL results, but these differences should be considered when verifying the requirements. Figure 11 - Figure 15 show the results. The trajectory starts at a time of 5sec. The errors for each run are plotted individually (thin lines), along with the 3-sigma RMS of the errors (thick, dashed blue lines) and the corresponding performance requirement (thick, dashed red lines) if applicable. Not counting transients and outlying points, this meets the velocity error requirements but not the altitude and TR attitude requirements. This can be improved by choosing a different trajectory, improving the lighting in TRON for better feature recognition and tracking more features in the filter. More detailed and final results are presented in [30].

Figure 11: Attitude error for all asteroid trajectory HIL runs.

Figure 12: Velocity error for all asteroid trajectory HIL runs.

Figure 13: Absolute position error for all asteroid trajectory HIL runs.

Figure 14: Feature plane distance error for all asteroid trajectory HIL runs.

Figure 15: Feature plane attitude error for all asteroid trajectory HIL runs.

IX. References
1. SINPLEX document, “Description of Work,” 2011-09-06.
2. SINPLEX document, “Mission Requirements Compilation,” 2012-04-20.
3. S. Theil and H. Krüger, ‘‘Definition Referenzmissionen,’’ Internal Report AT-RYNR-TN-004, DLR, November 2011.
4. S. Theil and H. Krüger, ‘‘Analyse Missionen,’’ Internal Report AT-RYNR-TN-002, DLR, 2010.
5. Jody L. Davis AND et. al., ‘‘Advances in POST2 End-to-End Descent and Landing Simulation for the ALHAT Project,’’ No. AIAA 2008-6938, Honolulu Hawaii, August 2008.
6. C. D. Epp and T. B. Smith, ‘‘Autonomous Precision Landing and Hazard Detection and Avoidance Technology (ALHAT),’’ Aerospace Conference, 2007 IEEE, March 2007.
7. Andrew G. Santo AND et. al., ‘‘The MESSENGER mission to Mercury: spacecraft and mission design,’’ Planetary and Space Science, Vol. 49, 2001, pp. 1481--1500.
8. B. Houdou and the ESA NEXT Lunar Lander Team, ‘‘NEXT Lunar Lander with In-Situ Science and Mobility: Phase A Mission Study, Mission Requirements Document,’’ Internal Report NEXT-LL-MRDESA(HME)-0001, ESA, October 2008.
9. D. Escorial AND et. al., ‘‘CDF Study Report: Marco Polo,’’ CDF Study Report CDF-72(A), ESA, May 2008.
10. J. i. Kawaguchi, A. Fujiwara, and T. Uesugi, ‘‘Hayabusa: Its technology and science accomplishment summary and Hayabusa-2,’’ Acta Astronautica, Vol. 62, 2008, pp. 639--647.
11. Astrium FOSTERNAV Team, ‘‘FOSTERNAV: Mission Requirements Compilation Document,’’ Internal Report GNC_T.TCN.756274.ASTR EADS Astrium, June 2011.
12. ‘‘HARVD GMV Team Solution,’’ presentation at estec, GMV, January 2010.
13. ‘‘HARVD: Final Presentation,’’ internal report, EADS Astrium, Februrary 2010.
14. J. A. Christian, A. M. Verges, and R. D. Braun, ‘‘Statistical Reconstruction of Mars Entry, Descent, and Landing Trajectories and Atmospheric Profiles,’’ AIAA SPACE 2007 Conference and Exposition, No. AIAA 2007-6192, Long Beach, California, AIAA, September 2007.
15. M. Samaan, T. Pollock, and J. Junkins, ‘‘Predictive Centroiding for Star Trackers with the Effect of Image Smear,’’ Journal of the Astronautical Sciences, Vol. 50, 2002, pp. 113--123.
16. D. Mortari, M. Samaan, and J. Junkins, ‘‘Lost-in-space pyramid algorithm for robust star pattern recognition,’’ The Journal of Navigation, Vol. 51, No. 3, 2004.
17. M. Samaan, D. Mortari, and J. L. Junkins, ‘‘Recursive Mode Star Identification Algorithms,’’ Journal of IEEE, Transaction of Aerospace and Electronic Systems, Vol. 41, October 2005, pp. 1246--1254.
18. P. Davenport, ‘‘A Vector Approach to the Algebra of Rotations with Applications,’’ Technical Report TN D-4696, NASA, August 1968.
19. S. R. Steffes and et. al., ‘‘SINPLEX: a Small Integrated Navigation System for Planetary Exploration,’’ 36th Annual AAS Guidance and Control Conference, Breckenridge, Colorado, AAS, February 2013. AAS 13-043.
20. S. R. Steffes and et. al., “Target Relative Navigation Results from Hardware-in-the-Loop Tests Using the SINPLEX Navigation System,” 37th Annual AAS Guidance and Control Conference, Breckenridge, Colorado, AAS, February 2014. AAS 14-402.
21. S. R. Steffes, ‘‘Real-Time Navigation Algorithm for the SHEFEX2 Hybrid Navigation System Experiment,’’ Proceedings of the AIAA Guidance, Navigation, and Control Conference, Minneapolis, Minnesota, AIAA, August 2012. AIAA-2012-4990, 10.2514/6.2012-4990.
22. H. Durrant-Whyte and T. Bailey, ‘‘Simultaneous Localization and Mapping: Part I,’’ IEEE Robotics & Automation Magazine, June 2006, pp. 99--108.
23. B. Maass AND et. al., ‘‘An Edge-Free, Scale-, Pose- and Illumination-Invariant Approach to Crater Detection for Spacecraft Navigation,’’ 7th International Symposium on Image and Signal Processing and Analysis (ISPA 2011), Dubrovnik, Croatia, September 2011.
24. C. S. Spigai M. and V. Simard-Bilodeau, ‘‘An Image Segmentation-Based Crater Detection and Identification Algorithm for Planetary Navigation,’’ Intelligent Autonomous Vehicles, Lecce, Italy, September 2010.
25. J. Shi and C. Tomasi, ‘‘Good Features to Track,’’ 9th IEEE Conference on Computer Vision and Pattern Analysis, June 1994.
26. H. Krüger and S. Theil, ‘‘TRON-hardware-in-the-loop test facility for lunar descent and landing optical navigation,’’ 18th IFAC Symposium on Automatic Control in Aerospace, 2010.
27. Laan, Erik, et. al. “SINPLEX: A Small Integrated Navigation System for Planetary Exploration,” 64th International Astronautical Congress, Beijing, China, September 2013.
28. Heise, David; Steffes, Stephen; Theil, Stephan, “Filter Design for Small Integrated Navigator for Planetary Exploration,” 61. Deutscher Luft- und Raumfahrtkongress, Estrel Berlin, Germany, September 2012.
29. S. Conticello, et. al., “Development and test results of SINPLEX, a compact navigator for planetary exploration,” 4S Symposium, 26-30 May 2014, Porto Petro, Majorca, Spain. Accepted for publication.
30. S. Steffes, et. al. “Target Relative Navigation Performance Results from SINPLEX: a Miniaturized Navigation System,” GNC 2014: 9th International ESA Conference on Guidance, Navigation and Control Systems, June 2014, Oporto, Portugal. Accepted for publication.

Potential Impact:
The SINPLEX project impacts in several ways. First, it uses micro technologies and functional integration to reduce the mass (by factor of 3 and more) and size of the navigation subsystem. This would allow more payload mass (e.g. more instruments) thus enabling a higher scientific return. SINPLEX technology would also enable small vehicles to be guided and actively controlled allowing them to land closer to scientifically interesting spots.
SINPLEX’s navigation system is based on current micro technologies. They are used in today’s state-of-the-art consumer products and military applications where a high maturity level has already been reached. Tightly integrating the sensors into one navigation system allows the weaknesses of single sensors to be compensated by the strengths of others. This provides high robustness and flexibility in harsh and unexpected environments.
SINPLEX’s significantly reduced mass would increase the available delta v leading to reduced travel times. Shorter travel times would improve component reliability since the spacecraft would be exposed to the harsh space environment for a shorter time.
The SINPLEX navigation subsystem is part of the GNC subsystem for planetary exploration vehicles. It will autonomously provide a full navigation solution needed for approach, descent, landing, rendezvous and capture. The combination of miniaturization and integration techniques is unique. Standard (non-miniaturized) components for this application are developed. The integrated approach is new and complementary to past and current developments which use standard non-miniaturized components. Autonomous navigation is a key enabling technology for future planetary exploration missions. SINPLEX’s navigation subsystem will provide a robust and precise navigation solution which will contribute to the success of future crewed, cargo and robotic missions (MSR, Lunar Lander, and Marco Polo).
In order to increase impact and achieve an optimal exploitation of project results the following exploitation activities have been done:
• Presentation of project results in technical conferences.
• Presentation of the project results and their potential exploitation to ESA and national agencies.
• Demonstration of the breadboard in the test environment.
• Publication in EC brochures
• Maintenance of a public website.
Following the CDR and the first year review the discussion with ESA was started how to include the SINPLEX technology in ESA programmes. Since the SINPLEX technology can be also used for approach and rendezvous with space debris objects a first idea for ESA’s programme with a focus on Clean Space was discussed with ESA, national agencies and potential other partners.
With the new Horizon 2020 programme a new opportunity arises to continue the development of SINPLEX within an EU project. The programme of the space call 2014 includes the development of technologies to access Near Earth Objects (NEOs). SINPLEX perfectly fits in this activity (PROTEC-2). Since a development of technologies up to TRL 5 and 6 is expected and SINPLEX is already reaching a TRL 4 at the end of the project, the Horizon 2020 space call offers an optimal opportunity for further maturation of the SINPLEX technologies and the development of an Engineering Model.
Furthermore each partner of the SINPLEX project has individual exploitation plans:
TNO:
TNO developed the first time a system with multiple AOCS sensors in single housing. The assembly and testing of Star tracker and navigation camera provide a proof of concept, TRL-4. Such a solution was not existing before. Several potential space missions have been identified as potential candidates for flying such a unit. Based on these missions the requirements for SINPLEX are based on the following mission secnario’s: lunar descent and landing, asteroid descent and landing, Mars descent and landing and container rendezvous and capture. The versatility of the unit is illustrated by the fact that it is currently considered to use it on a to be defined Space Debris Removal mission.

AAC:
There is a high potential that the IMU can be further developed into product that can be offered as off-the-shelf solution. The floating point matrix arithmetic accelerator designed as a FPGA-IP will be incorporated in future ÅAC products.

Cosine:

SystematIC:
SystematIC design is a design house for analog and mixed mode electronics and ASICs. The knowledge gained in this project can be advantageously applied in new projects. The Sinplex project was more oriented towards the digital design, while most of the projects of SystematIC are focusing on the analog signal processing. The design of complex (16 layer) PCB’s including complex FPGA’s can now be exploited for new projects and can be advertised as one of the competences of SystematIC design.

DLR:
Lessons Learned during the development process for this navigation system architecture as well as results of the test campaign will be feed back into future developments in the same sector or in navigation systems for other space or terrestrial applications. The simulation developed to test algorithms and performance will be exploited for the development of future applications. Part of DLR’s work is the development of image processing algorithms to be deployed on FPGAs. This heritage can also be exploited for future navigation applications for spacecraft or terrestrial vehicles. Within SINPLEX the test facilities have to be adapted to cover the scenarios of SINPLEX. Within this increased envelope the test facilities can be offered to a broader range of customers for testing. DLR will use the software and adapted FPGA hardware description for the star tracker in its SHEFEX-3 mission. The use of SINPLEX hardware components (e.g. detector board, FPGA board etc.) has to be analysed and decided.

List of Websites:
http://www.sinplex.eu