Planetary Lander Technology Publications

Publications authored or co-authored by members of the Space Technology Centre related to planetary lander technology are listed below:

  • S. Parkes, M. Dunstan, I. Martin, P. Mendham, S. Mancuso
    57th International Astronautical Congress
    October 2006

    ESA is planning a series of robotic missions to Mars, the safe delivery of which will be greatly assisted by landing systems capable of autonomous navigation and hazard detection and avoidance. Vision-based navigation is a promising technique which is currently being developed by ESA. The testing of vision-based navigation systems can benefit from computer-based planet surface simulations representative of the target planetary body.

  • S. Parkes
    Workshop on Advanced Space Vehicle Control (2006)
    September 2006
  • Steve Parkes, Martin Dunstan, Iain Martin
    SpaceOps 2006
    June 2006

    Future European planetary missions will include rovers to explore the surface of Mars. Mission operators will need to be trained to operate the rovers and to cope with various problems that could possibly occur during the mission. Operator training can be done using full scale mock-ups of the rover, Martian terrain and communications system once a complete rover is available. Training for fault recovery can be difficult on a real system as introducing the fault can be difficult. Another complementary method of operator training is to use a simulation of the rover and the planetary surface over which it is moving. It is then relatively easy to introduce faults into the simulation.

  • S. Parkes, M. Dunstan, I. Martin, S. Mancuso
    DASIA (Data Systems in Aerospace) 2005
    May 2005

    PANGU (Planet and Asteroid Natural Scene Generation Utility) is a software tool for supporting the development of autonomously guided planetary landers. It is designed to support work on vision-based lander navigation, vision and LIDAR based hazard detection and multi-sensor data fusion for planetary lander navigation and piloting. It may also be used for planetary rover navigation and path planning activities and for rover operator training. PANGU can synthesise realistic planetary surfaces representative of the Moon, Mercury, Mars, Asteroids or Comets. It can simulate cameras and LIDAR instruments viewing the surface.

  • M. Dunstan, S. Parkes, S. Mancuso
    DASIA (Data Systems in Aerospace) 2005
    May 2005

    This paper describes an image processing chip developed by the University of Dundee to support vision-based navigation of a planetary lander. The chip detects features in an image and tracks them from one image frame to the next. It manages the list of feature points and their status. Information about tracked feature points is sent to a Guidance and Navigation Control (GNC) computer for integration with information from the Inertial Measurement Unit. The connection to the GNC computer is via two SpaceWire links. A third SpaceWire link is used to control the camera providing images to the image processing chip.

  • V. Silva, S. Parkes
    6th International Conference on Dynamics and Control of Systems and Structure in Space (2004)
    July 2004
  • V. Silva, S. Parkes
    DASIA (Data Systems in Aerospace) 2004
    June 2004

    Recent planetary lander missions to Mars, such as UK’s Beagle 2, and NASA’s Spirit and Opportunity have highlighted the need for further research on vision based navigation and hazard detection and avoidance systems for autonomous planetary landers in order to achieve safe, soft, and precise landings. Landing on the Moon, on Mars or other planetary bodies, close to a predetermined target landing spot, in an area of rough terrain, is a difficult and risky task. Accurate navigation relative to the planetary surface is necessary, together with the detection of possible hazards like boulders or steep slopes. This is being made possible by development in vision-based guidance techniques and on-board processing technology.

  • Iain Martin, Steve Parkes, Martin Dunstan
    DASIA (Data Systems in Aerospace) 2004
    June 2004

    ESA is developing autonomous planetary guidance and lander systems for exploratory spacecraft. Ongoing research work at the University of Dundee is aiding this effort through the development of simulated planetary objects to facilitate the development and testing of autonomous navigation, guidance and landing systems. Realistic asteroids have been simulated in three dimensions using fractal techniques and crater models. An innovative use of Poisson Faulting has been developed to create irregularly shaped objects with a controllably rough surface. Simple craters are mapped onto the model to create a highly realistic, simulated asteroid.

  • S. Parkes, I. Martin, M. Dunstan
    Eighth International Conference on Space Operations (2004)
    May 2004

    Planetary landers have, in the past, relied on physical means to protect the payload from the shock of impact on the surface. These landers, starting their descent from orbit with their initial position only known to a few kilometres, were not required to land at a particular landing spot, but only to land safely. Today, much more knowledge, obtained from earlier landings and high-resolution orbiting instruments, is available about the surfaces of some planets than was available when previous landers were designed. Missions are becoming more demanding in terms of the accuracy of landing and significant effort is now focused on the design of surface relative navigation systems.

  • S. Parkes, M. Dunstan, D. Matthews, I. Martin, V. Silva
    DASIA (Data Systems in Aerospace) 2003
    June 2003

    The European Space Agency (ESA) is preparing for unmanned and/or manned exploration of several planets including Mercury and Mars. To support this endeavour several key technologies have to be developed. Among these necessary technologies are navigation sensors that can support landing and sample-return capsule rendezvous operations. A vision-based navigation sensor is currently being developed by ESA which will enable surface relative navigation. The Planet and Asteroid Natural-scene Generation Utility (PANGU) is a system for generating simulated surfaces of planets and for producing images of those surfaces from specified camera positions and orientations. PANGU was implemented specifically to support the development of vision-based navigation sensors and is now being used to validate associated image feature tracking algorithms.

  • V. Silva, S. Parkes
    DASIA (Data Systems in Aerospace) 2003
    June 2003

    The definition of guidelines for the design of an optimal GNC sensor architecture for spacecraft is a recurrent problem. Since the beginning of Space Age several generations of GNC sensors have been developed, flight tested and successfully used in space missions. With the advent of new technologies and materials, GNC sensors performances significantly improved on last decades and new sensors emerged. Concepts are evolving from ground-based to autonomous GNC systems. Hazard avoidance systems for planetary landers are replacing hazard tolerance systems. Both Lidar-based and Vision-based GNC systems are currently being developed to support autonomous landings. The optimum mix of sensors and the best way to fuse sensor data is an important area of research for future planetary lander space missions.

  • S.Parkes, I. Martin, M. Dunstan, S. Mills
    DASIA (Data Systems in Aerospace) 2002
    May 2002

    The ESA Bepi Colombo mission to Mercury is due for launch in 2009 and is expected to include a lander. Computer vision based navigation is being considered for the lander to enable it to land softly, close to a pre-designated target landing spot, avoiding any small craters, boulders or other obstacles not visible in an orbital survey. The development and testing of a vision-based lander guidance system requires high-resolution images of the planet’s surface which are not available for Mercury.

  • S. Parkes, V. Silva
    DASIA (Data Systems in Aerospace) 2002
    May 2002

    Computer vision technology is being developed by ESA to support the navigation and autonomous landing of spacecraft on a planet’s surface. A computer vision system is used to navigate the lander relative to the planet’s surface and to detect any obstacles close to the target landing site. The computer vision system will be supported by other Guidance and Navigation Control (GNC) sensors. This paper provides a review of GNC sensors for possible use on a planetary lander in support of a primary vision-based sensor. It also describes how those sensors will be simulated.

  • S. Parkes I. Martin, S. Strandmoe
    DASIA (Data Systems In Aerospace) 2001
    May 2001

    The ESA Bepi Colombo mission to Mercury is due for launch in 2009.  There are three components planned for the mission: a planetary orbiter, a magnetospheric satellite and a surface element.  Computer vision based guidance is being considered for the surface element to enable it to land close to a pre-designated target landing spot, avoiding any small craters, boulders or other obstacles not visible in an orbital survey.  The development and testing of a vision-based lander guidance system requires high-resolution images of the planet’s surface which are not available. The Planetary and Asteroid Natural scene Generation Utility (PANGU) is a tool for generating realistic simulated planetary surfaces and for producing the images of those simulated surfaces. This paper describes PANGU.

  • S. Parkes, I. Martin
    ICEUM (International Conference on Exploration and Utilization of the Moon) 2000
    July 2000

    Exploration and utilisation of the moon is likely to be undertaken with extensive support from robotic systems.  These robotic systems must be placed on the lunar surface close to or at the location in which they are to operate. To land a spacecraft safely at a required landing spot requires an autonomous piloting capability using computer vision to support navigation and obstacle avoidance. The landing is mission critical and any autonomous piloting system will require extensive testing and validation before it can be used in a mission. The Space Systems Research group at the University of Dundee recognised the need for a planetary surface simulation system to support the development of the mission critical computer vision techniques. Subsequent research led to the LunarSim system which may be used to produce simulated lunar surfaces complete with craters and other surface features. Images can then be taken of this simulated surface from any position and orientation above the surface.

  • Steve Parkes, Iain Martin
    5th International Conference on Virtual Systems and Multimedia (1999)
    September 1999

    A Virtual Reality simulation of a planetary surface can be used to support the testing and development of vision-based guidance systems for unmanned space-probes intended to land on other planets or bodies in the solar system. These space-probes will operate autonomously during landing and will use vision bases algorithms for navigation and guidance down to a safe landing spot. Feasible vision-based guidance systems have been developed but must be extensively tested on suitable simulations of planetary surfaces. The University of Dundee has developed a virtual reality simulation of the lunar surface to support exhaustive testing of vision-based guidance software.

  • S. Parkes, I. Martin
    Information Visualization IV 1999
    July 1999

    Virtual reality techniques are being used to support the development of unmanned space-probes intended to land on other planets in the solar system.  These planetary landers will operate autonomously during landing and will use vision for navigation and guidance down to a safe landing spot.  Suitable vision techniques have been developed but must be extensively tested on realistic test surfaces. A system (LunarSim) for producing realistic simulations of heavily cratered planetary surfaces has been developed to support exhaustive testing of vision guidance software.

  • S. Parkes, I. Martin, I. Milne
    DASIA (Data Systems in Aerospace) 1999
    May 1999

    Landing on the moon, close to a predetermined target landing spot, in an area of rough terrain, is a difficult task. Accurate navigation relative to the lunar surface is necessary, together with the detection of possible hazards like boulders or steep slopes.  The ESA 3D Planetary Modelling study demonstrated the feasibility of using vision for the guidance of planetary landers.  Vision guidance algorithms were tested using a robotic frame to move a camera above a 2m x 1m physical mock-up of the lunar surface. The use of physical models of a planetary surface for the development and testing of the vision techniques suffers from several problems including difficulty in building and using them and their cost.