From these digital panoramas, the AI will be able to correlate known ridges and boulders with those visible in images taken by crewed or robotic lunar missions, providing accurate location information for any given region of the Moon’s surface.
Dr Yew explained: “Conceptually, it’s like going outside and trying to figure out where you are by surveying the horizon and surrounding landmarks.
“While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet. This accuracy opens the door to a broad range of mission concepts for future exploration.”
In future, the same concept might also be applied to other planets — helping explorers on Mars navigate using the Red Planet’s terrain, for example, or even enabling terrestrial navigation in areas where GPS signals are blocked or spotty.
For efficiency’s sake, Dr Yew explained, practical applications of the concept might see only the necessary local subsets of terrain and elevation data downloaded to a navigation handset for a given mission.
According to research published by Goddard Space Flight Center planetary scientist Dr Erwan Mazarico, a lunar explorer would only be able to see up to 180 miles from any given location on the Moon.
DON’T MISS:
NASA Mars lander shares heartwarming final message as craft runs low [REPORT]
Christmas Strep A warning as grandparents are more at risk than kids [INSIGHT]
China’s Covid crisis deepens as ‘bodies pile up in morgues’ [ANALYIS]
Article source: https://www.express.co.uk/news/science/1712180/nasa-ai-artificial-intelligence-moon-navigation-system-lunar-horizon-lola-giant