18 Jul Celebrating the 50th Anniversary of Apollo 11
By Rachel Berkowitz, Ph.D., July 18, 2019
i2k Connect salutes our AI colleagues at NASA and many academic centers who contributed to the success of the U.S. space mission, notably the lunar landing 50 years ago. Intelligence had to be built into every aspect of the computing—most visibly in the robotics but also in all the computers entrusted with decision-making.
On 20 July 1969, astronaut Neil Armstrong took the famous small step for a man and giant leap for mankind. He had travelled from Florida to the moon’s orbit on the Apollo 11 spacecraft, launched on the Saturn V rocket four days earlier. Toward the end of the 12th lunar orbit, Apollo 11 split into two spacecraft: the lunar module Eagle, piloted by Armstrong and Edwin “Buzz” Aldrin; and the command module Columbia, piloted by Michael Collins. For the last 150 meters above the lunar surface, Armstrong and Aldrin manually maneuvered the Eagle to touch down gently on the smooth and level Sea of Tranquility.
The digital Apollo Guidance Computer (AGC) provided computation and electronic interfaces for guidance, navigation, and control for both the Columbia and Eagle. Developed in the early 1960s for the Apollo program by the MIT Instrumentation Laboratory, the AGC was one of the first integrated circuit-based computers. To enter a command, a user typed a two-digit “Verb” describing the type of action and another two-digit “Noun” specifying the data affected by the verb.
During the Eagle’s descent, the landing radar provided information regarding the module’s velocity and distance from the lunar surface. Despite a “reasonability check” performed by the software, radar data could not be incorporated into the computer’s position and velocity calculations without approval from the crew and ground. To check the spacecraft’s position, Aldrin gave the computer the command (Verb 16 Noun 68) to calculate and display the difference between altitude sensed by the landing radar and the computed altitude. Although the number displayed on the screen fell within expected error, alarm code 1202 appeared on the computer screen and sent a warning signal to Mission Control in Houston. Alarm 1202 meant that Aldrin’s command had overloaded the computer, telling it to do more work than it had time for.
Fortunately, Mission Control knew that a 1202 alarm indicated a peripheral hardware bug documented by Apollo 5 engineers. The Apollo team had decided it was safer to fly with the tested hardware rather than with a new but untested system. An uncorrected problem in the landing radar interface made it look like the antenna’s position was rapidly wobbling. Those phantom movements stole 13% of the computer’s duty cycle. Within half a minute of the alarm, Mission Control guidance officer Steve Bales issued a “Go” command to the Eagle. Aldrin told the computer to continue navigation based on measurements from the landing radar.
Several 1202 errors later (and a similar 1201 error), the Eagle had landed. Software designed by J. Halcombe “Hal” Laning ultimately saved the day. The software automatically recovered in the face of errors because it was designed to prioritize critical guidance and control tasks. Without it, the landing would have been aborted for lack of a stable guidance computer.
Although the Apollo computer had no more power than the first home computers of the late 1970s, it fulfilled the ground-breaking task of landing humans safely on the moon. Expert knowledge built the system; but the system’s success depended on humans who were aware of its strengths and limitations.
For further reading: https://www.doneyles.com/LM/Tales.html
About Rachel Berkowitz – Rachel is a science journalist and editor (Physics Today; APS Physics; freelance) with an academic background in geophysics. Rachel holds a Ph.D. in Geophysics from the University of Cambridge.
About i2k Connect – An Artificial Intelligence software and service company transforming business. Our Artificial Intelligence technology automatically identifies and tags documents with unique, accurate, and consistent metadata. Our deep data extraction uses computer vision, natural language processing, and statistical machine learning to understand the information in documents and deliver it to the people who need it to quickly make decisions. Our AI Platform and domain expertise are focused on vertical industries including oil and gas, utilities, financial services, and healthcare.