The Lambda Papers: When LISP Got Turned Into A Microprocessor

The physical layout of the SCHEME-78 LISP-based microprocessor by Steele and Sussman. (Source: ACM, Vol 23, Issue 11, 1980)
The physical layout of the SCHEME-78 LISP-based microprocessor by Steele and Sussman. (Source: ACM, Vol 23, Issue 11, 1980)

During the AI research boom of the 1970s, the LISP language – from LISt Processor – saw a major surge in use and development, including many dialects being developed. One of these dialects was Scheme, developed by [Guy L. Steele] and [Gerald Jay Sussman], who wrote a number of articles that were published by the Massachusetts Institute of Technology (MIT) AI Lab as part of the AI Memos. This subset, called the Lambda Papers, cover the ideas from both men about lambda calculus, its application with LISP and ultimately the 1980 paper on the design of a LISP-based microprocessor.

Scheme is notable here because it influenced the development of what would be standardized in 1994 as Common Lisp, which is what can be called ‘modern Lisp’. The idea of creating dedicated LISP machines was not a new one, driven by the processing requirements of AI systems. The mismatch between the S-expressions of LISP and the typical way that assembly uses the CPUs of the era led to the development of CPUs with dedicated hardware support for LISP.

The design described by [Steele] and [Sussman] in their 1980 paper, as featured in the Communications of the ACM, features an instruction set architecture (ISA) that matches the LISP language more closely. As described, it is effectively a hardware-based LISP interpreter, implemented in a VLSI chip, called the SCHEME-78. By moving as much as possible into hardware, obviously performance is much improved. This is somewhat like how today’s AI boom is based around dedicated vector processors that excel at inference, unlike generic CPUs.

During the 1980s LISP machines began to integrate more and more hardware features, with the Symbolics and LMI systems featuring heavily. Later these systems also began to be marketed towards non-AI uses like 3D modelling and computer graphics. As however funding for AI research dried up and commodity hardware began to outpace specialized processors, so too did these systems vanish.

Top image: Symbolics 3620 and LMI Lambda Lisp machines (Credit: Jason Riedy)

Word Processing: Heavy Metal Style

If you want to print, say, a book, you probably will type it into a word processor. Someone else will take your file and produce pages on a printer. Your words will directly turn on a laser beam or something to directly put words on paper. But for a long time, printing meant creating some physical representation of what you wanted to print that could stamp an imprint on a piece of paper.

The process of carving something out of wood or some other material to stamp out printing is very old. But the revolution was when the Chinese and, later, Europeans, realized it would be more flexible to make symbols that you could assemble texts from. Moveable type. The ability to mass-produce books and other written material had a huge influence on society.

But there is one problem. A book might have hundreds of pages, and each page has hundreds of letters. Someone has to find the right letters, put them together in the right order, and bind them together in a printing press’ chase so it can produce the page in question. Then you have to take it apart again to make more pages. Well, if you have enough type, you might not have to take it apart right away, but eventually you will.

Continue reading “Word Processing: Heavy Metal Style”

Site Of Secret 1950s Cold War Iceworm Project Rediscovered

The overall theme of the early part of the Cold War was that of subterfuge — with scientific missions often providing excellent cover for placing missiles right on the USSR’s doorstep. Recently NASA rediscovered Camp Century, while testing a airplane-based synthetic aperture radar instrument (UAVSAR) over Greenland. Although established on the surface in 1959 as a polar research site, and actually producing good science from e.g. ice core samples, beneath this benign surface was the secretive Project Iceworm.

By 1967 the base was forced to be abandoned due to shifting ice caps, which would eventually bury the site under over 30 meters of ice. Before that, the scientists would test out the PM-2A small modular reactor. It not only provided 2 MW of electrical power and heat to the base, but was itself subjected to various experiments. Alongside this public face, Project Iceworm sought to set up a network of mobile nuclear missile launch sites for Minuteman missiles. These would be located below the ice sheet, capable of surviving a first strike scenario by the USSR. A lack of Danish permission, among other complications, led to the project eventually being abandoned.

It was this base that popped up during the NASA scan of the ice bed. Although it was thought that the crushed remains would be safely entombed, it’s estimated that by the year 2100 global warming will have led to the site being exposed again, including the thousands of liters of diesel and tons of hazardous waste that were left behind back in 1967. The positive news here is probably that with this SAR instrument we can keep much better tabs on the condition of the site as the ice cap continues to grind it into a fine paste.


Top image: Camp Century in happier times. (Source: US Army, Wikimedia)

The Great Northeast Blackout Of 1965

At 5:20 PM on November 9, 1965, the Tuesday rush hour was in full bloom outside the studios of WABC in Manhattan’s Upper West Side. The drive-time DJ was Big Dan Ingram, who had just dropped the needle on Jonathan King’s “Everyone’s Gone to the Moon.” To Dan’s trained ear, something was off about the sound, like the turntable speed was off — sometimes running at the usual speed, sometimes running slow. But being a pro, he carried on with his show, injecting practiced patter between ad reads and Top 40 songs, cracking a few jokes about the sound quality along the way.

Within a few minutes, with the studio cart machines now suffering a similar fate and the lights in the studio flickering, it became obvious that something was wrong. Big Dan and the rest of New York City were about to learn that they were on the tail end of a cascading wave of power outages that started minutes before at Niagara Falls before sweeping south and east. The warbling turntable and cartridge machines were just a leading indicator of what was to come, their synchronous motors keeping time with the ever-widening gyrations in power line frequency as grid operators scattered across six states and one Canadian province fought to keep the lights on.

They would fail, of course, with the result being 30 million people over 80,000 square miles (207,000 km2) plunged into darkness. The Great Northeast Blackout of 1965 was underway, and when it wrapped up a mere thirteen hours later, it left plenty of lessons about how to engineer a safe and reliable grid, lessons that still echo through the power engineering community 60 years later.

Continue reading “The Great Northeast Blackout Of 1965”

Lost Techniques: Bond-out CPUs And In Circuit Emulation

These days, we take it for granted that you can connect a cheap piece of hardware to a microcontroller and have an amazing debugging experience. Stop the program. Examine memory and registers. You can see and usually change anything. There are only a handful of ways this is done on modern CPUs, and they all vary only by detail. But this wasn’t always the case. Getting that kind of view to an actual running system was an expensive proposition.

Today, you typically have some serial interface, often JTAG, and enough hardware in the IC to communicate with a host computer to reveal and change internal state, set breakpoints, and the rest. But that wasn’t always easy. In the bad old days, transistors were large and die were small. You couldn’t afford to add little debugging pins to each processor you produced.

This led to some very interesting workarounds. Of course, you could always run simulators on a larger computer. But that might not work in real time, and almost certainly didn’t have all the external things you wanted to connect to, unless you also simulated them. Continue reading “Lost Techniques: Bond-out CPUs And In Circuit Emulation”

The Hottest Spark Plugs Were Actually Radioactive

In the middle of the 20th century, the atom was all the rage. Radiation was the shiny new solution to everything while being similarly poorly understood by the general public and a great deal of those working with it.

Against this backdrop, Firestone Tire and Rubber Company decided to sprinkle some radioactive magic into spark plugs. There was some science behind the silliness, but it turns out there are a number of good reasons we’re not using nuke plugs under the hood of cars to this day.

Continue reading “The Hottest Spark Plugs Were Actually Radioactive”

Spy Tech: The NRO And Apollo 11

When you think of “secret” agencies, you probably think of the CIA, the NSA, the KGB, or MI-5. But the real secret agencies are the ones you hardly ever hear of. One of those is the National Reconnaissance Office (NRO). Formed in 1960, the agency was totally secret until the early 1970s.

If you have heard of the NRO, you probably know they manage spy satellites and other resources that get shared among intelligence agencies. But did you know they played a major, but secret, part in the Apollo 11 recovery? Don’t forget, it was 1969, and the general public didn’t know anything about the shadowy agency.

Secret Hawaii

Captain Hank Brandli was an Air Force meteorologist assigned to the NRO in Hawaii. His job was to support the Air Force’s “Star Catchers.” That was the Air Force group tasked with catching film buckets dropped from the super-secret Corona spy satellites. The satellites had to drop film only when there was good weather.

Spoiler alert: They made it back fine.

In the 1960s, civilian weather forecasting was not as good as it is now. But Brandli had access to data from the NRO’s Defense Meteorological Satellite Program (DMSP), then known simply as “417”. The high-tech data let him estimate the weather accurately over the drop zones for five days, much better than any contemporary civilian meteorologist could do.

When Apollo 11 headed home, Captain Brandli ran the numbers and found there would be a major tropical storm over the drop zone, located at 10.6° north by 172.5° west, about halfway between Howland Island and Johnston Atoll, on July 24th. The storm was likely to be a “screaming eagle” storm rising to 50,000 feet over the ocean.

In the movies, of course, spaceships are tough and can land in bad weather. In real life, the high winds could rip the parachutes from the capsule, and the impact would probably have killed the crew.

Continue reading “Spy Tech: The NRO And Apollo 11”