A photo of an RPi and a note about the first lecture.

The Latest Projects From Cornell’s ECE 4760/5730

ECE 4760/5730 is the Digital Systems Design Using Microcontrollers course at Cornell University taught by [Hunter Adams]. The list of projects for spring this year includes forty write-ups — if you haven’t got time to read the whole lot you can pick a random project between 1 and 40 with: shuf -i 1-40 -n 1 and let the cards fall where they may. Or if you’re made of time you could spend a few days watching the full playlist of 119 projects, embedded below.

We won’t pick favorites from this semester’s list of projects, but having skimmed through the forty reports we can tell you that the creativity and acumen of the students really shines through. If the name [Hunter Adams] looks familiar that might be because we’ve featured his work here on Hackaday before. Earlier this year we saw his Love Letter To Embedded Systems.

While on the subject, [Hunter] also wanted us to know that he has updated his lectures, which are here: Raspberry Pi Pico Lectures 2025. Particularly these have expanded to include a bunch of Pico W content (making Bluetooth servers, connecting to WiFi, UDP communication, etc.), and some fun lower-level stuff (the RP2040 boot sequence, how to write a bootloader), and some interesting algorithms (FFT’s, physics modeling, etc.).

Continue reading “The Latest Projects From Cornell’s ECE 4760/5730”

Spatial Audio In A Hat

Students from the ECE4760 program at Cornell have been working on a spatial audio system built into a hat. The project from [Anishka Raina], [Arnav Shah], and [Yoon Kang], enables the wearer to get a sense of the direction and proximity of objects in the immediate vicinity with the aid of audio feedback.

The heart of the build is a Raspberry Pi Pico. It’s paired with a TF-Luna LiDAR sensor which is used to identify the range to objects around the wearer. The sensor is mounted on a hat, so the wearer can pan the sensor from side to side to scan the immediate area for obstacles. Head tracking wasn’t implemented in the project, so instead, the wearer uses a potentiometer to indicate to the microcontroller the direction they are facing as they scan. The Pi Pico then takes the LIDAR scan data, determines the range and location of any objects nearby, and creates a stereo audio signal which indicates to the wearer how close those objects are and their relative direction using a spatial audio technique called interaural time difference (ITD).

It’s a neat build that provides some physical sensory augmentation via the human auditory system. We’ve featured similar projects before, too.

Continue reading “Spatial Audio In A Hat”

Keyboard Hero: A Barebones Alternative To The Guitar Version

Guitar Hero was all the rage for a few years, before the entire world apparently got sick of it overnight. Some diehards still remember the charms of rhythm games, though. Among them you might count [Joseph Valenti] and [Daniel Rodriguez], who built a Keyboard Hero game for their ECE 4760 class at Cornell.

Keyboard Hero differs quite fundamentally from Guitar Hero in one major way. Rather than having the player tackle a preset series of “notes,” the buttons to press are instead procedurally generated by the game based on incoming audio input. It only works with simple single-instrument piano music, but it does indeed work. A Raspberry Pi Pico is charged with analyzing incoming audio and assigning the proper notes. Another Pi Pico generates the VGA video output with the game graphics, which is kept in sync with the audio pumped out from the first Pico so the user can play the notes in time with the music. Rather than a guitar controller, Keyboard Hero instead relies on five plastic buttons assembled on a piece of wood. It works.

It’s obviously not as refined as the game that inspired it, but the procedural generation of “notes” reminds us of old-school rhythm game Audiosurf. Video after the break.

Continue reading “Keyboard Hero: A Barebones Alternative To The Guitar Version”

Sand Drawing Table Inspired By Sisyphus

In Greek mythology, Sisyphus was a figure who was doomed to roll a boulder for eternity as a punishment from the gods. Inspired by this, [Aidan], [Jorge], and [Henry] decided to build a sand-drawing table that endlessly traces out beautiful patterns (or at least, for as long as power is applied). You can watch it go in the video below.

The project was undertaken as part of the trio’s work for the ECE4760 class at Cornell. A Raspberry Pi Pico runs the show, using TMC2209 drivers to command a pair of NEMA17 stepper motors to drag a magnet around beneath the sand. The build is based around a polar coordinate system, with one stepper motor rotating an arm under the table, and another panning the magnet back and forth along its length. This setup is well-suited to the round sand pit on top of the table, made with a laser-cut wooden ring affixed to a thick base plate.

The trio does a great job explaining the hardware and software decisions made, as well as showing off how everything works in great detail. If you desire to build a sand table of your own, you would do well to start here. Or, you could explore some of the many other sand table projects we’ve featured over the years.

Continue reading “Sand Drawing Table Inspired By Sisyphus”

Audio Localization Gear Built On The Cheap

Most humans with two ears have a pretty good sense of directional hearing. However, you can build equipment to localize audio sources, too. That’s precisely what [Sam], [Ezra], and [Ari] did for their final project for the ECE4760 class at Cornell this past Spring. It’s an audio localizer!

The project is a real-time audio localizer built on a Raspberry Pi Pico. The Pico is hooked up to three MEMS microphones which are continuously sampled at a rate of 50 kHz thanks to the Pico’s nifty DMA features. Data from each microphone is streamed into a rolling buffer, with peaks triggering the software on the Pico to run correlations between channels to determine the time differences between the signal hitting each microphone. Based on this, it’s possible to estimate the location of the sound source relative to the three microphones.

The team goes into great deal on the project’s development, and does a grand job of explaining the mathematics and digital signal processing involved in this feat. Particularly nice is the heatmap output from the device which gives a clear visual indication of how the sound is being localized with the three microphones.

We’ve seen similar work before, too, like this project built to track down fireworks launches. Video after the break.

Continue reading “Audio Localization Gear Built On The Cheap”

Meet Cucumber, The Robot Dog

Robots can look like all sorts of things, but they’re often more fun if you make them look like some kind of charming animal. That’s precisely what [Ananya], [Laurence] and [Shao] did when they built Cucumber the Robot Dog for their final project in the ECE 4760 class.

Cucumber is controllable over WiFi, which was simple enough to implement by virtue of the fact that it’s based around the Raspberry Pi Pico W. With its custom 3D-printed dog-like body, it’s able to move around on its four wheels driven by DC gear motors, and it can flex its limbs thanks to servos in its various joints. It’s able to follow someone with some autonomy thanks to its ultrasonic sensors, while it can also be driven around manually if so desired. To give it more animal qualities, it can also be posed, or commanded to bark, howl, or growl, with commands issued remotely via a web interface.

The level of sophistication is largely on the level of the robot dogs that were so popular in the early 2000s. One suspects it could be pretty decent at playing soccer, too, with the right hands behind the controls. Video after the break.

Continue reading “Meet Cucumber, The Robot Dog”

Tamagotchi Torture Chamber Is Equal Parts Nostalgia And Sadism

Coming in hot from Cornell University, students [Amanda Huang], [Caroline Hohner], and [Rhea Goswami] bring a project that is guaranteed to tickle the funny bone of anyone in the under-40 set, and sadists of all ages: The Tamagochi Torture Chamber.

Tamagotchi Torture Chamber displaying Tombstone
He’s dead, Jim.

In case you somehow missed it, Bandai’s Tamagochi is a genre-defining digital pet that was the fad toy at the turn of the millennium, and has had periodic revivals since. Like the original digital pet, there are three pushbuttons to allow you to feed, play with, and clean your digital pet. These affect the basic stats of happiness, health, food and weight in ways that will be familiar to anyone who played with the original Tamagochi. Just as with the original, mistreatment or neglect causes the Tamagochi to “die” and display a tombstone on the TFT display.

Where the “Torture Chamber” part comes in is the presence of an accelerometer and soft physics simulation– the soft physics gets an entire core of the Pi Pico at the heart of this build dedicated to it, while the other core handles all inputs, display and game logic. What this enables is the ability to bounce the digital pet off the walls of its digital home with an adorable squish (and drop in health stat) by tilting the unit. You can check that out in the demo video blow.

Is it overkill for a kids toy to have a full soft body simulation, rather than just a squish-bounce animation? Probably, but for an ECE project, it lets the students show off their chops… and possibly work out some frustrations.

We won’t judge. We will point you to other Tamagotchi-inspired projects, though: like this adorable fitness buddy, or this depressingly realistic human version.

If you’ve got an innovative way to torture video game characters, or a project less likely to get you on Skynet’s hitlist, don’t forget to send in a tip!

Continue reading “Tamagotchi Torture Chamber Is Equal Parts Nostalgia And Sadism”