ECE516 Lab2 S.W.I.M Sequential Wave Imprinting Machine

by HOssias in Circuits > Cameras

12 Views, 0 Favorites, 0 Comments

ECE516 Lab2 S.W.I.M Sequential Wave Imprinting Machine

image.png

Today we will answer a fundamental principle of cameras. How do they measure? It is a well known fact that photosensors in cameras are lighmeters in an array pattern and produce data to be calculated into the amount of light reflected off of an object. However, we want to understand the physics of camera down to the pixel. We can simulate this using a computer controlled LED strip that moves along an array and outputs various patterns depending on the wavelength of light being emitted from each individual LED. This device is called a S.W.I.M and was invented in the 70s by Steve Mann. In this experiment, we will investigate what a S.W.I.M or Sequential Wave Imprinting Machine is and how to build one. Our end goal is to develop a system that can emit messages over a long exposure shot on a camera or even in front of the naked eye.


Using code from the gitlab repository below, we will learn how to build this system.

https://gitlab.com/HarrisonOssias/ece516-lab02-s.w.i.m

Supplies

System 1

  1. 1x Electronics Breadboard 1x6”
  2. 1x ESP32 Microcontroller (w/ Bluetooth & Wifi)
  3. 1x Adafruit NeoPixel Digital RGB LED Strip
  4. 3x jumper wires
  5. 1x breadboard
  6. 1x 12” wood stick
  7. Personal computer

System 2 (Rotational S.W.I.M)

  1. 1x Electronics Breadboard 1x6”
  2. 1x ESP32 Microcontroller (w/ Bluetooth & Wifi)
  3. 1x Adafruit NeoPixel Digital RGB LED Strip
  4. 3x jumper wires
  5. 1x breadboard
  6. 1x 12” wood stick
  7. 1x Stepper Motor (or servo)
  8. 1x Stepper Motor Driver ( if using stepper)
  9. Personal computer

Design & Prototype

Prototyping.jpg

Prototyping Phase


First, we want to make sure our theory is correct. Using a cheaper LED strip, test the ESP32 pinout and flash performance to make sure the microcontroller will be operational.

Manual S.W.I.M

S.W.I.M ABAB.jpg
S.W.I.M ABC.jpg
Manual swim.jpg

Hardware

  1. Connect ESP32 to a breadboard
  2. Connect the 5V from ESP to the 5V pin on the LED
  3. Connect the G pin on the LED to the GND on ESP32
  4. Connect the middle data pin from the LED to GPIO27 on the ESP32
  5. Tape the LED strip to a rigid structure, such as an 12" wooden strip or pvc pipe.


Software

This PyQt6-based GUI application allows users to generate and visualize text, waveforms, and FFT-based images for an LED matrix controlled by an ESP32. The ESP32 HTTP endpoint processes the LED matrix frames and iterates through them. The app supports real-time status updates, color customization, function-based waveform generation, and audio recording for frequency visualization.

Rotational S.W.I.M

image1.png
image3.png
image2.png
S.W.I.M Rotational.jpg
image4.png


  1. Connect ESP32 the same way as in step 2
  2. Connect the servo data pin to GPIO 26, 5V to 5V on the ESP32 , and GRND to GND on the ESP32
  3. Rig setup on a rigid structure such as a stick.
  4. Drill a small hole in the center of the stick
  5. Hot clue the servo arm to the stick.
  6. Manage cables for 180 degree turns


Software

Building upon the principles of angular resolution and LED sampling, we now introduce a servo motor to physically rotate the LED array. Instead of mapping the image to a fixed circular display, the servo dynamically rotates the LED strip while we project sequential frames onto it.


Each LED still follows the calculated position based on:


x = C_x + r \cos(\theta)



y = C_y + r \sin(\theta)


However, instead of updating pixel values in a static matrix, the servo moves the LED strip to different angles while displaying corresponding frame data. This allows us to create rotational persistence of vision (POV) displays, effectively displaying a complete 2D image using a single ring of LEDs.


The rotational speed of the servo determines the refresh rate, ensuring smooth image reproduction. By synchronizing the servo angle with the frame updates, we achieve a continuous and fluid animation, bringing dynamic LED-based image projection to life.

Hardware Explanation & Testing

Hardware The hardware for both systems is centered around the ESP32 microcontroller. In this lab, we used a WROOM-32 type ESP module, but any similar device will suffice. The first model is a manual linear S.W.I.M device. We will use this system to demonstrate the basic principles of a S.W.I.M and explain how the photosensor measures. Afterwards, we will be building on this theory to create a rotational S.W.I.M device using a basic H14S servo.

System 1 – Manual S.W.I.M

The hardware setup for the linear S.W.I.M is basic. The LED strip is powered by 5V and controlled using a GPIO pin (refer to figure 2). Using the Adafruit NeoPix library, we control the LEDs using nested lists containing RGB information (color). The amount of data given to the strip is calculated using the following formula:



The firmware running on the ESP32 is written in C and completes three main tasks. First, it starts an HTTP server and opens an asynchronous channel for incoming requests on the local network. The microcontroller then waits for a POST request on the “/update” route. The body of this request holds an array of RGB settings for each frame of the S.W.I.M. This array is a nested list of RGB values for each LED in each frame of animation. Finally, the ESP module will iterate through the sub arrays and display the information in a set interval. For ease of use, we chose to mount this device on a piece of ½” PVC pipe for support. By slowly moving this linear S.W.I.M in one direction along the image plane of a Camera shooting in “long-exposure” (aperture open for longer) will generate a sequence of LED pules into a message in one frame with still objects appearing undisturbed.


System 2 – Rotational S.W.I.M

The firmware running on the ESP32 is almost identical to that of the System 1, with additional steps to control rotational movement and maintain positional accuracy. Here are some key differences between the systems.

1. Servo-Controlled Rotation

• A servo motor is integrated to control the rotational motion of the LED strip.

• The firmware sends PWM signals to adjust the servo's angle incrementally, ensuring precise frame positioning.

2. Polar Coordinate Mapping for Rotation

• Unlike the linear S.W.I.M., where the LED strip moves in a straight path, the rotational version uses a polar coordinate system to determine LED positions.

• Given an angular step θ and a fixed radius r, each LED's position follows: x =𝑟𝑐𝑜𝑠(𝜃), 𝑦 = 𝑟𝑠𝑖𝑛(𝜃)

• The system calculates the required angular displacement to ensure smooth light painting. 3. Structural Stability via Silicon Mounting

• The LED strip is mounted on a rigid silicon base to maintain structural integrity.

When the camera’s shutter opens, we start the rotation and close the shutter while exposing the entire scene to a flash when the servo reaches 180̊ rotation.

Conclusion

This experiment has successfully proven that S.W.I.M devices can be built with basic components, simple logic, and have their images captured over a period of long exposure. To start, system 1 was completed by building a simple linear S.W.I.M using an ESP32 microcontroller, NeoPix LED strip, and some jumper wires. System 2 added on to the linear S.W.I.M by adding a servo and a series of formulations to adjust pixels to their polar coordinates. Both systems were controlled over HTTP using custom software built using PyQT. This software was responsible for data input, pattern creation, and sending data to the physical S.W.I.M device. To conclude, both systems worked optimally and produced clear images. The code for this project can be found in the Instructable or Gitlab links on the title page.