Augmented Reality Eyeglass With Thermal Vision: Build Your Own Low-cost Raspberry Pi EyeTap

by SteveMann in Workshop > 3D Printing

28441 Views, 318 Favorites, 0 Comments

Augmented Reality Eyeglass With Thermal Vision: Build Your Own Low-cost Raspberry Pi EyeTap

eyetap_lasers.gif
outs.jpg

(The above metavision photograph accurately records the sightfield of the EyeTap Digital Eye Glass).

Build your own low-cost open-source EyeTap eyeglass: the OpenEyeTap project.

Recognize faces, overlay augmented reality content, etc., using the visible light camera, and, optionally, add also a thermal camera... Now you can walk around your house and see where the insulation is failing, see where heat's leaking out, see how the furnace airflow is going through the ductwork and into the house, see which pipes are going to freeze and burst first, and you can even go to a public place and see who's packing heat.

The EyeTap was invented by Steve Mann, and made better by a team of really smart superstar students.

First let's acknowledge the great students that made this work what it is.

Left-to-right: Alex Papanicolaou, Bryan Leung, Cindy Park, Francisco Cendana; Jackson Banbury; Ken Yang; Hao Lu; Sen Yang (link to full resolution portraits). Not pictured: Audrey Hu; Sarang Nerkar; Jack Xie; Kyle Simmons.

The EyeTap has the following functions:

  • Live streaming (lifeglogging);
  • VMP (Visual Memory Prosthetic);
  • PSD (Personal Safety Device), like an automobile "dashcam" or a department store's surveillance camera;
  • Thermal vision: see in complete darkness and find suspects... see who has a concealed weapon;
  • Machine learning to sense emotions (e.g. is the person hiding the gun angry);
  • Many more functions will be added shortly;
  • We hope to build a community of users who can also add to the OpenEye project.

Historical notes: The EyeTap wearable computing project dates back to the 1970s and early 1980s and was brought to MIT to found the MIT wearable computing project in the early 1990s (http://wearcam.org/nn.htm).

Here my team of students and we present an opensource eyetap you can easily make at home using a 3d printer.

There have been a number of "copycats" making commercial variations of the device, but with significant design flaws (not to mention the lack of an open ethos that would allow the community to correct those design flaws).

There are 3 fundamental principles that an augmented reality glass needs to uphold:

  • Space: the visual content needs to be able to be spatially aligned. This is done by satisfying the collinearity criterion;
  • Time: the visual content needs to be able to be temporally aligned; feedback delayed is feedback denied;
  • Tonality: the visual content needs to be tonally aligned (photoquantigraphic alignment). This is what led to the invention of HDR as a way of helping people see. [Quantigraphic camera provides HDR eyesight from Father of AR, Chris Davies, Slashgear, 2012sep12]

The 3 Fundamental Principals of AR: Why the Market Has Failed to Deliver!

EyeTap_principle_IEEE_spectrum8x5.png
img1522crop.jpg

(Above picture: the eye itself is the camera. The Pi camera is mounted to the nosebridge pointing toward starboard side (my right, or your left, as you face me. You can see the reflection of the camera in the diverter, so it looks like I have a glass eye. The reflected virtual camera is exactly inside the eye, lined up perfectly with the center of the iris of the eye.)

There are 3 fundamental principles that an augmented reality glass needs to uphold:

  1. Space: the visual content needs to be able to be spatially aligned. This is done by satisfying the collinearity criterion;
  2. Time: the visual content needs to be able to be temporally aligned; feedback delayed is feedback denied;
  3. Tonality: the visual content needs to be tonally aligned (photoquantigraphic alignment). This is what led to the invention of HDR as a way of helping people see. [Quantigraphic camera provides HDR eyesight from Father of AR, Chris Davies, Slashgear, 2012sep12],

The EyeTap is based on a need to satisfy these 3 principles.

For example, the camera should capture PoE ("Point of Eye") images.

That's why it kind of looks like the wearer has a glass eye when you look at someone wearing the EyeTap. What you're seeing is a reflection of the camera in their eye. That's why people used to call this the "eye glass" or the "glass eye" or just "glass" for short, back in the 1980s and 1990s.

So when aligning everything, we try to make sure these criteria are followed.

List of Components

PiTapcc.jpg
PiTap_CAD_v2_nan.png

3D printed components:

  • Main frame x 1
  • Display holder assembly x 1
  • Nose piece x 1
  • Computer housing x 1
  • Optics holders x 1
  • Sensor housing x 1 or more

Off-the-shelf components:

The following components can be individually purchased from their official website, or they can be purchased as a bundle from our OpenEyeTap.com or other suppliers' website:

  • Micro display x 1;
  • Beamsplitter ("one-way" or "two-way") mirror x 1 (from which to cut the diverter optics below);
  • Raspberry Pi Zero W x 1(link);
  • Raspberry Pi Spy Camera x 1 (link);
  • Camera cable conversion board x1
  • 28 gauge wires x 1 (link)
  • M2 screws (various length) (link)

Laser Cut components:

  • Diverter, beam splitter optics x 1. DXF file can be downloaded using the above link, or you can also purchase pre-cut optics from OpenEyeTap.com

3D Print and Assemble the EyeTap Design

CombinedProcess
ThermalEyeTap_Fusion360_screengrab.png
20180119_135909.jpg
20180119_135854.jpg
BryonEyetap_at_MaRS.jpg

If you like our design as it is, you can simply use the STL models provided in this section, and then assemble the components according to the 3D model (see link below).

At least one of the cameras should line up with the eye so that when you look at yourself in the mirror (or when someone else looks at you) you can see your "glass eye" (the center of projection of the lens of the camera should match exactly with the center of the iris of your eye). This is the EyeTap camera to which other cameras can be coordinate-transformed so that they all operate in EyeTap coordinates.

The basic existing 3D print design should make this easy.

Also, if you want to make some changes to the design, this 3D model will also be useful for that purpose: http://a360.co/2CSxaum

The Code for Thermal Camera ... Lifeglogging

The OpenEyeTap project includes thermal camera code for Raspberry Pi.

We're a large community also developing other code for things like lifeglogging, wearable face recognizer, Visual Memory Prosthetic (VMP), etc..

The Livestream module for Open Eyetap enables the streaming of video from the camera attached to the Eyetap to the internet, triggered when the button is pressed.

Open Eyetap Livestream makes use of the FFmpeg video converter to obtain an input video stream from the camera, obtained using the PiCamera module for Python, and convert it into a stream that is compatible with a number of popular video live streaming sites, such as Youtube, Facebook, and Twitch. The camera is a standard Pi Camera, connected to a Pi or Pi Zero through the standard camera port. Once connected to a WiFi connection with internet access, Open Eyetap Livestream can then seamlessly stream video to the live streaming site of the user's choice.

Technically, Open Eyetap Livestream uses a video source - either Raspivid, or a Python app using PiCamera - that is then piped to FFmpeg, which performs the conversions necessary for live streaming. FFmpeg is used instead of the more recent avconv due to difficulties experienced in using the avconv stream for live streaming to websites. The demonstration case makes use of a Python script as a wrapper for the video source that obtains video from the Pi Camera, allowing us to trigger the video stream on demand by pressing the button attached to the Raspberry Pi.

Other Applications

upsam_thermocam-output_1511762367.jpg
deleteme.gif
deleteme2.gif
thermocam video 1514687324
thermocam video 1514686078
ece516 lec1thermo

(Concealed weapon is visible hidden under a t-shirt. The long strip of flat metal being concealed from regular vision is clearly visible in the infrared because it doesn't emit heat to the same degree that the human body does. Hot meals are visible and we can see the spectrum of thermal variations at the buffet counter...)

Another useful variation is the thermal EyeTap. Use a "hot mirror" for the diverter. A hot mirror reflects heat and transmits visible light.

In this variation, heat is reflected off the front of the diverter into an infrared thermal camera, and the rays of heat are resynthesized into rays of visible light.

The above examples show:

  • Seeing concealed weapons;
  • Selecting foods from a buffet;
  • Supervising kitchen staff;
  • Selecting a heater from a store that sells heaters (seeing which heater is best);
  • Plumbing repairs: seeing where the pipes are hot and cold, seeing hot and cold water, seeing where pipes might be close to freezing and bursting, etc.;
  • Classroom demonstration of hydraulophone (e.g. consider also visualization of pipe leaks) and Dyson heater.

Although handheld cameras exist for this, wearing the camera is much better. The "WearCam" concept leaves both hands free to fix the plumbing while working on it and seeing everything well.

Have Fun and Share Your Work With Others...

P1050264crop.jpg
ARinAction_SteveMann1200x.gif

The most important thing is to have fun and share your work with others.

Add some brain-sensing headwear, or maybe a SWIM (Sequential Wave Imprinting Machine) as per previous Instructables.

Help us build a better future of HuMachine Learning and HI (Humanistic Intelligence).