Embaudy

by divydhavala in Circuits > Software

161 Views, 1 Favorites, 0 Comments

Embaudy

Group 13.png
586 Final Video

This project is created as a course assignment at the California State University, Long Beach; taught by Dr. Behnaz Farahi: DESN 586 Human Experience and Embodied Interactions Studio


By Divya Dhavala, Gabe Gutierrez, Afsana Shanta

E-Mood is an interactive design experience to help you understand your emotions through body language. Using TouchDesigner, we created a playful space for our audience to explore the connection between movement and emotions. Explore how you communicate nonverbally and gain the confidence to express yourself authentically.


Context

The 2018 National College Health Assessment reports that in the past year, 63% of college students surveyed felt overwhelming anxiety, 42% felt so depressed that it was difficult to function, 62% felt very lonely, and 12% seriously considered suicide.

Project Statement

We want to create an interactive public installation that brings awareness about the importance understanding your emotions and how it can be expressed through the body.

Target Audience

Undergraduate students

Goal

Bring Awareness about emotional well-being

Benefit

Research has shown that understanding your emotions and expressions of your body can help build: greater self awareness, emotional processing, and emotional intelligence.

Value

Building self awareness, effective emotional processing, greater emotional intelligence, develops foundational life skills that students will use later on in their lives.


Literature reviews


Case Studies

MOOD METER

Large-Scale and Long-Term Smile Monitoring System

Mood Meter is a computer vision system designed to measure and visualize the emotional footprint of a community based on smiles. It was implemented in a college campus during a 10-week festival to encourage, recognize, and count smiles at 4 key locations.The Mood Meter project used cameras connected to laptops to capture and process images of human faces. They utilized computer vision technology, specifically OpenCV and Shore framework, to analyze facial expressions and predict smile intensity scores. The system projected live-feed images with overlaid graphics to encourage participation and displayed aggregated smile estimations on a public website.


AURA

Audiovisual installation translates emotions into beams of light

This immersive audiovisual installation re-interprets people's emotions as pulsing light compositions. Its main intention was to explore the light as a medium. Visitors sat on the floor listening to a musical composition that triggered an emotional response. The artist explores how seeing emotions outwardly affects our understanding of ourselves and others. It's like making the invisible aspects of us visible. The artist used bio sensors for brainwaves, heart-rate variability, and galvanic skin response.


THE EMOTION LIGHT

An interactive biofeedback sculpture

The Emotion Light is an interactive artwork from 2009 that changes color based on the holder's arousal level, measured through a galvanic skin response (GSR) sensor and heart rate. It also includes a relaxing soundscape. The artwork uses a GSR sensor and heart rate monitor to measure arousal and heart rate, mapping them to the color and speed of the light.

Supplies


Development Tools

Touch Designer

  • TouchDesigner is a visual programming environment aimed at the creation of multimedia applications. 

Media Pipe

  • A GPU Accelerated, self-contained, MediaPipe Plugin for TouchDesigner that runs on Mac and PC with no installation.

Figma

  • Prototyping & Testing


Supporting Tutorials Links

  • https://github.com/torinmb/mediapipe-touchdesigner/releases/tag/v0.4.2
  • https://www.youtube.com/watch?v=Cx4Ellaj6kk
  • https://www.youtube.com/watch?v=NnrWjQ_zO-s&t=1193s

Understanding Our Model

Touch Designer Flow.png
Screenshot 2024-05-12 at 9.21.50 PM.png
Screenshot 2024-05-12 at 9.21.40 PM.png
Screenshot 2024-05-12 at 9.21.28 PM.png
Screenshot 2024-05-12 at 9.21.23 PM.png
Screenshot 2024-05-12 at 9.21.12 PM.png

Using the Image Segmentation feature in Media Pipe, we used the video in to map how the particles will be mapped in Touch Designer using the Particle GPU generator

Image Segmentation + Media Pipe

Screenshot 2024-05-12 at 9.21-4.png

Using the Face Tracking and Hand Tracking feature in Media Pipe, we used these values to manipulated the color and force in the Particle GPU Generator 


Visualization

Screenshot 2024-05-12 at 9.21-3.png

Adding additional visualization. We then outputted the mapped particles to a Blob Tracker to add additional visual elements. 


Adding a Background Element

Adding other Visualization. Using Figma to create a PNG overlay to be used as a background element, is added to the current interaction. 


Export for Performance