Movie Immersion Using Haptics

by esteban_rivas in Circuits > Arduino

81 Views, 0 Favorites, 0 Comments

Movie Immersion Using Haptics

4D.jpg
ps5.png

This is a simple wearable device that provides haptic vibrations that attempt to simulate certain sound effects that occur during movies or TV shows, such as gunshots, explosions, impacts, heartbeats, etc. The device reads data from the video's subtitles to find any effects and proceeds to simulate them at the given time so that it is synchronized with the actual moment they happen during the footage.


Problem Statement

Movies, TV shows and other types of broadcast media have become very important for the entertainment and culture of today's society. Every year, cinematographers attempt to make new films with state-of-the-art technology and tools such as visual effects and refined sound mixing to create a memorable viewing experience.

One tool that is often used during the screening of such media is subtitles. These are used to provide the viewer with the dialogue of a scene. This is useful, for example, for people who want to watch something in its original language when they do not speak it, or maybe they just simply cannot follow fast dialogues.

In particular, subtitles are important for people who are deaf or hard of hearing, who otherwise would not be able to follow the plot. Needless to say, allowing this population to enjoy media in such capacity is very important. for them. Moreover, there exists subtitles which contain not only dialogue, but also are contain the text version of the audible components of videos. These are called SDH subtitles (subtitles for the d/Deaf and hard of hearing) and they include detailed audio information such as sound effects, music, and speaker name.

Sadly, even with these aid, the experience for this population cannot be the same as the one they would have if they could hear properly. However, one way to improve the immersion is with the use of haptics. Indeed, instead of just reading "gunshot" in a subtitle for example, vibrotactile actuators can be used to generate vibrations at frequencies that could resemble that of an actual gunshot. In addition, while this is mainly beneficial for people who are deaf or hard of hearing, this principle can also improve the experience of regular viewers.


State of the Art

Attempts to improve the viewer's experience are not uncommon nowadays. In fact, many attractions in famous theme parks use special lighting effects, vibrating seats, water sprayers, artificial smells, and other tricks to make the user feel more immersed in the experience. Such an example is Shrek's 4D ride in Universal Studios [1]. This sort of movie theaters also exist outside of theme parks, such as the 4DX technique from Blue Cinema [2]. Lastly, one of the most famous haptics applications is for gaming, which is something that companies like Sony have continued to perfect with their PlayStation controllers [3].

In research, there are a few works that aim to enhance movie experience by the use of haptic technology. The work of Mazzoni and Bryan-Kinns [4] aims to create a wearable device that uses vibrotactile actuators to enhance the mood in a film score. This means that a different kind of vibration would be use to enrich the emotion that a certain part of a film's soundtrack is trying to convey. Furthermore, this study also emphasizes the relevance of such application for people who are deaf or hard of hearing. A similar wearable device was made in the work of Brokken et al. [5], where a jacket containing 64 actuators was used to render feelings of certain characters such as anger, anxiety, happiness, etc. The idea of this jacket was also used by Dijk et al. [6] to create a more user friendly interface, in this case a blanket, to render similar effects.

Other works attempt to have some actual user interaction with the actual scene, such as the one of O'Modhrain and Oakley [7], where they introduce some haptic cues during a scene from a cartoon through a haptic actuator that resembles a remote control. The user can then choose to interact with the scene but without changing its final outcome, similar to some videogames.

More complicated set ups also exist, such as the one presented by Danieau et al. [8]. They designed a special seat with haptic actuators used to render camera effects such as zoom, vertigo or instability, which is a different and more complicated effect compared to what is usually found in literature.

Most of these systems require the manual detection of the effects to be rendered. Lee and Choi [9] conducted research on a software to automatically detect physical impacts in games and movies by analyzing image sequences and/or audio signals.

Lastly, now outside of the context of broadcast media, Kawakami et al. [10] presented several wearables for different body parts that enhance emotionally immersive experiences during real-time messaging. these experiences go from heartbeats, shivering, stomach butterflies, the feeling of hugging, etc.

Similarly to [9], this project focuses on simple impact sounds as well as other types of effects just to show the full range of possibilities that haptics offer. The use of subtitles as a way of detecting the right effect and its timing is an alternative not fully explored in literature. Lastly, the device made is a simple wearable that demonstrates the functionality. However, more sophisticated and yet user friendly interfaces such as the blanket in [6] could show more potential for future work.

Supplies

20240607_130741.jpg

Controller Side

  • Arduino Micro
  • Breadboard and jumper wires
  • USB A to USB micro cable
  • Laptop with Arduino IDE and Python installed
  • Potentiometer
  • Haptic Actuator TacHammer Drake HF (High Frequency)
  • Motor Controller Adafruit DRV2605L
  • Multiplexer Adafruit TCA9548A

Wearable Side

  • Elastic arm sleeve
  • Double sided tape
  • Rubber band
  • Wrist tape

Circuit

Circuit_NoBluetooth.png
Circuit_Bluetooth.png
Circuit.jpeg

The full circuit can be found in the Fritzing file appended below. Note: the TacHammer is displayed in the circuit using a Vibration Motor - ROB - 08449, as it has the same input connections. In this project, a circuit without Bluetooth is used, meaning that the Arduino needs to be directly connected to a computer.

However, there is an option to use bluetooth, in which case two more components are needed: a Bluetooth HC-05 Module and a 9-12V battery. The circuit diagram for this circuit is in the same Fritzing file as the other one.


The basic functions of the different components found in the circuit are:

  • Arduino Micro: contains the program that controls all the inputs and outputs for the control of the haptic actuator.
  • Motor Controller: it receives commands from the Arduino Micro in order to send the right signals to the TacHammer actuator.
  • Multiplexer: allows the Arduino Micro to send signals to more than one motor controller in order to have multiple TacHammers actuators running in parallel.
  • Potentiometer: for this application, it is used to change the amplitude of the signal sent to the TacHammers.
  • HC-05 Module: it allows for bluetooth communication between the Laptop and the Arduino Micro.

Downloads

Determine the Effects

interstellar.jpg
matrix reloaded.jpg
daredevil.jpg
the shining.jpg
avengers.jpg

In order to showcase the potential of haptics for this application, several sound effects had to be selected. The effects had be common but different enough to showcase the range of possibilities. In order to accomplish this, clips from several movies/TV shows were screened until sufficient effects were found.

The effects found and the corresponding sources are:

  • Ticking - Interstellar (2014): in a scene from this movie, there is an audible click that is present for several minutes, which would not be noticeable for someone who is deaf or hard of hearing. The frequency of this ticking has an interesting meaning in the movie. In the fictional planet Miller, 1.25 seconds is equivalent to 1 full day on Earth. Being aware of this ticking puts into perspective how important time is during this scene.


  • Gunshots - Matrix Reloaded (2002): this is a simple but common example of the use of haptics for user immersion. In this scene the main character Neo is being shot with several machine guns by some villains. A part of this scene is in slow motion, which can also be rendered with haptics.


  • Heartbeat - Daredevil (2015): during this scene, the main character Matt Murdock can tell whether someone is telling the truth or not by listening to their heartbeat. This effect can be heard during this scene, which is a nice touch that would be lost if there was only a subtitle that said [Heartbeat].


  • Impact - The Shining (1980): here Jack Nicholson is just smashing the door with an axe in that iconic "Here's Johnny" scene. Using haptics here would amplify the tension that the scene already offers.


  • Lightning - The Avengers (2012): here its Thor hits Iron Man with some lightning. The charging and discharging can be separately rendered with different vibrations.


Other possible effects:

  • General rumbling for explosions, earthquakes, structures collapsing, roars, etc.
  • Footsteps.
  • Countdowns.
  • Even more kinds of impact.

Code

arduino logo.png
python logo.png

The Arduino IDE Code


This code first sets up the communication between the multiplexer and the actuator drives, as well as the potentiometer and the bluetooth. In order to create haptic effects, certain base functions are used. Some of these functions are pulse, pause or vibrate.

By implementing / combining these base functions using specific frequencies and delays, several haptic effects can now be created, including the ones mentioned in Step 2. An example of how this is done can be seen in the code snippet below, which is the function for the heartbeat effect. This code snippet shows how the program waits until it receives the input that a heartbeat effect should be simulated.

  while (start && effect == "heartbeat") {
   pulse(0.3, 45);     
   pause(112);
   pulse(0.3, 3);      
   pause(900);

Lastly, even though it was not integrated in this code, within each effect function, the value of the amplitude of the signal can be multiplied by a scaling factor between 0 and 1 (min and max), indicating the level of intensity required. This scaling factor is dependent on the value of the potentiometer.


The Python Code

A Python program is used send a signal to the Arduino to indicate which effect to render and at the corresponding time. To accomplish, the code processes the text from a subtitle file to detect any instances of a sound effect and its timestamp.

This is how the code sends a command to the Arduino

def send_command(command):
  """Send a command to the Arduino."""
  print(f"Sending command: {command}")
  ser.write(f"{command}\n".encode())


This is how the program finds a haptic subtitle and how it is processed before sending the command to the Arduino

# Process each pair of lines (timestamp and effect)
  for i in range(0, len(lines), 2):
    timestamp_line = lines[i].strip()
    effect_line = lines[i+1].strip().lower().strip('[]')
     
    start_timestamp, end_timestamp = timestamp_line.split(' --> ')
    start_seconds = timestamp_to_seconds(start_timestamp)
    end_seconds = timestamp_to_seconds(end_timestamp)

    # Calculate target times based on the current time
    target_start_time = current_time + start_seconds
    target_end_time = current_time + end_seconds

    # Sleep until the start time
    time.sleep(max(0, target_start_time - time.time()))

    # Send start command
    send_command(f"START {effect_line}")

    # Sleep until the end time
    time.sleep(max(0, target_end_time - time.time()))

    # Send stop command
    send_command(f"STOP {effect_line}")

The corresponding movie clip is automatically played as soon as the Python program is, which starts a timer upon execution. The code then proceeds reads the file line by line. Once an effect is found, by looking for a line in the form of [String], it waits until the timestamp matches the timer, after which it sends the effect as a String to the Arduino. The Arduino would then read the input string and compare it to the ones found in the library. If there is a match, the function containing the effect is called, sending the corresponding signals to the multiplexer.

Make Wearable

wearable.jpeg

The sleeve is worn on the arm and the breadboard circuit is attached to it using the double sided tape. The breadboard is oriented in such a way that the TacHammer is over the wrist.

The bandage tape is wrapped around the wrist in order to isolate it from the heat the TacHammer generates. However, the tape layer should not be to thick that it absorbs all the vibrations, as this would inhibit the haptic sensation. Lastly, the elastic band is used to fix the position of the TacHammer over the wrist.

Create Video and Subtitles

Haptic Interfaces Project: Enhanced movie sound effect

A video was created to compile and showcase all the previously mentioned scenes. It was edited with Microsoft's Clipchamp, which as an option to automatically create subtitles using AI. Due to the limitations of this project, the sound effects were added to the subtitle file manually by looking at the specific time they appeared and their approximate duration. Another limitation is that it is not possible to rewind or fast forward the video, as the timestamps will not match with the timer in the code.

Each new entry in the subtitle file looks has the format (# of entry, start-end time, caption):


18

00:01:18,384 --> 00:01:31,054

[Heartbeat]


For future work, a complete library of all possible effects is needed. Furthermore, the current way SDH files are created should be modified to include or match the effects from the library. Using AI for the generation of such files is worth considering as the manual implementation is time consuming and possibly expensive.

The recording only shows the TacHammer placed on a table instead of having the complete wearable. The reason for this is that the vibration is not visually or audibly noticeable when the wearable is used.

The video and the subtitle file are both appended.

Downloads

References

[1] “Shrek 4-D at Universal Studios Florida,” Orlando Informer. Accessed: May 30, 2024. [Online]. Available: https://orlandoinformer.com/universal/shrek-4-d/

[2] “4DX - cinema experience for all senses,” blue Cinema. Accessed: May 30, 2024. [Online]. Available: https://bluecinema.ch/en/4dx-en/

[3] “DualSense wireless controller | The innovative new controller for PS5,” PlayStation. Accessed: May 30, 2024. [Online]. Available: https://www.playstation.com/en-fi/accessories/dualsense-wireless-controller/

[4] A. Mazzoni and N. Bryan-Kinns, “Moody: Haptic Sensations to Enhance Mood in Film Music,” in Proceedings of the 2016 ACM Conference Companion Publication on Designing Interactive Systems, in DIS ’16 Companion. New York, NY, USA: Association for Computing Machinery, Jun. 2016, pp. 21–24. doi: 10.1145/2908805.2908811.

[5] P. Lemmens, F. Crompvoets, D. Brokken, J. van den Eerenbeemd, and G.-J. de Vries, “A body-conforming tactile jacket to enrich movie viewing,” in World Haptics 2009 - Third Joint EuroHaptics conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Mar. 2009, pp. 7–12. doi: 10.1109/WHC.2009.4810832.

[6] E. O. Dijk, A. Weffers-Albu, and T. de Zeeuw, “A tactile actuation blanket to intensify movie experiences with personalised tactile effects,” Eindhoven, The Netherlands : Philips Research Eindhoven, p. 2. [Online]. Available: https://www.eskodijk.nl/doc/intetain09.pdf

[7] S. O’Modhrain and I. Oakley, “Touch TV: Adding Feeling to Broadcast Media ,” Sugar House Lane, Bellevue, Dublin, Ireland , Apr. 2004, p. 2. doi: 10.1109/HAPTIC.2004.1287211.

[8] F. Danieau, J. Fleureau, P. Guillotel, N. Mollet, M. Christie, and A. Lécuyer, “Toward Haptic Cinematography: Enhancing Movie Experiences with Camera-Based Haptic Effects,” IEEE MultiMedia, vol. 21, no. 2, pp. 11–21, Apr. 2014, doi: 10.1109/MMUL.2013.64.

[9] J. Lee and S. Choi, “Real-time perception-level translation from audio signals to vibrotactile effects,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, in CHI ’13. New York, NY, USA: Association for Computing Machinery, Apr. 2013, pp. 2567–2576. doi: 10.1145/2470654.2481354.

[10] D. Tsetserukou, A. Neviarouskaya, H. Prendinger, N. Kawakami, and S. Tachi, “Affective haptics in emotional communication,” in 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, Sep. 2009, pp. 1–6. doi: 10.1109/ACII.2009.5349516.