Haptic Feedback System for Needle Alignment in Augmented Reality-guided PCNL Procedure

by aureliedecaluwe in Circuits > Arduino

32 Views, 0 Favorites, 0 Comments

Haptic Feedback System for Needle Alignment in Augmented Reality-guided PCNL Procedure

Schermafbeelding 2025-05-26 004108.png

Kidney stone disease is a growing global health concern, affecting up to 20% of the population, with recurrence rates reaching 40% within 15 years. For stones larger than 2 cm, the gold-standard treatment is Percutaneous Nephrolithotomy (PCNL), a minimally invasive surgical procedure involving a precise renal puncture to create a tract for stone removal. Despite a high success rate of up to 95%, PCNL remains technically challenging, particularly the initial puncture phase, which is widely recognized as the most critical determinant of procedural success [1][2].

This initial step is often performed semi-blind, relying solely on 2D imaging modalities such as ultrasound (US) or fluoroscopy (FS) to guide the needle through complex 3D renal anatomy. As a result, it imposes a steep learning curve and high cognitive demands on surgeons, increasing the risk of complications such as hemorrhage or injury to surrounding organs including the colon, pleura, spleen or liver [3].

To address these challenges, recent years have seen increasing exploration of image-guided navigation systems, ranging from robotic assistance and laser targeting to 3D CT-based planning, electromagnetic tracking (EMT) and augmented reality (AR) systems. These technologies aim to enhance accuracy and reduce operator dependency, however many remain experimental or confined to high-resource settings [4][5].

In my Master’s thesis, I investigated how AR could serve as a more accessible and intuitive tool to support the renal puncture phase of PCNL. Previous studies have shown that AR can project ideal trajectories onto the body surface [6], yet they typically rely on FS for needle verification and lack real-time tracking capabilities. Optical tracking, while promising, is limited by its dependence on a clear line of sight, an impractical constraint in real-world surgical environments, especially when tools are bent or partially obstructed [7][8]. EMT offers a compelling alternative, providing real-time, line-of-sight-independent tracking of surgical instruments. Despite its potential, EMT has seen limited application in PCNL [9][10]. To explore this, I developed a novel AR-guided puncture system integrating EM needle tracking via the Aurora EMT system. A Unity-based application was deployed on the Microsoft HoloLens 2, overlaying a 3D anatomical model and ideal needle trajectory directly onto the patient's body. Real-time needle tracking allowed the virtual representation to reflect the physical needle’s movement, enabling intuitive, FS-free navigation.

A feasibility study was conducted to compare this AR-guided system against conventional US-guided puncture. A custom silicone phantom was created, containing a 3D-printed kidney with an electronic button embedded in the target calyx. Successful needle contact triggered the button, completing the task and allowing for objective evaluation of puncture time, operation duration and number of attempts. Three levels of AR guidance were tested: (1) color cues (green for aligned, red for near-miss), (2) audio feedback, and (3) augmented with real-time angle deviation data and directional arrows.

The results indicated a clear performance improvement across all metrics using AR guidance compared to traditional methods. However, distinctions between the three feedback layers were less pronounced. While layered cues appeared to support user performance, their effectiveness may be limited by cognitive overload or suboptimal design. Some participants reported that excessive visual stimuli interfered with concentration during the puncture task. To address this limitation, we introduced a novel form of non-visual guidance: a wearable haptic feedback wristband. This device incorporated a Drake vibration motor, driven by an Arduino-based controller, to provide tactile cues based on needle alignment. As angular deviation increased, the vibration intensified, as alignment improved, the vibration decreased, ultimately ceasing upon correct needle orientation. This intuitive feedback mechanism reduced reliance on visual input, allowing users to make adjustments instinctively in response to tactile signals. Much like a conditioned reflex, users were motivated to "silence" the vibration by correcting their hand position.

The remainder of this post focuses specifically on the development, implementation and evaluation of this haptic feedback wristband prototype as a complementary guidance modality for AR-assisted PCNL.

Supplies

rrr.jpg

Since this project builds on my Master Thesis work, this section provides a detailed bill of materials needed to replicate the haptic feedback wristband. This post will focus exclusively on the wristband itself, how it was designed, built and integrated. For completeness, a list of all hardware and software components used in the full experimental setup have been included aswell. These are not required to build the wristband alone, but are necessary to fully test it in the context of the larger AR-guided PCNL system.


Materials for the wristband prototype:

  1. Arduino Micro (ATMega32U4-based)
  2. DRV2605L haptic driver (eg: Adafruit breakout board)
  3. Drake actuator
  4. Mini breadboard (for prototyping)
  5. Jumper wires
  6. PCB board (for final version)
  7. Velcro wristband or adjustable strap
  8. Hairband
  9. Basic sewing kit (to attach motor to strap)
  10. Soldering iron and lead
  11. USB cable (for powering and programming the Arduino)

Software used for the wristband prototype:

  1. Unity
  2. Visual studio
  3. Arduino IDE

Components required for the entire project (for testing purposes):

  1. Microsoft Hololens 2
  2. Aurora EMT system (NDI)
  3. Neelde with EM sensor
  4. Custom phantum model

Methods

Arduino_prototypeCircuit.jpg
solo.jpg
zer.png
bro.png
band3.jpg

The following section outlines the complete process of creating the haptic feedback wristband, along with how communication between different software platforms, primarily Unity (running on HoloLens) and Arduino was established. Below is an overview of the development workflow, which will be explained in more detail step by step:


Step A) Create the Arduino circuit on a breadboard

Step B) Upload a test script to validate circuit functionality

Step C) Establish serial communication between Unity and Arduino

Step D) Test the setup within the Unity Game environment

Step E) Build and deploy the Unity app to HoloLens

Step F) Transfer the breadboard design to a compact PCB

Step G) Assemble the wearable wristband

Step H) Conduct user testing


Step A) Create the Arduino circuit on a breadboard

To prototype the circuit efficiently, it was first assembled on a breadboard. This setup allows for easy modifications and troubleshooting before committing to a soldered PCB. The components include:

  1. Arduino Micro (ATMega32U4)
  2. DRV2605L haptic driver
  3. Drake vibration motor actuator

The DRV2605L is connected to the Arduino using the I2C protocol, which requires two lines: SDA (data) and SCL (clock). On the Arduino Micro, SDA and SCL are mapped to digital pins 2 and 3, respectively, when using the Wire library in software I2C mode (as opposed to hardware I2C) [11]. This was necessary due to compatibility with the board layout and available pins. If using hardware I2C, you would connect to pins D2 (SDA) and D3 (SCL), which internally map to the correct I2C functions on the Micro. Power (+5V) and GND were connected from the Arduino to the haptic driver and actuator. Jumper wires and a mini breadboard were used for modular connections during initial testing.


Step B) Upload a test script to validate circuit functionalit

Before integrating with Unity, a basic Arduino script was uploaded to verify that the actuator responded correctly to commands from the DRV2605L driver. The test code initialized the motor and triggered a short vibration when powered on. If the motor vibrates correctly, this confirms that all wiring is functional and the driver is properly set up. At this stage, no communication with Unity is involved (attachment test_code_arduino).


Step C) Establish serial communication between Unity and Arduino via Python interface

After confirming hardware functionality, the next step involved establishing communication between Unity (running on the HoloLens) and the Arduino microcontroller in order to drive the haptic feedback motor. This was achieved using a modular communication pipeline consisting of three integrated components:

1) Unity (running on HoloLens) calculates the needle’s angular deviation from the ideal puncture path.

2) A Python interface script runs on a host PC, receiving angle values from Unity over UDP and forwarding them via serial USB.

3) The Arduino receives the angle string through the serial port and triggers the appropriate vibration pattern via the DRV2605L haptic driver.

This layered setup provides a clean division of responsibilities: Unity handles real-time spatial computation and network messaging, the Python script bridges wireless and wired protocols and the Arduino translates numerical data into physical feedback.


Angle calculation and UDP transmission in Unity:

Inside Unity, the angle of deviation is computed by comparing the orientation of the virtual needle (registered to the physical needle using electromagnetic tracking) with a predefined ideal path representing the optimal PCNL trajectory. This is done using Unity’s Vector3.Angle function, which outputs the angular difference in degrees.

The resulting angle is then:

  1. Clamped between 0° and 33° to match the operational range of the haptic system.
  2. Formatted as a string with one decimal precision (eg: "12.5").
  3. Sent to the host PC using UDP to a predefined IP address and port (typically port 8888).

To avoid redundant communication, Unity only transmits a new angle when the value changes significantly (threshold: 0.05°) and throttles the update rate to once every 0.2 seconds. This ensures responsive yet efficient communication that doesn't overwhelm the serial buffer on the Arduino. This entire process is handled by the Unity script AngleCalculator.cs. In addition to sending the angle, Unity provides visual feedback in the user interface by updating an angle display and directional indicators (arrows) that guide the user toward proper alignment. This information is displayed live during the procedure and hidden when not needed.


UDP to Serial Translation in Python:

The host PC runs a Python script (attachement PythonCode) that serves as the intermediary between Unity and the Arduino. This script listens for UDP messages from Unity, decodes them and relays them to the Arduino over a USB serial connection.

Upon receiving a UDP packet:

  1. The Python script strips whitespace and newline characters.
  2. It appends a new line (\n) to the message to ensure the Arduino can detect the end of the data.
  3. It writes the result to the serial port (eg: COM6) at a baud rate of 9600.
  4. A short delay (50 ms) is inserted after each transmission to prevent flooding the Arduino’s input buffer.

This interface makes it easy to monitor incoming messages in real time through the console, which is helpful for both debugging and logging purposes. Because Unity communicates over UDP and not serial, the Python script acts as the critical bridge between the network layer and the hardware layer, without introducing noticeable latency.


Angle parsing and haptic feedback in Arduino:

On the Arduino side, a script (attachment Arduino_haptic) continuously monitors the serial port for incoming data. Characters are read one at a time until a newline character is detected, at which point the message is parsed as a float angle value. The Arduino sketch then interprets this angle and maps it to one of several predefined vibration levels using conditional thresholds. Each level corresponds to a vibration waveform stored in the DRV2605L haptic driver. The Arduino selects the correct waveform and initiates playback by triggering the driver's go() function. The driver is set to internal trigger mode, allowing precise control over vibration patterns through I2C commands. The motor is activated by pulsing the INT pin on the driver, which is connected to digital pin 13 (D13) on the Arduino Micro.

The overall effect is a proportional tactile feedback system: the more the needle deviates from the target path, the stronger and more frequent the vibration becomes. When alignment is restored and the deviation angle drops below the 3° threshold, vibration stops entirely, providing a clear and intuitive signal to the user.


Step D) Test the setup within the Unity Game environment

Before deploying to the HoloLens, testing was done directly in the Unity Game environment (Play mode) on PC. This step ensures that angle calculations, data transmission and motor responses all behave as expected. Unity debug messages confirmed the angles were being calculated and sent. Arduino’s onboard LED and Serial output (when tested without Unity) further verified that the motor vibrated with appropriate strength and frequency based on the angle received.

Before deploying to the HoloLens, the system was first tested within the Unity Game view (Play mode) on a PC. This intermediate step serves two key purposes: It allowed developers to verify that Unity was generating and transmitting angle data correctly in real time and it confirmed that the Python interface and Arduino firmware were working as expected to receive and respond to that data.

Within Unity, debug logs are used to track the calculated angle, the clamped value sent to Arduino, and confirmation that the serial port is active. Testing was done with a wide range of angles to observe how the vibration intensity changed accordingly, verifying both responsiveness and accuracy. Additionally, any UI elements in Unity (eg: text displays for angles or on-screen alignment indicators) could be observed during this step to ensure they reflected the correct system state. This phase is especially valuable for debugging in a controlled environment before moving onto the next step and deploying the application on the Hololens.


Step E) Build and deploy the Unity app to HoloLens

Once the Unity application and Arduino communication were fully verified in the Game view, the next step was to build the application for the Microsoft HoloLens 2. Unity projects targeting HoloLens must be configured to build for the Universal Windows Platform (UWP). After selecting UWP as the build target the project was exported to a Visual Studio solution. Using Visual Studio, the solution was compiled and deployed to the HoloLens via type C cable.

At runtime, the HoloLens runs the Unity application independently, sending angle data to the Python interface script running on a nearby PC. The Arduino remains physically connected via USB to the same PC, and the Python script continues to act as the bridge, forwarding all UDP messages from Unity to the Arduino’s serial port.

This hybrid deployment setup allows:

  1. Unity to run natively on the HoloLens with full spatial tracking and UI rendering.
  2. Real-time haptic feedback to be delivered through a PC-connected Arduino.
  3. Debugging and communication monitoring via the Python console on the PC.

The separation between Unity and the Arduino ensures flexibility during testing and makes the system scalable for future integration into untethered or more distributed AR setups.


Step F) Transfer the breadboard design to a compact PCB

Once the circuit was fully validated on the breadboard, the design was converted into a compact, robust version suitable for wearable use. All components were transferred onto a custom-cut PCB, specifically trimmed to fit comfortably onto the wristband. The layout of the components on the PCB was carefully planned to maintain a flat profile and minimize overlapping wires. The Arduino Micro, DRV2605L haptic driver and Drake actuator were positioned in such a way that connections remained short, organized and unobstructed. Each component was soldered directly onto the board using a fine-tip soldering iron for precision. To improve thermal conductivity and ensure strong electrical connections, Lotpaste "E" soldering paste was applied at all solder joints. This thermal paste not only helped distribute heat evenly during soldering, reducing the risk of damaging sensitive components, but also improved the consistency and durability of the connections.

After all components were soldered, wires were routed and anchored securely and any excess leads or material extending from the board were carefully trimmed to prevent snagging or short circuits. The finished PCB was then thoroughly cleaned using ethanol alcohol and a soft brush to remove any residual flux, paste or debris that could interfere with performance or cause corrosion over time. Finally, a round of functional testing was performed on the completed PCB circuit to confirm that communication with Unity was still intact and that the vibration motor responded correctly to a range of test angles. This compact version now served as the final hardware assembly to be integrated into the wearable wristband.


Step G) Assemble the wearable wristband

With the electronics finalized, the components were sewn into a soft Velcro-based wristband. The PCB was attached securely, and the Drake actuator was housed inside a cushioned headband material sewn to the inside of the wristband. To improve skin contact and distribute vibration, two 3D-printed structures were added beside the motor. These helped transmit the vibrations more effectively across a wider area. Sewing was used for all structural integration to ensure the wristband was sturdy but flexible.

With the electronics finalized and the PCB circuit completed, the next step was integrating the components into a wearable wristband. The PCB was first sewn directly onto a Velcro-based strap, carefully positioned so that the USB connection faces outward (to the right) when worn on the right wrist. This orientation ensures easy access to the USB port during testing and use. The Drake actuator was intentionally placed at the end of the PCB so that it could be embedded directly under the band. To protect the user’s skin from heat generated by the motor during vibration, an additional thin elastic headband was introduced beneath the Velcro strap. This headband acts as a protective insulation layer, preventing direct contact between the actuator and the wrist.

The motor itself was inserted inside this headband, which was folded over and then sewn shut to keep the motor securely enclosed while maximizing comfort and safety. Once the actuator was secured within the headband, the entire headband was then sewn onto the underside of the Velcro wristband, unifying the structure into one wearable band. However, to allow for adjustability across different arm sizes, the back portion of the band was left partially open, preserving flexibility while maintaining the device's structural integrity. This handcrafted approach ensured that the wristband was secure, wearable, skin-safe, and compatible with repeated testing across multiple users.


Step H) Conduct user testing

The completed wristband was then tested with users in the full experimental PCNL setup. Testing procedures and user feedback will be discussed in the next section.

Results & Discussion

r.jpg
nano.jpg
uno.jpg

To evaluate the feasibility and usability of the haptic wristband prototype, a group of 16 users who previously participated in the main PCNL AR-guided study (from the Master Thesis project) were invited to test this new system. Each user was asked to perform a full puncture task while wearing the wristband. For each trial, the following three performance metrics were recorded

  1. Puncture time
  2. Total operation time
  3. Number of puncture attempts

The graph showcases the average operation time, as this metric revealed the most noticeable differences across the various guidance methods and allowed for a clear performance comparison. As shown, the haptic wristband resulted in the shortest average operation time at 16.4 seconds. This was followed by sound cues (18.32 s), visual cues (18.6 s), and color cues (23.68 s). In contrast, the traditional US-guided approach resulted in a significantly higher operation time of 97.32 seconds. These results suggest that haptic guidance may offer a more efficient and intuitive experience during the puncture task.

Although the participant group for the wristband condition was limited to 16 users, the data for all other guidance methods was based on a much larger group of 47 participants. This difference in sample size means that no statistically grounded conclusions can yet be drawn from the wristband results. To enable a fair and validated comparison, a larger number of puncture attempts would need to be recorded using the haptic system. However, for the sake of illustrating the early findings, the results were still visualized in the graph to give a first impression of how the system performs. Even with this smaller sample size, the consistently fast operation times suggest that the system is both feasible and intuitively usable. Participants were able to rely on the vibration feedback to guide their needle alignment without the need for additional visual or auditory input. Informal feedback further indicated that the system felt natural and did not distract from the primary task. While further testing is necessary to establish the effectiveness of the wristband at scale, the results obtained so far are promising. They highlight the potential for haptic feedback to serve as a simple, low-distraction guidance tool in clinical environments.


Troubleshooting & Development Challenges

Throughout the project, several technical challenges emerged that required major revisions to both hardware and software. The initial goal was to develop a fully wireless wristband using an Arduino Nano ESP32, powered by a portable 9V battery. This setup worked initially, but after a while after plugging out the Type C calble, the motor failed to respond. Despite repeated efforts, including rewiring, using alternative components, flashing different code versions, and troubleshooting connections, the issue persisted. A similar problem occurred when switching to an Arduino Uno Rev2, where the motor also became unresponsive and failed to vibrate. Each Arduino model required its own libraries and communication scripts, which had to be rewritten and tested separately, increasing development complexity. After extensive debugging and testing, the decision was made to revert to the Arduino Micro, which offered stable USB-powered operation, though without wireless capability.

Even with the Micro, one unexpected issue occurred where the motor suddenly stopped vibrating during operation. The Drake actuator was replaced twice, and the wiring was checked, but the problem remained. Both the Arduino and the DRV2605L haptic driver board had active indicator lights, leading us to suspect the green screw terminal block as the point of failure. After carefully desoldering the component and replacing it with a new terminal, the system resumed normal function, confirming that the fault was due to a hardware connector rather than the motor or logic boards.


Conclusion & Future Work

This project presents a promising first step toward integrating haptic feedback into AR-guided PCNL procedures. While the prototype was successfully tested with a small group of 16 participants, more data is needed before drawing firm conclusions or comparing it directly with other guidance layers tested previously on 47 users. Expanding the number of tests will be essential to validate its effectiveness and generalizability.

As explained in the previous section, the original goals was to develop a fully wireless wristband using the Arduino Nano ESP32 powered by a portable 9V battery. Although this setup functioned briefly, it failed under real testing conditions. Future work should investigate the root cause of this failure in greater depth, and explore alternative wireless microcontrollers that can handle continuous vibration output reliably. Achieving a truly portable system would significantly improve usability and integration in clinical environments.

Despite technical hurdles, the prototype has demonstrated clear potential. Users were able to rely on the vibration cues for intuitive needle alignment, with no reports of distraction or overload. Preliminary testing showed operation times comparable to, or even faster than, other AR-based feedback methods. This suggests that haptic guidance could serve as a simple, low-cognitive-load alternative, particularly valuable in procedures like PCNL where surgeons already face significant cogntiive workloads.

The integration of haptic feedback into medical interfaces continues to grow, offering new ways to enhance spatial awareness and precision. This ptototype contributes to that evolution by demonstrating a functional, low-cost wristband that can serve as a foundation for more sophisticated and wireless solutions in the future.

References

[1] B. Doré, “Facteurs de risques et prise en charge des complications de la néphrolithotomie percutanée,” Jun. 2006. doi: 10.1016/j.anuro.2006.01.006.

[2] M. Akand et al., “Feasibility of a novel technique using 3-dimensional modeling and augmented reality for access during percutaneous nephrolithotomy in two different ex-vivo models,” Int Urol Nephrol, vol. 51, no. 1, pp. 17–25, Jan. 2019, doi: 10.1007/s11255-018-2037-0.

[3] R. Kachkoul, G. B. Touimi, G. El Mouhri, R. El Habbani, M. Mohim, and A. Lahrichi, “Urolithiasis: History, epidemiology, aetiologic factors and management,” 2023.

[4] D. R. Webb, “Percutaneous Renal Surgery A Practical Clinical Handbook.”

[5] E. Checcucci et al., “3D mixed reality holograms for preoperative surgical planning of nephron-sparing surgery: evaluation of surgeons’ perception,” Minerva Urology and Nephrology, vol. 73, no. 3, pp. 367–375, Jun. 2021, doi: 10.23736/S2724-6051.19.03610-5.

[6] F. Porpiglia et al., “Percutaneous Kidney Puncture with Three-dimensional Mixed-reality Hologram Guidance: From Preoperative Planning to Intraoperative Navigation,” Eur Urol, vol. 81, no. 6, pp. 588–597, Jun. 2022, doi: 10.1016/j.eururo.2021.10.023.

[7] C. Xu et al., “Conventional ultrasonography enabled with augmented reality needle guidance for percutaneous kidney access: An innovative methodologies randomized controlled trial,” International Journal of Surgery, Jan. 2024, doi: 10.1097/js9.0000000000002033.

[8] Y. Gao, C. Qin, B. Tao, J. Hu, Y. Wu, and X. Chen, “An electromagnetic tracking implantation navigation system in dentistry with virtual calibration,” International Journal of Medical Robotics and Computer Assisted Surgery, vol. 17, no. 2, Apr. 2021, doi: 10.1002/rcs.2215.

[9] J. Rassweiler, M. C. Rassweiler, and J. Klein, “New technology in ureteroscopy and percutaneous nephrolithotomy,” Jan. 01, 2016, Lippincott Williams and Wilkins. doi: 10.1097/MOU.0000000000000240.

[10] I. M. Spenkelink, X. Zhu, J. J. Fütterer, and J. F. Langenhuijsen, “Feasibility of stereotactic optical navigation for needle positioning in percutaneous nephrolithotomy,” World J Urol, vol. 42, no. 1, Dec. 2024, doi: 10.1007/s00345-024-04870-0.

[11] Arduino, "Wire library," Arduino Documentation, [Online]. Available: https://docs.arduino.cc/learn/communication/wire/. [Accessed: 26-May-2025].

[12] Unity Technologies, "Vector3.Angle," Unity Scripting API, [Online]. Available: https://docs.unity3d.com/ScriptReference/Vector3.Angle.html. [Accessed: 26-May-2025].

[13] Arduino, "Serial - Communication," Arduino Documentation, [Online]. Available: https://docs.arduino.cc/language-reference/en/functions/communication/serial/. [Accessed: 26-May-2025].


Video

This video presents the background, the development process and the fundamental working principles of our portable haptic feedback wristband. Enjoy!