Haptic Feedback System for Needle Alignment in Augmented Reality-guided PCNL Procedure

by aureliedecaluwe in Circuits > Arduino

14 Views, 0 Favorites, 0 Comments

Haptic Feedback System for Needle Alignment in Augmented Reality-guided PCNL Procedure

Schermafbeelding 2025-05-26 004108.png

Kidney stone disease is a growing global health concern, affecting up to 20% of the population, with recurrence rates reaching 40% within 15 years. For stones larger than 2 cm, the gold-standard treatment is Percutaneous Nephrolithotomy (PCNL), a minimally invasive surgical procedure involving a precise renal puncture to create a tract for stone removal. Despite a high success rate of up to 95%, PCNL remains technically challenging, particularly the initial puncture phase, which is widely recognized as the most critical determinant of procedural success [1][2].

This initial step is often performed semi-blind, relying solely on 2D imaging modalities such as ultrasound (US) or fluoroscopy (FS) to guide the needle through complex 3D renal anatomy. As a result, it imposes a steep learning curve and high cognitive demands on surgeons, increasing the risk of complications such as hemorrhage or injury to surrounding organs including the colon, pleura, spleen or liver [3].

To address these challenges, recent years have seen increasing exploration of image-guided navigation systems, ranging from robotic assistance and laser targeting to 3D CT-based planning, electromagnetic tracking (EMT) and augmented reality (AR) systems. These technologies aim to enhance accuracy and reduce operator dependency, however many remain experimental or confined to high-resource settings [4][5].

In my Master’s thesis, I investigated how AR could serve as a more accessible and intuitive tool to support the renal puncture phase of PCNL. Previous studies have shown that AR can project ideal trajectories onto the body surface [6], yet they typically rely on FS for needle verification and lack real-time tracking capabilities. Optical tracking, while promising, is limited by its dependence on a clear line of sight, an impractical constraint in real-world surgical environments, especially when tools are bent or partially obstructed [7][8]. EMT offers a compelling alternative, providing real-time, line-of-sight-independent tracking of surgical instruments. Despite its potential, EMT has seen limited application in PCNL [9][10]. To explore this, I developed a novel AR-guided puncture system integrating EM needle tracking via the Aurora EMT system. A Unity-based application was deployed on the Microsoft HoloLens 2, overlaying a 3D anatomical model and ideal needle trajectory directly onto the patient's body. Real-time needle tracking allowed the virtual representation to reflect the physical needle’s movement, enabling intuitive, FS-free navigation.

A feasibility study was conducted to compare this AR-guided system against conventional US-guided puncture. A custom silicone phantom was created, containing a 3D-printed kidney with an electronic button embedded in the target calyx. Successful needle contact triggered the button, completing the task and allowing for objective evaluation of puncture time, operation duration and number of attempts. Three levels of AR guidance were tested: (1) color cues (green for aligned, red for near-miss), (2) audio feedback, and (3) augmented with real-time angle deviation data and directional arrows.

The results indicated a clear performance improvement across all metrics using AR guidance compared to traditional methods. However, distinctions between the three feedback layers were less pronounced. While layered cues appeared to support user performance, their effectiveness may be limited by cognitive overload or suboptimal design. Some participants reported that excessive visual stimuli interfered with concentration during the puncture task. To address this limitation, we introduced a novel form of non-visual guidance: a wearable haptic feedback wristband. This device incorporated a Drake vibration motor, driven by an Arduino-based controller, to provide tactile cues based on needle alignment. As angular deviation increased, the vibration intensified, as alignment improved, the vibration decreased, ultimately ceasing upon correct needle orientation. This intuitive feedback mechanism reduced reliance on visual input, allowing users to make adjustments instinctively in response to tactile signals. Much like a conditioned reflex, users were motivated to "silence" the vibration by correcting their hand position.

The remainder of this post focuses specifically on the development, implementation and evaluation of this haptic feedback wristband prototype as a complementary guidance modality for AR-assisted PCNL.

Supplies

rrr.jpg

Since this project builds on my Master Thesis work, this section provides a detailed bill of materials needed to replicate the haptic feedback wristband. This post will focus exclusively on the wristband itself, how it was designed, built and integrated. For completeness, a list of all hardware and software components used in the full experimental setup have been included aswell. These are not required to build the wristband alone, but are necessary to fully test it in the context of the larger AR-guided PCNL system.


Materials for the wristband prototype:

  1. Arduino Micro (ATMega32U4-based)
  2. DRV2605L haptic driver (eg: Adafruit breakout board)
  3. Drake actuator
  4. Mini breadboard (for prototyping)
  5. Jumper wires
  6. PCB board (for final version)
  7. Velcro wristband or adjustable strap
  8. Hairband
  9. Basic sewing kit (to attach motor to strap)
  10. Soldering iron and lead
  11. USB cable (for powering and programming the Arduino)

Software used for the wristband prototype:

  1. Unity
  2. Visual studio
  3. Arduino IDE

Components required for the entire project (for testing purposes):

  1. Microsoft Hololens 2
  2. Aurora EMT system (NDI)
  3. Neelde with EM sensor
  4. Custom phantum model

Methods

Arduino_prototypeCircuit.jpg
solo.jpg
zer.png
bro.png
band3.jpg

The following section outlines the complete process of creating the haptic feedback wristband, along with how communication between different software platforms, primarily Unity (running on HoloLens) and Arduino was established. Below is an overview of the development workflow, which will be explained in more detail step by step:


Step A) Create the Arduino circuit on a breadboard

Step B) Upload a test script to validate circuit functionality

Step C) Establish serial communication between Unity and Arduino

Step D) Test the setup within the Unity Game environment

Step E) Build and deploy the Unity app to HoloLens

Step F) Transfer the breadboard design to a compact PCB

Step G) Assemble the wearable wristband

Step H) Conduct user testing


Step A) Create the Arduino circuit on a breadboard

To prototype the circuit efficiently, it was first assembled on a breadboard. This setup allows for easy modifications and troubleshooting before committing to a soldered PCB. The components include:

  1. Arduino Micro (ATMega32U4)
  2. DRV2605L haptic driver
  3. Drake vibration motor actuator

The DRV2605L is connected to the Arduino using the I2C protocol, which requires two lines: SDA (data) and SCL (clock). On the Arduino Micro, SDA and SCL are mapped to digital pins 2 and 3, respectively, when using the Wire library in software I2C mode (as opposed to hardware I2C) [11]. This was necessary due to compatibility with the board layout and available pins. If using hardware I2C, you would connect to pins D2 (SDA) and D3 (SCL), which internally map to the correct I2C functions on the Micro. Power (+5V) and GND were connected from the Arduino to the haptic driver and actuator. Jumper wires and a mini breadboard were used for modular connections during initial testing.


Step B) Upload a test script to validate circuit functionality

Before integrating with Unity, a basic Arduino script was uploaded to verify that the actuator responded correctly to commands from the DRV2605L driver. The test code initialized the motor and triggered a short vibration when powered on. If the motor vibrates correctly, this confirms that all wiring is functional and the driver is properly set up. At this stage, no communication with Unity is involved (attachment test_code_arduino).


Step C) Establish serial communication between Unity and Arduino

After confirming hardware functionality, communication between Unity and Arduino was established using serial communication over USB. This connection enables Unity to continuously send the deviation angle between the end of the virtual needle and the ideal puncture path to Arduino, which then translates this value into the appropriate haptic feedback pattern on the wristband.

In Unity, the angle is calculated by comparing the orientation of the virtual needle (overlayed on the physical neelde by EM tracking) with the predefined ideal puncture path that represents the optimal trajectory for PCNL access. Specifically, the Vector3.Angle function is used to measure the angular difference between the upward direction of the needle and the direction of the ideal path [12]. If the needle is tilted away from the path, the angle increases. This angle reflects the degree of deviation that the user must correct during the procedure. Once calculated, the angle is clamped within a defined range because the deviation is only visualized when the needle is inside the proximity cone surrounding the ideal puncture path. This cone represents the correction zone, any angle beyond it is treated as a large deviation. To avoid endlessly increasing the vibration, all values above 25° are treated as 25°, which represents the maximum feedback intensity. This value is then converted to a string format and transmitted through the serial port using the SerialPort.Write() function. For example, if the needle is deviating by 12.5°, Unity sends the string "12.5\n" to the Arduino via USB. The update frequency is set to every 0.05 seconds, ensuring smooth and responsive feedback without overwhelming the serial buffer.

On the Arduino side, this string is received line-by-line using the Serial.read() function. The Arduino reads characters until it encounters a newline character (\n), at which point it converts the received string into a float value. This value represents the angle of deviation and is immediately processed. The Arduino code first takes the absolute value of the angle to ensure that both negative and positive deviations are treated equally, and then clamps it to a maximum of 25° to stay within the predefined operational range of the vibration system. This final clamped angle is then mapped to specific waveform IDs and vibration intervals using conditional thresholds. For instance, a small deviation (eg: under 3°) results in no vibration, while larger deviations produce stronger and more frequent vibrations. The motor is controlled through the DRV2605L haptic driver, which receives the waveform ID over I2C and triggers the appropriate feedback using internal vibration profiles stored in its library. The driver is configured to run in internal trigger mode, meaning that the Arduino selects the desired waveform and initiates playback using a go() command. This internally pulses the driver's INT pin, which is connected to digital pin 13 (D13) on the Arduino. This setup removes the need for PWM control and allows precise vibration patterns to be activated directly through I2C and the INT line.

This setup allows the wristband to provide real-time tactile feedback that intuitively guides the user: the greater the deviation from the ideal path, the stronger the vibration. As the user adjusts the needle and the angle decreases, the vibration weakens, eventually stopping entirely when perfect alignment is achieved.

It is important to note that the Arduino Serial Monitor cannot be open while Unity is communicating with the Arduino. The serial connection can only be used by one program at a time. If the Serial Monitor in the Arduino IDE is open, Unity will not be able to access the port, and the connection will fail. Therefore, for testing, debugging and deployment, ensure the serial port is exclusively controlled by the Unity application [13].

The communication scripts used in this setup are included in the attachments: Unity_angleCalculator.cs (Unity-side C# script) and hapticfeedback_Ard-Unity.ino (Arduino-side firmware). These scripts handle all aspects of the angle computation, transmission, reception and haptic control.


Step D) Test the setup within the Unity Game environment

Before deploying to the HoloLens, testing was done directly in the Unity Game environment (Play mode) on PC. This step ensures that angle calculations, data transmission and motor responses all behave as expected. Unity debug messages confirmed the angles were being calculated and sent. Arduino’s onboard LED and Serial output (when tested without Unity) further verified that the motor vibrated with appropriate strength and frequency based on the angle received.

Before deploying to the HoloLens, the system was first tested within the Unity Game view (Play mode) on a PC. This intermediate step serves two key purposes: it allows developers to verify that angle calculations are functioning correctly in real time, and it ensures that the serial communication between Unity and Arduino is stable before moving to the more complex HoloLens build process. Within Unity, debug logs are used to track the calculated angle, the clamped value sent to Arduino, and confirmation that the serial port is active. On the Arduino side, the Serial Monitor (only accessible when Unity is not running) can be used temporarily to check whether the values are arriving correctly and being interpreted as expected. Testing was done with a wide range of angles to observe how the vibration intensity changed accordingly, verifying both responsiveness and accuracy. Additionally, any UI elements in Unity (eg: text displays for angles or on-screen alignment indicators) could be observed during this step to ensure they reflected the correct system state. This phase is especially valuable for debugging in a controlled environment before moving onto the next step and deploying the application on the Hololens.


Step E) Build and deploy the Unity app to HoloLens

After confirming that communication was stable and accurate, the Unity project was built for UWP (Universal Windows Platform) and deployed to the HoloLens using Visual Studio. The Arduino remained connected via USB to the host PC, and communication continued over the same serial port.

Once the Unity application and Arduino communication were fully verified in the Game view, the next step was to build the application for the Microsoft HoloLens 2. Unity projects targeting HoloLens must be configured to build for the Universal Windows Platform (UWP). After selecting UWP as the build target the project was exported to a Visual Studio solution. Using Visual Studio, the solution was compiled and deployed to the HoloLens via type C cable.

During runtime, the Arduino remains physically connected to the host PC, not the HoloLens itself. Unity, running on the HoloLens, continues to send angle data over the WiFi-connected app session, while the actual serial communication to the Arduino happens from the Unity editor on the PC. This hybrid setup enables testing the AR guidance system on HoloLens while still using the Arduino tethered to a computer.


Step F) Transfer the breadboard design to a compact PCB

Once the circuit was fully validated on the breadboard, the design was converted into a compact, robust version suitable for wearable use. All components were transferred onto a custom-cut PCB, specifically trimmed to fit comfortably onto the wristband. The layout of the components on the PCB was carefully planned to maintain a flat profile and minimize overlapping wires. The Arduino Micro, DRV2605L haptic driver and Drake actuator were positioned in such a way that connections remained short, organized and unobstructed. Each component was soldered directly onto the board using a fine-tip soldering iron for precision. To improve thermal conductivity and ensure strong electrical connections, Lotpaste "E" soldering paste was applied at all solder joints. This thermal paste not only helped distribute heat evenly during soldering, reducing the risk of damaging sensitive components, but also improved the consistency and durability of the connections.

After all components were soldered, wires were routed and anchored securely and any excess leads or material extending from the board were carefully trimmed to prevent snagging or short circuits. The finished PCB was then thoroughly cleaned using ethanol alcohol and a soft brush to remove any residual flux, paste or debris that could interfere with performance or cause corrosion over time. Finally, a round of functional testing was performed on the completed PCB circuit to confirm that communication with Unity was still intact and that the vibration motor responded correctly to a range of test angles. This compact version now served as the final hardware assembly to be integrated into the wearable wristband.


Step G) Assemble the wearable wristband

With the electronics finalized, the components were sewn into a soft Velcro-based wristband. The PCB was attached securely, and the Drake actuator was housed inside a cushioned headband material sewn to the inside of the wristband. To improve skin contact and distribute vibration, two 3D-printed structures were added beside the motor. These helped transmit the vibrations more effectively across a wider area. Sewing was used for all structural integration to ensure the wristband was sturdy but flexible.

With the electronics finalized and the PCB circuit completed, the next step was integrating the components into a wearable wristband. The PCB was first sewn directly onto a Velcro-based strap, carefully positioned so that the USB connection faces outward (to the right) when worn on the right wrist. This orientation ensures easy access to the USB port during testing and use. The Drake actuator was intentionally placed at the end of the PCB so that it could be embedded directly under the band. To protect the user’s skin from heat generated by the motor during vibration, an additional thin elastic headband was introduced beneath the Velcro strap. This headband acts as a protective insulation layer, preventing direct contact between the actuator and the wrist.

The motor itself was inserted inside this headband, which was folded over and then sewn shut to keep the motor securely enclosed while maximizing comfort and safety. Once the actuator was secured within the headband, the entire headband was then sewn onto the underside of the Velcro wristband, unifying the structure into one wearable band. However, to allow for adjustability across different arm sizes, the back portion of the band was left partially open, preserving flexibility while maintaining the device's structural integrity. This handcrafted approach ensured that the wristband was secure, wearable, skin-safe, and compatible with repeated testing across multiple users.


Step H) Conduct user testing

The completed wristband was then tested with users in the full experimental PCNL setup. Testing procedures and user feedback will be discussed in the next section.

Results & Discussion

r.jpg
nano.jpg
uno.jpg

To evaluate the feasibility and usability of the haptic wristband prototype, a group of 16 users who previously participated in the main PCNL AR-guided study (from the Master Thesis project) were invited to test this new system. Each user was asked to perform a full puncture task while wearing the wristband. For each trial, the following three performance metrics were recorded

  1. Puncture time
  2. Total operation time
  3. Number of puncture attempts

The graph showcases the average operation time, as this metric revealed the most noticeable differences across the various guidance methods and allowed for a clear performance comparison. As shown, the haptic wristband resulted in the shortest average operation time at 16.4 seconds. This was followed by sound cues (18.32 s), visual cues (18.6 s), and color cues (23.68 s). In contrast, the traditional US-guided approach resulted in a significantly higher operation time of 97.32 seconds. These results suggest that haptic guidance may offer a more efficient and intuitive experience during the puncture task.

Although the participant group for the wristband condition was limited to 16 users, the data for all other guidance methods was based on a much larger group of 47 participants. This difference in sample size means that no statistically grounded conclusions can yet be drawn from the wristband results. To enable a fair and validated comparison, a larger number of puncture attempts would need to be recorded using the haptic system. However, for the sake of illustrating the early findings, the results were still visualized in the graph to give a first impression of how the system performs. Even with this smaller sample size, the consistently fast operation times suggest that the system is both feasible and intuitively usable. Participants were able to rely on the vibration feedback to guide their needle alignment without the need for additional visual or auditory input. Informal feedback further indicated that the system felt natural and did not distract from the primary task. While further testing is necessary to establish the effectiveness of the wristband at scale, the results obtained so far are promising. They highlight the potential for haptic feedback to serve as a simple, low-distraction guidance tool in clinical environments.


Troubleshooting & Development Challenges

Throughout the project, several technical challenges emerged that required major revisions to both hardware and software. The initial goal was to develop a fully wireless wristband using an Arduino Nano ESP32, powered by a portable 9V battery. This setup worked initially, but after a while after plugging out the Type C calble, the motor failed to respond. Despite repeated efforts, including rewiring, using alternative components, flashing different code versions, and troubleshooting connections, the issue persisted. A similar problem occurred when switching to an Arduino Uno Rev2, where the motor also became unresponsive and failed to vibrate. Each Arduino model required its own libraries and communication scripts, which had to be rewritten and tested separately, increasing development complexity. After extensive debugging and testing, the decision was made to revert to the Arduino Micro, which offered stable USB-powered operation, though without wireless capability.

Even with the Micro, one unexpected issue occurred where the motor suddenly stopped vibrating during operation. The Drake actuator was replaced twice, and the wiring was checked, but the problem remained. Both the Arduino and the DRV2605L haptic driver board had active indicator lights, leading us to suspect the green screw terminal block as the point of failure. After carefully desoldering the component and replacing it with a new terminal, the system resumed normal function, confirming that the fault was due to a hardware connector rather than the motor or logic boards.


Conclusion & Future Work

This project presents a promising first step toward integrating haptic feedback into AR-guided PCNL procedures. While the prototype was successfully tested with a small group of 16 participants, more data is needed before drawing firm conclusions or comparing it directly with other guidance layers tested previously on 47 users. Expanding the number of tests will be essential to validate its effectiveness and generalizability.

As explained in the previous section, the original goals was to develop a fully wireless wristband using the Arduino Nano ESP32 powered by a portable 9V battery. Although this setup functioned briefly, it failed under real testing conditions. Future work should investigate the root cause of this failure in greater depth, and explore alternative wireless microcontrollers that can handle continuous vibration output reliably. Achieving a truly portable system would significantly improve usability and integration in clinical environments.

Despite technical hurdles, the prototype has demonstrated clear potential. Users were able to rely on the vibration cues for intuitive needle alignment, with no reports of distraction or overload. Preliminary testing showed operation times comparable to, or even faster than, other AR-based feedback methods. This suggests that haptic guidance could serve as a simple, low-cognitive-load alternative, particularly valuable in procedures like PCNL where surgeons already face significant cogntiive workloads.

The integration of haptic feedback into medical interfaces continues to grow, offering new ways to enhance spatial awareness and precision. This ptototype contributes to that evolution by demonstrating a functional, low-cost wristband that can serve as a foundation for more sophisticated and wireless solutions in the future.

References

[1] B. Doré, “Facteurs de risques et prise en charge des complications de la néphrolithotomie percutanée,” Jun. 2006. doi: 10.1016/j.anuro.2006.01.006.

[2] M. Akand et al., “Feasibility of a novel technique using 3-dimensional modeling and augmented reality for access during percutaneous nephrolithotomy in two different ex-vivo models,” Int Urol Nephrol, vol. 51, no. 1, pp. 17–25, Jan. 2019, doi: 10.1007/s11255-018-2037-0.

[3] R. Kachkoul, G. B. Touimi, G. El Mouhri, R. El Habbani, M. Mohim, and A. Lahrichi, “Urolithiasis: History, epidemiology, aetiologic factors and management,” 2023.

[4] D. R. Webb, “Percutaneous Renal Surgery A Practical Clinical Handbook.”

[5] E. Checcucci et al., “3D mixed reality holograms for preoperative surgical planning of nephron-sparing surgery: evaluation of surgeons’ perception,” Minerva Urology and Nephrology, vol. 73, no. 3, pp. 367–375, Jun. 2021, doi: 10.23736/S2724-6051.19.03610-5.

[6] F. Porpiglia et al., “Percutaneous Kidney Puncture with Three-dimensional Mixed-reality Hologram Guidance: From Preoperative Planning to Intraoperative Navigation,” Eur Urol, vol. 81, no. 6, pp. 588–597, Jun. 2022, doi: 10.1016/j.eururo.2021.10.023.

[7] C. Xu et al., “Conventional ultrasonography enabled with augmented reality needle guidance for percutaneous kidney access: An innovative methodologies randomized controlled trial,” International Journal of Surgery, Jan. 2024, doi: 10.1097/js9.0000000000002033.

[8] Y. Gao, C. Qin, B. Tao, J. Hu, Y. Wu, and X. Chen, “An electromagnetic tracking implantation navigation system in dentistry with virtual calibration,” International Journal of Medical Robotics and Computer Assisted Surgery, vol. 17, no. 2, Apr. 2021, doi: 10.1002/rcs.2215.

[9] J. Rassweiler, M. C. Rassweiler, and J. Klein, “New technology in ureteroscopy and percutaneous nephrolithotomy,” Jan. 01, 2016, Lippincott Williams and Wilkins. doi: 10.1097/MOU.0000000000000240.

[10] I. M. Spenkelink, X. Zhu, J. J. Fütterer, and J. F. Langenhuijsen, “Feasibility of stereotactic optical navigation for needle positioning in percutaneous nephrolithotomy,” World J Urol, vol. 42, no. 1, Dec. 2024, doi: 10.1007/s00345-024-04870-0.

[11] Arduino, "Wire library," Arduino Documentation, [Online]. Available: https://docs.arduino.cc/learn/communication/wire/. [Accessed: 26-May-2025].

[12] Unity Technologies, "Vector3.Angle," Unity Scripting API, [Online]. Available: https://docs.unity3d.com/ScriptReference/Vector3.Angle.html. [Accessed: 26-May-2025].

[13] Arduino, "Serial - Communication," Arduino Documentation, [Online]. Available: https://docs.arduino.cc/language-reference/en/functions/communication/serial/. [Accessed: 26-May-2025].


Video

This video presents the background, the development process and the fundamental working principles of our portable haptic feedback wristband. Enjoy!