Haptic Feedback Motion Correction Rehabilitation Device

by dfougias in Circuits > Arduino

326 Views, 6 Favorites, 0 Comments

Haptic Feedback Motion Correction Rehabilitation Device

1000_F_581491994_QbiB8r9l9k2371pPbBZNuaYPVZKK69li.jpg
IMG_0383.jpeg

Anterior cruciate ligament (ACL) injuries are among the most common knee injuries, particularly in athletes and physically active individuals. More than 200,000 people experience ACL injuries in the US annually, with over half of them needing surgical reconstruction [1]. ACL injuries often impair knee functionality and require long rehabilitation processes. Regaining symmetric (equal for both legs) range of motion (ROM) is critical to the rehabilitation of the knee [2]. A critical exercise of early-stage ACL rehabilitation is the heel slide exercise, designed to restore range of motion while minimising joint strain [2]. Such exercises are done multiple times a day, and follow progression based programs with gradient increases in difficulty [1,2]. The need for constant monitoring and doing the exercises multiple times daily highlights the need for home-based rehabilitation.

Traditional home-based approaches include written schedules (pre-determined progression programs) and video guides [3]. The lack of feedback and patient-specific adaptable progression programs leads to poor compliance (i.e. patients not following the exact plan) and reduced quality of life [3]. Nowadays, tracking via phone cameras is used, which has seen improvement in clinical indicators, but not a significant perceived increase in quality of life [3]. Patients take photos of their knee at flexion and extension, and the ROM is automatically calculated, while also indicating when they should go back to a doctor [3]. This sort of system monitors the progression of the patient, but does not give real-time feedback on the correctness of the exercise, basically sending the patient to the doctor when progress stagnates. The use of computer vision is an attractive alternative, using phone cameras and markerless tracking to track motions in 3D [4]. While a better alternative to no live-tracking at all, due to the inaccuracy of such systems in 3D motion tracking, they are not widely used in biomechanics [4]. A step further, wireless Inertial Measurement Units (IMUs) are commonly used for motion tracking [5,6]. IMU sensors are commonly used for tracking joint movements, allowing for accurate real-time tracking, but most algorithms used to quantify an "optimal" path are not validated for patient populations (i.e. they are not accurately generalisable) [5]. In gait analysis, via the use of IMUs and clustering algorithms, classification of the recovery stage of patients can be done [6].

This project introduces a smart rehabilitation system using IMUs and haptic feedback to guide patients during heel slide exercises. The system ensures that the knee follows a pre-defined motion by tracking real-time joint orientation and alerting users via vibrations when deviations exceed an allowable range. The haptic immediate feedback allows the patient to correct motions in real time, while minimising knee strain. Haptic feedback has shown promise in various medical applications for enhancing proprioceptive awareness and promoting motor learning, making it particularly suited for rehabilitation tasks where small adjustments in motion are necessary [7]. While other systems have employed visual or auditory feedback, haptic feedback is less obtrusive and more easily interpreted during physical activity [7].

For heel slides specifically, the femorotibial angle is looked at for the correctness of the motion. Increased tibial external rotation is associated with reduced knee flexion and higher associated loads on certain parts of the knee joint [8]. The ideal angle varies from person to person, but studies have shown that patients with lower femorotibial angles (around 4 degrees) are less likely to experience knee pain [9].

This project aims to improve the current state of at-home physical therapy via the use of low-cost sensors, real-time data processing and personalised haptic feedback. Furthermore, all motion data is logged and can be reviewed by clinicians and algorithms remotely. The long-term goal is to enhance the accessibility and effectiveness of physical therapy, potentially extending this system to other joints and exercises.

Supplies

With modified circuitry, the project can be achieved using different types of multiplexers, actuators and sensors. The extended materials list used in this version of the project is attached below, including the exact component names, prices and sources used in this project. To avoid soldering and the use of breadboards, an emphasis on 4-pin connectors is made.

Arduino Micro: The microcontroller that reads sensor data and controls feedback.

Arduino USB to micro-USB data-link cable: Connects Arduino to the computer.

2 MPU-6050 IMUs: These will capture the orientation and motion of the thigh and shank.

2 LFi Drake Titan Haptics Motor and DRV2605L drivers: These provide haptic feedback based on detected motion errors.

Stemma - PCA9548A Multiplexer: Used to swap channels between IMUs and Haptic Motors; has 4-pin connection ports.

4 Stemma 200mm 4-pin connector cables: Allow for direct connection between multiplexer, IMUs, and DRV modules.

Wires, Breadboard: For interconnections and proper signal control.

Adafruit Push Button: Used to reset trials.

Old Sock and Sewing Material: To place components on the knee.

Arduino Connections

IMG_0379.jpeg

The first step in wiring the circuit is to connect the computer to the Arduino using a data-link USB to micro-USB cable. The Arduino now has power and can be connected to further components. To support multiple IMUs and haptic motors, 4-wire Stemma QT (I²C) connectors were utilised alongside a multiplexer. Using female-to-male jumper wires, the following connections were made from the Arduino to the multiplexer: 5V to VIN, GND to GND, digital pin 2 to SDA, and digital pin 3 to SCL. Subsequently, four 4-wire connections are made to the multiplexer to connect the haptic motors and the IMUs. The IMUs can be directly connected, as illustrated in the image, while the haptic motors require extra wiring, as explained in Step 2.

Haptic Motor Connections

IMG_0380.jpeg

To enable precise and convenient control of the haptic motors, DRV modules are employed. These modules are connected to the two unused 4-wire connections from the previous step, which are routed through the multiplexer. To connect the haptic motors to the DRV modules without soldering, female jumper wires are used. Each jumper wire is connected to the + and – terminals of the DRV module. The free ends of the jumper wires are then twisted together with the corresponding wires from the haptic motors. To ensure safety and prevent short circuits, electrical tape is applied to insulate the exposed connections. The setup is pictured above.

Button Connection

schematic.png
IMG_0385.JPG

Once the IMUs and the haptic motors are connected to the circuit, a push-button is added to allow for more control of the circuit. A typical switch connection for Arduinos is pictured in the figure on the left. In the figure on the right, the setup that is used is shown. The push button used has four connections as it is both a switch and an LED. This specific LED has an internal resistor and is thus connected to 5V and ground directly. For the switch of the push button, 5V goes through the switch, to a 10kΩ resistor and back to the ground, branching to Pin #7 of the Arduino to send the switch signal, similar to the figure on the left.

Sensor Placement

IMG_0386.jpeg
IMG_0387.jpeg

Correct sensor placement is important when it comes to evaluating the performance of the system. An old sock is taken, and by cutting a hole on one side, it becomes a knee wearable. The multiplexer, DRVs and IMUs can be sewn onto the sock as seen in the picture above. The position of the IMU is of particular importance, as it should be pointing along the tibia, as pictured in the figure on the left. The haptic motors can be held in place by the sock (between sock and leg) as seen in the figure on the right. Additionally, the sock can be adjusted as such that when at rest, the Y angle is 0, or the Y angle can be zeroed using the code itself, to ensure that the resting position is in between the thresholds.

Coding

Screenshot 2025-05-28 074813.jpg
Figure_1.png

Two sets of code are used to extract the data and visualise it. A C++ script is used by the Arduino to extract the sensor data, analyse the data, and activate the actuators accordingly. In the meantime, to evaluate and visualise the motion data collected from the IMUs, a custom Python script was developed. This script communicates with the microcontroller via a serial connection and receives real-time data, including the x and y orientation angles from the IMUs. Both scripts are attached, and the code is explained as follows:

Arduino C++ script:

The setup begins by initialising the I²C bus and configuring all connected devices. Initially, the DRV modules are placed on standby for start-up. Each IMU is initialised and calibrated to eliminate gyroscopic bias. During the main loop, the system cycles between multiplexer channels to read and send data to the multiple connected sensors and actuators. A complementary filter is applied to the gyroscope and accelerometer data to estimate angular orientation over time. Atoir et al. show that complementary filters can be effectively used for tracking hand motions [10], which are similar to the motion that is being tracked here. Orientation angles are logged every 200 milliseconds via the Serial monitor.

The threshold angle for corrective feedback can be changed depending on the performance of the user. The X and Y angles of the IMU are used to track the motion of the patient, X monitoring the angle of the femur relative to the ground, while Y measures the rotation of the femur. When the Y values deviate beyond the given thresholds, the system determines the direction of the deviation and triggers the haptic motor on that side to vibrate. The intensity of the vibration is scaled according to how far the knee deviates from the intended posture, effectively delivering a directional cue to "push" the knee back toward the desired angle.

Python Script:

To evaluate and visualise the motion data collected from the IMUs, a Python script was developed. This script communicates with the microcontroller via a serial connection and receives real-time data, including the x and y orientation angles from the IMUs. Based on the x angle the script detects when an attempt is started and the push-button can be used to reset in between attempts.

Results

download (23).png

To evaluate the repeatability of sensor placement and potential movement-induced displacement, angle values were recorded before and after each exercise, starting from a standardised position (leg extended, toes pointing upward). The gyro indicated minimal drift, suggesting the sensor maintained its orientation throughout the motion.

Participants found the real-time haptic feedback intuitive and helpful. Most reported that the vibration clearly communicated the need for a compensatory adjustment and its direction. The integration of haptic cues allowed users to focus on the movement itself rather than relying on visual feedback from a screen. By dynamically adjusting the vibration thresholds and intensities, the feedback could be personalised to suit each participant’s progression.

One participant, Michael, demonstrated clear learning across sessions. His performance was assessed over three 5-trial sessions. His average deviation from the target Y = 0° was evaluated across time and within specified angle bands. As his control improved, the vibration thresholds were adjusted 12 degrees in trial one to 10 in trial 2, and 8 in trial 3, to accommodate and challenge his enhanced performance. This progression is visualised in the table above.

Discussion

The results from this study show that the proposed IMU and haptic feedback system can be used to provide real-time valuable corrective feedback during heel slide exercises. Even with a sock being used, the system showed large repeatability, with minimal drift observed pre and post exercises among repeated trials (±2°). Additionally, the haptic feedback mechanism was found to be intuitive and impactful. Participants highlighted that they were able to focus on the movement much better than when using visual cues only. Michael specifically showed significant improvement among three 5-trial runs. Michael reduced the average deviations from the target path as seen in the results, indicating that the system can facilitate motor learning. Furthermore, Michael's thresholds were adapted in between trials, to account for his rapid improvement. The gradual adjustment of vibration thresholds to match his improving control highlights the system's potential for personalization in rehabilitation, adapting to a patient's progress and offering dynamic difficulty levels. One interesting thing we saw from Michael's performance was that as the threshold was lowered, he showed improvements in the regions where he was close to the threshold (high inclination angles) while he did not show much progress in the lower angles, indicating that thresholds can be varried based on the inclination angle rather than globally for better overall performance.

Despite promising results, there are several limitations to this study. Firstly, the placement of IMUs was done under controlled conditions, and consistent alignment was ensured by the experimenters. In real-world settings, particularly when patients apply the system, misalignment could degrade accuracy and repeatability. Secondly, this study involved a limited number of participants, with no statistical validation of effectiveness (i.e. future studies with larger and more diverse samples are necessary). Additionally, no long-term retention data were gathered. It is not clear whether Michael's performance would continue to be good once the haptic feedback is removed. This direction can be explored through A/B testing (comparing performance with vibration feedback enabled versus disabled in follow-up trials).

The current project is a prototype, demonstrating a proof-of-concept. The use of jumper wires, bread boards and computer in close proximity is not viable for commercial at home use. A transition towards a PCB based design and wireless components would significantly improve the current ergonomics via miniaturisation of the systems and the removal of external wiring. In terms of system intelligence, the adaptivity of the thresholds is done intuitively as of now, but the potential for data analytics and adaptive models is very clear. Logging exercise data over time could allow physicians to track progress, detect regressions, and tailor rehabilitation plans, while adaptive threshold tuning algorithms could automate the vibration thresholds according to the user's performance. A more ambitious extension could evolve predictive feedback, where vibrations can take place before deviations happen, based on patient historical data, enabling proactive, rather than reactive guidance. From a signal processing point of view, the current complementary filter is viable statically, but may lack the accuracy of more robust filters under dynamic conditions. From a signal processing standpoint, while the complementary filter used here offers simplicity and real-time performance, it may lack robustness under dynamic conditions. Kalman filters or extended Kalman filters could provide better sensor fusion for dynamic motion, especially when fusing accelerometer, gyroscope, and magnetometer data. As the system expands to handle more joints or faster, multi-planar motion, implementing a Kalman filter could improve orientation estimation fidelity.

Overall, this study presents a promising step toward intelligent, wearable, and user-friendly rehabilitation aids. Future work will focus on improving system usability, increasing sample size, testing a broader range of rehabilitation exercises, and advancing the software and hardware for full clinical deployment.

Conclusion

This project demonstrates the potential of IMU and haptic feedback for at-home rehabilitation devices. The participants in this study highlighted that the use of haptic feedback allowed them to focus on the motion, and improvements were observed between trials. Systems similar to this one are critical for exercises that need to be done multiple times a day, such as heel slides, and cover a current gap in medicine, as patients cannot consult physiotherapists multiple times a day. The ultimate goal is to bridge the gap between clinical supervision and at-home recovery. While promising, the current prototype is limited in terms of consistency of sensor placement, limited and non-significant participant testing, and a lack of long-term performance evaluation. To further assess the viability and ergonomics of such systems, clinical trials are necessary, along with better hardware integration (wireless, PCB, etc.). With further development, such systems show strong potential to enhance rehabilitation through personalised, engaging, and data-driven feedback.

References

[1] J. T. Cavanaugh and M. Powers, ‘ACL Rehabilitation Progression: Where Are We Now?’, Curr. Rev. Musculoskelet. Med., vol. 10, no. 3, pp. 289–296, Sep. 2017, doi: 10.1007/s12178-017-9426-3.

[2] A. Biggs, W. L. Jenkins, S. E. Urch, and K. D. Shelbourne, ‘Rehabilitation for Patients Following ACL Reconstruction: A Knee Symmetry Model’, North Am. J. Sports Phys. Ther. NAJSPT, vol. 4, no. 1, pp. 2–12, Feb. 2009.

[3] Y. Guo, D. Li, Y. Wu, X. Sun, X. Sun, and Y. Yang, ‘Mobile health-based home rehabilitation education improving early outcomes after anterior cruciate ligament reconstruction: A randomized controlled clinical trial’, Front. Public Health, vol. 10, Jan. 2023, doi: 10.3389/fpubh.2022.1042167.

[4] T. Hellsten, J. Karlsson, M. Shamsuzzaman, and G. Pulkkis, ‘The Potential of Computer Vision-Based Marker-Less Human Motion Analysis for Rehabilitation’, Rehabil. Process Outcome, vol. 10, p. 11795727211022330, Jan. 2021, doi: 10.1177/11795727211022330.

[5] S. Krishnakumar, B.-J. F. van Beijnum, C. T. M. Baten, P. H. Veltink, and J. H. Buurke, ‘Estimation of Kinetics Using IMUs to Monitor and Aid in Clinical Decision-Making during ACL Rehabilitation: A Systematic Review’, Sensors, vol. 24, no. 7, Art. no. 7, Jan. 2024, doi: 10.3390/s24072163.

[6] S. M. N. Arosha Senanayake, O. A. Malik, Pg. M. Iskandar, and D. Zaheer, ‘A knowledge-based intelligent framework for anterior cruciate ligament rehabilitation monitoring’, Appl. Soft Comput., vol. 20, pp. 127–141, Jul. 2014, doi: 10.1016/j.asoc.2013.11.010.

[7] N. Jafari, K. D. Adams, and M. Tavakoli, ‘Haptics to improve task performance in people with disabilities: A review of previous studies and a guide to future research with children with disabilities’, J. Rehabil. Assist. Technol. Eng., vol. 3, p. 2055668316668147, Oct. 2016, doi: 10.1177/2055668316668147.

[8] C. Huang et al., ‘The association between tibial torsion, knee flexion excursion and foot progression during gait in people with knee osteoarthritis: a cross-sectional study’, BMC Sports Sci. Med. Rehabil., vol. 15, p. 110, Sep. 2023, doi: 10.1186/s13102-023-00726-z.

[9] G. B. Salsich and W. H. Perman, ‘Tibiofemoral and patellofemoral mechanics are altered at small knee flexion angles in people with patellofemoral pain’, J. Sci. Med. Sport, vol. 16, no. 1, pp. 13–17, Jan. 2013, doi: 10.1016/j.jsams.2012.04.003.

[10] F. Z. A. Atoir, A. G. Putrada, and R. R. Pahlevi, ‘An Evaluation of Complementary Filter Method in Increasing the Performance of Motion Tracking Gloves for Virtual Reality Games’, Kinet. Game Technol. Inf. Syst. Comput. Netw. Comput. Electron. Control, May 2021, doi: 10.22219/kinetik.v6i2.1234.

Video

A short reel can be found in: Haptic Interface, summarising why the device is used, and demonstrating its use.