Group for Interventional Robotic and Imaging Systems (IRIS)
Soft Robotics
Intelligent Shape Decoding of a Soft Optical Waveguide Sensor
Optical waveguides create interesting opportunities in the area of soft sensing and electronic skins due to their potential for high flexibility, quick response time, and compactness. The loss or change of light intensities inside a waveguide can be measured and converted into useful sensing feedback such as strain or shape sensing. We utilize simple light-emitting diodes (LEDs) and photodetectors (PDs) combined with an intelligent shape decoding framework to enable 3D shape sensing of a self-contained flexible substrate. Multiphysics FEA is leveraged to optimize the PDs/LEDs layout and enrich ground-truth data from sparse to dense points for model training. The mapping from light intensities to overall sensor shape is achieved with an autoregression-based model that considers temporal continuity and spatial locality. The sensing framework was evaluated on an A5-sized flexible sensor prototype and a fish-shaped prototype, where sensing accuracy (RMSE = 0.27 mm) and repeatability (Δ light intensity < 0.31% over 1000 cycles) were tested underwater.
[Video: Intelligent Shape Decoding of a Soft Optical Waveguide Sensor]
A Tensegrity Joint for Low-Inertia, Compact, and Compliant Soft Manipulators
Tensegrity, of which the structural integrity is constrained by tension, does not involve static or sliding friction among the rigid components. However, this mechanical stability is very susceptible to actuation errors. It requires complex kinematics modeling and sophisticated control model with sensing feedback. Herein, a low-inertia tensegrity joint that is covered/protected by a fiber Bragg grating (FBG)-embedded silicone sheath is proposed, with the aim to reinforce the joint motion stability and enable self-contained sensing feedback. A learning-based closed-loop controller is also designed and trained with the proper joint configurations selected by a two-step sampling method. Both the kinematics and static equilibriums of such configurations can be well satisfied. The experiments demonstrate that the joint can follow paths accurately in 2D, and its stiffness can also be varied against the external/impulsive disturbances, allowing the joint to provide both compliant interaction with humans and controllable motions for manipulation tasks.
[Video: A Tensegrity Joint for Low-Inertia, Compact, and Compliant Soft Manipulators]
Learning-Based Visual-Strain Fusion for Eye-in-Hand Continuum Robot Pose Estimation and Control
We fuse visual information with the sparse strain data collected from a single-core fiber inscribed with fiber Bragg gratings (FBGs) to facilitate continuum robot pose estimation. An improved extreme learning machine algorithm with selective training data updates is implemented to establish and refine the FBG-empowered (Femp) pose estimator online. The integration of F-emp pose estimation can improve sensing robustness by reducing the number of times that visual tracking is lost given moving visual obstacles and varying lighting. In particular, this integration solves pose estimation failures under full occlusion of the tracked features or complete darkness. Utilizing the fused pose feedback, a hybrid controller incorporating kinematics and data-driven algorithms is proposed to accomplish fast convergence with high accuracy. The online-learning error compensator can improve the target tracking performance with a 52.3%–90.1% error reduction compared with constant-curvature model based control, without requiring fine model-parameter tuning and prior data acquisition.
Large-scale surface shape sensing with learning-based computational mechanics
In a step forward to realize robots with proprioception, we propose a flexible sensor framework that incorporates a novel hybrid modeling strategy, taking advantage of computational mechanics and machine learning. We implement the sensor framework on a large, thin and flexible sensor that transforms sparsely distributed strains into continuous surface shape. Finite element (FE) analysis is utilized to determine sensor design parameters, while an FE model is built to enrich the morphological data used in the supervised training to achieve continuous surface reconstruction. A mapping between the local strain data and the enriched surface data is subsequently trained using ensemble learning. This hybrid approach enables real-time, robust and high-order surface shape reconstruction, which has not been demonstrated and reported on such a large-scale (A4-paper-size) sensor before.
[Video: Large-scale surface shape sensing with learning-based computational mechanics
Interfacing Soft and Hard: A Spring Reinforced Actuator
Muscular hydrostats have long been a source of inspiration for soft robotic designs. With their inherent compliance, they excel in unpredictable environments and can gently manipulate objects with ease. However, their performance lacks where high force or a fast-dynamic response is needed. In this project, we propose a novel spring reinforced actuator (SRA) that explores the intermediate state between muscular hydrostats and endoskeletal mechanisms. The result is that we dramatically enhance the robot dynamic performance, which is unprecedented in similar kinds of soft robots, while retaining compliant omni-directional bending. Analytical modelling of the flexible backbone was built and experimentally validated. This is also the first attempt to perform detailed finite element analysis (FEA) to investigate the strain-stress behavior of the constraining braided bellow tube. The braided interweaving threads are modeled, wherein complex thread-to-thread contacts occurs. Experimental evaluation of SRAs was performed for actuation force, stiffness, and dynamic response. We showcase the enhanced actuator’s performance in several applications such as locomotion and heavy object manipulation.
[Video: Interfacing Soft and Hard: A Spring Reinforced Actuator.]
[Video: Finite element analysis of braided tube compressed into a bellow sheath]
Real-time Surface Shape Sensing for Soft and Flexible Structures
We present a new soft and flexible sensor which can reconstruct its surface shape in real-time. A single-core optical fiber with fiber Bragg gratings (FBGs) is capable of detecting sparse local strains at high bandwidth using wavelength-division multiplexing (WDM). The fiber was embedded into an elastomeric substrate to reconstruct its global surface morphology. Finite element analysis (FEA) was used to determine the design parameters, and also to validate the unique mapping from sparse strain measurements to the continuum shape of the sensor. To simplify the fabrication and error compensation process without precise/prior knowledge of the FBG locations in the sensor, machine learning-based modelling was applied. This enables real-time, robust and reliable shape reconstruction. It is demonstrated to outperform various applications of electronics-based sensors which require sophisticated electrode wiring and noise reduction.
[Video: Real-time Surface Shape Sensing for Soft and Flexible Structures.]
Vision-based Online Learning Kinematic Control for Soft Robots
Soft robots, owing to their elastomeric material, ensure safe interaction with their surroundings. These robot compliance properties inevitably impose a trade-off against precise motion control, as to which conventional model-based methods were proposed to approximate the robot kinematics. However, too many parameters, regarding robot deformation and external disturbance, are difficult to obtain, even if possible, which could be very nonlinear. Sensors self-contained in the robot are required to compensate modelling uncertainties and external disturbances. Camera (eye) integrated at the robot end-effector (hand) is a common setting. We are investigating an eye-in-hand visual servo that incorporates with learning-based controller to accomplish more precise robotic tasks. Our work first demonstrates vision-based path following for a hyper-elastic robot with heavy variable loading (up to 105% of the robot weight). The enhanced accuracy and adaptability can bring new opportunities to minimally invasive surgical applications, such as soft robotic endoscopy and laparoscopy, as they can take advantage of existing camera feedback.
[Video: Vision-based Online Learning Kinematic Control for Soft Robots.]
Soft Robot Visual Servoing Enhanced with Sparse Strain Measurement
In the feature/object tracking of eye-in-hand visual servoing, 2D motion estimation relying only on image plane feedback is easily affected by vision occlusion, blurring, or poor lighting. For the commonly-used template matching method, tracking performance greatly depends on the image quality. Fiber Bragg gratings (FBGs), a type of high-frequency flexible strain sensor, can be used as an assistant device for soft robot control. We propose a method to enhance motion estimation in soft robotic visual servoing by fusing the results from template matching and FBG feedback. Sparse strain measurement of the single-core FBG fiber could be trained as an independent motion sensor, and combined with image processing to improve tracking accuracy. Path following performance is validated in a simulated laparoscopic scene and LEGO-constructed scene, demonstrating significant improvement to feature tracking, even under external forces. The enhanced estimation method increased tracking accuracy by 82.3% when capturing the dark and feature-deficient liver surface. It also showed stable performance under imitated endoscopic irrigation.
[Video: Soft Robot Visual Servoing Enhanced with Sparse Strain Measurement.]
Online Learning Control for Effective Endoscopic Navigation
Bio-inspired robotic structures composed of soft actuation units have attracted increasing research interest. Taking advantage of its inherent compliance, soft robots can assure safe interaction with external environments, provided that precise and effective manipulation could be achieved. However, previous model-based control approaches often require simplified geometric assumptions on the soft manipulator, but which could be very inaccurate in the presence of unmodeled external interaction forces. We are investigating model-free control methods that do not require prior knowledge of the robot’s structural parameters. A generic control framework based on a nonparametric online learning technique is developed, which in the inverse model is acquired directly. As a result, a soft continuum robot can precisely follow a 3D trajectory, even under dynamical external disturbance (e.g. push). Such enhanced control accuracy and adaptability would therefore facilitate effective manipulation in complex and changing environments such as endoscopic navigation. Furthermore, finite element analysis (FEA) using advanced element formulations is also adopted. We study the dynamical response by considering different materials, actuation methods, and external loadings. Based on the FEA realization, not only the geometrical and structural design parameters can be optimized, but also the simulation data can be used to initialize the model-free control policy, hence eliminating the need for random exploration in the robot’s workspace.
[Video: Online learning control for effective endoscopic navigation.]
PUBLICATIONS
[1] T.L.T. Lun, K. Wang, J.D.L. Ho, K.H. Lee, K.Y. Sze, K.W. Kwok,"Real-time Surface Shape Sensing for Soft and Flexible Structures using Fiber Bragg Gratings," IEEE Robotics and Automation Letters (RA-L) 4(2):1454-1461, 2019 Detail
[2] G. Fang, X. Wang, K. Wang, K.H. Lee, J.D.L. Ho, H.C. Fu, D.K.C. Fu, K.W. Kwok,"Vision-based Online Learning Kinematic Control for Soft Robots using Local Gaussian Process Regression," IEEE Robotics and Automation Letters (RA-L) 4(2):1194-1201, 2019 Detail
[3] K.H. Lee, M.C.W. Leong, M.C.K. Chow, H.C. Fu, D.K.C. Fu, Wayne Luk, K.Y. Sze, C.K. Yeung, K.W. Kwok, "FEM-based Soft Robotic Control Framework for Intracavitary Navigation," IEEE International Conference on Real-time Computing and Robotics (RCAR), pp.11-16,2017. Detail
[4] K.H. Lee, D.K. Fu, M.C. Leong, M. Chow, H.C. Fu, K. Althoefer, K.Y. Sze, C.K. Yeung, and K.W. Kwok, "Nonparametric Online Learning Control for Soft Continuum Robot: An Enabling Technique for Effective Endoscopic Navigation," Soft Robotics, vol. 4, no. 4, pp. 324-337, 2017. Detail
[5] H.C. Fu, J.D.L. Ho, K.H. Lee, Y.C. Hu, K.W. Au, K.J. Cho, K.Y. Sze, K.W. Kwok, "Interfacing soft and hard: a spring reinforced actuator," Soft Robotics, 2020;7(1):44-58. Detail
[6] K. Wang, C.H. Mak, J.D.L. Ho, Z.Y, Liu, K. Y. Sze, K.K.Y. Wong, K. Althoefer, Y.H. Liu, T. Fukuda, K.W. Kwok, "Large-scale surface shape sensing with learning-based computational mechanics", Advanced Intelligent Systems, 2100089, 2021 Detail
[7] X. Wang, J. Dai, H.S. Tong, K. Wang, G. Fang, X. Xie, Y.H. Liu, K.W.S. Au, K.W. Kwok, “Learning-based Visual-Strain Fusion for Eye-in-hand Continuum Robot Pose Estimation and Control,” IEEE Transactions on Robotics (TRO), 39(3), 2448-2467, 2023 Detail
[8] Y. Hao, X. Wang, X. Song, Y. Li, H.C.H. Fu, A.P.W. Lee, K.M.C. Cheung, J. Lam, K.W. Kwok, “A tensegrity joint for low-inertia, compact and compliant soft manipulators” Advanced Intelligent Systems (AISY) 2300079, 2023 Detail
[9] C.H. Mak, Y. Li, Kui Wang, M. Wu, J.D.L. Ho, Q. Dou, K.Y. Sze, K. Althoefer, K.W. Kwok, “Intelligent Shape Decoding of a Soft Optical Waveguide Sensor," Advanced Intelligent Systems (AISY), 2300082, 2023 Detail
AWARDS
1. Best Conference Paper Award in the IEEE International Conference on Real-time Computing and Robotics (RCAR) 2017.
Authors and title: K.H. Lee, M.C.W. Leong, M.C.K Chow, H.C. Fu, W. Luk, K.Y. Sze, C.K. Yeung and K.W. Kwok, "FEM-based Soft Robotic Control Framework for Intracavitary Navigation."
PATENTS
1. US Provisional Patent: A tensegrity joint with variable stiffness and precise motion control
US Provisional Utility Pat.: 63/512,911 (Filed on 11 July 2023])
2. PCT Patent: Endoscopic Systems, Devices, and Methods for Performing In Vivo Procedures, App No. PCT/CN2016/070906 (Filed on Jan 14, 2016.)
3. PRC Patent: Endoscopic Systems, Devices, and Methods for Performing In Vivo Procedures, App No. 201610147810.9 (Filed on March 15, 2016.)
4. US Provisional Patent: Endoscopic Systems, Devices, and Methods for Performing In Vivo Procedures, App No. US 14/985,587 (Filed on Dec 31, 2015.)