Introduction
When engaging with objects in the surrounding environment, our sense of touch provides both kinesthetic and cutaneous haptic information.[1] Currently, these types of feedback are diminished or lost entirely in bilateral teleoperation systems. For example, a surgeon remotely commands a slave robot to manipulate an object via a local master device. In this case, no physical contact between his/her hands and target objects is engaged.[2] Therefore, the lack of haptic information during teleoperation might be deleterious, lowering perceptive abilities and leading to excessive contact forces which can damage the objects.[3] These shortcomings make it challenging to carry out delicate activities, such as suturing[4] and tissue palpation/characterization,[5] in some teleoperation settings, like robot-assisted minimally invasive surgery (MIS).[6] In endoscopic submucosal dissection (ESD), tissue dissection under an endoscope requires delicate skills and therefore the presence of a force sensor and a haptic system is highly desirable to improve the safety and accuracy of the procedure.[7]
Force feedback in surgical teleoperation is valuable but often limited by stability concerns arising from time delays and inaccuracies in robot modeling and force calculation.[8] Thus, sensory substitution approaches have been employed by researchers to substitute kinesthetic and cutaneous force input with visual, aural, or vibration cues.[9] These techniques maintain stability while conveying critical force information. However, graphic and aural feedback can overwhelm other senses in some surgical scenarios. Vibration feedback, while used to replicate the sense of touch, struggles to convey direction effectively due to bulky and rigid actuators. In addition, the continuous wearing of vibrating gadgets while touching objects repeatedly promotes desensitization and discomfort in the users after lengthy usage.[10] Alternative kinds of cutaneous feedback, including pressure and skin stretch, have recently gained attention, which may offer an effective solution for naturally conveying force magnitude and direction information during force-sensitive teleoperated activities.[11] Skin stretch is also a form of haptic feedback that closely mimics the shear force experienced during our daily interactions with objects. To deliver this unique haptic experience, people use tactors embedded into skin-stretch devices. By acting on the tactors in various directions, skin-stretch feedback also allows for the transmission of directional information.[12] Since natural interactions contain both kinesthetic and cutaneous elements, a feedback methodology that uses a device to replicate both of these components is necessary to represent the natural interactions.[13] A recent study[14] has shown that cutaneous and kinesthetic feedback delivered by haptic devices considerably enhanced the performance of all considered tasks.
Conventional fingertip haptic devices, which localize stimulation on the most touch-sensitive area of human skin by providing multi-degree-of-freedom (DoF) skin deformation and forces at the millimeter scale, still face fundamental challenges.[15] They mostly employ traditional rigid robotic mechanisms, transmission elements, and electric motors, resulting in relatively bulky and complex construction and high cost.[16] For example, Prattichizzo et al.[17] created a 3-DoF fingertip device using three DC motors on the nail and a mobile platform for contact force generation. Malvezzi et al.[18] introduced a haptic device with a static body and a mobile platform linked by three legs, replicating contacts on various fingertip surfaces. Chinello et al.[13] described a modular wearable interface featuring a 3-DoF fingertip device and a 1-DoF finger kinesthetic exoskeleton actuated by small servo motors, limiting finger motion with 3D-printed parts. Recently, soft robotic technology for haptic feedback has been implemented to provide a better interaction with the skin surface.[19] Different soft actuator arrays, also known as haptic displays or artificial skins, were presented by researchers based on vibrotactile fluidic,[20] shape memory polymers,[21] and dielectric elastomer actuators (DEA).[22] The purpose of these arrays is to increase the number of actuators or haptic pixels across the skin surface in order to maximize the amount of tactile information that can be conveyed. However, it is worth noting that each haptic pixel is only able to provide a one-degree-of-freedom pressure stimulation. Recently, more capable soft haptic devices have been developed that could produce skin deformations with up to 3-DoF tactor movements.[23] For example, Zhakypov et al.[24] presented a fully 3D printed, soft, monolithic fingertip haptic device. Two origami fingertip haptic devices that could deliver normal, shear, and torsional haptic cues to the fingertip were shown.[25] However, despite their innovative design, these prototypes are considered bulky and heavy, making it difficult to wear them as a garment or clothing accessory. Additionally, they demand labor-intensive hand assembly procedures like molding and gluing many pieces. The development of high-DoF mechanisms, scalability, and customizability still poses significant challenges in the realm of soft robotics. Therefore, high-DoF scalable lightweight and hand-worn haptic devices are necessary.
In teleoperated system, a force acting on a slave robotic arm is measured and communicated back to the master side to provide a haptic display to the operator. This can be done by measuring the forces directly acting on the slave robot, using a force sensor mounted to the end effector, or estimating the force by examining the dynamic model of the surgical system. Strain gauges, capacitive sensors, piezoelectric sensors, and optical sensors are the most common force-sensing methods in MIS.[26] There have been many attempts to develop soft force sensors comprising soft silicone elastomers, conductive inks, and dielectric structures.[27] Recent progress has led to the development of fluid pressure-based soft sensors, which mostly focus on pneumatic pressure.[28] These sensors work by deforming fluid chambers to alter internal fluid pressure, which is monitored using commercial fluid pressure sensors.
To better capture the force interaction, 3D force signals are highly desired. A few research groups have attempted to construct 3D or multi-axial force sensors using this fluidic transduction approach with the benefit of low cost and ease of implementation. Choi et al.[28f] produced a three-axis force sensor using three pneumatic chambers embedded in silicone layers. When an external force is applied, these chambers flex and alter internal pressure. The sensor is 40 mm in diameter and 10 mm thick but lacks scalability due to reliance on large commercial pressure sensors. He et al.[29] proposed a silicone air chamber-based sensor shaped like a hollow ball. It could adjust its sensitivity, an advantage over other sensors, but integrating it into small-scale applications like surgical instruments poses challenges. Furthermore, we only found a few hydraulic-based fluidic force sensors in the literature,[30] none of which took the form of 3D force sensors. While air provides some compliance to soft systems, its compressibility also contributes to non-linearities.[31] Hydraulic fluids, such as water or oil, are incompressible and hence have a significantly more linear outcome and are more energy efficient, making them more responsive.[31–32] Hydraulic systems are more reactive to load fluctuations and respond 50 times quicker than pneumatic systems, which is a significant benefit in sensor applications.[31,32b,33] Other non-fluidic 3D force sensors, such as magnetic-hall-effect sensors,[34] liquid metal sensors,[35] and piezoelectric sensors,[36] have been created. Although they can be tailored to provide a 3D force sensor, they require electrical wires to transmit signals from the sensing site to an external reading source,[37] which requires complex set-up and signal processing. In addition, they also lack the tuneable sensitivity to work with different external force ranges, which is possible with our proposed technology in this paper.
To address the above shortcomings, this paper proposes a teleoperated endoscopic surgical system with a wearable haptic display and a master-slave architecture (Figure 1a) where a user wearing a fabric haptic glove remotely manipulates a slave robotic arm incorporating a soft 3D force sensor. On the slave side, a soft robotic surgical arm comprises a 3-DoF flexible robotic arm for tissue dissection. The interaction force is directly measured using an onboard 3D force sensor F = (Fx, Fy,Fz) made from hydraulic soft filament sensors (SFSs). To reproduce 3D force feedback F to the operator, the haptic module in the form of a fabric glove consisting of a 3-DoF fingertip cutaneous device which will also generate corresponding forces Fc= (Fx,c, Fy,c, Fz,c) and a 1-DoF finger kinesthetic module (tension force, Fkinesthetic), which can be used either independently as two different devices or together as one device, for conveying both the 3D force information and kinesthetic interaction to the user's finger. The central tactor of the cutaneous haptic device will generate tactile sensations that correspond to the directional forces from the tool-object interaction via an appropriate mapping, the kinesthetic module is controlled by a developed feedforward controller to follow finger flexion or generate kinesthetic haptic feedback with a determined sequencing (Figure 1a). The 3D force sensor is made straightforwardly, with no molding required, simply 3D printing and gluing/adhesion, and can precisely detect external forces based on the change of the hydraulic pressures using a neural network. To evaluate the effectiveness of this system, we conduct experiments that involve basic force tests, motion tests, and human-wearing evaluation tests.
[IMAGE OMITTED. SEE PDF]
Results and Discussion
Overall Working Principle
The endoscopic surgical system with a wearable haptic display is designed with a master-slave architecture where a user wearing a soft fabric glove remotely manipulates a slave robotic arm. The flexion of the fingers will induce a change in the internal hydraulic pressure of the three SFSs, which are used as input signals to control the soft robotic arm (see Figure 1). The haptic components integrated into the fabric glove on the master side comprise a 3-DoF fingertip cutaneous device (Figure 2a) and a 1-DoF finger kinesthetic module (Figure 2c), which can be either used together as a single device or separately as two different devices. A soft robotic surgical arm at the slave side consists of a 3-DoF flexible robotic arm that can perform tissue dissection for endoscopic surgery and a 3-axis force sensor that can monitor the tool-tissue interaction.
[IMAGE OMITTED. SEE PDF]
The 3-DoF fingertip cutaneous device is composed of four soft microtubule artificial muscles (SMAMs) for inducing omnidirectional shear forces in the x-y plane (Fx,c, Fy,c) and a soft actuator for normal z-axis forces (Fz,c), resulting in 3D force feedback Fc. The developed cutaneous haptic display is integrated into a flexible fabric glove to enhance the interaction between the device and human skin, similar to human clothing and adaptable to different finger sizes. The 1-DoF finger kinesthetic module provides a kinesthetic force, Fkinesthetic, that renders the stiffness and shape of the contact object to the index finger. This feedback is provided by an SMAM grounded on the back of the hand through a thermoplastic polyurethane (TPU) gauntlet which enhances the overall user kinesthetic experience and stabilizes the filament sensor signal. To sense the finger flexion (e.g., index finger), a soft filament hydraulic sensor (SFS) which is routed along the kinesthetic module or SMAM is used. The hydraulic pressure signal from the SFS will be converted to the SFS elongation, which will be then used to control the SMAM length. The advanced control scheme (see next section) will be employed to ensure that: (1) the kinesthetic section does not impede the motion of the index finger when it is inactivated, and (2) it will induce a desired kinesthetic force with an appropriate mapping once it is activated.
While the slave robotic surgical arm is actuated via three fabric bellow actuators (FBAs) which are arranged in parallel at the vertices of an equilateral triangle to enable omnidirectional motions, the 3D force sensor which employs SFSs capable of sensitivity tunability is mounted on the end of the robotic arm for sensing the tool-tissue interaction once the robotic arm approaches a tissue. Using a neural network (NN) model, the soft 3D sensor can precisely detect interaction forces (F = (Fx, Fy,Fz)) based on the change of the hydraulic pressures and the 3-DoF fingertip cutaneous immediately reproduces these sensed 3D force with a mapping to the index fingertip through a tactor. In addition, each FBA is controlled by a motor housing[38] based on the signal change in three SFSs in the index finger, the middle finger, and the ring finger resulting in the position and direction change of the slave robotic arm (Figure 1a). A typical operating process of the proposed system is shown in Video S1, (Supporting Information).
Design and Fabrication
Wearable Haptic Device
The 3-DoF fingertip cutaneous device is composed of four SMAMs (Ø1.2 mm in diameter and 6 mm in length), a z-axis soft actuator, a tactor (Ø3 mm), and a housing frame (Ø18 mm in diameter and 4 mm in thickness). Beginning with the SMAMs, a detailed fabrication process can be found in our recent work.[39] Each SMAM was elongated at the start of the assembly to achieve 50% of its maximum strain and to maintain a balanced force. Both the tactor and the housing frame were manufactured using an Ultimaker 2+ 3D printer (Ultimaker B.V., Netherlands). The four SMAMs link to the tactor which can slide over the skin's surface to provide an x-y plane of skin stretch sensation. The four SMAMs, which are organized in a cross shape and use the tactor as a 4-way connector, are regulated by remote miniature syringes that are positioned distant from the working areas to control the relative position of the tactor with regard to the housing frame. Individual SMAMs are regulated in pressurized, depressurized, or starting states with a relationship to control the motion and force in the desired directions. The z-axis soft actuator is fabricated by using Ecoflex 30 and water-soluble paper to form a circular channel as shown in Figure 2b,e. The soft actuator is then attached to the backside of the housing frame using adhesive glue (UHU GmbH & Co. KG) for producing normal z-axis forces. It is actuated by hydraulic pressure and can move the platform away and toward the user's finger skin to mimic contacts with arbitrarily oriented surfaces. To enhance the interaction between the device and the wearers, the device is then integrated into a flexible fabric glove that allows the device to be worn like human clothing. Previous research has shown that a skin displacement of <0.5 mm is sufficient to reproduce the skin stretch sensation.[40] Our device offers a maximum movement of ±3 mm, ±3 mm, and ±4 mm in the x, y, and z directions, respectively, which is far above this minimum sensation threshold.
The 1-DoF finger kinesthetic module consists of an SMAM (Ø2 mm in diameter and 83 mm in length) grounded on the back of the hand through a TPU gauntlet for generating kinesthetic force and a soft sensor (Ø1.5 mm in diameter and 80 mm in length) for sensing the flexion of the index finger. The glove was fabricated by simply sewing 5 mm sections of spring coil at each joint and gluing heat-shrink tubes along the index finger such that both the SMAM and the sensor could be threaded through and would not slip or buckle off the joints upon working, acting as routing sheaths. The ends of the kinesthetic module were then attached to the 3D printed frame and the TPU gauntlet to fix it in place, which enhances the overall user kinesthetic experience and stabilizes the filament sensor signal. Note that the SMAM was initially elongated by 15% and the soft sensor was originally stretched by 20% before installing into the glove for better control and sensitivity. The maximum movement and force of the SMAM are 45 mm and 7.5 N, respectively, which are sufficient to provide adequate finger motion and kinesthetic force.
The new haptic display that can convey cutaneous and kinesthetic feedback is one of the most compact and lightweight devices ever presented.[1] The 3-DoF fingertip module alone weighs 1.12 g for 18 × 22 × 4 mm dimensions excluding the 2 m PTFE transmission tube. The design is inspired by our previous work[39b] but shows advantages such as a smaller size, an active z-axis actuation, and no bulky outer sheath. The complete kinesthetic and cutaneous device weighs 9.5 g with 22 × 100 × 15 mm dimensions. Moreover, it can be adjusted to adapt to different finger sizes thanks to the soft fabric glove.
3D Force Sensor
The central component of the 3D force sensor is the soft filament sensor or SFS, made up of three units that form an equilateral triangle pyramid within the sensor housing. The structure and operation of the SFS fibers can be seen in Figure 3a. SFS is made of an outer spring coil with a hollow channel and an inner silicone elastomeric tube placed within it. Using water from a hydraulic pressure source (e.g., syringe), the inner tube is first pressured to attain P0. This pressured condition (P0) is described as the fiber's resting state while the inner silicone tube's walls are always under hydraulic pressure during its operation. The water volume within the tube must remain constant while the sensor fiber is stretched since water cannot be compressed. In order to account for the sensor fiber's lengthening when it is stretched axially, the tube's inner radius must be reduced. By doing so, the tube's walls may be made thicker and their stress can be reduced. The pressure in the water is directly decreased to a smaller value, Pi < P0, by this axial stretching-induced stress reduction. A commercial pressure sensor is used to detect the water pressure within the sensor.
[IMAGE OMITTED. SEE PDF]
The essential novelty of this sensor fiber is its capability to tune its sensitivity to strain or applied force, allowing it to adapt to diverse applications. Changing the initial pressure at rest, as explained in,[41] changes the rate of change of stress reduction within the tube's walls for some constant strain. As a result, we can tune the rate of change of pressure decrease inside the water in response to this strain and we may change the starting water pressure of each sensor fiber to control its gauge factor (GF).
We used three of these SFS fibers in an equilateral triangle pyramid configuration to transduce 3D force, which was supported by a PLA housing that was 3D printed. These three SFS fibers were connected to the housing through holes, as seen in Figure 3b, with a large hole at the base of the housing serving as both a cover for all three fibers and their fluid transmission tubes. The sensing surface of the 3D force sensor was created by placing a 3D-printed PLA plunger on the exposed portions of the three sensors. The sensors below will flex when a force is applied to the plunger, as shown in Figure 3c. Each SFS fiber experiences some axial strain as a result of this deformation, which causes a proportional decrease in the hydraulic pressure signals coming from each fiber. The x, y, and z components of the applied force F are then decoded from the three SFS fiber signals using a mathematical method. The three Equations (1–3) used to transform each sensor signal into 3D force are provided below, where N = 1, 2, 3 is the sensor number (Fiber 1, Fiber 2, Fiber 3, Figure 3b), and θ is the acute angle formed by each sensor fiber and the sensing surface plane, as seen in Figure 3b,c.
Our latest works provide a complete manufacturing method for the SFS fibers.[39a,c] While this structure was used in our earlier work as a soft actuator, we employ it as a soft sensor, where the operating principle is entirely different in this work. In a nutshell, ASAHI Intecc Co., Ltd. provided the outside spring coil, while Saint-Gobain S.A. (Courbevoie, France) provided the inner silicone tube (Japan). Using an Ultimaker 2+ 3D printer (Ultimaker B.V., Netherlands) and PLA, the housing and plunger were 3D printed (Ultimaker PLA, Ultimaker B.V., Netherlands). The three SFS fibers were then routed up into their anchoring holes in the housing's rim and each fiber was attached using adhesive material in these anchoring holes (UHU GmbH & Co. KG). The transmission tube end of each sensor was free in the center housing hole, whereas one end of each sensor was now secured to the housing. The SFS fibers were then inflated to the desired pressure, and their other end was secured with glue using the central housing hole. The sensors were mounted inside the enclosure, leaving the three transmission tubes free. The space between the plunger and the housing was then filled with Sil-Poxy once the plunger had been positioned on top of the fibers and aligned inside the housing (Smooth-On, Inc.). The plunger was fixed in its resting position, and thanks to the elastic Sil-Poxy, it can return to this position after being manipulated. The sensitivity of the plunger's movement in reaction to force was controlled by the elastic modulus of this gap-filling silicone. It should be noticed that the filled silicone has a significant impact on the sensor's sensitivity. A more sensitive silicone, like Smooth-On, Inc.’s Ecoflex-0030, would undoubtedly be softer. This article is not intended to investigate sensor performance using various silicone elastomers. Since it is outside the scope of this study, we leave the examination of sensor performance with various silicone elastomers for future work.
3D Soft Robotic Arm
To produce omnidirectional movements at the tip of the 3-DOF soft robotic arm, three soft, wrinkled FBAs were connected in parallel and at the triangle's vertices (Figure 4a). Hydraulic pressures inside the FBAs, which are supplied from external linear hydraulic units through fluid transmission tubes (micro-sized PTFE tubes, Cole-Parmer, USA), control the arm's bending action. While activating two bellows results in bending movement in the plane between the two bellows, energizing a single bellow will result in bending action in one direction. Similarly, a linear translational motion is accomplished along the long axis when all three bellows are concurrently subjected to a combination of three hydraulic pressure values as shown.[38,42] In most existing surgical robots, when external translation motion is necessary, the proposed robot structure implies that the soft robotic arm can expand its length without moving the flexible body. When the bellows are activated, the soft robotic arm can move in a variety of motions, as shown in Figure 4b and Video S4 (Supporting Information). The bending angle of the arm toward left, right, forward, and backward directions could exceed 120o. Figure 4c demonstrates the control of the soft robotic arm's motion through the finger-tracking glove with integrated SFSs at the index, middle, and ring fingers. Because each SFS accounts for the length of each corresponding FBA, the position and direction of the soft robotic arm are operated under the mappings as shown.[43] The initial pressure within the sensor fiber was chosen so that maximal joint flexion resulted in no flat spotting in the pressure output. Any further rise in initial pressure would improve the sensor's sensitivity, meaning more bending movement for the same finger movement. To reduce the overall size of the device, the fluid transmission tubes for the 3D force sensor and electrical wire for electrosurgery are routed along a central channel of the robotic arm. The robotic arm we fabricated in this study has an outside diameter of 10 mm, a length of 19 mm, and a weight of ≈50 g. It may be inserted into the gastrointestinal tract through a natural orifice like the stomach or colon or delivered minimally invasively through a minor skin incision.
[IMAGE OMITTED. SEE PDF]
Calibration of Measured Force in the Haptic Device
The experimental results in each DoF and the mean as well as standard deviations for the maximum blocked force test are presented in Figure 5. We observed similar force magnitude for the x and y axes, greater than ±1.5 N. The data showed that the shear forces along diagonals were smaller than the principal directions of the tactor because two adjacent SMAMs produced a higher force compared to a single SMAM. The shear forces in the four main directions were slightly dissimilar because of inherent inconsistencies in the fabrication process by hand and also due to measurement errors. The maximum z-axis force was considerably larger at ≈3.62 N, while the SMAM in the kinesthetic module could generate roughly 7.12 N. The force output of the device is suitable for haptic stimulation, particularly because humans are more sensitive to stimulation in the shear (x and y) directions than in the normal (z) direction.
[IMAGE OMITTED. SEE PDF]
3D Force Sensor Neural Network Calibration
We evaluated the proposed 3D force sensor architecture in this section. The 3D force sensor was constructed as previously described, and the parameters for the sensor fibers utilized in this experiment can be found in Table 1. Here, we used a smaller size of filament sensor due to the required size of the 3D force sensor, but the sensor's performance and tunability characterization were not affected. Using adhesive glue, the sensing surface of a 6-axis ATI Nano17 force sensor (Calibration SI-25-0.25, ATI Industrial Automation, USA) was attached to the sensing surface of the 3D force sensor for characterization and calibration (Figure 6). The ATI was pushed into the 3D force sensor in various directions using human hands after the force sensor had been clamped in place. Since the direction of applied force should not matter, doing this was done instead of constructing a sophisticated rig for simplicity purposes.
Table 1 Specifications for SFS fiber components in the 3D force sensor architecture.
Component | Specifications |
Inner tube | OD 0.610 mm, ID 0.305 mm, length 15 mm, silicone rubber |
Outer coil | OD 0.800 mm, ID 0.600 mm, length 15 mm, stainless steel |
[IMAGE OMITTED. SEE PDF]
With the potential of the 3D force sensor described above, the challenge is to use an intelligent calibration technique that can convert SFS pressure data directly into force data rather than utilizing Equations (1–3) which may cause errors due to sensor nonlinearity. We have employed a neural network (NN) model to do this. By using this method of analyzing the sensor data, it is no longer necessary to manually process the data, which produces unreliable findings. The work required to construct mathematical relationships can be lessened, which might potentially lower the significant mistakes when calibrating manually.[44] The idiosyncrasies of each sensor's contact dynamics are also taken into consideration by training a NN to interpret the data.
In order to estimate the calibrated relationship between the desired force data from the 3 Axis ATI force sensor and pressure data from our proposed force sensor, the NN technique (Figure 6b) needs a large enough amount of data. As a result, datasets with pressure data (p1, p2, and p3) as inputs and time-varying force values in the x, y, and z axes as outputs were used to train the NN model. Using the experimental set-up depicted in Figure 6a, the datasets were collected by randomly applying force to both our proposed sensor and the ATI force sensor with a sampling rate of 100 Hz. The training force range, which is suitable for our intended applications, is from −0.5 to 2 N on the x-axis, −1 N to 4 N on the y-axis, and −1 N to 11 N on the z-axis with a frequency range of 0.1 Hz to 10 Hz. A total of 1000 datasets were gathered and randomly divided into 75/15/15 training, validation, and testing sets. The network parameters were trained using a neural network with two hidden layers and backpropagation using Bayesian regularization. The Dynamic Time-series function in MATLAB and 10 delay samples, together with an Intel Xeon E-2136 @ 3.30 GHz, were used to train the network. Current pressure in SFSs is continually fed into the training model when the trained NN is applied to a real setup, and the trained NN's output signal is force data on three axes. Because this is an offline learning controller and soft force sensors might vary over time, a new data set will be required to retrain the model in the future. According to the results shown in Figure 6c–e and Video S3 (Supporting Information), the measured errors for the x, y, and z axes when compared with the ATI Nano17 force sensor were 1.0%, 1.3%, and 0.94%, respectively. It is clear that with soft force sensors, where nonlinearity is common, intelligent calibration using NN beats manual calibration using Equations (1), (2), and (3). Furthermore, there are ways to increase the reliability of the sensor. Training a neural network with more separately collected datasets, applying better NN models, and improving the consistency of the sensor to match its contact dynamics before training are examples of ways to improve the performance of the sensor. These methods can help to ensure that the sensor produces accurate and consistent results.[45]
Mapping Force Data to Actuator Commands
To connect our 3-DoF fingertip cutaneous device to the 3D force sensor, we apply a mapping (3D force mapping) as shown in Figure 7 to convert the interaction force at the tip of the robotic arm under the neural network calibration to the desired position of the tactor, which pushes onto the skin, as shown in Equations (4) and (5). In addition, the second mapping (normal force mapping) is used to transform the normal force into the kinesthetic haptic part as shown in Equation (6). Upon force interaction at the tip of the robotic arm, the motor housing corresponding to each SMAM is given position commands based on these mappings and the hysteresis model.[39c] To avoid the tactor from being commanded beyond its useful workspace, the device's commanded tactor movement is scaled down by a factor related to the amount of the measured interaction forces. For the x and y-axes, the relative factor is 0.4, whereas, for the z-axis, it is 0.6. We established these values through informal pilot testing to preserve user comfort and make sure that most interactions resulted in tactor displacement that was within the workspace of the skin stretch device. Throughout the study, these scaling factors remained the same for each participant. A second-order Butterworth low-pass filter with a cutoff frequency of 5 Hz was used to filter the desired tactor position in order to stop the tactor from experiencing transient vibrations as a result of sensitive variations in the 3D force sensor.
[IMAGE OMITTED. SEE PDF]
Because the relationship between the input motion and contraction force in SMAMs attached to the tactor is linear and has little hysteresis,[39c] the position of the tactor can represent desired force on it. This implies that controlling the tactor position is equal to controlling the force applied to it. Therefore, we introduce a kinematic model that represents the relationship between the position of tactor T (x, y, z) with a diameter ∅d and the change of each SMAM at the current state, Δl1, Δl2, Δl3, Δl4 within the housing frame R as shown in Figure 2d. The actual length of each SMAM (li), i = 1, 2, 3, 4.
Fx, Fy, and Fz are the magnitude of the applied force at the tip of the 3D force sensor in the x, y, and z axis respectively.
k is the input motion of the SMAM in the kinesthetic haptic part. This relative factor is scaled up or down depending on the stability of the whole system and the pilot testing of each user before the official user study as shown in Section 4.2 because someone has a weaker muscle and a smaller handsize than others. During the preliminary pilot testing section, participants are asked to manupulate the robotic arm while the kinesthetic component is active, incorporating the relative factor as described in Equation (6). If a user successfully identifies and attunes to the kinesthetic feedback, maintaining a sense of confidence, the relative factor remains unchanged. However, in cases where the recognition of the kinesthetic feedback is excessively strong or weak, we make corresponding adjustments to the relative factor. It is essential to note that these adjustments are capped at a maximum of 10% relative to the value outlined in Equation (6). This limitation is intentionally set to ensure a consistent and equitable testing experience for all users, fostering fairness and reliability across the study.
Position Control for the Kinesthetic Haptic Module
Because the SFS fiber can provide bending motion of the fingers for health monitoring, rehabilitation, and virtual reality,[41] we integrated one SFS fiber into the haptic glove to sense the flexion of the index finger and an SMAM in the kinesthetic haptic module then follows the finger so that there is a minimum force applied to the finger when the 3D force sensor has not made contact to any objects. This tracking implementation is important because of the required transparency of the haptic system. The physical prototype and the sensor's output pressure signal are shown in Figure 8. The sensor fiber is stressed when the finger is flexed because of the cumulative bending angles of each joint, which led to a decrease in fiber pressure. The variation in length (or strain x+∆x) of the SFS is consequently used to calculate the flexion angle. Based on the tracking data from the SFS, the SMAM extends accordingly under a feedforward controller which is built by the direct inversion of the hysteresis model presented in Equations (7) and (8), resulting in free movement of the user's finger without any kinesthetic force when the robotic arm does not contact the object or the kinesthetic function is deactivated (Figure 8a,b,d). Once the z-axis force is larger than zero while activating the kinesthetic function, the SMAM is triggered by a DC motor using Equation (6), which generates kinesthetic haptic feedback (Figure 8b). A typical operating process of the kinesthetic haptic module is shown in Video S2 (Supporting Information).
[IMAGE OMITTED. SEE PDF]
We define the SMAM length output as ΦS(x, t) = xout (t) and the displacement input in the motor housing as x( t) = xin (t), a fourth-degree polynomial is incorporated into the hysteresis state space model to smoothen the reverse curve at the transition point of the hysteresis loop. The new model is expressed by:
The dimensionless parameters A, υ, ρ, and n in Equations (7) and (8) control the shape and size of the hysteresis loops. The coefficients αx1, αx2, and αz represent the ratio of output length ΦS(x,t) to the input length x(t) and the internal state z(t). By minimizing the mean square error (MSE) between the model output and the measured experimental data based on Genetic Algorithm (GA), seven parameters are identified and optimized. The feedforward compensation is then utilized by a direct inversion of the hysteresis model to approximately eliminate the hysteresis loop. A diagram of the feedforward controller for the position control of SMAM and position tracking results with the different desired trajectories are shown in Figure 8e–g. The results from Figure 8f,g revealed that there was a higher tracking error between the desired trajectory xd(t) and the measured output xout(t) if the feedforward controller was not implemented. Quantitatively, the tracking performance for the case of a single frequency (0.3 Hz) had an MSEno= 3.1681 mm2 (without compensation) and MSEcom= 0.1153 mm2 (with compensation). For the non-periodic reference (combination of 0.3 Hz and Hz), the MSEno = 3.3754 mm2 without compensation and MSEcom= 0.1522 mm2 if the compensation was employed. It is also worth noting that a smaller tracking error with feedforward compensation requires higher accuracy of the identified model parameters for the hysteresis loop.
User Study Results
To demonstrate the effectiveness of our teleoperated system, we also conducted a user study with human subjects (N = 15) under the UNSW Ethics approval number HC210451 (see Experimental Section).
Experiment 1: Direction Discrimination
According to the results of the experiment, all of the participants demonstrated an impressive ability to distinguish the direction of cutaneous stimuli with a commendable level of accuracy, including identifying directions like East (E), West (W), North (N), South (S), Northeast (NE), Southeast (SE), Southwest (SW), and Northwest (NW). The confusion matrices shown in Figure 9a are normalized by the total number of stimulus presentations, and the numerical value provided in each cell indicates the occurrence of a particular answer (horizontal axis) in response to a specific stimulus (vertical axis). The correct ratio achieved for each stimulus is shown in the confusion matrix's diagonal cells. All directions could be distinguished with a comparable degree of accuracy (max 87.0%, min 81.1%), with the majority of inaccurate responses coming from cells that concentrated to the stimulus' neighboring directions. NW stimulation was differentiated from other directional stimulations, and the performance was lower (81.1%). The outcome may be attributed to heterogeneity in the device's force generation as shown in the force measurements section, which used a force sensor to measure the device's interaction forces with the fingerpad, revealing that forces exerted in the NW direction were less powerful than those in other directions. The success rate of discrimination in diagonals was lower than in the principal directions due to differences in force generation.
[IMAGE OMITTED. SEE PDF]
Experiment 2: Robot-Tissue Force Interaction
We analyzed the confidence ratings provided by the subjects at the end of each feedback condition to evaluate the subject's performance under each of the relevant feedback situations. All the subjects were able to complete the task. The results in Figure 9d show the average ratings for the four feedback conditions including no haptic feedback (N), cutaneous feedback only (C), kinesthetic feedback only (K), and both cutaneous and kinesthetic feedback (CK) which were received for identifying the tissue force interaction on a scale of 0 to 10. The collected data passed the D'Agostino-Pearson normality test. A repeated-measures ANOVA with a Geisser-Greenhouse's epsilon (0.5508) revealed a statistically significant difference between the means of the four feedback conditions (F1.652, 23.13 = 86.49, p <0.0001). Post-hoc analysis using Tukey correction showed statistically significant differences between conditions CK and K (p < 0.001) and between CK and C (p = 0.0023).
Conditions CK, C, and K were evaluated at 9.40, 7.97, and 7.30 out of 10 in terms of perceived efficacy, respectively. The results suggested that combining cutaneous and kinesthetic feedback improved significantly performance over using only cutaneous feedback, solely kinesthetic feedback, or no haptic feedback, which was consistent with prior findings in the literature.[13] Fourteen subjects decided condition CK was the most effective feedback condition and only one chose condition C. Interestingly, nine subjects preferred condition C as the second most effective feedback and 5 subjects chose condition K while one subject rated conditions C and K at the same effective level. Furthermore, we did not find any significant differences between the two conditions providing cutaneous feedback C and kinesthetic feedback K because the posthoc analysis combined with Tukey correction between them was 0.513. We believe that the two haptic feedback approaches in our proposed design can be either used together as a single device or separately as two different devices.
In addition, we analyzed the ratings given by the subjects after using the device for four questions and the results are shown in Figure 9e. The first question (“How comfortable was the device to wear?”) was rated 8.20 out of 10 and the second question (“How easy was the device to put on and take off?”) was graded 8.50 out of 10, while the third question (“how much did the device impede your natural movements?”) was rated 1.57 out of 10, which means that the device was comfortably wearable and similar to an ordinary glove. Finally, all users were asked whether the haptic feedback affected their performance or not and the rating was 9.03 out of 10, which implied that adding haptic feedback significantly improved robot-tissue force interaction in all of the considered metrics.
Application of 3D Force Sensor in an Ex Vivo Test
Because the surgical knife cuts tissues laterally in performing endoscopic submucosal dissection or ESD, a bending force is provided by a monopolar surgical knife. As a result, we integrated a monopolar knife into the tip of the 3D force sensor at the tip of the 3D robotic arm and then quantified the bending force imparted to the monopolar knife when the arm makes contact with an object. Hence, the direction of the applied force can be determined by our force sensor. Figure 10a,b and Video S5 (Supporting Information) demonstrate how the proposed 3D force sensor was used in an ex vivo test using chicken breast to measure forces during tissue dissection and palpation. As a result, the force sensor was able to detect force interactions greater than 0.01 N, making it sensitive enough for surgical applications.[46]
[IMAGE OMITTED. SEE PDF]
Conclusion
In this work, an endoscopic surgical system was developed with a soft wearable haptic display, a 3D soft force sensor, and a master-slave architecture. A user wearing the soft fabric glove remotely manipulates a 3-DoF robotic arm with a 3D force sensor using feedback signals from 3 SFSs in their fingers. The haptic glove has a 3-DoF cutaneous device and a 1-DoF kinesthetic module to convey cutaneous force and kinesthetic information to the user's finger, respectively. While the tactor motion on the fingerpad regenerates tactile sensations that correspond to forces detected by the 3D force sensor, the kinesthetic module is controlled by a feedforward controller for following finger flexion or generating kinesthetic feedback with a determined sequencing. The 3D force sensor is fabricated with additive manufacturing methods such as 3D printing and molding and could detect external forces based on the change of the hydraulic pressures using a neural network. Two human subject experiments with 15 participants showed that the proposed system significantly improved performance when providing both cutaneous and kinesthetic feedback and the haptic glove was rated as highly wearable, comfortable, and effective.
We were also able to identify several limitations regarding the practical applications of our findings, including the robustness of the NN model and the consistency of the 3D force sensor when working for a long period due to the nature of the soft material. Hence, training the NN with more separately collected datasets, applying superior NN models, and improving the sensor reliability with new material to match its contact dynamics are examples to improve the sensor accuracy and consistency in future work. We also observed that the force magnitude generated by SMAMs in the cutaneous device is slightly different from others due to inherent inconsistencies in the fabrication process by hand, making the differences in the direction discrimination test. Therefore, an intelligent calibration method or embedding a soft strain gauge into SMAMs should be considered in the near future. In addition, it is important to create more prototypes and test reproducibility and average performance in the continued development to ensure that the haptic device is reliable and effective. Future work should extend the experimental evaluation of the haptic gloves on real surgical tasks and a higher number of participants to assess their impact on the operation of the 3D force sensor and the haptic display.
Experimental Section
Force Capabilities of the Haptic Device
Experiments were performed to measure the blocked forces of the haptic device for all three degrees of freedom while applying different input pressure in order to assess the device's force capacity. A rigid adapter was used that was 3D printed to link a Nano17 force sensor (ATI Industrial Automation, Apex, USA) to the actuators. Forces were transmitted to the force sensor by the adaptor, which tightly fitted onto the tactor. Syringes were used to vary the input pressure and measure the tactor-generated forces. The input pressure was provided in the form of periodic sine waves with amplitudes within an acceptable pressure range (1–200 psi). Five times for each experiment were carried out.
User Study
To test the effectiveness of the proposed haptic device in rendering both cutaneous and kinesthetic haptic, two human-subject experiments were carried out. While the first experimental activity evaluated the discrimination of different fingerpad stretch directions, the second experiment assessed haptic feedback provided in a robot-tissue force interaction task. Fifteen participants (9 males, and 6 females, age range 25–37, 4 lefties and 11 righties) took part in the experiments. None of them had previous experience with haptic interfaces. The experimental setup was composed of the complete wearable haptic glove, shown in Figure 1a. All participants in the experiments provided their informed permission and had no known issues with their vision or skin. The Human Research Ethics Advisory Panel (HREAP) at UNSW previously accepted the research protocol under project number HC210451.
Experiment 1: Direction Discrimination: The first perceptual experiment focused on determining the discrimination of skin stretch direction. The investigation assessed respondents' capacity to detect various directions of tangential skin deformation at the fingertip, as delivered by the developed haptic device. The experiment was designed as a forced-choice paradigm, with stimuli consisting of fingerpad deformation in eight distinct directions along the tangential plane, with a fixed normal force (1 N). Subjects sat in a chair in front of a workstation with a computer screen while wearing the haptic device on their right index finger. Stimuli included fingertip deformation in eight distinct directions on the tangential plane, which corresponded to the four principal directions plus the four diagonals. For all stimuli, the tactor's radial displacement with respect to the starting point was 1.5 mm. The stimulus was presented in two phases for each trial: first, the tactor moved 1.5 mm in the reference tangential direction. The tactor then returned to its center position after 2 s. After the preliminary presentation of the set of stimuli, each participant received a randomly selected series of stimuli, while being blindfolded. After the presentation of each stimulus, participants had to select one of eight options to represent the perceived tangential direction in oral. Each individual received a series of 80 stimuli.
Experiment 2: Robot-Tissue Force Interaction: To evaluate further the effectiveness and viability of the developed system, a robot-tissue force interaction task was carried out with fifteen human subjects. Subjects sat in a chair next to the robotic arm with an embedded 3D force sensor so that subjects were always able to see the operative environment while wearing the developed haptic glove on their right hand. The task started when the robotic arm touched the tissue for the first time and ended when the subject told the experimenter that he/she felt clearly the force interaction. Each participant performed ten random trials for each of the following four feedback conditions:
- Condition N: no haptic feedback,
- Condition C: cutaneous feedback only provided by the 3-DoF cutaneous device,
- Condition K: kinesthetic feedback only provided by the 1-DoF finger kinesthetic module,
- Condition CK: cutaneous and kinesthetic feedback provided by the complete device.
The cutaneous and kinesthetic modules were inactive in Condition N. The kinesthetic module was programmed to apply no force to the subject's finger while the tactor remained in constant contact with the subject's fingertip. The cutaneous module supplied cutaneous stimuli in Condition C to reproduce the contact with the tissue. Again, no force was applied to the subject's finger via the kinesthetic module in this condition. The kinesthetic module provided kinesthetic stimuli in Condition K while no force was applied to the subject's fingerpad via the cutaneous module. In Condition CK, both the cutaneous and kinesthetic modules delivered cutaneous and kinesthetic stimuli, respectively, to produce contact with the tissue according to the proposed mapping. At the end of each condition, each participant was asked to rate their confidence in identifying the tissue force interaction on a scale of 0 to 10. A score of 0 indicated “very difficult” or “extremely unconfident,” while a score of 10 indicated “extremely easy” or “extremely confident.”
After using the device, the responses (10-point Likert scale) were provided on a scale of 0 to 10, with 0 representing “extremely low” and 10 representing “very high” for each bellow question:
- Question 1: How comfortable was the device to wear?
- Question 2: How easy was the device to put on and take off?
- Question 3: How much did the device impede your natural movements?
- Question 4: Do you feel the force feedback affected your performance at all? (scale from greatly hindered performance to greatly improved performance)
Acknowledgements
The authors acknowledge support from the UNSW Scientia Fellowship Grant (PS46197) and Google Research Scholar Award (PS66605). Mai Thanh Thai, Trung Thien Hoang, and Chi Cong Nguyen would like to thank the Science and Technology Scholarship Program for Overseas Study for Master's and PhD degrees, VinUniversity, Vingroup, Vietnam.
Conflict of Interest
The authors declare no conflict of interest.
Data Availability Statement
The data that support the findings of this study are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.
C. Pacchierotti, S. Sinclair, M. Solazzi, A. Frisoli, V. Hayward, D. Prattichizzo, IEEE Trans. Haptics 2017, 10, 580.
N. Enayati, E. De Momi, G. Ferrigno, IEEE Rev. Biomed. Eng. 2016, 9, 49.
M. T. Thai, P. T. Phan, T. T. Hoang, S. Wong, N. H. Lovell, T. N. Do, Adv. Intell. Syst. 2020, 2, [eLocator: 1900138].
P. T. Phan, T. T. Hoang, M. T. Thai, H. Low, J. Davies, N. H. Lovell, T. N. Do, Sci. Rep. 2021, 11, [eLocator: 22420].
T. T. Hoang, P. T. Phan, M. T. Thai, J. Davies, C. C. Nguyen, H.‐P. Phan, N. H. Lovell, T. N. Do, Adv. Intell. Syst. 2022, 4, [eLocator: 2200282].
a) C. Pacchierotti, D. Prattichizzo, K. J. Kuchenbecker, IEEE Trans. Biomed. Eng. 2015, 63, 278;
b) C. C. Nguyen, S. Wong, M. T. Thai, T. T. Hoang, P. T. Phan, J. Davies, L. Wu, D. Tsai, H. P. Phan, N. H. Lovell, T. N. Do, Adv. Sens. Res. 2023, 2, [eLocator: 2200036].
M. Turkseven, S. De, C. D. Jackson, M. S. Sawhney, IEEE Trans. Haptics 2022, 15, 603.
Z. F. Quek, W. R. Provancher, A. M. Okamura, IEEE Trans. Haptics 2018, 12, 102.
F. Chinello, C. Pacchierotti, J. Bimbo, N. G. Tsagarakis, D. Prattichizzo, IEEE Robot. Autom. Lett. 2017, 3, 524.
S. B. Schorr, Z. F. Quek, I. Nisky, W. R. Provancher, A. M. Okamura, IEEE Trans. Hum. Mach. Syst. 2015, 45, 714.
Y. H. Jung, J.‐Y. Yoo, A. Vázquez‐Guardado, J.‐H. Kim, J.‐T. Kim, H. Luan, M. Park, J. Lim, H.‐S. Shin, C.‐J. Su, Nat. Electron. 2022, 5, 374.
F. H. Giraud, S. Joshi, J. Paik, IEEE Trans. Haptics 2021, 15, 131.
F. Chinello, M. Malvezzi, D. Prattichizzo, C. Pacchierotti, IEEE Trans. Ind. Electron. 2019, 67, 706.
C. Pacchierotti, L. Meli, F. Chinello, M. Malvezzi, D. Prattichizzo, Int. J. Robot. Res. 2015, 34, 1773.
S. B. Schorr, A. M. Okamura, IEEE Trans. Haptics 2017, 10, 418.
a) F. Chinello, M. Malvezzi, C. Pacchierotti, D. Prattichizzo, presented at 2015 IEEE Int. Conf. on Advanced Intelligent Mechatronics (AIM), Busan, Korea (South), July 2015;
b) E. M. Young, K. J. Kuchenbecker, IEEE Trans. Haptics 2019, 12, 295.
D. Prattichizzo, F. Chinello, C. Pacchierotti, M. Malvezzi, IEEE Trans. Haptics 2013, 6, 506.
M. Malvezzi, F. Chinello, D. Prattichizzo, C. Pacchierotti, IEEE Trans. Haptics 2021, 14, 266.
H. A. Sonar, A. P. Gerratt, S. P. Lacour, J. Paik, Soft Rob. 2020, 7, 22.
H. A. Sonar, J. Paik, Front. Robot. AI 2016, 2, 38.
N. Besse, S. Rosset, J. J. Zarate, H. Shea, Adv. Mater. Technol. 2017, 2, [eLocator: 1700102].
H. Zhao, A. M. Hussain, A. Israr, D. M. Vogt, M. Duduta, D. R. Clarke, R. J. Wood, Soft Rob. 2020, 7, 451.
K. T. Yoshida, C. M. Nunez, S. R. Williams, A. M. Okamura, M. Luo, presented at 2019 IEEE World Haptics Conference (WHC), Sola city, Japan, July 2019.
Z. Zhakypov, A. M. Okamura, presented at 2022 IEEE 5th Int. Conf. on Soft Robotics (RoboSoft), Edinburgh, UK, April 2022.
a) S. R. Williams, J. M. Suchoski, Z. Chua, A. M. Okamura, IEEE Robot. Autom. Lett. 2022, 7, 3310;
b) C. E. Winston, Z. Zhakypov, M. Cutkosky, A. M. Okamura,
A. Trejos, R. Patel, M. Naish, Part C: J. Mech. Eng. Sci. 2010, 224, 1435.
a) M. Cianchetti, C. Laschi, A. Menciassi, P. Dario, Nat. Rev. Mater. 2018, 3, 143;
b) S. Peng, S. Wu, Y. Yu, Z. Sha, G. Li, T. T. Hoang, M. T. Thai, T. N. Do, D. Chu, C. H. Wang, J. Mater. Chem. A 2021, 9, [eLocator: 26788].
a) C. Tawk, G. Alici, Adv. Intell. Syst. 2021, 3, [eLocator: 2000223];
b) H. Yang, Y. Chen, Y. Sun, L. Hao, Sens. Actuators, A 2017, 266, 318;
c) C. Tawk, M. in het Panhuis, G. M. Spinks, G. Alici, Adv. Intell. Syst. 2019, 1, [eLocator: 1900002];
d) C. Tawk, E. Sariyildiz, H. Zhou, M. in het Panhuis, G. M. Spinks, G. Alici, presented at 2020 3rd IEEE Int. Conf. on Soft Robotics (RoboSoft), New Haven, CT, USA 2020;
e) K. Kong, M. Tomizuka, IEEE ASME Trans Mechatron 2009, 14, 358;
f) H. Choi, P.‐G. Jung, K. Jung, K. Kong, presented at 2017 IEEE Int. Conf. on Robotics and Automation (ICRA), Marina Bay Sands, Singapore May—June 2017;
g) D. Gong, R. He, J. Yu, G. Zuo, Sensors 2017, 17, 2592;
h) R. Slyper, J. Hodgins, presented at 2012 IEEE RO‐MAN: The 21st IEEE Int. Symposium on Robot and Human Interactive Communication 2012;
i) M. Vázquez, E. Brockmeyer, R. Desai, C. Harrison, S. E. Hudson, presented at Proceedings of the 33rd Annual ACM Conf. on Human Factors in Computing Systems, Seoul Republic of Korea, April 2015.
L. He, N. Herzig, T. Nanayakkara, P. Maiolino, Soft Rob. 2022, 9, 1062.
a) M. M. Sadeghi, K. Dowling, R. L. Peterson, K. Najafi, presented at 2013 IEEE 26th Int. Conf. on Micro Electro Mechanical Systems (MEMS), Taipei, Taiwan, Jan 2013;
b) T. Sasaki, Y. Fujiwara, K. Tachikawa, K. Terabayashi, K. Dohda, Int. J. Automat. Technol. 2020, 14, 625.
M. A. Meller, M. Bryant, E. Garcia, J. Intell. Mater. Syst. Struct. 2014, 25, 2276.
a) M. De Volder, D. Reynaerts, J. Micromech. Microeng. 2010, 20, [eLocator: 043001];
b) M. Focchi, E. Guglielmino, C. Semini, A. Parmiggiani, N. Tsagarakis, B. Vanderborght, D. G. Caldwell, presented at 2010 IEEE/RSJ Int. conf. on intelligent robots and systems, Taipei, Taiwan, Oct 2010;
c) P. T. Phan, T. T. Hoang, M. T. Thai, H. Low, N. H. Lovell, T. N. Do, Soft Rob. 2022, 9, 820;
d) P. T. Phan, M. T. Thai, T. T. Hoang, J. Davies, C. C. Nguyen, H.‐P. Phan, N. H. Lovell, T. N. Do, Sci. Rep. 2022, 12, [eLocator: 11067].
a) D. G. Caldwell, G. A. Medrano‐Cerda, M. Goodwin, IEEE Control Syst. Mag. 1995, 15, 40;
b) R. Tiwari, M. A. Meller, K. B. Wajcs, C. Moses, I. Reveles, E. Garcia, J Intell Mater Syst Struct 2012, 23, 301.
D. S. Chathuranga, Z. Wang, Y. Noh, T. Nanayakkara, S. Hirai, presented at 2016 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Daejeon, Korea (South), Oct 2016.
D. M. Vogt, Y. L. Park, R. J. Wood, IEEE Sens. J. 2013, 13, 4056.
P. Yu, W. Liu, C. Gu, X. Cheng, X. Fu, Sensors 2016, 16, 819.
S. Mousavi, M. T. Thai, M. Amjadi, D. Howard, S. Peng, T. N. Do, C. H. Wang, J. Mater. Chem. A 2022, 10, [eLocator: 13673].
M. T. Thai, P. T. Phan, H. A. Tran, C. C. Nguyen, T. T. Hoang, J. Davies, J. Rnjak‐Kovacina, H. P. Phan, N. H. Lovell, T. N. Do, Adv. Sci. 2023, 10, [eLocator: 2205656].
a) P. T. Phan, M. T. Thai, T. T. Hoang, N. H. Lovell, T. N. Do, IEEE Access 2020, 8, [eLocator: 226637];
b) M. T. Thai, T. T. Hoang, P. T. Phan, N. H. Lovell, T. N. Do, Ieee Access 2020, 8, [eLocator: 157878];
c) M. T. Thai, P. T. Phan, T. T. Hoang, H. Low, N. H. Lovell, T. N. Do, IEEE Robot. Autom. Lett. 2021, 6, 5089.
a) A. L. Guinan, N. C. Hornbaker, M. N. Montandon, A. J. Doxon, W. R. Provancher, presented at 2013 World Haptics Conference (WHC), Delft, Netherlands, July 2013;
b) M. R. Motamedi, D. Florant, V. Duchaine, Front. Robot. AI 2017, 4, 1.
J. Davies, M. T. Thai, T. T. Hoang, C. C. Nguyen, P. T. Phan, H. P. Phan, N. H. Lovell, T. N. Do, Adv. Mater. Technol. 2023, 8, [eLocator: 2201453].
a) T. A. Truong, T. K. Nguyen, X. Huang, A. Ashok, S. Yadav, Y. Park, M. T. Thai, N. K. Nguyen, H. Fallahi, S. Peng, Adv. Funct. Mater. 2023, 33, [eLocator: 2211781];
b) J. Davies, M. T. Thai, T. T. Hoang, C. C. Nguyen, P. T. Phan, K. Zhu, D. B. N. Tran, H. M. La, Q. P. Ha, N. H. Lovell, T. N. Do, In 2023 IEEE International Conference on Robotics and Automation (ICRA), London, UK, 2023, pp. 581–587.
R. J. Webster III, B. A. Jones, Int. J. Robot. Res. 2010, 29, 1661.
H. Su, C. Yang, H. Mdeihly, A. Rizzo, G. Ferrigno, E. De Momi, IEEE Access 2019, 7, [eLocator: 122041].
K. Chin, T. Hellebrekers, C. Majidi, Adv. Intel. Syst. 2020, 2, [eLocator: 1900171].
A. L. Trejos, S. Jayaraman, R. V. Patel, M. D. Naish, C. M. Schlachta, Surg. Endosc. 2011, 25, 186.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Haptic (touch) is important for effective interaction with the surroundings. In teleoperated surgical systems, its absence leads to reduced perception, making delicate tasks difficult and resulting in unexpected damage to surrounding tissues. To enhance safety, a force‐feedback system and haptic device are needed. However, existing haptic prototypes are associated with rigid, bulky, and heavy components, making effective communication with the skin problematic. This paper presents a teleoperated endoscopic surgical system with an integrated 3D force sensor and real‐time haptic feedback. A surgical robotic arm is remotely controlled by a soft haptic glove incorporated 3‐axis cutaneous device and a finger kinesthetic module. The 3D force sensor is constructed from hydraulic filament soft sensors that can induce pressure change under strain. To enable precise motion, the haptic glove is operated by a feedforward controller and a master‐slave architecture. Experiments with human subjects (n = 15) show that cutaneous and kinesthetic feedback significantly improves the user's performance (9.4 out of 10) compared to no haptic feedback (2.27 out of 10). Finally, subjects rank the new system as highly wearable, comfortable, and effective, which is expected to bridge a gap in the surgical field and support the future development of advanced teleoperated systems.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details

1 College of Engineering and Computer Science, VinUniversity, Hanoi, Vietnam
2 Graduate School of Biomedical Engineering, Faculty of Engineering, UNSW Sydney, Sydney, NSW, Australia
3 School of Mechanical & Mining Engineering, The University of Queensland, St Lucia, QLD, Australia
4 School of Mechanical and Manufacturing Engineering, Faculty of Engineering, UNSW Sydney, Sydney, NSW, Australia
5 Tyree Institute of Health Engineering, UNSW Sydney, Sydney, NSW, Australia