1. Introduction
Biomimetic robots imitate living organisms’ appearance, shape, or behaviour and are designed to utilise biological principles to replicate natural behaviour and solve complex problems. A snake robot is an example of a biomimetic robot characterised by its high level of redundancy and numerous degrees of freedom. The robot’s movement is produced by changes in its internal shape, similar to the motion of natural snakes. The robot’s joints, which link its segments, are defined by a series of angles that describe its configuration [1].
Virtual Reality (VR) has an important role in Industry 4.0, the fourth industrial revolution, and offers new ways to use VR in robotics applications through virtual manufacturing. Integrating VR with robotics has several potential uses [2], including enabling the planning of robot trajectories through trial and error in VR rather than using existing industrial systems; training operators; assisting surgeons in robotic surgery procedures; developing safe procedures for human-robot collaboration; and creating an integrated environment for control design that encompasses models (digital twins), control algorithms, and VR. VR simulation provides a more immersive and engaging way to study simulation results. In the virtual environment, abstract properties of the motion, such as angles or velocity vectors, can be displayed. The VR application also allows the user to change their perspective, stop or rewind the simulation, and manipulate simulation parameters within the VR environment.
VR provides a cost-effective approach to controlling virtual robots in a simulated environment and facilitates the training of operators. VR is also a valuable tool for testing complex simulations involving human–robot collaboration. In [3], the authors developed a virtual environment for validating mathematical models of robots. In addition, multiple users can be tracked in VR while performing assembly tasks, thereby aiding in designing effective workplace layouts for humans and robots in the industrial setting [4].
Robot-assisted surgery is a promising field that employs VR simulators to train doctors in various clinical procedures, including those using da Vinci robots. The immersive perspective of VR visualisation provides medical students with an unprecedented understanding of anatomy, enabling the exploration of organs at both the micro and macro scales. Moreover, immersive, dynamic models of physiological and pathological processes could result in an experience of “immersive medicine” [5].
Programming by Demonstration is a technique for teaching robotic systems new behaviours from a demonstration by a human operator. To reduce programming complexity, robots can be taught new movements in VR environments [6]. A novel methodology based on heuristic beam search, described in [7,8], has been implemented in VR. This algorithm plans collision-free paths for n degree-of-freedom robots, with human involvement in defining the free space or collision-free volume and selecting the start and goal configurations.
To ensure safe and efficient collaboration between humans and robots, planning the workspace layout and programming of the industrial robotic work cell is essential. VR can aid in achieving these tasks. Ref. [7] highlights the use of VR in planning the workspace layout and programming the robotic cell, which can improve the safety of human–robot collaboration. Furthermore, VR-enhanced Computer-Aided Design (CAD) software provides an effective way to create and visualise an appropriate layout for the robotic cell.
VR has been demonstrated to enhance learning for high school students by simplifying and simulating complex concepts across various fields. Virtual Reality experiments have also been found to improve student understanding in science courses and to increase their interest in learning through immersive experiences [9,10,11,12]. Virtual Reality laboratories provide a safe environment for students to perform experiments without the risk associated with real-world materials or hazardous situations. In mathematical subjects, geometric models can be taught effectively using VR, as students can easily visualise complex geometry models to improve the quality of their education.
One further application of Virtual Reality is testing and implementing control algorithms for various systems. In [13], a solution is proposed for decentralised formation tracking of groups of mobile robots using consensus and Model Predictive Control. The solution is designed to ensure collision avoidance and communication topology, with experimental results verified in a VR environment. In [14], an illusion-based VR interaction technique is proposed where the virtual hand of a user is moved during a reach to guide their physical hand to a specific location. The proposed method is validated for developing a control approach to achieve redirection and desired point and path tracking. Additionally, VR simulations are used to test and train autonomous transition control methods for drones, which assist farm workers in scouting to improve the efficiency of vineyard management [15], and in smart cities [16].
VR is a technology that allows the visualisation and interaction with three-dimensional environments, including objects’ behaviour. It is used to simulate and study complex environments for entertainment and research purposes. VR interfaces are used for visualisation, interaction with robotics, planning, usability, and infrastructure for both experts and non-experts [17]. However, the use of VR with biomimetic robotics has not been extensively researched. More specifically, there is limited research on using snake robots in VR. However, the motion of haptic snakes has been investigated for multiple feedback methods, such as tapping and gesture feedback [18].
A comprehensive review is presented in the literature from 2015 to 2019 that covers VR applications in medical robotics [19]. In [20], the author presented the VR challenges for manufacturing systems for industry 4.0, which conclude that the adoption of immersive technologies of AR/VR systems in manufacturing industries has persistently increased. Moreover, research possibilities are in areas that improve the flexibility of multiple users in a VR environment and the development of methodologies for simultaneous interface and concurrent interactions among collaborators in a VR environment. Another promising area is a mixed reality framework for human–robot collaboration [21].
The dynamical model of the snake robot and its kinematics was described in several articles [22,23]. In this article, it was shown that when the snake robot moves on the ground, the friction or drag force coefficients of the snake robot are larger in a sideway direction than in the longitudinal direction of the link. However, there is a lack of visual implementation. Previous articles on the subject analyse the joint angles’ angular velocities and the robot’s head or mass centre linear velocity [24,25,26,27,28]. In these articles, snake robots used Line-of-Sight (LOS) guidance control law to exponentially stabilise the desired straight-line path under a given condition on the look-ahead distance parameter. However, our article describes Point-of-Sight (PoS) control law. Moreover, our article presents a new method for enhancing kinematic studies by analyzing the velocity in all segments’ normal and tangential directions. Simulation results include plots generated in MATLAB and screenshots taken in a VR environment.
In this paper, we present a new approach for evaluating control systems in snake robots. Currently, Virtual Reality (VR) and augmented reality (AR) are employed to visualise the motion of machines and multi-body systems, utilising virtual entities and presenting information in standard graphs and numerical data. In particular, AR offers additional information often imperceptible in the physical world. In this study, we introduce supplementary layers of information, such as vectors (e.g., velocity, friction, torque) and colours, to facilitate a more comprehensive understanding of how parameter changes impact the control system’s quality in snake robots. We employ a synesthetic design approach to enable insightful evaluations of the algorithms used in robotics. This paper proposes a new method for presenting engineering simulation results in VR. Specifically, it investigates the movement of a snake robot on a flat surface and scrutinises how various model parameters affect its motion. The study utilises MATLAB to compute the robot’s dynamic model and control algorithms while the Unity engine generates the virtual environment and animations.
This article is structured into five sections. Section 1 introduces the use of VR technology in engineering and advances in biomimetic snake robots. Section 2 discusses the dynamic model of the snake robot, the control algorithm used for the robot to reach the destination position, and the implementation details of the applications created in MATLAB and Unity. The simulation results are presented in Section 3. The obtained results are discussed in Section 4. Finally, Section 5 provides the conclusions and future work.
2. Materials and Methods
2.1. Snake Robot Model
The model of the snake robot depicted in the article is based on widely used equations of snake robot motion on a flat horizontal surface. A detailed description of the model can be found in [22,29]. This dynamical model can be derived based on the torque equilibrium equation:
(1)
where: and are square matrices with trigonometric functions of link angles at the diagonal and zeros in the remaining elements, and link angles in a global coordinate system.To define the robot’s configuration, there is a differentiation between link angles and joint angle indicated in Figure 1. A link angle is defined as an angle between the link and a global x-axis, while a joint angle is a difference between the link angles of two neighbouring links . The vector defines the controllable parameters–actuator torques exerted on successive links. The and vectors represent components of friction force on the links in global x- and y-directions. This model assumes viscous friction forces acting on the mass centre of the links. The friction force is defined in a local coordinate system attached to each robot’s segments, as shown in Figure 1. Friction force is proportional to the segment’s linear velocity and coefficients and :
(2)
Anisotropic friction force enables the snake robot’s movement by producing lower friction coefficients, denoted as , in the joints’ longitudinal direction compared to the normal direction, where the coefficient is denoted as . This difference in coefficients allows the joints to slide forward.
2.2. Path Following Controller
A simple approach for path-following is the Line-of-Sight (LoS) method [30], which involves moving towards a series of pre-defined reference points. This method requires the robot to follow a straight line between its current and target positions. Once the robot reaches the desired accuracy for the current target position, it moves on to the next reference point in the sequence.
The gait pattern of a snake robot’s commonly used control system is lateral undulation, described in [31]. The dynamical analysis of various snake robot motion patterns can be found in [32], and the performance of different motion strategies is discussed in [33]. For the lateral undulation pattern, each joint angle i of the robot, where i belongs to the set , is controlled using the following equation
(3)
where and are the amplitude and angular frequency, respectively, of the sinusoidal joint motion, determines the phase shift between the joints, and is a joint offset, which we assume to be identical for all joints. The joint offset controls the direction of the locomotion and allows the robot to reach the destination point. It is defined as the difference between the heading angle and heading reference angle as(4)
The controller gain influences the control system efficiency. The heading angle of the snake robot is defined as an average of link angles:
(5)
The heading reference angle designates a direction to the reference position as follows
(6)
where and are distances between the robot’s head and destination point along the x- and y-axes of the inertial coordinate system. The joint torque is determined according to the control law given as(7)
The system performance depends on controller gains and .
2.3. MATLAB Simulations
The equations of motion for the snake robot were coded in MATLAB software and solved using the
Two coordinates define the destination position in the inertial coordinate system. A series of points can also be defined, and once the robot reaches one of them, it will move on to the next in the sequence. If the algorithm does not define the subsequent target point, the robot continues to move forward using the last calculated value of the joint offset.
The friction coefficients in the normal and tangential directions are the primary parameters determining the snake robot’s behaviour. A distinct difference in the anisotropy of friction force is necessary for the robot to move forward. The simulation program lets the user define the viscous friction coefficient within to .
To direct a robot towards a target orientation, we manipulate a joint offset defined in Equation (4). The control algorithm’s effectiveness is determined by the controller gain , which can vary between 0 (when the robot’s dynamics are not influenced by path following) and 3.
The simulation generates several plots, such as the robot’s trajectory, joint angles, and control signals, as well as graphs depicting the heading angle and reference heading angle’s progression. Additionally, each segment’s resultant position and orientation at discrete simulation intervals are saved to a
The MATLAB implementation is available at GitHub webpage (
2.4. Visualisation of the Snake Robot Motion
2.4.1. Simulink 3D Animation
Initially, the robot’s movement was visualised using Simulink 3D Animation Toolbox, a MATLAB library that utilises the Virtual Reality Markup Language (VRML). The segment geometry was imported from the STL file created in SOLIDWORKS. MATLAB calculated the segment positions and orientations, which were saved in a txt file to create the animation. Figure 2 shows that this visualisation provided a deeper understanding of the snake robot’s motion from various perspectives. However, manipulating the camera in the program was challenging, and the graphics of the solution were relatively inadequate. Thus, a different visualisation approach was employed for the snake robot.
2.4.2. Three-Dimensional Simulations in Unity
The latest version of the 3D visualisation for the snake robot was developed using Unity, a popular game engine known for its advanced graphical animations. Besides creating three-dimensional and two-dimensional games, Unity is also used in several industries, including film, automotive, architecture, engineering, and construction. Its intuitive editor has drag-and-drop functions and scripting abilities based on the widely used C# language. The Unity engine offers a comprehensive training platform with numerous tutorials, examples, and specialised training paths, making it an ideal choice for this application.
To improve VR software development, a useful tool is the Software Development Kit (SDK), which offers a collection of pre-built and configured interactions for VR projects. The XR Interaction Toolkit, which is one of the most popular free SDKs, is used in our project. Other widely used solutions include the Oculus Interaction SDK and the Windows Mixed Reality Toolkit. The SDK provides script libraries to implement interactions in VR projects, such as grabbing objects, interacting with the user interface, recognising hand gestures, locomotion systems, and physics interactions.
The implemented program allows running simulations on Oculus Quest 2 (Meta Quest documentation,
The visualisation program utilises the CAD-designed geometry of the snake robot segments. Using data generated by MATLAB, the program interpolates the position and orientation of the segments in consecutive moments. The Unity scene, as shown in Figure 3, features an immersive environment, lighting, and a camera that tracks the user’s head movement. Additionally, a Graphical User Interface (GUI) allows the user to interact with certain elements of the scene. The user can move around in either continuous or teleportation mode. The GUI contains grabbable and interactable parts, such as the reference position or buttons.
There is also an optionally available grid that features white marks that are apart and a red mark that indicates the origin of the Cartesian coordinate system.
One of the main benefits of visualising the snake robot’s movement in VR is the user’s ability to observe it from any distance or perspective. This means the user can change the camera position or move around the virtual scene. To switch to a different pre-defined camera position, the user must press one of the controller buttons, X/Y/A/B, as shown in Figure 4.
There are five sets of camera positions available, each offering a unique perspective:
No button pressed—default stationary camera at eye-level position and horizontal orientation. This camera observes the GUI in front of the user, and the snake robot is seen from a height like it is moving on the floor.
The B button pressed—the camera is facing downwards perpendicular to the floor. It moves with a robot’s centre and rotates to track the angle.
The A button pressed—this camera simulates a camera attached to the robot’s head. It is moving and rotating along with the first segment of the robot.
The Y button pressed—the camera is stationary and is facing downwards, perpendicular to the floor.
The X button pressed—the camera is in a low position behind the target position. It changes position when the robot reaches subsequent targets.
The user can manipulate the Oculus Quest 2 controllers to control the snake robot’s simulation. The user can increase or decrease the simulation’s speed, pause, or rewind it. The trigger button shown in Figure 5 can be used to stop or rewind the simulation, with the amount of pressure applied to the button determining the magnitude of the rewind. The user can decelerate the simulation by pressing a push button, and the deceleration rate depends on the force applied. Further, haptic feedback (controller vibrations) may occur when reaching a designated position.
The Unity game engine provides a way to display an interactable Graphical User Interface (GUI) in VR. The GUI consists of elements such as sliders, drop-down lists, and push/toggle buttons, and the user interacts with them by casting a ray from the controllers, as shown in Figure 6. In our program, the GUI is stationary and located in front of the default camera position. It enables the user to restart the simulation and change the data source. There are two simulation modes available. In the first “off-line” mode, an application reads previously calculated data from a text file generated by MATLAB. In this mode, the GUI displays a drop-down list with all files uploaded to the application.
In the second “real-time” mode, the robot’s position and orientation are calculated in real-time while the visualisation runs on Unity. In this case, the simulation in MATLAB and VR visualisation run in parallel. Unity and MATLAB communicate via TCP Sockets to exchange information about simulation parameters and trajectories. In “real-time” mode, users can assign the robot’s destination point and change the simulation parameters in GUI, such as friction coefficients in the normal and tangential directions and the path following gain. The GUI provides sliders to change parameter values in a given range. Application modes can be changed by marking the “data form file/data from MATLAB” checkbox in GUI.
The VR GUI consists of checkboxes that allow users to turn on/off the visualisation of selected physical properties of the robot’s motion, such as the segment’s velocities and guidance angles. The segment’s velocity is depicted as a vector attached to the segment’s centre. The velocity of the segment is calculated as the quotient of the change in its position by the time of data sampling. The user can select by GUI slider which segment’s velocity should be displayed.
There is a panel shown in Figure 7 attached to the user’s left hand. It shows the current values of selected parameters:
time,
heading angle,
reference heading angle,
position of the snake robot’s head in global coordinates.
Plots are generated in MATLAB, and display:
position of robot head and remaining segments,
heading angle versus reference heading angle,
joint angles,
link angles,
tangential and normal components of segment velocities,
absolute and global components of the snake robot’s head velocity.
The application also provides visualisation of the guidance angles. The heading reference angle, denoted by , is displayed as a red line that connects the destination point, the robot’s head, and the horizontal line. Meanwhile, the heading angle, denoted by , is represented by a yellow line.
The VR environment includes a red cylinder to represent the location of the destination point that the robot is following. In “off-line” mode, the destination position and reaching time are saved in a file. In “real-time” mode, the user marks the desired position on the floor with controller rays and clicks the trigger button to confirm the new position. The application sends the reference point to MATLAB via a TCP protocol, and the control algorithm calculates the target trajectory. The robot’s segment’s calculated positions are sent back to Unity, and the robot’s configuration is displayed in VR. If the user assigns a new target position, the algorithm restarts, and the snake robot returns to the origin. This allows the user to observe the algorithm’s performance and compare results for different parameters.
3. Results
For kinematic analysis of the snake robot, described by mathematical model 1, we have carried out the following numerical studies for varying parameters:
friction coefficients in normal and tangential directions;
parameters of the lateral undulation gait pattern, Equation (3): , , ;
controller gains, and , described in control law Equation (7) and described in joint offset Equation (4).
For all analysis, we have assumed the same robot’s geometrical parameters:
number of robot segments: ;
the segment mass: ;
the segment length: ;
the moment of inertia: .
3.1. Friction Coefficients
The anisotropy of the friction coefficient was introduced in Equation (2), where is the friction coefficient in the tangential direction and in the normal direction. During studies, we have analysed four cases:
and ;
and ;
and ;
and .
All simulations in this subsection were performed for:
(8)
and(9)
3.1.1. Robot’s Configurations for Different Friction Coefficients
Figure 8, Figure 9, Figure 10 and Figure 11 display different sets of parameters and their effects on the robot’s configurations. Each figure contains three screenshots taken from the VR application, with one plot per figure.
The MATLAB-generated plot displays the positions of the robot’s head (indicated by a thick blue line) and its remaining segments (indicated by dotted lines) on the x- and y-axes. The robot’s configuration at seconds, seconds, and seconds is marked on the plot with pentagrams, circles, and triangles, respectively. Moreover, the bottom plot shows the position of the robot’s head along the x-axis, while the left plot shows its position along the y-axis.
At the 5th, 10th, and 24th seconds of the simulation, screenshots were taken in VR. The camera used in these shots was fixed and placed parallel to the floor, as seen in Figure 4, where the camera is attached to the Y button. A grid representing the coordinate system was displayed in the virtual scene.
Figure 10 indicates that the robot requires anisotropy of friction coefficients to move. When , the robot seems to slide on the surface without any forward motion. However, as shown in Figure 8 and Figure 9, when , the robot slides forward. The displacement is greater when there is a larger difference between coefficients. On the other hand, as depicted in Figure 11, when , the snake moves backwards, and the head’s trajectory remains a polygonal chain rather than a sinusoidal curve.
3.1.2. Analysis of Linear Velocities of Robots Segments
Figure 12 displays the velocity of the snake robot’s head. The plot contains three lines: the yellow line represents the absolute velocity value, the red line corresponds to the global y-component of velocity, and the blue line shows the x-component of velocity. At three points in time, , , and , there are markers on the plot indicating the velocity values.
The oscillations along the y-axis in the global coordinate system have a similar appearance in the first three simulations, with an amplitude of approximately and a period of about . However, the amplitude of oscillation is considerably lower in the fourth simulation. The oscillations along the z-axis in the global coordinate system have a mean value of zero in the third simulation, resulting in no forward movement of the robot. In the first and second simulations, the mean value of the x-axis velocity is positive, causing the robot to move forward. The resultant velocity is significantly higher in the first case. Finally, in the fourth simulation, the x-axis velocity oscillates around a negative value when the robot moves backwards.
Figure 13, Figure 14, Figure 15 and Figure 16 depict the velocities of each robot segment, with the yellow line indicating the velocity magnitude. The blue line represents the tangential component of velocity, which is aligned with the x-axis of the local coordinate system of each segment, and the red line represents the normal component of velocity, which is aligned with the y-axis of the local coordinate system of each segment.
Figure 12, Figure 13, Figure 14 and Figure 15 provide insights into how the robot’s segments move, whether they slide longitudinally along the segment’s length or laterally, perpendicular to the segment. In the first two simulations (Figure 12 and Figure 13), segments 1, 2, 9, and 10 primarily move tangentially, while segments 4, 5, and 6 predominantly move in the normal direction. All segments exhibit dominant normal velocity directions in the third and fourth simulations.
The simulation results are also presented in Figure 17, Figure 18, Figure 19 and Figure 20. The top-left plot displays the absolute velocity value of each segment (top plot) and the ratio of the tangential velocity to the absolute velocity value (bottom plot). The VR screenshots illustrate the robot’s configuration, with arrows representing the velocity vectors of each segment at various simulation time points.
The results obtained support the previous findings. As seen in Figure 17 and Figure 18, the arrows are primarily oriented tangentially for segments 1, 2, 9, and 10 and perpendicularly for segments 4, 5, and 6. In contrast, the arrows in Figure 19 and Figure 20 are predominantly normal for all segments.
3.2. Parameters of Gait Pattern
The motion of the snake robot is characterised by a lateral undulation pattern defined by Equation (3) and determined by the amplitude , angular frequency , and phase shift . The subsequent section examines the impact of each of these parameters on the robot’s motion. In all simulations, we used the values , , , and .
3.2.1. Amplitude
We conducted a series of simulations with an angular frequency of , a phase shift of , and a sequence of amplitudes: , , , and . Figure 21, Figure 22, Figure 23 and Figure 24 display the robot trajectories, which indicate that increasing the amplitude results in a more pronounced bending of the robot’s body, and a significantly greater distance travelled by the robot, as observed in the VR screenshots.
Figure 25 and Figure 26 depict plots of the link angles (top-left), joint angles (top-right), link angle angular velocities (bottom-left), and joint angle angular velocities (bottom-right). The thick line represents the mean angle value. The larger values correspond to larger link and joint angles. The derivatives of the angles also increase with larger amplitudes.
Figure 26 shows the velocity of the robot head. The absolute value of the velocity is marked by the yellow line, the component of the velocity along the x-axis of the global coordinate system is marked by the blue line and the component of the velocity along the y-axis of the global coordinate system is marked by the red line. The robot’s velocity increased with larger ; however, there is no significant difference between and .
Figure 27 shows the torques applied to the robot’s joints. They increased linearly with rising amplitude.
The simulation results have been gathered in Table 1. Column d indicates the distance travelled by the robot’s head after 25 seconds of simulation; is the maximum value of torque applied to the robot’s joints; V is the final velocity of the robot head in the 25th second of the simulation; , are the maximum value of the link and joint angle; finally, and are the maximum values of angle derivatives. While calculating columns 5, 7–10, we considered only the system’s steady state after the 200th simulation sample.
3.2.2. Angular Frequency
The next series of simulations investigated the influence of angular frequency on the snake motion. The simulations were performed for , , and sequence of frequencies , , , . Figure 23, Figure 28, Figure 29 and Figure 30 indicate the change in the robot’s shape. For smaller , the robot’s body is more curved and bent. Figure 31 indicates changes in robot angles and their derivatives. For larger frequencies, the values of angles are decreasing. However, their derivatives are increasing. Nevertheless, the difference in derivatives is not so significant as for different amplitudes, shown in Figure 25. There is no significant influence of frequencies on the snake head’s velocity, as shown in Figure 32. An increase in angular frequency strongly influences the values of torques in joints, as shown in Figure 33. The simulation results for different frequencies have been gathered in Table 2.
3.2.3. Phase Shift
Analogous experiments were performed for , , and a sequence of different phase shifts: , , , . As we can see from Figure 29, Figure 34, Figure 35 and Figure 36, the robot’s shape is the same but the movement direction changes. The values of joint angles, derivatives of joint angles, and torque remain the same for all phase shift values (see Figure 37, Figure 38 and Figure 39, and Table 3).
3.3. Controller Gains
This section presents the results of simulations for different controller gains. The simulation results have been collected in Table 4. All simulation were performed for , , , , and .
The distance travelled by the robot increases with the coefficient. The relation between distance and does not look so obvious. For and the robot behaves chaotically. The required torque seems to increase with rising and lower with . There is no unambiguous trend between gain coefficients and joint/link angles.
3.4. Path Following Coefficient
The last parameter considered in our studies was , which allows the robot to follow a destination point. For these simulations, we have assumed , , , , , , and , and a destination point situated in coordinates . The simulations were performed for , , , , , and . Larger gains increase the distance travelled by the robot and improve the tracking of the reference heading angle, as shown in Figure 40 and Figure 41. However, this means the significant increase in control signals shown in Figure 42 and joint angles shown in Figure 43 can violate design constraints. The above the value of one causes unacceptable torque values and can destroy the robot’s mechanism. The optimal solution is to assign a path-following coefficient larger than to ensure path-following but smaller than .
4. Discussion
Our research aimed to introduce a novel method of studying robotic multi-body systems utilising Virtual Reality technology. By integrating VR visualisation in Unity and simulation in MATLAB, we could conduct a series of real-time, interactive simulations of the motion of a snake robot. We could adjust the robot’s parameters, designate destination points using VR controllers and GUI, and observe an animation illustrating the robot’s behaviour. The VR environment allowed us to follow the motion from multiple perspectives, pause or rewind the simulation, analyse the velocities and heading angles of different segments, and compare the animations with the plots produced in MATLAB.
Our research generated a collection of experiments demonstrating how various parameters affect the robot’s motion. The necessary condition for the movement of a snake robot is the anisotropy of frictional force. Conducted studies show that the robot moves faster with a greater difference in friction in the tangential and normal directions. Increasing the amplitude of the gait pattern causes the robot to move faster, but it may lead to a violation of joint constraints. The angular frequency and phase shift determines the robot’s direction. Choosing optimal controller gains, and , is a difficult task that requires further investigation for different scenarios. According to our studies, the optimal value of the path following coefficient is between 0.5 and 1.
In the research, we utilised the snake robot as a representative example of a multi-body dynamical system featuring non-linearity. Our study demonstrates that even minor alterations to the parameters or controller gains can result in significant variations in the robot’s performance. Identifying the optimal set of parameters to maximise the robot’s performance under diverse environmental conditions and destination positions is challenging. Hence, our research emphasises the implementation of advanced control algorithms, such as Model Predictive Control or Sliding Mode Control, to operate the snake robot. We will conduct VR-supported studies to test and evaluate the efficacy of these control algorithms.
5. Conclusions
Virtual Reality is a cutting-edge technology. It allows users to experience and interact with virtual environments. It has applications in entertainment, education, and socialisation and has the potential to be a game-changer in engineering research. In addition, VR has immense potential to revolutionise research in the engineering field. A pioneering application of VR in visualising snake robot motion is presented in this article. This application could significantly influence the development of control algorithms for robotic systems.
At the moment, we are in the process of developing sophisticated control algorithms for a snake robot. This will enhance the VR application by visualising the abstract control properties and extending its functionality.
To this end, VR, Unity, and Matlab integration can be utilised to conduct realistic simulations and design the digital twin for mechanical systems. Therefore, several potential future research proposals have been identified based on the findings of this study. First, this approach will enable a synesthetic approach to enhance the control system design methodology. Exploring the best ways to present information to improve the engineering understanding of control systems would be worthwhile. Building on the current research, novel visual evaluation methods for mechanical control systems based on augmented reality can be developed.
In the specific case of evaluating snake robot controllers, it is possible to use VR to visualise additional physical parameters of the robot during motion, such as friction forces, joint torques, and link angle errors. Furthermore, ongoing investigations evaluate using the snake robot’s Model Predictive Control (MPC) in various control scenarios involving obstacles or joint failures. Ghost robots can be displayed to indicate the predicted positions of the robot.
Conceptualisation, A.S.-M. and A.O.; methodology, A.S.-M.; software, A.S.-M. and A.H.; validation, A.S.-M., A.H., K.S. and J.M.; formal analysis, A.O.; investigation, A.S.-M. and A.H.; resources, A.S.-M., A.O. and K.S.; data curation, A.S.-M. and J.M.; writing—original draft preparation, A.S.-M.; writing—review and editing, A.S.-M., A.H., K.S. and J.M; visualisation, A.S.-M.; supervision, A.O. and J.M.; project administration, A.S.-M. and J.M.; funding acquisition, A.S.-M. and K.S. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
Not applicable.
The authors declare no conflict of interest.
Footnotes
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Figure 1. Assignment of (a) joint and link angles (b) friction force in the normal and tangential direction.
Figure 5. Oculus Quest 2 controllers with buttons used to control the simulation speed.
Figure 6. Graphical User Interface and controllers ray allowing interaction with GUI.
Figure 7. Screen from VR application showing a panel with simulation parameters and plots generated in MATLAB.
Figure 8. Trajectories of robot segments for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 9. Trajectories of robot segments for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 10. Trajectories of robot segments for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 11. Trajectories of robot segments for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 12. Velocity of the snake robot’s head for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (d).
Figure 13. Velocities of the snake robot’s segments for [Forumla omitted. See PDF.] (a) and [Forumla omitted. See PDF.] (b).
Figure 14. Velocities of the snake robot’s segments for [Forumla omitted. See PDF.] (a) and [Forumla omitted. See PDF.] (b).
Figure 15. Velocities of the snake robot’s segments for [Forumla omitted. See PDF.] (a) and [Forumla omitted. See PDF.] (b).
Figure 16. Velocities of the snake robot’s segments for [Forumla omitted. See PDF.] (a) and [Forumla omitted. See PDF.] (b).
Figure 17. Absolute value and tangential component of segments velocities for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and velocity vectors for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 18. Absolute value and tangential component of segments velocities for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and velocity vectors for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 19. Absolute value and tangential component of segments velocities for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and velocity vectors for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 20. Absolute value and tangential component of segments velocities for [Forumla omitted. See PDF.] and [Forumla omitted. See PDF.] (a) and velocity vectors for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 21. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 22. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 23. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 24. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 25. Values of link angles [Forumla omitted. See PDF.], joint angles [Forumla omitted. See PDF.], link angles derivatives [Forumla omitted. See PDF.], joint angles derivatives [Forumla omitted. See PDF.] for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 26. Velocity of the snake robot’s head for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 27. Torque in robot’s joints for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 28. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 29. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 30. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 31. Values of link angles [Forumla omitted. See PDF.], joint angles [Forumla omitted. See PDF.], link angles derivatives [Forumla omitted. See PDF.], and joint angles derivatives [Forumla omitted. See PDF.] for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 32. Velocity of the snake robot’s head for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 33. Torque in robot’s joints for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 34. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 35. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 36. Trajectories of robot segments for [Forumla omitted. See PDF.] (a) and robot configurations for [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 37. Values of link angles [Forumla omitted. See PDF.], joint angles [Forumla omitted. See PDF.], link angles derivatives [Forumla omitted. See PDF.], and joint angles derivatives [Forumla omitted. See PDF.] for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 38. Velocity of the snake robot’s head for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 39. Torque in robot’s joints for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), and [Forumla omitted. See PDF.] (d).
Figure 40. Trajectories of robot segments for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), [Forumla omitted. See PDF.] (d), [Forumla omitted. See PDF.] (e), and [Forumla omitted. See PDF.] (f).
Figure 41. Heading angle and reference heading angle for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), [Forumla omitted. See PDF.] (d), [Forumla omitted. See PDF.] (e), and [Forumla omitted. See PDF.] (f).
Figure 42. Torque in robot’s joints for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), [Forumla omitted. See PDF.] (d), [Forumla omitted. See PDF.] (e), and [Forumla omitted. See PDF.] (f).
Figure 43. Values of link angles [Forumla omitted. See PDF.], joint angles [Forumla omitted. See PDF.], link angles derivatives [Forumla omitted. See PDF.], joint angles derivatives [Forumla omitted. See PDF.] for [Forumla omitted. See PDF.] (a), [Forumla omitted. See PDF.] (b), [Forumla omitted. See PDF.] (c), [Forumla omitted. See PDF.] (d), [Forumla omitted. See PDF.] (e), and [Forumla omitted. See PDF.] (f).
Simulation results for different amplitudes
|
|
|
d |
|
V |
|
|
|
|
---|---|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Simulation results for different angular frequencies
|
|
|
d |
|
V |
|
|
|
|
---|---|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Simulation results for different phase shifts
|
|
|
d |
|
V |
|
|
|
|
---|---|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Simulation results for different control gains.
|
|
d |
|
V |
|
|
|
|
---|---|---|---|---|---|---|---|---|
|
|
|
|
|
|
|
|
|
1 |
|
|
|
|
|
|
|
|
3 |
|
|
|
|
|
|
|
|
5 |
|
|
|
|
|
|
|
|
|
1 |
|
|
|
|
|
|
|
1 | 1 |
|
|
|
|
|
|
|
3 | 1 |
|
|
|
|
|
|
|
5 | 1 |
|
|
|
|
|
|
|
|
3 |
|
|
|
|
|
|
|
1 | 3 |
|
|
|
|
|
|
|
3 | 3 |
|
|
|
|
|
|
|
5 | 3 |
|
|
|
|
|
|
|
|
5 |
|
|
|
|
|
|
|
1 | 5 | 1 |
|
|
|
|
|
|
3 | 5 |
|
|
|
|
|
|
|
5 | 5 |
|
|
|
|
|
|
|
References
1. Hirose, S. Biologically Inspired Robots: Snake-Like Locomotors and Manipulators; Oxford University Press: Oxford, UK, 2022.
2. Yun, H.; Jun, M.B. Immersive and interactive cyber-physical system (I2CPS) and virtual reality interface for human involved robotic manufacturing. J. Manuf. Syst.; 2022; 62, pp. 234-248. [DOI: https://dx.doi.org/10.1016/j.jmsy.2021.11.018]
3. Pérez, L.; Diez, E.; Usamentiaga, R.; García, D.F. Industrial robot control and operator training using virtual reality interfaces. Comput. Ind.; 2019; 109, pp. 114-120. [DOI: https://dx.doi.org/10.1016/j.compind.2019.05.001]
4. Yap, H.J.; Taha, Z.; Md Dawal, S.Z.; Chang, S.W. Virtual reality based support system for layout planning and programming of an industrial robotic work cell. PLoS ONE; 2014; 9, e109692. [DOI: https://dx.doi.org/10.1371/journal.pone.0109692]
5. Moglia, A.; Ferrari, V.; Morelli, L.; Ferrari, M.; Mosca, F.; Cuschieri, A. A systematic review of virtual reality simulators for robot-assisted surgery. Eur. Urol.; 2016; 69, pp. 1065-1080. [DOI: https://dx.doi.org/10.1016/j.eururo.2015.09.021] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/26433570]
6. Costa, G.d.M.; Petry, M.R.; Moreira, A.P. Augmented Reality for Human–Robot Collaboration and Cooperation in Industrial Applications: A Systematic Literature Review. Sensors; 2022; 22, 2725. [DOI: https://dx.doi.org/10.3390/s22072725] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/35408339]
7. Chong, J.W.S.; Ong, S.; Nee, A.Y.; Youcef-Youmi, K. Robot programming using augmented reality: An interactive method for planning collision-free paths. Robot.-Comput.-Integr. Manuf.; 2009; 25, pp. 689-701. [DOI: https://dx.doi.org/10.1016/j.rcim.2008.05.002]
8. Pan, Z.; Polden, J.; Larkin, N.; Van Duin, S.; Norrish, J. Recent progress on programming methods for industrial robots. Robot.-Comput.-Integr. Manuf.; 2012; 28, pp. 87-94. [DOI: https://dx.doi.org/10.1016/j.rcim.2011.08.004]
9. Santos Garduño, H.A.; Esparza Martínez, M.I.; Portuguez Castro, M. Impact of virtual reality on student motivation in a High School Science Course. Appl. Sci.; 2021; 11, 9516. [DOI: https://dx.doi.org/10.3390/app11209516]
10. Kuhail, M.A.; ElSayary, A.; Farooq, S.; Alghamdi, A. Exploring Immersive Learning Experiences: A Survey. Informatics; 2022; 9, 75.
11. Monita, F.; Ikhsan, J. Development Virtual Reality IPA (VR-IPA) learning media for science learning. J. Phys. Conf. Ser.; 2020; 1440, 012103.
12. Kavanagh, S.; Luxton-Reilly, A.; Wuensche, B.; Plimmer, B. A systematic review of virtual reality in education. Themes Sci. Technol. Educ.; 2017; 10, pp. 85-119.
13. de Barros Correia, F.L.; Moreno, U.F. Decentralized formation tracking for groups of mobile robots with consensus and mpc. IFAC-PapersOnLine; 2015; 48, pp. 274-279. [DOI: https://dx.doi.org/10.1016/j.ifacol.2015.12.045]
14. Gonzalez, E.J.; Chase, E.D.; Kotipalli, P.; Follmer, S. A Model Predictive Control Approach for Reach Redirection in Virtual Reality. Proceedings of the CHI Conference on Human Factors in Computing Systems; Orleans, LA, USA, 29 April–5 May 2022; pp. 1-15.
15. Griffiths, H.; Shen, H.; Li, N.; Rojas, S.; Perkins, N.; Liu, M. Vineyard management in virtual reality: Autonomous control of a transformable drone. Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II. SPIE; Anaheim, CA, USA, 10–11 April 2017; Volume 10218, pp. 95-102.
16. Manju, P.; Pooja, D.; Dutt, V. Drones in smart cities. AI and IoT-Based Intelligent Automation in Robotics; Dubey, A.K.; Kumar, A.; Kumar, S.R.; Gayathri, N.; Das, P. Scrivener Publishing LLC: Beverly, MA, USA, 2021; pp. 205-228.
17. Wonsick, M.; Padir, T. A systematic review of virtual reality interfaces for controlling and interacting with robots. Appl. Sci.; 2020; 10, 9051. [DOI: https://dx.doi.org/10.3390/app10249051]
18. Al-Sada, M.; Jiang, K.; Ranade, S.; Kalkattawi, M.; Nakajima, T. HapticSnakes: Multi-haptic feedback wearable robots for immersive virtual reality. Virtual Real.; 2020; 24, pp. 191-209. [DOI: https://dx.doi.org/10.1007/s10055-019-00404-x]
19. Makhataeva, Z.; Varol, H.A. Augmented Reality for Robotics: A Review. Robotics; 2020; 9, 21. [DOI: https://dx.doi.org/10.3390/robotics9020021]
20. Eswaran, M.; Bahubalendruni, M.R. Challenges and opportunities on AR/VR technologies for manufacturing systems in the context of industry 4.0: A state of the art review. J. Manuf. Syst.; 2022; 65, pp. 260-278. [DOI: https://dx.doi.org/10.1016/j.jmsy.2022.09.016]
21. Sonawani, S.; Amor, H. When And Where Are You Going? A Mixed-Reality Framework for Human Robot Collaboration. Proceedings of the 5th International Workshop on Virtual, Augmented, and Mixed Reality for HRI; Online, 7 March 2022.
22. Liljebäck, P.; Pettersen, K.Y.; Stavdahl, Ø.; Gravdahl, J.T. Snake Robots: Modelling, Mechatronics, and Control; Springer: Berlin/Heidelberg, Germany, 2013.
23. Pettersen, K.Y. Snake robots. Annu. Rev. Control; 2017; 44, pp. 19-44. [DOI: https://dx.doi.org/10.1016/j.arcontrol.2017.09.006]
24. Enner, F.; Rollinson, D.; Choset, H. Simplified motion modeling for snake robots. Proceedings of the 2012 IEEE International Conference on Robotics and Automation; St Paul, MN, USA, 14–18 May 2012; pp. 4216-4221.
25. Baysal, Y.A.; Altas, I.H. Modelling and simulation of a wheel-less snake robot. Proceedings of the 2020 7th International Conference on Electrical and Electronics Engineering (ICEEE); Bandung, Indonesia, 23–24 September 2020; pp. 285-289.
26. Xiu, Y.; Deng, H.; Li, D.; Zhang, M.; Law, R.; Huang, Y.; Wu, E.Q.; Xu, X. Finite-Time Sideslip Differentiator-Based LOS Guidance for Robust Path Following of Snake Robots. IEEE/CAA J. Autom. Sin.; 2023; 10, pp. 239-253. [DOI: https://dx.doi.org/10.1109/JAS.2022.106052]
27. Baysal, Y.A.; Altas, I.H. Optimally efficient locomotion of snake robot. Proceedings of the 2020 International Conference on Innovations in Intelligent SysTems and Applications (INISTA); Novi Sad, Serbia, 24–26 August 2020; pp. 1-6.
28. Nonhoff, M.; Köhler, P.N.; Kohl, A.M.; Pettersen, K.Y.; Allgöwer, F. Economic model predictive control for snake robot locomotion. Proceedings of the 2019 IEEE 58th Conference on Decision and Control (CDC); Nice, France, 11 December 2019; pp. 8329-8334.
29. Saito, M.; Fukaya, M.; Iwasaki, T. Modeling, analysis, and synthesis of serpentine locomotion with a multilink robotic snake. IEEE Control Syst. Mag.; 2002; 22, pp. 64-81.
30. Kelasidi, E.; Liljebäck, P.; Pettersen, K.Y.; Gravdahl, J.T. Integral line-of-sight guidance for path following control of underwater snake robots: Theory and experiments. IEEE Trans. Robot.; 2017; 33, pp. 610-628. [DOI: https://dx.doi.org/10.1109/TRO.2017.2651119]
31. Liljebäck, P.; Pettersen, K.Y.; Stavdahl, Ø.; Gravdahl, J.T. Lateral undulation of snake robots: A simplified model and fundamental properties. Robotica; 2013; 31, pp. 1005-1036. [DOI: https://dx.doi.org/10.1017/S0263574713000295]
32. Ariizumi, R.; Matsuno, F. Dynamic analysis of three snake robot gaits. IEEE Trans. Robot.; 2017; 33, pp. 1075-1087. [DOI: https://dx.doi.org/10.1109/TRO.2017.2704581]
33. Branyan, C.; Menğüç, Y. Soft snake robots: Investigating the effects of gait parameters on locomotion in complex terrains. Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); Madrid, Spain, 1–5 October 2018; pp. 1-9.
34. Sibilska-Mroziewicz, A.; Możaryn, J.; Hameed, A.; Fernández, M.M.; Ordys, A. Framework for simulation-based control design evaluation for a snake robot as an example of a multibody robotic system. Multibody Syst. Dyn.; 2022; 55, pp. 375-397. [DOI: https://dx.doi.org/10.1007/s11044-022-09830-3]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
In this article, we present a novel approach to performing engineering simulation in an interactive environment. A synesthetic design approach is employed, which enables the user to gather information about the system’s behaviour more holistically, at the same time as facilitating interaction with the simulated system. The system considered in this work is a snake robot moving on a flat surface. The dynamic simulation of the robot’s movement is realised in dedicated engineering software, whereas this software exchanges information with the 3D visualisation software and a Virtual Reality (VR) headset. Several simulation scenarios have been presented, comparing the proposed method with standard ways for visualising the robot’s motion, such as 2D plots and 3D animations on a computer screen. This illustrates how, in the engineering context, this more immersive experience, allowing the viewer to observe the simulation results and modify the simulation parameters within the VR environment, can facilitate the analysis and design of systems.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details





1 Institute of Micromechanics and Photonics, Department of Mechatronics, Warsaw University of Technology, 02-525 Warsaw, Poland
2 Institute of Automatic Control and Robotics, Department of Mechatronics, Warsaw University of Technology, 02-525 Warsaw, Poland
3 Air Force Institute of Technology, 01-494 Warsaw, Poland