This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1. Introduction
Mobility impairments are a part of life as it could occur due to diseases such as stroke and multiple sclerosis [1] and due to muscle weaknesses caused by aging [2]. In extreme cases, lower limb amputations also lead to challenges to ambulation of individuals [3]. Over the years, using a wheelchair for ambulatory purposes has become a common practice among people with walking impairments [4]. Modern-day wheelchairs contain sophisticated facilities to reduce the functional limitations of the wheelchair users to engage in most of the service tasks [5]. However, the underlying factors to improve the user engagement in using wheelchairs for activities of daily living (ADL) are still to be explored [6, 7]. With the growing popularity of adding peripherals like wheelchair trays and robotic arms to wheelchairs, focusing on user preferences and prior experiences is crucial for enhancing user engagement [8, 9, 10, 11, 12, 13]. With the addition of such peripherals, it is also expected to enhance the self-reliance of wheelchair users and accessibility of the object handling feature to a broad population with different motor functionalities [5].
Object manipulation for the wheelchair users comprised of three main components. They are (1) navigating toward object, (2) picking the object, and (3) placing the object. Each component can have one or more subcomponents depending on the complexity of the controlling algorithm. In literature, it can be observed that researchers often consider only one or two of these components when designing the controllers for wheelchairs with wheelchair-mounted robotic arms (WMRAs) [14, 15, 16, 17, 18, 19]. However, these algorithms may not be accessible for the individuals who are lacking some motor functionalities such as hand movements. Hence, researchers have incorporated additional modalities to object handling algorithms [12, 20, 21]. Furthermore, incorporation of all three of the above-mentioned components to make a fully autonomous object handling algorithm for WRMAs presents unique challenge as well. Recent study revealed that wheelchair users prefer to be interactive with the object handling process than using a fully autonomous system [22]. Hence, it is apparent that an object manipulation algorithm should maintain the balance between the automating all the three above-mentioned aspects in object manipulation and user interaction during the process to improve the user engagement.
In this paper, a semiautonomous object manipulation algorithm is proposed, which comprises navigating toward objects, picking them up, and arranging them on a wheelchair tray. Furthermore, a human study was conducted to gather information on user preferences. User engagement is expected to be enhanced through an intelligent object manipulation algorithm achieved by incorporating intelligence based on user preferences. Additionally, electromyography (EMG) signals from the left and right sternocleidomastoid muscles and the biceps muscles of both arms are utilized, allowing wheelchair users to incorporate their intentions in certain decision-making processes, thus making it a human-in-the-loop (HITL) system. Enhancements to the previously proposed EMG-based wheelchair navigation algorithm [23] have laid the foundation for the proposed algorithm. Finally, the entire system was modeled in the CoppeliaSim simulation environment [24] to monitor the effectiveness of the proposed system.
2. Preliminary Human Study on Object Placement
To gather data on object placement locations and design a model that reflects user preferences, a human study was conducted. This experiment consisted of three studies. The first two studies focused on situation-based object placement, examining how wheelchair users arrange objects on a wheelchair tray based on their handedness in different scenarios. The third study aimed to gather information on how individuals place objects when given in pairs. The underlying hypothesis of the third study was that participants would consider the individual attributes of each object when arranging them.
2.1. Participant Information
Twenty healthy participants (29
2.2. Experimental Setup
The experimental setup consisted of a powered electric wheelchair (Jazzy Air, Power Mobility), a personal computer, a wheelchair tray with dimensions of 53 cm
[figure(s) omitted; refer to PDF]
2.3. Situation-Based Object Placement: Study 1 and Study 2
In this experiment, participants were instructed to sit in the wheelchair and arrange the provided objects on the wheelchair tray according to their preferences. Each object used in the experiment was marked with a red marker for identification. Study 1 simulated a working/studying scenario, while Study 2 simulated a dining scenario. Participants were prompted to envision these situations while arranging the objects on the tray. Depending on the scenario, participants received different sets of objects. Study 1 included the pen, tablet, mobile phone, cup, and book, while Study 2 included the spoon and plate instead of the pen, tablet, and book used in Study 1. Each of the 20 participants completed the activity 10 times for each scenario. A 5-min break was allocated between each trial of the respective study.
2.4. Object Attribute-Driven Object Placement: Study 3
The third study aimed to investigate how wheelchair users arrange objects on the wheelchair tray based on the attributes of each object. Each object used in Studies 1 and 2 possessed unique attributes distinguishing them from one another. For instance, a mobile phone is an electronic device typically kept away from surfaces prone to liquid spills. To gain insight into how wheelchair users would arrange these objects while considering their unique attributes, pairs of objects were presented to 10 participants who had already taken part in Studies 1 and 2 (five right-handed and five left-handed individuals, aged 28 ± 2 years, mean
2.5. Data Analysis
After the objects were placed on the wheelchair, an image of the object arrangement was captured using an overhead camera (see Figure 1). This captured image was then analyzed using the MATLAB image processing toolbox to determine the locations of the red markers relative to the boundaries of the wheelchair tray (see Figure 2). Images from each trial of Studies 1 and 2 were analyzed to identify the individual object locations in relation to the tray boundaries. Subsequently, the K-means clustering algorithm was applied to identify and label the clusters of the recorded data. Before this analysis, the dataset obtained for preferences underwent careful preprocessing to filter out any outliers.
[figure(s) omitted; refer to PDF]
The primary objective of the human study was to collect essential insights into user preferences for arranging objects and how user handedness influenced these preferences, with the goal of implementing an intelligent object manipulation algorithm for WMRA. Centroids of the identified clusters were computed to facilitate the implementation of the algorithm for arranging objects in both working and dining environments. Furthermore, images from Study 3 were analyzed to calculate the mean distance between each object pair. These mean distances were subsequently utilized to establish the distances between objects when arranged in pairs. Summary of the human study is depicted in Figure 3.
[figure(s) omitted; refer to PDF]
3. Implementation in Simulation Environment
The intelligent object manipulation algorithm was implemented in the CoppeliaSim simulation environment. To achieve this, the findings from the human study were incorporated, along with the authors’ previously introduced EMG-based navigation and intention detection algorithm [23], into the proposed algorithm. Consequently, the terminologies used for that algorithm will be used throughout this paper as well. Additionally, the simulation was implemented for healthy individuals capable of activating the right and left sternocleidomastoid muscles and biceps. Additionally, the simulation was implemented for healthy individuals capable of activating the right and left sternocleidomastoid muscles and biceps.
3.1. Overview of the Simulation
A working 3D model of the wheelchair was imported into the CoppeliaSim simulation environment after converting it to a Unified Robotic Description Format (URDF) model and manually defining the joint axes to match the original wheelchair motions. Other components, such as the Kinova Jaco 2°–6° of freedom manipulator (WMRA), tables, and cups, were imported from the CoppeliaSim built-in models. The dimensions of these components were adjusted to match the items used in the human study. The “rigid body dynamic properties” for the imported model were set and balanced according to their real-life properties. To generate the simulation with dynamic properties, the “Bullet 2.7 engine” was used. Additionally, it was necessary to specify whether each component was static or dynamic.
The simulated setup also includes a camera mounted on the gripper, a proximity sensor (sonar), and a wheelchair tray. The WMRA is mounted on the arm stand extension of the wheelchair, with the wheelchair tray attached to the same extension. The wheelchair tray has the same dimensions (53 cm
[figure(s) omitted; refer to PDF]
After creating the working dynamic model of the wheelchair in the CoppeliaSim environment, CoppeliaSim and MATLAB were interfaced through a “Remote Application Programming Interface (Remote API).” This interface allowed the high-level controller to send and receive control commands to and from the CoppeliaSim simulation (see Figure 5).
[figure(s) omitted; refer to PDF]
3.2. Navigating toward Objects
The wheelchair built inside the CoppeliaSim simulation environment has two active wheels and four passive wheels just as the original electric wheelchair. Two active wheels in the simulation are working in the assumption that they are connected through a differential drive.
As described by Abayasiri et al. [23], users can navigate to the desired location within the simulation environment while avoiding obstacles by using specific muscle activations. Forward and backward movements of the wheelchair are controlled by flexing the right and left arms to activate the biceps muscles. Turning the wheelchair to the right and left is achieved by rotating the head to activate the left and right sternocleidomastoid muscles. Additionally, users receive visual feedback of the wheelchair movements by observing different viewports of the simulation environment. A sample navigation conducted in the simulation environment is shown in Figure 6.
[figure(s) omitted; refer to PDF]
3.3. Picking the Objects
After successfully navigating to the object, the user must perform a task switcher to stop the wheelchair from moving and shift to the next task. Once the intention to use the manipulator is detected, the object-picking process begins [23].
3.3.1. Object Searching and Detection
For object manipulation, an eye-in-hand configuration of a camera is employed. Here, the camera is mounted on the gripper to provide vision feedback (see Figure 4). To search for the object, the robot gripper moves from point Pint (X, Y, Z) to point Pfinal (Xf, Yf, Zf) in a straight line with respect to the base frame of the WMRA (see Figure 7). Since, Pint and Pfinal line is parallel to the X axis of the base frame of the manipulator, Pfinal can be calculated using Equation (1). The gripper pose at Pint (X, Y, Z) is calculated using the homogeneous transformation of the WMRA with respect to its base. MATLAB Robotics Toolbox was utilized for the calculations related to the manipulator movements and poses [25].
[figure(s) omitted; refer to PDF]
The gripper camera is utilized to detect the red marker placed on the object, as described in the previous human study. Once the binary large object (BLOB) representing the red marker is detected, the movement of the gripper stops.
3.3.2. Reaching toward the Object
After briefly stopping at the BLOB detection, the gripper begins to move toward the object in a straight path, which was generated using the MATLAB Robotics Toolbox. To calculate the expected final pose of the gripper after starting to move toward the object, Equation (2) is utilized:
Here, Dexp represents the expected distance from the stopping point to the object. Dexp is set to 20 cm to prevent the manipulator from reaching singularity due to its maximum reach. When the gripper is in close proximity to the object, gripper movement is halted when the sonar of the gripper detects the 3 cm mark.
3.3.3. Grasping the Object
Once the gripper movement is stopped after the sonar reading, gripper rotation is enabled. This feature is added to the system to compensate for misalignments between the orientation of the gripper and the required pose for grasping. To achieve this, activation of the left biceps brachii enables clockwise rotation of the gripper, while activation of the right biceps brachii enables counterclockwise rotation. Users must intend to flex their respective elbows to activate the relevant biceps brachii.
When the required gripper orientation is achieved, users must perform a task switcher to finalize the gripper adjustment and enable the grasping motion.
3.4. Placing the Object
Object placement is a complex task, as it requires considering both the placement locations and potential collisions between objects. To address spatial ambiguities and avoid collisions, a sequential placement method is employed in this algorithm. The study considers three object placement scenarios: two situation-based placements (dining and working environments) and placing objects in pairs.
In the dining environment, objects are placed in the order of plate
For object placement in pairs, the first object placed by the manipulator is positioned at the
[figure(s) omitted; refer to PDF]
The first object is placed without considering object attributes or user preference. Once the object is placed, the user activates their right bicep until the placed object is shown in the status bar. Objects are stored in a sequence of seven: book
[figure(s) omitted; refer to PDF]
3.4.1. Intermediate Position
Here, the intermediate position refers to the location the robot manipulator reaches before placing the object (see Figure 11). This position is directly above the center of the wheelchair tray, and the manipulator’s pose at this position is predefined. This measure is used to avoid collisions between objects during the placement process. Consequently, the object-picking and placing task includes an intermediate step where the manipulator moves to this position before placing the object.
[figure(s) omitted; refer to PDF]
3.4.2. Retraction of Robot Arm
Retraction of the manipulator to its neutral position completes the object placement task for one object. This retraction also initiates the wheelchair’s movement to the location of the next object. If all the objects for the intended task have been placed on the wheelchair tray, the system will return to its default initial resting stage.
Figure 12 illustrates all the steps involved in the proposed object manipulation algorithm and how they are interconnected. As shown in the figure, a few additional steps were added to complete the object manipulation algorithm. Although the system takes into account the handedness of the user, this is not depicted in Figure 12 since the system is preset according to the user’s handedness by the experimenter. A reset command (Resetter) is included as a safety mechanism. If the user performs the Resetter at any point during the tasks, the system will pause. This is achieved by performing a task switcher and a right rotation of the head simultaneously. If the Resetter is activated in the middle of a manipulator task, the manipulator will stop/freeze in its current configuration when the Resetter is performed.
[figure(s) omitted; refer to PDF]
4. Results and Discussion
4.1. Human Study
Figure 13 depicts the identified clusters based on the data points recorded for object arrangements in Study 1 and Study 2, along with the centroids of the respective clusters. These clusters reveal common orientations for both left-handed and right-handed users in the studying/working environment (Figure 13(a)).
[figure(s) omitted; refer to PDF]
Both left-handed and right-handed users placed the pen on the side of their dominant hand. There was a noticeable tendency to place the book in the middle area of the tray. Additionally, most participants placed the cup on the left side of the tray. Right-handed users, however, positioned the cup closer to themselves, whereas left-handed users placed it toward the far left corner. This difference likely stems from left-handed users’ concerns about accidentally hitting the cup with their writing hand during studying or working. Since right-handed users are less affected by this concern, they placed the cup closer to themselves.
The placement of the mobile phone was not influenced by handedness; both left-handed and right-handed users placed the phone at a moderate distance from the bottom edge of the tray toward the right side. The placement distance of the tablet from the bottom edge of the tray was nearly identical for both types of users, though handedness did affect the tablet’s position. Participants tended to place the tablet further from themselves compared to other objects, likely due to the higher usage of the other four objects in a studying/working environment.
In Study 2, there was no significant difference in the arrangement of objects between left-handed and right-handed users, except that their arrangements were almost mirror images of each other, with the exception of cup placement (Figure 13(b)). This mirror image effect was due to the handedness of the users. Both types of users placed the plate in the middle, the spoon adjacent to the plate on the side of their dominant hand, and the mobile phone on the side of their nondominant hand. The cup was placed on the left side in both instances. Users tended to keep the phone on the nondominant side to allow for its use while eating, as the dominant hand was occupied with the spoon. Additionally, users positioned the cup as far as possible to minimize the risk of liquid spills damaging the mobile phone.
These studies revealed that participants tend to arrange objects in a unique orientation based on the attributes of the objects, their past experiences, and their handedness. The handedness of the user significantly influenced object placement, demonstrating a clear pattern in how different users manage their workspace and dining area.
Figure 9 illustrates the variations in the mean distances between object pairs in the aforementioned configurations. The analysis reveals that five pairs—book–cup, tablet–cup, plate–cup, plate–phone, and phone–cup—exhibit almost identical average distances, approximately 20 cm, which are the highest among all the average values. This consistent separation indicates that participants prioritized the safety of electronic devices, consciously placing them far from potential liquid spills or food contamination from cups or plates.
Conversely, the phone–tablet pair shows a significantly lower average distance of 15.24 cm, indicating that participants perceived these two objects as having similar attributes and thus placed them closer together. This proximity suggests a user preference to keep objects with similar functions and usage patterns nearby.
Other pairs displayed comparatively moderate mean distances, indicating that these objects do not possess strongly contrasting or allied attributes, leading participants to place them at moderate distances from each other. This behavior suggests that the participants’ past experiences and perceptions of the objects’ attributes significantly influenced their placement decisions.
In summary, this study confirms that participants relied on their knowledge of object attributes when arranging items, demonstrating a clear pattern of placing electronic devices away from potential hazards while grouping similar items closer together.
4.2. Implementation
The implementation in the simulation environment was tested with four healthy participants who had initially participated in the human study. Three of these participants were right-handed, while the other was left-handed. Due to the limited object placement area and the considerable differences between the simulation environment and the actual scenario, only the orientations after object arrangement (object arrangement pattern) were compared for the dining and working environments. For the task of placing objects in pairs, the distance between the centers of the two objects was measured using Kinovea software [26]. Screen captures of the orthographic view of the wheelchair tray were taken on each occasion. All four users were able to perform the tasks after familiarizing themselves with the proposed intelligent object manipulation algorithm.
Figures 14, 15, 16, and 17 depict the results obtained from the simulation environment for situation-based object placements. In all of the arrangements, it can be seen that they have secured the orientations depicted by the object clusters. Moreover, it can be seen that even in the same situation, the orientation of the objects is not the same. The orientation for picking/grasping the object is adjusted manually by the user according to his/her preference after the manipulator is stopped by the sonar reading. Hence, the location of the object where the gripper grasps can vary according to the user’s state of mind at the particular moment. Due to the divergent nature of the grasping stage, different orientations have occurred for the placed object in different trials. However, this scenario does not affect the final pose of the manipulator when placing the object as it is independent of the pose of grasping the object. Therefore, the location of placing the object would not change, and it would only change if the handedness of the user changes.
[figure(s) omitted; refer to PDF]
The object arrangements for one right-handed participant (age 31 years) and one left-handed participant (age 29 years) are depicted. Objects were arranged in the same orientations as depicted in the figures by the other participants, respectively, but the individual locations of the objects slightly differed (max deviation from the depicted result for right-handed user, 1.50 cm, and for left-handed user, 1.12 cm). Figure 18 depicts the object placements done in pairs in the simulation environment by the aforementioned right-handed user. These pairs were placed under the constraints mentioned in Figure 9. In contrast to the situation-based object placements where the orientation of the object placement is paramount, individual object location is very important for object placement in pairs. Especially the object location of the second object plays a vital role as it defines the distance between the two objects. The maximum error recorded for the object pair: the plate and cup pair is equal to 2 cm. This could be accepted considering the fact that the placing is not as smooth as in real life due to the simulation software lags which accumulate toward these errors.
[figure(s) omitted; refer to PDF]
4.3. Limitations
In all of the above-mentioned experiments, flat objects such as books, phones, and tablets could not be picked up by the manipulator in the simulation if they lay flat on the table, as the fingers could not go under the objects to grasp them. To address this, the objects were supported with a wedge to keep them vertical or nearly vertical, allowing the manipulator to easily grab them. However, this requires that these items be placed in that position by someone else. In real-life applications, several types of such holders are commercially available. Additionally, it is assumed that there is only one object with a red marker at the location where the object is being grasped, and the red marker is facing the Pint and Pfinal line of the gripper camera traverse. The current algorithm is designed to match the capabilities of the Jazzy Air powered electric wheelchair (Power Mobility), which has the capability to adjust the seat height. Therefore, the wheelchair tray is adjusted to a height where it aligns with the table on which the object to be picked is located. This step has to be done before the onset of the object manipulation algorithm. Another, limitation of the proposed algorithm is that user has to follow a sequential order to pick and place the objects and is not allowing the user to place objects outside of the sequence.
An example of the detailed process flow for object manipulation in working mode is depicted in Figure 19. It is apparent that the proposed system is not fully automatic, as it requires user input through EMG signals to carry out the entire process. Hence, the system can be considered a human-in-the-loop (HITL) system where user feedback determines the key aspects of the system’s operation. Additionally, the current algorithm is limited to only the seven objects mentioned in the sections above.
[figure(s) omitted; refer to PDF]
5. Conclusion
This paper introduces an intelligent object manipulation algorithm for wheelchairs equipped with WMRAs. A human study was conducted to collect data on user preferences and handedness, revealing that user preferences are influenced by the user’s handedness and the attributes of the objects. The study used the K-means clustering algorithm to identify clusters in the obtained data, highlighting that wheelchair users tend to group objects with similar attributes together while keeping objects with contrasting attributes apart.
The proposed object manipulation algorithm was implemented within the CoppeliaSim simulation environment, incorporating both an EMG-based controller and insights from the human study. This combination allows the system to leverage user EMG signals and past experiences to perform tasks on the wheelchair, providing users with more flexible control inputs and reducing misalignments during object grasping. Implementation was tested with four healthy participants across various scenarios, including dining, working, and placing objects in pairs. Results showed that object placements in dining and working scenarios aligned with the orientations obtained from the human study. For the scenario where objects were placed in pairs, a maximum error of 2 cm was reported for the object pair of plate and phone.
While the proposed algorithm was developed based on data from multiple participants, further improvements could be made to provide personalized intelligent object manipulation by conducting individualized human studies. Additionally, the algorithm can be used by users who have control over head rotation, biceps, and triceps muscles, thereby enhancing accessibility to a wider population without the need for incorporating additional modalities.
Acknowledgments
The authors thank all the participants who participated in this research for their contribution toward the success of this research. Also, the authors would like to acknowledge the support of Eng. Sachi Edirisinghe for the success of this research. This work was supported by the Senate Research Council (SRC), University of Moratuwa, Sri Lanka (grant no. SRC/LT/2020/12) and the National Research Council (NRC) Sri Lanka (grant no. NRC 17-069).
[1] D. Cattaneo, E. Gervasoni, E. Pupillo, E. Bianchi, I. Aprile, I. Imbimbo, R. Russo, A. Cruciani, J. Jonsdottir, M. Agostini, E. Beghi, "Mobility disorders in stroke, parkinson disease, and multiple sclerosis: a multicenter cross-sectional study," American Journal of Physical Medicine & Rehabilitation, vol. 99 no. 1, pp. 41-47, DOI: 10.1097/PHM.0000000000001272, 2020.
[2] E. van Der Kruk, A. K. Silverman, P. Reilly, A. M. Bull, "Compensation due to age-related decline in sit-to-stand and sit-to-walk," Journal of Biomechanics, vol. 122,DOI: 10.1016/j.jbiomech.2021.110411, 2021.
[3] W. Lovegreen, D. P. Murphy, P. M. Stevens, Y. I. Seo, J. B. Webster, "Lower limb amputation and gait," Braddom’s Physical Medicine and Rehabilitation, pp. 174-208, 2021.
[4] D. Cattaneo, I. Lamers, R. Bertoni, P. Feys, J. Jonsdottir, "Participation restriction in people with multiple sclerosis: prevalence and correlations with cognitive, walking, balance, and upper limb impairments," Archives of Physical Medicine and Rehabilitation, vol. 98 no. 7, pp. 1308-1315, DOI: 10.1016/j.apmr.2017.02.015, 2017.
[5] S. K. Sahoo, B. B. Choudhury, "Wheelchair accessibility: bridging the gap to equality and inclusion," Decision Making Advances, vol. 1 no. 1, pp. 63-85, DOI: 10.31181/dma120623p, 2023.
[6] B. Zhang, H. Eberle, C. Holloway, T. Carlson, "Design requirements and challenges for intelligent power wheelchair use in crowds: learning from expert wheelchair users," Age, vol. 46 no. 12, 2021.
[7] E. McSweeney, R. J. Gowran, "Wheelchair service provision education and training in low and lower middle income countries: a scoping review," Disability and Rehabilitation: Assistive Technology, vol. 14 no. 1, pp. 33-45, DOI: 10.1080/17483107.2017.1392621, 2019.
[8] B. Zhang, G. Barbareschi, R. R. Herrera, T. Carlson, C. Holloway, "Understanding interactions for smart wheelchair navigation in crowds," .
[9] M. E. Hudson, A. Zambone, J. Brickhouse, "Teaching early numeracy skills using single switch voice-output devices to students with severe multiple disabilities," Journal of Developmental and Physical Disabilities, vol. 28 no. 1, pp. 153-175, DOI: 10.1007/s10882-015-9451-3, 2016.
[10] A. Betlej, "Designing robots for elderly from the perspective of potential end-users: a sociological approach," International Journal of Environmental Research and Public Health, vol. 19 no. 6,DOI: 10.3390/ijerph19063630, 2022.
[11] W. Tao, J. Xu, T. Liu, "Electric-powered wheelchair with stair-climbing ability," International Journal of Advanced Robotic Systems, vol. 14 no. 4,DOI: 10.1177/1729881417721436, 2017.
[12] M. S. H. Sunny, M. I. I. Zarif, I. Rulik, J. Sanjuan, M. H. Rahman, S. I. Ahamed, I. Wang, K. Schultz, B. Brahmi, "Eye-gaze control of a wheelchair mounted 6dof assistive robot for activities of daily living," Journal of NeuroEngineering and Rehabilitation, vol. 18 no. 1,DOI: 10.1186/s12984-021-00969-2, 2021.
[13] I. Rulik, M. S. H. Sunny, J. D. S. De Caro, M. I. I. Zarif, B. Brahmi, S. I. Ahamed, K. Schultz, I. Wang, T. Leheng, J. P. Longxiang, M. H. Rahman, "Control of a wheelchair-mounted 6dof assistive robot with chin and finger joysticks," Frontiers in Robotics and AI, vol. 9,DOI: 10.3389/frobt.2022.885610, 2022.
[14] M. Zhong, Y. Zhang, X. Yang, Y. Yao, J. Guo, Y. Wang, Y. Liu, "Assistive grasping based on laser-point detection with application to wheelchair-mounted robotic arms," Sensors, vol. 19 no. 2,DOI: 10.3390/s19020303, 2019.
[15] P. Karuppiah, H. Metalia, K. George, "Automation of a wheelchair mounted robotic arm using computer vision interface," .
[16] C.-S. Chung, B. Styler, E. Wang, D. Ding, "Robotic assistance in action: examining control methods for long-term owners of wheelchair-mounted robotic arms," .
[17] A. Palla, G. Meoni, L. Fanucci, A. Frigerio, "Position based visual servoing control of a wheelchair mounter robotic arm using parallel tracking and mapping of task objects," ICST Transactions on Ambient Systems, vol. 4 no. 13,DOI: 10.4108/eai.17-5-2017.152545, 2017.
[18] M. Ababneh, H. Sha’ban, D. AlShalabe, D. Khader, H. Mahameed, M. AlQudimat, "Gesture controlled mobile robotic arm for elderly and wheelchair people assistance using kinect sensor," pp. 636-641, .
[19] N. Shafii, P. Farias, I. Sousa, H. Sobreira, L. P. Reis, A. P. Moreira, "Autonomous interactive object manipulation and navigation capabilities for an intelligent wheelchair," pp. 473-485, .
[20] Q. Huang, Y. Chen, Z. Zhang, S. He, R. Zhang, J. Liu, Y. Zhang, M. Shao, Y. Li, "An EOG-based wheelchair robotic arm system for assisting patients with severe spinal cord injuries," Journal of Neural Engineering, vol. 16 no. 2,DOI: 10.1088/1741-2552/aafc88, 2019.
[21] Á. A. Pálsdóttir, S. Dosen, M. Mohammadi, L. A. N. Struijk, "Remote tongue based control of a wheelchair mounted assistive robotic arm–a proof of concept study," pp. 1300-1304, .
[22] B. K. Styler, W. Deng, R. Simmons, H. Admoni, R. Cooper, D. Ding, "Exploring control authority preferences in robotic arm assistance for power wheelchair users," Actuators, vol. 13 no. 3,DOI: 10.3390/act13030104, 2024.
[23] R. A. M. Abayasiri, A. G. B. P. Jayasekara, R. A. R. C. Gopura, K. Kiguchi, "Emg based controller for a wheelchair with robotic manipulator," pp. 125-130, .
[24] S. Rooban, S. D. Suraj, S. B. Vali, N. Dhanush, "Coppeliasim: adaptable modular robot and its different locomotions simulation framework," .
[25] P. Corke, Robotics and Control: Fundamental Algorithms in MATLAB®, vol. 141, 2021.
[26] A. Puig-Diví, C. Escalona-Marfil, J. M. Padullés-Riu, A. Busquets, X. Padullés-Chando, D. Marcos-Ruiz, C. Balsalobre-Fernández, "Validity and reliability of the Kinovea program in obtaining angles and distances using coordinates in 4 perspectives," PLOS ONE, vol. 14 no. 6,DOI: 10.1371/journal.pone.0216448, 2019.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright © 2024 R. A. M. Abayasiri et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/
Abstract
Over the years, intelligent wheelchairs have been developed to fulfill the needs of wheelchair users by gathering information from the environment to make autonomous decisions. This paper proposes an intelligent object manipulation algorithm that leverages users’ intentions, past experiences, and preferences. To gather the necessary information for object placement, a human study involving 20 healthy participants was conducted. The study included two situation-based object placement scenarios and object placement when objects were given in pairs. Building on the authors’ previously developed electromyography-based wheelchair navigation algorithm, a semiautonomous object manipulation algorithm was developed. This new algorithm was tested in the CoppeliaSim simulation environment by replicating the tasks performed in the human study. The object arrangements in the simulation were then compared to the results from the human study. The findings indicate that the implemented object manipulation algorithm successfully replicated the outcomes observed in the human study. The application of this controller is expected to significantly enhance accessibility for individuals with impaired motor functionalities in addition to their ambulatory impairments. This development represents a significant step forward in creating user-friendly and adaptive assistive technologies for wheelchair users.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details




1 Intelligent Service Robotics Group Department of Electrical Engineering University of Moratuwa Moratuwa 10400 Sri Lanka
2 Bionics Laboratory Department of Mechanical Engineering University of Moratuwa Moratuwa 10400 Sri Lanka
3 Department of Mechanical engineering Faculty of Engineering Kyushu University Fukuoka 819-0395 Japan