This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
1. Introduction
Vision is, undoubtedly, one of the most important senses for humans as humans get 83% of information from the environment via this sense. Unfortunately, as per this study [1], carried out by the World Health Organization (WHO) in 2011, 285 million people are visually impaired, 39 million are completely blind, and 246 million had weak vision/sight. In Pakistan alone, ~2 million people are suffering from blindness/sightlessness and visual impairment [2]. Vision loss disability or sightlessness is a phenomenon of lacking the vision perceptions, due to physiological or neurological mental factors, having a visual acuity of just 1-2/10 with both eyes open (less than or equal to 30 degrees [3]). This fraction of people faces many challenges, like performing simple daily routine activities, and hence always remains dependent on others [4–6]. This is not just the loss of the individual but also for the country, as they are unable to play their role in the growth of their country’s economy [7]. Using a “white cane” simple stick (introduced by James Biggas of Bristol in 1921 [8]), as an indication, was the only way out for a visually impaired person to move around. The US congress proclaimed October 15 the “White Cane Safety Day,” to show love, respect, and sympathy to the visually impaired people.
In the 1960s, the research on assistive technologies—associated with data transmission [9, 10], navigation, and orientation aids, to confer accurate help to the visually handicapped person—was initiated [11–14]. For example, the first approach to observe the hurdle (coming in the way of an impaired person) was the thought of a vector field bar graph. The technique, somehow, solved the indoor navigational activities; however, it was less helpful for the outdoor activities [14]. Hence, the developed system was considered overcomplicated and less accurate and needed a mobile terminal to send and receive the information [15]. Later, the use of a microcontroller in a renowned system, namely, “NAVEBELT and Guidance,” consisted of a series of devices (placed at the belt), to observe the hurdles [16]. Afterward, Yuan provided a device for varying sensing and surrounding discovery, intending to assist a visually impaired person. The device measures active triangulation and observes the environmental feature exploitation using the Kalman filter tracking [17, 18]. Another proposed solution takes the help of a camera and a transmitter to transmit the real-time video to the server—to be monitored and assisted by another person. This system also proved not accurate especially in outdoor applications and could not help the person much [19]. Bolgiano and Meeks [20] proposed a visual aid system based on the GPS, GSM, and ultrasonic sensors to detect the hurdles and to obtain the information related to user location [21].
Nowadays, different kinds of those canes are introduced, e.g., smart cane [22] and optical maser cane [23]; however, these tools have many constraints: unnecessary longer length of the cane and limitations in recognizing obstacles, making it difficult for the person to access public places. Recently, some other solutions are also proposed like GPS devices for landmark identification (near-infrared (IR) lightweight or radio frequencies), supersonic obstacle detectors (sonar, UltraCane, Miniguide, Palm-sonar, Ultra-Body-Guard, and iSONIC cane), and optical devices (the laser long cane) [24–29]. However, these devices have shown to be less useful in crowded environments especially in outdoor environments, mainly because of the multiple reflections. Alternatively, several techniques have been developed to revisit the quality of life of visually impaired people by introducing smart devices with built-in signal processing and sensing technology. These referred to electronic travel aid (ETA) devices which facilitate the blind to maneuver freely in an atmosphere dynamically changing in real time. As per the literature, ETAs are broadly classified into 2 major types: sonar input systems (laser signal, infrared signals, or supersonic signals) [30, 31] and camera input systems (consisting principally of a mini-CCD camera and fuzzy system) [32–35]. Bat K sonar, sensible cane, smart vision, and guide cane to observe obstacles in front of the person by transmitting and receiving the mirrored wave normally used supersonic sensors or optical maser sensors [36–39]. In response to detected obstacles to warn the person, it produces either associate audio or vibration. Systems like voice [40], Sound View [32], SVETA [41], and CASBLIP [42] use a single camera or stereo video cameras mounted on a wearable device to capture pictures. These captured pictures are resized and processed to regenerate the corresponding speech, audio, musical sounds, or vibrations. In such systems, the frequency of warning sound signals is related to the orientation of pixels. Some advanced systems use Global Positioning System (GPS) integration with the main system. It is also noteworthy that GPS receivers are beneficial for understanding the present location of the subject and close landmarks. Some solutions are already accessible within the market such as UltraCane [43], iSONIC [44], and Teletact [45, 46]. These products facilitate the visually impaired ones by grouping the data collected through sensors and then transmitting the recommendations through vibration or sound messages to the user. ETAs provide a warning by using sensing modality or/and tactile signals once an obstacle is estimated within the range and recommend the user to avoid it.
Blind aid and security systems have already been prepared with many other solutions. However, none of them is capable of completely deterring the needs of the visually impaired person. In summary, the main contributions of the proposed work are as follows:
(i) We provide an accurate and usable proof-of-concept electronic stick to make the user’s life easier
(ii) Our approach used supersonic sensing technology to detect the hurdle and generates an audio alarm/vibration to get the user’s attention
(iii) Our prototype uses the LDR (Light-Dependent Resistor or photoresistor) to detect the darkness around the user and generates a darkness alert and lights the stick in parallel
(iv) Additionally, our system also detects water and smoke as well. Our solution uses GSM and GPS for the detection of the location of a user, and in case of an emergency, a user can press an emergency button to automatically generate and distribute a SMS to the desired person
The rest of this paper is structured as follows: next, we will discuss in Section 2 a concept of the indoor guide system based on partial vision loss which is presented with a detailed description of the design and further development of electronic apparatus used in the proposed electronic guidance cane, based on outdoor detection of objects, and direction indication-simulation system based only on the detection of objects. In addition, our experimental results for a lab-scale prototype are provided, and at last Section 3 concludes the paper.
2. Materials and Methods
Presently, visually impaired people normally use a white cane as a tool for directing them when they move or walk for their daily routine work. Here, in Figure 1, a smart electric cane prototype design was developed as a tool that can serve as an electric guidance cane for visually impaired persons being a more efficient and helpful technique than the conventional one. The prototype explains different parts of the proposed system, and an ultrasonic sensor is used on the top side and bottom right and left for obstacle avoidance. The vibrator is fixed on the top of a stick which vibrates when an obstacle is encountered which helps in alarming the visually impaired person and allows that person to change their path. On the bottom of the cane, an IR sensor is used for pit and staircase detection to guide someone in that direction where there are fewer obstacles. To avoid sleeping in a water area, a water sensor is attached to the initial side of the guided system, and for fire protection, a fire sensor is placed on the initial bottom; a light sensor is useful at night which alerts the people in the surrounding area that a visually impaired person is walking and allows space so that the person can walk easily.
[figure omitted; refer to PDF]
For further discussion and simplicity, Figure 2 illustrates the overall working mechanism and the features included in the proposed designed electric guidance cane concept. As shown within the figure, the Arduino Uno microcontroller is the heart of the proposed system as all the individual units are controlled and interfaced with the heart of the machine. The supersonic detector and the GPS unit act because the input to the microcontroller provides the obstacle data and user data severally when walking and using this embedded system. The LCD, GSM unit, and buzzer unit essentially are the outputs results from the microcontroller. The buzzer unit gives alerts via voice when the user encounters an obstacle in front of a specified distance. In case of emergency and any failure, the GSM unit provides message service to the expensive ones of the user. IR and LDR represent the obstacle avoidance sensor as a heat-sensitive sensor and light-dependent resistance, respectively. LDR and IR first sense the intensity value of an object in front of the user and send it to a microcontroller. The rest of the sensors including the water sensor work in the same procedure. After receiving the data, the microcontroller converts it into different discrete values from initial step 0–1023 and checks whether the received value is above the threshold level (a limit value that is set independently by the visually impaired person from the range of discrete values: 0–1023) or not; it will then be considered as there is no object in front of an electric guidance cane, and the buzzer alarming will remain off; if the received value is below the threshold level, the microcontroller will consider it as an object or hurdle in front of the guidance cane. During outdoor activities, if the value of IR and LDR is high and detects an object in front of a visually impaired person, then the vibrator will be on, or if the ultrasonic sensor value is high and identifies an object, then beep will be generated, voice coming from the speaker will tell the user to move right or left, and the light on the stick will also shine. For the practical decomposition of the merchandise, the structure of the proposed architecture is convenient and efficient enough to cover all aspects of the observant because it demonstrates the presence of every module ranging from its arrival until the service is served to the desired user and it describes that the modules are interlinked and intercepted with one another and work along to attain the required goal.
[figure omitted; refer to PDF]
Table 2 demonstrates the values of specific working voltage, measurement range, I/O pins needed, operating current, and dimensions.
2.1.3. GPS Module
In 1960, U.S. Air Force invented a satellite-based navigation system. The tracking sensitivity of GPS is -165 dBm. The working principle of GPS is that it can connect with 24 satellites which continuously encircle the earth once in 12-hour duration and provide very useful information regarding time, velocity, and position on earth. We can identify the position of any object by measuring the distance of that object from the satellite. The distance from satellite is
GPS needs three satellites for a 2D position, but for accuracy it needs four satellites for providing the location of any object on earth. When someone turned on the GPS module, it firstly downloads orbital information of all satellites and saves information in its memory. Then, the receiver calculates the distance between a satellite and itself by multiplying the velocity of the transmitted signal which is the speed of light
2.1.4. GSM Module
GSM was developed in 1970 by Bell Laboratories Inc. This device has great importance in communication to transmit and receive data by using mobiles across the world. It had made communication easy since its development. It works at different frequency bands in which the band is used depending on the application. GSM handles multiple access at a time that is why today’s communication system has become better than past years’ systems. Its works between frequencies 850 MHz-00 MHz and 1800 MHz-1900 MHz, and data rates from 64 kbps to 120 Mbps can be conceded by this system. Time division multiple access overlapping of one signal to another signal was common in an older communication system. To avoid this major error in a communication system, engineers introduced a new method that is time division multiple access. In this method, some specific time slots were assigned to every user that helped a lot to solve many problems regarding overlapping, but one thing is important that frequency remained the same for all users. The use of the GSM module is that because of its better spectrum competence, international roaming, support for new facilities, and real-time clock with alarm management, phone calls are secure by using encryption and short message service (SMS). Most secure telecommunications currently accessible as security strategies are standardized for it. This module can be used in several applications in GPRS mode remote data logging, transaction terminals, weather stations, security applications, and supply chain management.
2.1.5. Voltage Regulator
In this prototype, a voltage regulator, LM 7805, is used. Voltage regulators are used for regulating voltages to our desired circuit voltage requirement, as shown in Figure 4. It will protect our circuit whenever there is excess voltage or current level beyond our circuit limit. It provides a constant voltage at its output terminal. It is a member of 78xx linear voltage regulator ICs, and xx indicates that it will provide a constant voltage level. The voltage regulator LM 7805 ratings described are its input voltage range from 7 V to 35 V,
[figure omitted; refer to PDF]
In this scenario, a photocell LDR sensor, LEDs, LCD screen, resistors (220 ohms), micro-SD socket (SD card), ultrasonic sensor (HC SR04), speaker (SPKR 1), rotatory potentiometer (large) (R3), round pushbutton (S1), vibration motor (ROB 08-449), Adafruit Ultimate GPs, GSM (sim 800 L), water level sensor, flame sensor, and single Arduino Uno Mega (2560-ReV3) were used. One leg of the LDR sensor is attached to the microcontroller analog PIN A3, and another leg to a 5 V pin, and the same is done with a resistor which is attached to the GND port of the microcontroller. Besides, the threshold value for the sonar sensor (HC SR04) was adjusted to 5 from the discrete values (0–1023) for understanding whether it is a hurdle or not. On the other side, right and left pins of the sonar sensor pin VCC are connected to a speaker and potentiometer, and trig and echo pin is related to pin D34 and D36 Arduino Uno as well as GNDs. The front sensor (HC SR040) VCC was connected the same as left and right pins, and trig and echo were connected to pins D13 and D14 Arduino Uno and GNDs. Here, Arduino Mega ADK analog pins A1 and A0 are connected to the LDR R1 side and water sensor sport, and other to 3v3 and GND. The other ground of battery’s negative also put against to the battery’s positive. Furthermore, the D52 (SCK) and D50 (MISO) of Arduino were connected to the GSM port sim_rxd, sim, and txd, as we can see in Figure 5. The Arduino pins D28, D30, and D32 are related to the LDR positive side and D26 with the flame sensor port Ky-026. The other considerable connections are Txl D18 to GPS pin rx, D16 to SD card pin (U RSV), D2-D5 with LCD PIN DB7, and DB6-DB5 and DB4. The rest of the Arduino D8 and D12 with push-button leg 1 is connected to the LCD cathode side.
At the last part of the circuit design, GND negative part of all the obstacle avoidance sensors was connected to the GND port and all VCC (input voltage) to the Arduino 5 V pin. Initially, the sonar HC-Sr04 has a distance set from 2 cm to 1 m (by default) at the start; if there was a hurdle detected, the vibrator vibrates and will generate a beep through the speaker and the voice (turn left and right and forward) will be heard by the blind person.
2.2.1. Indoor Prototype of the Electric Guidance Cane
Concerning indoor cane usage patterns, a user follows it to fight the steer in its center where they are walking in order that any open proportional font window, door, or split unit mechanical device and house objects might not hurt them. They will swipe the electric guidance cane on the ground rather than the left-right direction technique if the floor surface is okay enough. It is quite difficult to measure the huge area square for the navigation, which does not need personal or infrastructure help. On stairs, they use the pencil vogue technique. They follow the railing of the steps or wall with manus. When returning down, they use a guidance cane to feel and observe the depth and breadth of stairs by a sound alarm and swiping cane. For escalator use in a mall in the door, the bumped start line allows them to comprehend the initial line. To get out of it on the finish, they either get information by sloping down reeling at the top or place a guidance cane on a further next step in order that they shall recognize once it touches the ground level area. One subject aforesaid that he uses to face on heel once approaching towards finish; therefore, he could safely find himself on escalators. For carry usage and other altitude purposes, they use a guidance cane to grasp once the carry arrived if it does not ring. Then, they use a cane to gauge the height of a lift’s floor from the building floor, step in, and face within the center of the carry.
(1) Results and Discussion. In the beginning, Figure 6 shows the flow of the electric guidance cane for every step of the blind walking. It conjointly shows the sensors and actuator’s work and also the control method done by Arduino Uno.
[figure omitted; refer to PDF]
The flowchart of the obstacle detector using the supersonic sensing element is shown in Figure 6, which has 2 components: the initial half part deals with the obstacle detection and the other second half deals with distance measuring and alerting the users counting on the distance of the near obstacle to avoid collision. Counting on the gap of the obstacle from the person, four zones of the area unit formed: far zone, near zone, shut zone, and zone. If the next detected object is at ≥2 meters or more, then it comes underneath the way (safe) zone. If the next item object is found at 1 meter or more, then it comes underneath some close to the zone; if the item object is found at 100 cm or more, then it comes underneath the shut zone; and if the item is detected at 100 cm, then it comes underneath the zone. A voice instruction in conjunction with vibration and a buzzer alert voice is processed to a user at each zone to alarm him/her and let individuals around that visually handicapped person to assist. For usage, a disabled person can control it very easily which is our main point to give more priority for partially disabled person and old age humans. When a disabled person wants to use it, he/she just needs to press “Button,” then all features and functions will be activated like a microcontroller and sensors with GPS location. He uses his cane to move on road—indoor or outdoor, the cane will guide the user about their path and obstacles via voice (turn right, left, etc.) and send the location to person guardian; also, it can be used to track a disabled person in case of emergency.
Figure 7 shows the final demonstration of the proposed smart aid electronic stick that detects the obstacles and object’s movement indoor using Arduino Uno. Figure 7(a) represents a lab-scale prototype of an electronic stick. Figure 7(b) shows the movement of the forward base because the sensed object intensity value of the sonar sensor was less than the threshold value and there was no obstacle detected by any of the obstacle avoidance sensors, so as a result, no beeping alarm turns on. The formal design of the proposed model and methodology can be seen in Figures 7(c) and 7(d), with the motive that only those obstacles will be detected in an indoor area, and the rest of the module including the voice alarm, vibrator, and beep will keep maintaining their state and start working in this condition. As an example, as discussed in Figure 7(c), the sonar sensor detects the initial hurdle and provides a voice assistant to handle the obstacles. Therefore, if the sonar sensor observes the sense intensity value or a value above the threshold, it suggests solutions to handle the obstacles. Moreover, when the object moves to the first obstacle, it indicates to turn right; that condition becomes true. Similarly, when an object moves to the second obstacle, it indicates to turn left (Figure 7(d)). The demonstrated results proposed the efficiency of the idea and give immediate validation for the RF module, vibrator, speaker, sonar sensor (HC SR04), and proposed model. These types of applications can be implemented and installed in the indoor area, parking area, hotels, schools’ shopping malls, college’s boundary, inside sectors, hospitals, and malls and homes to detect objects and give a straight path to the blind person.
[figures omitted; refer to PDF]
2.2.2. Outdoor Prototype of a Smart Stick of Visually Impaired Human
As per our vision, the central idea of this work is to create such an innovation for our current visually impaired persons and an embedded system so that the results of location accuracy and direction can be improved. As presented in Figure 8, in outside use, all of the themes use left and right sound of the cane technique on a flat surface path. The reason of this can be that all of them use a cane with a vacant tip, not with a roller ball that will be swiped on the surface. In an outdoor setting, when a visually impaired person walks with angular deviation, three speech types will be on: turn “right, turn “left,” or “forward,” dependent on sensor detection varying frequency on the pedestrian’s adherence to the way.
[figure omitted; refer to PDF]
(1) Results and Discussion. The practicality of the supersonic sensor device remained the same as shown in Figure 6; this could be an additional display in Figure 8. During this, a combination of GPS and GS-modem technologies would possibly offer additional aid for the visually impaired persons.
Initially, whenever there is an emergency, the blind individuals have to be compelled to press the trigger button that activates the GPS and GSM. GPS identifies the placement of the visually handicapped person in real time which is sent to GSM within the sort of coordinates. An alert message is sent together with the precise location of the visually handicapped person to the receiver. For additional aid, supersonic detectors with voice recognition also are accustomed to finding an obstacle and active torch sensors. Therefore, this stick will not be misused by others except for the approved users. Figure 9 shows the design results of the outdoor prototype of an electric guidance cane for users having a vision loss disability using the Arduino Mega microcontroller. In this way, Figure 9(a) represents a visually impaired person from outside of the city where the location can be traced, and a user can also follow the path which he decided at the initial start level when he starts a journey. On the other hand, in Figures 9(b) and 9(c), the person turns right and left on the surface sonar sensor placed on the electric cane because there was a motion that was detected by the first IR obstacle avoidance sensor. At the same time, the object moved forward from the second avoidance sensor, then the third obstacle avoidance sensor detected its motion, and a voice alarm coming from the speaker helps the person to choose which side is better. Meanwhile, when a fire was detected by the flame sensor placed on the cane, the vibrator automatically operates to alert the person who is visually disabled; also, the water sensor senses the road water or depth of the road if there is any, as seen in Figures 9(b) and 9(c). In Table 3, the functionality of old systems differs from the new purposed embedded system of the electric guidance cane for users having vision loss problem. In the old system, a basically single microcontroller was used with single distance and infrared sensor to guide the user in only prescribed one way direction. On other hand, some used only an obstacle sensor or flame sensor and different engineers used GPS for location. At the end, we study and concluded that the overall solution of this problem is to minimize the cost and increase the usability of the device with lots of other features. In this study, the authors have designed a low-cost system. Therefore, the system has various productive functions. First, it detects obstacles in multiple directions and informs the user through voice and beep alert, detecting also flame and ground depth. Second, the system also detects water existence and movement. Third, the system is also beneficial as it sends and receives the real-time location. At the end, if the visually impaired lost their direction, then he/she alarms other persons nearby by pressing a button on the cane, hinting them that he/she lost his/her path, and a message is also sent to the guardian via longitude and latitude as we discuss in Figures 10(a) and 10(b) with a real-time location of that person.
[figures omitted; refer to PDF]
Table 3
Comparison between old systems and the proposed system.
Functionality | Old systems | Proposed system |
Based on microcontroller-Arduino | Yes | Yes |
Operates with the sonar and infrared module | Yes | Yes |
Flame and water intelligence and sensing capability with hurdle detection generating vibration and sound to divert the person to the right direction where there is no obstacle | No | Yes |
Locate the location of the blind person and give alert to the guardian (longitude & latitude capability) | No | Yes |
2.3. User Path Finding
We designed the proposed embedded system and tested its functionality and results with a partially visually impaired person holding it, as featured in Figure 10. The alarm SMSs transmitted from the system to mobile phones of the disabled person’s family and guardian are shown in Figures 10(a) and 10(b) for generating alert and request to need help, respectively. The SMSs with the embedded system device’s current location and tracking path and status requested by the guardians are featured in Figures 10(c) and 10(d), respectively. Finally, a web link is generated, and the current location of the electric guidance cane in Google Maps and the available tracking device are shown in Figures 11(a) and 11(b), respectively.
[figures omitted; refer to PDF]
In our country, Pakistani visually impaired persons (aged between 17 and 60 years) who used the prototype, as shown in Figures 8 and 9, respectively, reported clear benefits from the ability to alarm and get connected with their family and guardians. Moreover, the data analyzed to check multiple users how to use this guidance cane and reported that the guidance system improved their life style by giving them additional confidence and life easiness.
We used the microcontroller Arduino-based electric guidance cane with GPS and GSM wireless connectivity, in which the visually impaired person could automatically control the directions based on some sensors which the provide the data in the form of direction, buzzer, and alert via voice, to avoid any hurdles which come in a path or way. In some cases, we required wireless connectivity to make the prototype more scalable to find the current location of a new or existing visually impaired person and online access to the information managed by the microcontroller boards to develop a user-friendly interface, and a message is sent to the required guardian. In this regard, the current location longitude and latitude can be used for access from the Internet, and Google Maps would provide easy-to-use access to the location of visually impaired humans. Meanwhile, to store previous location record history and secure the acquired information of the prototype for further study and analysis, different kinds of searching and dictation algorithms could be used in our proposed designs; in fact, a microcontroller is easy to integrate with several sensors. Moreover, the reported systems are presented as lab-scale prototypes; similarly, the proposed embedded systems can be managed for real-scale facilities by applying other technologies. For the question of wireless connectivity, the GSM and GPS are used as shown in Figure 5. The system can be replaced with different sensor types, or a combination of several types, for better results. For example, one can choose a long-range infrared sensor with a maximum range of 1.5 m or a combination of laser diodes and photoresistors.
The feedback of the test on visually impaired persons about the proposed embedded system after participating in the practical experiment in a university of the system was positive. Based on the results and outputs, this kind of system would aid their navigation. The study also collected usability suggestions for further development of the system and real-life use.
3. Conclusions
In this paper, we conclude, finalized, and developed an electric guidance cane for the safety, protection, and convenience of visually impaired persons. It guides the person using voice and by vibration alert. Therefore, it is useful for people that are visually impaired as well as for those who are hearing impaired. It is used for navigation and hurdle detection with the help of three ultrasonic sensors installed in three different directions, i.e., front, right, and left, and for the footstep of the stick. Using the GSM and GPS, the user can send its location to the concerned person; we have developed a smart algorithm with compact hardware design in a small box which comprises of the GPS module, GSM module, Arduino Mega, and water sensor at the bottom and flame sensor. The battery and other components are inside the stick body. This is an efficient design through which visually impaired people can go anywhere without any problem and can easily avoid hitting any hurdle coming in their path. We have installed a GSM module for locating blind people anywhere on earth. There is a specific mobile number that is saved in coding; whenever a specific message is sent by this number to a blind stick GSM number, the location of the blind person would be sent to that specific number. We have used Arduino to make our hardware compact and to avoid making the stick heavy. It also increases the efficiency of the system. This stick is easy to carry by a blind person without any problem. In the future, we will convert a rechargeable battery to solar and road fraction recharge. In this way, the battery life will increase and interlink with the AI camera which would be able to detect obstacles.
Authors’ Contributions
AK, MAA, and MAJ devised the methodology and acquired funding. MSS, MMAK, and AU carried out the formal analysis and data curation. AK and MSS wrote the original draft, reviewed the writing, and edited the manuscript. AK and AU proofread the manuscript before its final submission. AK, MAA, and MAJ contributed equally to this work.
Acknowledgments
This work was supported by the Guangzhou Government Project under grant no. 62104301.
[1] R. P. A. Bourne, S. R. Flaxman, T. Braithwaite, M. V. Cicinelli, A. Das, Vision Loss Expert Group, "Magnitude, temporal trends, and projections of the global prevalence of blindness and distance and near vision impairment: a systematic review and meta-analysis," Lancet Global Health, vol. 5 no. 9, pp. e888-e897, DOI: 10.1016/S2214-109X(17)30293-0, 2017.
[2] T. R. Fricke, N. Tahhan, S. Resnikoff, E. Papas, A. Burnett, M. H. Suit, T. Naduvilath, K. Naidoo, "Global prevalence of presbyopia and vision impairment from uncorrected presbyopia: systematic review, meta-analysis, and modelling," Ophthalmology, vol. 125 no. 10, pp. 1492-1499, DOI: 10.1016/j.ophtha.2018.04.013, 2018.
[3] M. Varghese, S. S. Manohar, K. Rodrigues, V. Kodkani, S. Pendse, "The smart guide cane: an enhanced walking cane for assisting the visually challenged," 2015 International Conference on Technologies for Sustainable Development (ICTSD),DOI: 10.1109/ICTSD.2015.7095907, .
[4] A. Shaha, S. Rewari, S. Gunasekharan, "SWSVIP-smart walking stick for the visually impaired people using low latency communication," 2018 International Conference on Smart City and Emerging Technology (ICSCET),DOI: 10.1109/ICSCET.2018.8537385, .
[5] M. P. Agrawal, A. R. Gupta, "Smart stick for the blind and visually impaired people," 2018 Second international conference on inventive communication and computational Technologies (ICICCT), pp. 542-545, DOI: 10.1109/ICICCT.2018.8473344, .
[6] A. S. Al-Fahoum, H. B. Al-Hmoud, A. A. Al-Fraihat, "A smart infrared microcontroller-based blind guidance system," Active and Passive Electronic Components, vol. 2013,DOI: 10.1155/2013/726480, 2013.
[7] T. Miura, Y. Ebihara, M. Sakajiri, T. Ifukube, "Utilization of auditory perceptions of sounds and silent objects for orientation and mobility by visually-impaired people," 2011 IEEE International Conference on Systems, Man, and Cybernetics, pp. 1080-1087, DOI: 10.1109/ICSMC.2011.6083818, .
[8] H. Marion, J. Michael, Assistive Technology for Visually Impaired and Blind People,DOI: 10.1007/978-1-84628-867-8, 2010.
[9] V. Tiponut, D. Ianchis, Z. Haraszy, I. Bogdanov, "Work directions and new results in electronic travel aids for blind and visually impaired people," WSEAS Transactions on Systems, vol. 9, pp. 1086-1097, 2010.
[10] V. Tiponut, S. Popescu, I. Bogdanov, C. Caleanu, "Obstacles detection system for visually impaired guidance," 12th WSEAS International Conference on SYSTEMS, .
[11] I. Ulrich, J. Borenstein, "Local obstacle avoidance with look-ahead verification," Proceedings 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065), vol. 3, pp. 2505-2511, DOI: 10.1109/ROBOT.2000.846405, .
[12] Y. Seki, T. Sato, "A training system of orientation and mobility for blind people using acoustic virtual reality," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 19 no. 1, pp. 95-104, 2011.
[13] E. S. Narayanan, D. D. Gokul, B. P. Nithin, P. Vidhyasagar, "IoT based smart walking cane for typhlotic with voice assistance," 2016 Online International Conference on Green Engineering and Technologies (IC-GET),DOI: 10.1109/GET.2016.7916687, .
[14] D. Yuan, R. Manduchi, "A tool for range sensing and environment discovery for the blind," 2004 Conference on computer vision and pattern recognition workshop, pp. 39-39, DOI: 10.1109/CVPR.2004.292, .
[15] D. Yuan, R. Manduchi, "Dynamic environment exploration using a virtual white cane," 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), vol. 1, pp. 243-249, DOI: 10.1109/CVPR.2005.136, .
[16] P. Baranski, M. Polanczyk, P. Strumillo, "A remote guidance system for the blind," The 12th IEEE International Conference on e-Health Networking, Applications and Services, pp. 386-390, DOI: 10.1109/HEALTH.2010.5556539, .
[17] T. Miura, K. Ueda, S. Ino, T. Muraoka, T. Ifukube, "Object's width and distance distinguished by the blind using auditory sense while they are walking," Journal of the Acoustical Society of America, vol. 123 no. 5,DOI: 10.1121/1.2935714, 2008.
[18] R. Boldu, D. J. C. Matthies, H. Zhang, S. Nanayakkara, "AiSee: an assistive wearable device to support visually impaired grocery shoppers," Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 4,DOI: 10.1145/3432196, 2020.
[19] A. A. Tahat, "A wireless ranging system for the blind long-cane utilizing a smart-phone," 2009 10th International Conference on Telecommunications, pp. 111-117, .
[20] D. Bolgiano, E. Meeks, "A laser cane for the blind," IEEE Journal of Quantum Electronics, vol. 3 no. 6, pp. 268-268, DOI: 10.1109/JQE.1967.1074528, 1967.
[21] S. Shoval, I. Ulrich, J. Borenstein, "Robotics-based obstacle-avoidance systems for the blind and visually impaired - Navbelt and the guidecane," IEEE Robotics and Automation Magazine, vol. 10 no. 1,DOI: 10.1109/MRA.2003.1191706, 2003.
[22] L. Shangguan, Z. Yang, A. X. Liu, Z. Zhou, Y. Liu, "STPP: spatial-temporal phase profiling-based method for relative RFID tag localization," IEEE/ACM Transactions on Networking, vol. 25 no. 1, pp. 596-609, DOI: 10.1109/TNET.2016.2590996, 2017.
[23] A. Raina, D. Bansal, "Poster: smart-phones as active sensing platform for road safety solutions," Proceedings of the 14th Annual International Conference on Mobile Systems, Applications, and Services Companion, pp. 74-74, .
[24] A. Zvironas, M. Gudauskis, D. Plikynas, "Indoor electronic traveling aids for visually impaired: systemic review," 2019 International conference on computational science and computational intelligence (CSCI), pp. 936-942, DOI: 10.1109/CSCI49370.2019.00178, .
[25] N. A. Giudice, G. E. Legge, "Blind navigation and the role of technology," Engineering Handbook of Smart Technology for Aging, Disability, and Independence, pp. 479-500, .
[26] F. Xiao, Q. Miao, X. Xie, L. Sun, R. Wang, "Indoor anti-collision alarm system based on wearable Internet of things for smart healthcare," IEEE Communications Magazine, vol. 56 no. 4, pp. 53-59, DOI: 10.1109/MCOM.2018.1700706, 2018.
[27] M. F. Saaid, I. Ismail, M. Z. H. Noor, "Radio frequency identification walking stick (RFIWS): a device for the blind," 2009 5th International Colloquium on Signal Processing & Its Applications, pp. 250-253, DOI: 10.1109/CSPA.2009.5069227, .
[28] B. Andò, S. Baglio, V. Marletta, A. Valastro, "A haptic solution to assist visually impaired in mobility tasks," IEEE Transactions on Human-Machine Systems, vol. 45 no. 5, pp. 641-646, DOI: 10.1109/THMS.2015.2419256, 2015.
[29] S. S. Reddy, G. Sathyaprabha, P. V. V. P. Rao, "Voice based guidance and location indications system for the blind using GSM," GPS and Optical Device Indicator, vol. 3, pp. 822-832, 2014.
[30] E. Milios, B. Kapralos, A. Kopinska, S. Stergiopoulos, "Sonification of range information for 3-D space perception," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 11 no. 4, pp. 416-421, DOI: 10.1109/TNSRE.2003.819645, 2003.
[31] X. Xing, R. Zhou, L. Yang, "The current status of development of pedestrian autonomous navigation technology," 2019 26th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS),DOI: 10.23919/ICINS.2019.8769378, .
[32] P. B. L. Meijer, "An experimental system for auditory image representations," IEEE Transactions on Biomedical Engineering, vol. 39 no. 2, pp. 112-121, DOI: 10.1109/10.121642, 1992.
[33] M. Olfati, W. Yuan, A. Khan, S. H. Nasseri, "A new approach to solve fuzzy data envelopment analysis model based on uncertainty," IEEE Access, vol. 8, pp. 167300-167307, DOI: 10.1109/ACCESS.2020.3022158, 2020.
[34] M. Ahmad, A. Khan, A. Khan, M. Mazzara, S. Distefano, A. Sohaib, O. Nibouche, "Spatial prior fuzziness pool-based interactive classification of hyperspectral images," Remote Sensing, vol. 11 no. 9, article 1136,DOI: 10.3390/rs11091136, 2019.
[35] R. Chen, Z. Tian, H. Liu, F. Zhao, S. Zhang, H. Liu, "Construction of a voice driven life assistant system for visually impaired people," 2018 International Conference on Artificial Intelligence and Big Data (ICAIBD), pp. 87-92, DOI: 10.1109/ICAIBD.2018.8396172, .
[36] Bay Advanced Technologies, "BAT “K” sonar: ultrasonic sensing device for the blind," 2012. http://www.batforblind.co.nz
[37] A. Garg, M. Balakrishnan, K. Paul, R. Paul, D. Mehra, V. Singh, D. Manocha, "Cane mounted knee-above obstacle detection and warning system for the visually impaired," Proceedings of the 11th International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, pp. 143-151, .
[38] D. T. Hartong, F. F. Jorritsma, J. J. Neve, B. J. Melis-Dankers, A. C. Kooijman, "Improved mobility and independence of night-blind people using night-vision goggles," Investigative Ophthalmology & Visual Science, vol. 45 no. 6, pp. 1725-1731, 2004.
[39] A. Ullah, M. Shaheen, A. Khan, M. Khan, K. Iqbal, "Evaluation of topology-dependent growth rate equations of three-dimensional grains using realistic microstructure simulations," Materials Research Express, vol. 6 no. 2, article 026523,DOI: 10.1088/2053-1591/aaecc3, 2018.
[40] I. Ulrich, I. Borenstein, "Correspondence the guide cane — applying mobile robot technologies to assist the visually impaired," IEEE Transactions on Systems Man and Cybernetics-Part A Systems and Humans, vol. 31 no. 2, pp. 131-136, DOI: 10.1109/3468.911370, 2001.
[41] M. Nie, J. Ren, Z. Li, J. Niu, Y. Qiu, Y. Zhu, S. Tong, "SoundView: an auditory guidance system based on environment understanding for the visually impaired people," Proceedings of the 31st Annual International Conference of the IEEE Engineering in Medicine and Biology Society: Engineering the Future of Biomedicine (EMBC’09), pp. 7240-7243, .
[42] K. W. Lin, T. K. Lau, C. M. Cheuk, Y. Liu, "A wearable stereo vision system for visually impaired," 2012 IEEE International Conference on Mechatronics and Automation, pp. 1423-1428, DOI: 10.1109/ICMA.2012.6284345, .
[43] L. Dunai, D. Ismael, L. Lengua, I. Tortajada, S. Brusola, "Obstacle detectors for visually impaired people," 2014 International Conference on Optimization of Electrical and Electronic Equipment (OPTIM),DOI: 10.1109/OPTIM.2014.6850903, .
[44] M. A. Rahman, M. S. Sadi, M. M. Islam, P. Saha, "Design and development of navigation guide for visually impaired people," 2019 IEEE international conference on biomedical engineering, computer and information Technology for Health (BECITHCON), pp. 89-92, DOI: 10.1109/BECITHCON48839.2019.9063201, .
[45] A. Riazi, F. Riazi, R. Yoosfi, F. Bahmeei, "Outdoor difficulties experienced by a group of visually impaired Iranian people," Journal of current ophthalmology, vol. 28 no. 2, pp. 85-90, 2016.
[46] A. Kumar, R. Patra, M. Manjunatha, J. Mukhopadhyay, A. K. Majumdar, "An electronic travel aid for navigation of visually impaired persons," 2011 Third International Conference on Communication Systems and Networks (COMSNETS 2011), .
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright © 2021 Asad Khan et al. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Vision is, no doubt, one of the most important and precious gifts to humans; however, there exists a fraction of visually impaired ones who cannot see properly. These visually impaired disabled people face many challenges in their lives—like performing routine activities, e.g., shopping and walking. Additionally, they also need to travel to known and unknown places for different necessities, and hence, they require an attendant. Most of the time, affording an attendant is not easier and inexpensive, especially when almost 2.5% of the population of Pakistan is visually impaired. There exist some ways of helping these physically impaired people, for example, devices with a navigation system with speech output; however, these are either less accurate, costly, or heavier. Additionally, none of them have shown perfect results in both indoor and outdoor activities. Additionally, the problems become even more severe when the subject/the people are partially deaf as well. In this paper, we present a proof of concept of an embedded prototype which not only navigates but also detects the hurdles and gives alerts—using speech alarm output and/or vibration for the partially deaf—along the way. The designed embedded system includes a cane, a microcontroller, Global System for Mobile Communication (GSM), Global Positioning System (GPS) module, Arduino, a speech output module speaker, Light-Dependent Resistor (LDR), and ultrasonic sensors for hurdle detection with voice and vibrational feedback. Using our developed system, physically impaired people can reach their destination safely and independently.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details




1 School of Computer Science and Cyber Engineering, Guangzhou University, Guangzhou 510006, China
2 School of Information Engineering, Chang’an University, Xi’an 710064, China
3 Department of Computer Science, FAST-National University of Computer and Emerging Sciences, Faisalabad, Pakistan
4 Department of Mathematical Sciences, Karakoram International University (KIU), Gilgit-Baltistan, Pakistan
5 Department of Examinations, GC University Faisalabad, Pakistan; School of Computer Science and Technology, Chongqing University, Chongqing 400044, China