1 Introduction
In recent years, the development of the ‘Internet of things’ (IoT) has moved rapidly, and several kinds of wearable IoT (wIoT) have appeared in the market such as: Apple™ watch and Google™ glass. Considering the desired functions of wIoT, motion detection is an important player in IoT. Analysis of motion data can be used for a variety of applications in a variety of fields. Having this data accessible remotely and wirelessly is another huge benefit for this type of technology, and takes advantage of the IoT framework. Beyond these activities, though, body motion data is also useful in remote supervision applications. These could include supervision of elderly residents at nursing homes, where caregiver employees are not capable of physically monitoring all residents at all times. Remote access to body motion allows these caregivers to form a picture of a resident's daily activities, and the caregiver can then make decisions based on a comparison of the activities detected and the activities expected. In a similar application, motion data can be used to alert a remote caregiver of a resident who has fallen or is at risk of fall. Furthermore, motion data can also allow a user of the technology to know when they are showing a tendency toward falling, thus preventing the fall in the first place.
Our project comprises of a wearable sensor system that makes use of the IoT framework and connects to a user's smartphone to capture body motion. This body motion data is collected using static electric fields (EFs) through a capacitive coupling on the device. The system uses the data gathered from the fields to accurately detect and determine different types of motions. The system aims to operate on ultra-low power (LP), in the range of nanowatts. In [1], Cohn et al. discuss a low-powered sensor that operates via the user's static EF. This system achieved detection of simple and complex body motions and provided a non-intrusive wearable sensor. The system did not, however, include a low-powered, fast, wireless form of data transfer such as Bluetooth. Owing to this difference, our system has an advantage in its ability to utilise wireless data transfer for remote access to user data. Also, the system developed in [1] includes a wake-up system that increases power consumption.
Our approach to collect body motion data utilises static EFs. They are very useful since they do not change in magnitude over distance or time from the original value. This is helpful as it allows us to measure the change in the field as the user moves through it to provide accurate feedback. It is also beneficial in our case as it allows us to recognise smaller movements that the user is performing as the fields will be moving with the user. This type of body motion data can be used for a variety of purposes related to tracking human movements including sleep and posture analyses, health and fitness tracking, remote supervision of elderly or disabled persons, and fall detection and prevention. For this work, the human movement will be categorised into separate motions, and a decision about which activity is being performed will be wirelessly transmitted to a smartphone application for remote monitoring and real-time data collection.
The objective of this paper is to present a multisensory system for human body motion sensing using body area sensors (BASs) and static EFs. For this paper, we implemented an embedded sensor system with an LP Bluetooth communication module to collect body single node voltage using a smartphone. Our proposed system will be useful for monitoring the elderly and has utility in identifying simple activities among child physical rehabilitation patients, and for human behaviour analysis research.
1.1 Major contributions
In this paper, we propose to use smartphones as the primary platform for developing an embedded system for detecting daily activities using body motion; these devices naturally combine the detection and communication components. Our major contributions are as follows:
- developed a multisensory embedded IoT system for daily activity detection;
- proposed a smartphone-based motion detection system using a static EF with a wearable BAS;
- designed, developed, and implemented a self-assistance system which analyses upper and lower body motions using body single node voltage; and
- implemented a mobile system for remote supervision of users, which can be used to differentiate current activities from normal, expected activities.
The rest of this paper is organised as follows: in Sections 2 and 3, we describe the background and relevant related work. In Section 4, we discuss the process of designing our system. In this section, we also explain the circuit design of our system. In Section 5, we discuss the data collection process. Sections 6 and 7 are the results and evaluation of our smartphone-based prototype system. In Section 8, we conclude this paper with possible directions for future research.
2 Background and motivation
Wearable motion sensors have been used in high-impact applications such as simple and complex activity recognition, health and wellness sensing, and elderly care. It is hard to realise the effect of an injury due to abnormal body motion if someone does not have experience with this kind of injury. It can take only one bad incident to severely incapacitate or even kill an individual. wIoT solutions can help in the diagnosis and analysis of users with neuromotor disorders.
In this work, we investigated static EF sensing and decided to build a circuit that utilises static EFs to detect body motion. The advantages of differential static field sensors over accelerometer are LP consumption and fast response to motions. Better sensitivity is also achieved due to the low noise level of capacitive detection. Using this self-built sensor, we could collect preliminary data to analyse the differences between separate simple motions. The scope of applications for motion detection is wide, and the benefits garnered from this type of data are important and impactful. The applications described above represent a large societal benefit in terms of quality of life and personal health and safety. The impacts of these technology stretches to those who are using this remotely as well as the technology works to decrease their menial workload and allows them to use their time more efficiently and effectively.
In [1], Cohn et al. introduced an ultra-low-power (LP) approach to passively sense body motion based on static EFs. The total power consumption for their system was 6.6 μW. Besides the LP consumption, their system passively relies on the existing static EF between the human body and the environment, as shown in Fig. 1, and measures the voltage difference between the human body and the environment through capacitive coupling via a capacitor (CS) connected between the body and the local ground.
[IMAGE OMITTED. SEE PDF]
The expression for sensed voltage (VS) is the difference between the voltage on the body (VB) and the local ground plane (VR). The equation can be expressed as , and the circuit diagram is shown in Fig. 2.
[IMAGE OMITTED. SEE PDF]
For the hardware, the system included a gain stage, buffer, low-pass filter (LPF), and a wake-up portion. Fig. 3 shows the block diagram of the hardware of [1], which was implemented using ultra-LP, off-the-shelf components. Also, to filter out the high amplitude 60 Hz signal on the body, they applied a third-order Butterworth LPF with a corner frequency of 10 Hz using an active filter.
[IMAGE OMITTED. SEE PDF]
This system [1] served as great inspiration for portions of the work done in our system. For further development of the system, there were four challenges faced. The first challenge was how to build an integrated cyber-human system using a microcontroller. Cyber-human systems advance scientific understanding of computing and communications system together with a theoretical and practical understanding of behavioural, social, and design sciences to better design and develop diverse kinds of systems. Also, cyber-human systems seek to improve our fundamental understanding of how, and the processes by which, interactive systems should be designed to achieve human–computer symbiosis and computer-mediated human communication, collaboration, and competition [2]. We made a general assumption that the end user had access to a smartphone to receive and process data. The second challenge was how to develop a low-powered sensor using discrete components. Our approach was to use a passive LPF, pull-down resistor and capacitor to lower-power consumption. Initial attempts also integrated an op-amp to amplify the body's signal [3]. After testing, it was decided that the op-amp was not to be used when monitoring simple body motions, which reduced the power consumption for our designed system in half. The third challenge was how to collect and transmit data wirelessly in real time. The solution was to use a Bluetooth module to send data from the Arduino microcontroller to the user's smartphone. There were several possible ways such as using wireless fidelity communication. A similar commercial product, the Fitbit Flex, uses a combination of Bluetooth communication between the device and the phone and adaptive network topology (ANT) wireless networking to sync with compatible health applications [4]. However, considering power consumption, Bluetooth was found to be a better approach to save power, and best fit our system's constraints. The final challenge was to determine complex motions with accuracy. As for the solution, we decided to approach it with both hardware and software components. For the hardware part, an op-amp was introduced to our system to amplify the signal to ensure the output signal is large enough for detection [3]. For the software part, we designed and developed an algorithm that was developed to accurately detect motions.
3 Related work
Human body motion detection using static EFs has been the subject of many studies over the past decade. Most of the previous approaches regarding motion detection utilise accelerometers attached to the subject for gathering data. Existing dynamic models are non-subject specific because the dynamic parameters are used in general models. Therefore, they have very limited accuracy in predicting body motion for a specific individual.
Most of the motion sensors such as [5] are inertial measurement units (IMUs). Usually, an IMU sensor is a combination of an accelerometer, a gyroscope, and a magnetometer. Compared to our system, an IMU has more power consumption and can only measure the movement of a single node. For our system which utilises static EFs, though the device is attached to a single node on the human body, the movement of the whole body is measured.
In [6], Trost et al. used a wrist-worn sensor and a sensor on the hip to detect seven physical activities. They showed the potential of using the wrist position for activity recognition using logistic regression as a classifier. However, they assessed these two positions separately and did not combine these two sensors. Similarly, a wrist-worn accelerometer was used in [7] to recognise eight activities including the activity of working on a computer. In [8], Ramos-Garcia and Hoover detect the act of eating using a hidden Markov model with a wrist-worn accelerometer and a gyroscope. They recognise eating by dividing that activity into sub-activities: resting, eating, drinking, using utensils, and others. The authors report an accuracy of 84.3%. A feasibility study on smoking detection using a wrist-worn accelerometer is done in [9], where Scholl and Van reported a user-specific accuracy of 70% for this activity. This paper uses only an accelerometer at the wrist position. Parate et al. [10] use an accelerometer, a gyroscope, and a magnetometer at the wrist position to recognise smoking puffs, though solely differentiated from all other activities.
To address the drawbacks of the aforementioned research, we propose a smartphone-based body motion detection system. Our system is designed to directly address some of the drawbacks of the existing systems and yield good activity prediction results. We illustrate the difference between our system and the other related works in Table 1.
Table 1 Comparison of existing work based on different features
Approach | Use the IoT system | Has com. capability | Use body node voltage | Use embedded sensor | LP system |
Cohn [1] | no | no | yes | yes | yes |
Konrad [5] | yes | yes | no | no | no |
Ming [11] | yes | yes | yes | yes | no |
Bennett [4] | yes | no | no | no | no |
WIHa [12] | no | yes | no | yes | no |
Edward [13] | no | no | no | yes | no |
Shyamal [14] | yes | yes | no | yes | no |
Maurizio [15] | no | no | no | yes | no |
MSFb [16] | no | no | no | yes | no |
Trost [6] | no | no | no | yes | no |
Silva [7] | no | no | no | yes | no |
Ramos [8] | no | no | no | yes | yes |
Scholl [9] | no | no | no | yes | no |
Parate [10] | yes | no | no | yes | no |
our Approach | yes | yes | yes | yes | yes |
4 Circuits and system
The strength of our proposed system relies on existing wireless communication to provide an LP solution with maximum freedom of movement to users in their physical activity. Besides, we have used small, lightweight devices that are user-friendly such as the smartphone and the wrist-band. The progression of our system can be described in two stages: an early prototype using a commercial sensor and the final design using our sensor.
4.1 Early prototype using a commercial sensor
For our initial prototype, we utilised a commercial sensor, which included an accelerometer, a gyroscope, a magnetometer, and an embedded algorithm to detect and determine the motion. The circuit is composed of two parts. The first part was an Arduino microcontroller that connects to a BNO055 sensor, as shown in Fig. 4. The BNO055 sent data using the I2C communication protocol, meaning only two outputs from the Arduino, a clock line, and a data line, were necessary.
[IMAGE OMITTED. SEE PDF]
The second part of the architecture was the Arduino connecting to the HC-05 [17] Bluetooth module, as shown in Fig. 5. The Bluetooth module communicated with the Arduino using serial data transfer, again using only two microcontroller outputs.
[IMAGE OMITTED. SEE PDF]
The description of the proposed system architecture shown in Fig. 6 is as follows: a single hand wearing the glove with integrated BNO055 detects body motion. The BNO055 collects the position data of the hand.
[IMAGE OMITTED. SEE PDF]
The Arduino calculates the first derivative of the position data from the BNO055 sensor. Next, the HC-05 module transmits the data from the Arduino and sends it to an app on the phone via Bluetooth, where the app displays the graphs of body motion data and an activity decision. The initial prototype using a commercial sensor is shown in Fig. 7.
[IMAGE OMITTED. SEE PDF]
4.2 Final design using our developed sensor
For our final design, as shown in Fig. 8, we focused on building our sensor to integrate into the system. Our sensor uses static EFs and calculates the compared voltage difference between the human body and the environment.
[IMAGE OMITTED. SEE PDF]
We initially tried different materials and components to serve as body contact including conductive fabric and a conductive wrist strap. After tests were completed, the wrist strap had the best performance. To get the raw data, we connect the body contact node with one side of a capacitor, and ground the other side of the capacitor, then measuring the voltage level across the capacitor.
To reduce the effects of ambient noise, we implemented an LPF to filter out the noise that was affecting the body signal such as 60 Hz hum from power mains. Through testing, it was found that when the cut-off frequency was set at 30.8 Hz, it could filter out the noise and accurately detect different simple body motions such as walking, running, and jumping. To stabilise the value when the subject was at rest, a pull-down resistor was introduced into the circuit. It was connected in parallel with the LPF, and it pulled down the voltage so that the signal level was effectively reduced to the ground when users are at rest.
In the next step, to detect the complex motion such as hand waving, bending, and typing, we implemented an amplifier in the circuit as the signal magnitude is low for accurate detection. An LP op-amp, MCP6041 [3], was added into our circuit after LPF with a pull-down resistor. Also, we introduced a voltage divider to reduce the battery voltage (7.4 V) to 0.6 V, which was used for rail-to-rail voltage for the op-amp. The 7.4 V supply voltage was selected because it fits within the recommended voltage specifications of our Arduino Nano (between 7 and 12 V).
The output of the circuit was connected with Arduino analogue pin, from which Arduino could read the data and send it to the smartphone through Bluetooth. The reason that we decided to utilise an Arduino as our microprocessor was that it is simple, easy to use, and has a built-in analogue-to-digital converter [18].
We transmit the data to a smartphone in real time through the Bluetooth communication module. After analogue-to-digital conversion using the Arduino, we send data to the user's smartphone along with a delimiter that would be utilised in our app through Bluetooth communication module. Once the application was embedded into the smartphone, and the phone was correctly paired with the Bluetooth, the app received the data and used a delimiter. In our case, we used the pound symbol (#) to separate the individual data points and display them in real time. The screenshots of real-time body sensor data collection are shown in Fig. 9. Also, the user interface of real-time activity detection is shown in Fig. 10.
[IMAGE OMITTED. SEE PDF]
[IMAGE OMITTED. SEE PDF]
Our system is divided into two separate circuit modules. The first part of the circuit was used to detect simple motion such as walking, jumping, and running, as shown in Fig. 11. An LPF, pull-down resistor, and a capacitor were used. The power consumption for this circuit is at around 7.2 μW. The second circuit is used to detect complex activity detection. We increase the sensor data signal strength to accurately detect complex activity. In this circuit, we add an LP amplifier circuit, as shown in Fig. 12. The signal strength for a complex motion such as typing, is very low. We tried to detect simple and complex motions using the same circuit (circuit 1). The data flow diagram for the system is shown in Fig. 13.
[IMAGE OMITTED. SEE PDF]
[IMAGE OMITTED. SEE PDF]
[IMAGE OMITTED. SEE PDF]
It is difficult to determine the difference between finger and hand movements in complex activity detection. This makes sense because small movements of the fingers and hands do not have a large effect on the overall capacitance of the body to its environment. Therefore, we focus on detecting simple motion using our current hardware implementation. However, complex activity detection could still be improved using a sensor with better sensitivity. The developed sensor implementation is better in comparison with using commercially available IMU sensor or built-in smartphone sensor. The dynamic range of the smartphone built-in accelerometer is low (< ).
5 Data collection
On completing our system, we began testing our system. For testing the accuracy of our system, we recruited 26 participants from both genders, a variety of age groups, and a range of heights (see Table 2 for statistics). We established a baseline walk period for each of the walking traces. This was achieved by manually finding the walk-start (tstart) and walk-end (tend) events.
Table 2 Statistics about subjects participating in our data collection
Height, cm | ||
— | — | 140–159: 7 |
gender | age, years | 160–169: 4 |
f: 4 | 20–29: 16 | 170–179: 12 |
m: 22 | 30–34: 6 35–39: 4 |
180–190: 3 |
The data was collected for different simulated scenarios such as, for simple motion while walking, running, jumping, and while the subject is bending. For complex motion detection data was collected by simulated typing, washing dishes, eating, and hand waving. We collected data for different motion events in different environments for 200 samples in each event.
We started the data collection with a simulation at rest condition. This allowed us to see a baseline and if there were value differences to see what was different and fix what could be causing that issue. We then proceeded with walking, running, jumping, and bending. For each subject and in each motion, we gathered a minimum of three data sets, allowing us to see if the data was consistent or not, and if not, how much it varied. This would play a critical role in our analysis later. The average values for each test subject can be seen below, with each average being represented as a millivolt value.
During our test, we tracked simple motions performed by the user. What distinguishes these motions is the pattern that is seen from the motions. In simple motion, a subject is performing the same process repeatedly, which is seen in walking, running, and jumping, all of which are classified under the simple motion category. Using this knowledge on the pattern of activity being performed, the activity of the user can be seen, tracked, and predicted.
As shown in Table 3, the sensor values vary for different test subjects, but the relative range of signal variations are the same. Also, each signal waveforms remain constant and have a clear pattern with clear distinctions.
Table 3 Average sensor signal values (millivolt) for four test subjects
Test subject | Resting | Walking | Running | Jumping | Bending |
S1 | 0 | 10.20 | 11.87 | 15.53 | 1.51 |
S2 | 0 | 6.98 | 8.54 | 13.48 | 2.10 |
S3 | 0 | 6.15 | 8.11 | 12.35 | 3.52 |
S4 | 0 | 5.42 | 12.30 | 15.33 | 0.84 |
During our tests, we noted significant changes in data in each subject that also changed with time and day. After close analysis, we noted the most significant changes in the subject's clothing and footwear. As the footwear established the ground connection to the earth, which would complete the circuit, a bad connection changed the data significantly. We saw jumps in values of ±20 depending on the person's footwear, with the most consistent data coming from normal gym shoes.
Clothing would also play an effect, as certain types of clothing could build up more of a charge on the body, causing the change in the field to be greater than without this charge. When performing the final tests, we took this into consideration having subjects wear gym shoes and non-cotton clothing as that typically built up the most static. This allowed us to get consistent data for each test subject that we could utilise in our analysis. To evaluate the effect of different clothing, we also tested our system with several other commonly used clothing with different materials, athletic clothes (made of cotton, flax, wool, ramie, silk), denim (98% cotton, 2% elastase), and roma (74% polyester, 27% rayon, 3% spandex). It is observed that the combination of athletic clothes and tennis shoes gives us an accurate result in motion detection. The tennis shoes with firm rubber soles keep our circuit most grounded. To improve the ground effect and to be able to work for all types of clothing, we will include a small local ground plane on the sensor board. Both the body and the local ground plane will be capacitively coupled to the earth ground. Thus, will improve the dynamic sensor range.
Motion detection using a built-in smartphone sensor such as accelerometer might have an effect on different clothing. The clothing would affect the accelerometer signals with the placement of the smartphone during data collection. The data from accelerometer's x, y, z axes would vary with the position of the smartphone. The most common approach to manage movement noise in wearable sensors in preventing: by attaching the phone to the body using elastic, straps, adhesive, or a skin-tight clothing. We investigated the impact of different clothing (fabric, denim, roma) on sensor signal quality and found a significant effect on both in measuring the effects and in modelling the effect of phone orientation.
The graphs shown in Fig. 14 are raw sampled data from the Arduino, with the x-axis corresponding to the individual data points (sampled at ∼200 Hz) and the y-axis representing the digital signal magnitude (∼0.48 mV per increment).
[IMAGE OMITTED. SEE PDF]
6 Results and analysis
To evaluate our proposed system, we developed a prototype application and investigated its performance. We evaluated the prototype with extensive experiments. In this section, we describe how the data was analysed, and performance was measured.
Initially, we used the commercially available multi-function BNO055 motion sensor and were able to successfully distinguish between three forms of simple motion activities: walking, running, and jumping. Additionally, all actions could be successfully differentiated when the user is at rest. All decisions were made according to a custom algorithm developed by our team. The algorithm depended on the accelerometer and gyroscope data collected from the BNO055 sensor. For both types of sensors, the magnitude of the data was found by taking the square root of the sum of the squares of all three axes. This resulted in a cumulative magnitude for both accelerometer and gyroscope; these magnitudes were then summed together to create a more complete motion index. The rate of change of this motion index value was calculated over the previous eight data points, then the slope of motion index was compared with threshold values to decide on the type of motion.
The motion decision was determined on the Arduino side of the architecture, and a coded decision corresponding to the proper motion was sent as part of each Bluetooth transmission to the Android application. Raw sensor values were also sent to the Android handheld, allowing the user to see real-time graphs of accelerometer, magnetometer, and gyroscope data, along with the final decision of the motion.
Our improved system can distinguish between running, walking, jumping, bending over, and being at rest. Each of these activities leads to a distinct and corresponding voltage pattern. As the voltage is sampled through the Arduino and plotted in real time on the smartphone, the signal patterns for each activity are easily distinguishable. Each motion pattern has a unique frequency and maximum magnitude; the uniqueness of these features led us to use both properties in our detection algorithm. Each activity is distinguishable as all motions present itself as peaks or spikes. Similarly, the patterns that were gathered for more complex activities such as waving and typing were distinguishable, though the magnitude of these activities was too small to be accurately and consistently measured in our tests without integrating the op-amp amplifier circuit [3].
To integrate detection into our system, though, pattern detection algorithms needed to be run on the data. Early attempts at detection were performed offline in MATLABTM, using sampled data collected by the Arduino. These attempts were a combination of principal component analysis (PCA) and the K-nearest neighbour (KNN) algorithm through a toolbox implementation for MATLABTM [19]. PCA is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components. Moreover, the KNN's algorithm (kNN) is a non-parametric method used for classification and regression. In both cases, the input consists of the k closest training examples in the feature space. The output depends on whether kNN is used for classification or regression.
These techniques required a strict standardisation of the data set to accurately detect a pattern. Some of the equations that are used in this process are shown below and are critical in the performance of this algorithm. To start, the data is standardised uniquely based on the application; we tried many ways to attempt to keep the data as similar as it was before this process. Following this, covariance and eigenvectors were required to calculate the principal component (PC) score for each point [20]. The equation to determine correlation and covariance matrices and the definition of eigenvalues and eigenvectors are displayed in equations below:
[IMAGE OMITTED. SEE PDF]
Fig. 15 shows the issues that were encountered. The graph of KNN shows two distinct data sets recognised by the programme, but when the actual classification of this data occurred, the classifier puts both motions under the same class all the time. Owing to this, we decided to develop our algorithm that considers the most polarising properties of the data, namely the frequency and magnitude of the sampled signals. Prior works use different methods/algorithms to capture the user's daily activities [22–25].
7 Activity identification algorithm
In our prototype system, which used a commercial sensor, we developed a mathematical model that would allow us to predict which activity was being performed. This equation utilised values from the accelerometer and gyroscope on the market sensor to calculate an overall magnitude. The slope is measured from the calculated signal magnitude graphs, in which the slope would determine which motion was being performed. This was an effective way of approaching detection, though it was slow and not always the most accurate with a detection rate of about ∼70%. There is still room to increase detection accuracy. The magnitude equation that we used for our values and equation to calculate slopes are displayed in the equations below:
We have designed and developed our algorithm after several experiments and tests using KNN-based algorithm to get accurate classifications accuracy. This algorithm was focused on a score that took the unique attributes of each movement such as the mean and standard deviation of the sampled data. Then, we calculate the difference between adjacent data points to model the first-order derivative. From this derivative data set, the maximum value and the number of zero crossings were found. The score was calculated by combining these four properties in a way that led to a unique range of values for each activity. Equations (7) and (8) represent the calculation of standard deviation and mean of the sensor signal
Table 4 Confusion matrix of simple motion-based classification
Activity | Resting | Walking | Running | Jumping | Bending | |
resting | 3 | 0 | 0 | 0 | 0 | actual class |
walking | 0 | 3 | 0 | 0 | 0 | |
running | 0 | 1 | 2 | 0 | 0 | |
jumping | 0 | 1 | 0 | 2 | 0 | |
bending | 0 | 0 | 0 | 0 | 3 | |
predicted class |
We believe that a calibration mode can improve the incorrect determination for better classification accuracy. In the smartphone implementation mode, a new user could go through all possible activities and establish a sample baseline data set to determine their specific threshold values.
For the complex motion detection (such as typing, waving a hand, eating, and dishwashing), we used the same features as described above from the sensing signals. We integrated an amplifier into our system during testing and accurately amplified signals from the body node. We believe that the small levels of current generated by the body for complex motion in our sensing system are the reason for the inaccurate amplification of the raw body signals. Owing to these issues with amplification, our team focused most of our efforts on distinguishing between the motions of running, walking, jumping, and bending over. We are still working on the complex motion for better detection accuracy. Currently, the accuracy for the complex motion detection using our system is not high. We are in the process of collecting more complex motion data using our system for different age groups in a laboratory environment. We are collecting complex activity data using our system and from the built-in smartphone accelerometer. It is observed that the complex activity detection using our system had better accuracy than the accuracy using smartphone accelerometer. The dynamic range of the smartphone sensor is not adequate to accurately detect the complex motion.
8 Conclusion
In this paper, we have developed an integrated cyber-human system for motion detection. We use our developed sensor to validate the proposed approach and the real-time simple and complex activity detection in users. The results from different data sets are also presented to show that this approach provides a high degree of classification accuracy in distinguishing between at rest, walking, running, jumping, and body bending patterns. The system may also find multiple applications in behaviour detection for people with various disabilities.
To test the chronological permanence and long-term feasibility of our approach in the future, we plan to test our system with people who suffer from chronic health problems once we get the Institutional Review Board approval. Also, we plan to measure the different spatiotemporal parameters of the user during daily activities. Additionally, the system can be used in the smart home monitoring system for future wireless technology. Also, we plan to miniaturise the circuit in a printed circuit board to make our system user-friendly.
9 Acknowledgments
We thank the anonymous reviewers for their valuable comments, which helped us to improve this paper. Also, thank Ms. Tina Carico and Jeff Peterson from the Department of Electrical and Computer Engineering at Miami University for their help and support.
Cohn, G., Gupta, S., Lee, T. et al.: ‘An ultra‐low‐power human body motion sensor using static electric field sensing’. Proc. 2012 ACM Conf. Ubiquitous Computing (Ubicomp), Pittsburgh, PA, USA, September 2012, pp. 99–102
‘Research articles’, Cyber‐Human Systems (CHS), National Science Foundation (NSF)
‘Dexsilicium.com (2013) and microchip MCP6041’. Available at https://www.dexsilicium.com/Microchip_MCP6041.pdf, accessed June 2019
Bennett, B.: ‘Fitbit Flex review’. Available at https://www.cnet.com/reviews/fitbit‐flex‐review/, accessed June 2019
Konrad, L., Bor‐rong, C., Geoffrey, W.C. et al.: ‘Mercury: a wearable sensor network platform for high‐fidelity motion analysis’. Proc. ACM SenSys'09, Berkeley, CA, USA, November 2009, pp. 183–196
Trost, S.G., Zheng, Y., Wong, W.K.: ‘Machine learning for activity recognition: hip versus wrist data’, Physiol. Meas., 2014, 35, (11), pp. 2183–2189
Da‐Silva, F.G., Galeazzo, E.: ‘Accelerometer based intelligent system for human movement recognition’. Proc. Fifth IEEE Int. Workshop on Advances in Sensors and Interfaces (IWASI), Bari, Italy, June 2013, pp. 20–24
Ramos‐Garcia, R.I., Hoover, A.W.: ‘A study of temporal action sequencing during consumption of a meal’. Proc. ACM Int. Conf. Bioinformatics, Computational Biology and Biomedical Informatics, Washington, D.C., USA, September 2013, p. 68
Scholl, P.M., Van, K.: ‘A feasibility study of wrist‐worn accelerometer based detection of smoking habits’. Proc. Sixth IEEE Int. Conf. Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS), Palermo, Italy, July 2012, pp. 886–891
Parate, A., Chiu, M.C., Chadowitz, C. et al.: ‘Risq: recognizing smoking gestures with inertial sensors on a wristband’. Proc. 12th Annual Int. Conf. Mobile Systems, Applications, and Services, Bretton Woods, NH, USA, June 2014, pp. 149–161
Ming‐Zher, P., Nicholas, C.S., Rosalind, W.P.: ‘A wearable sensor for unobtrusive, long‐term assessment of electrodermal activity’, IEEE Trans. Biomed. Eng., 2010, 57, (5), pp. 1243–1252
‘Withings inspire health (WIH), pulse Ox track improve’. Available at https://www.amazon.com/Withings‐Pulse‐O2‐Activity‐Sleep‐and‐Heart‐Rate‐SPO2‐Tracker‐for‐iOS‐and‐Android/dp/B00JQ6YA6O, accessed June 2019
Edward, S.S., George, F., James, H. et al.: ‘Monitoring of posture allocations and activities by a shoe‐based wearable sensor’, IEEE Trans. Biomed. Eng., 2011, 58, (4), pp. 983–990
Shyamal, P., Konrad, L., Richard, H. et al.: ‘Analysis of feature space for monitoring persons with Parkinson's disease with application to a wireless wearable sensor system’, Comput. Biol. Med., 2017, 89, (c), pp. 379–388
Maurizio, G., Matteo, L., Dan, B. et al.: Inc., Cambridge, MA., USA, and Milan, Italy, MIT, Cambridge, MA, USA.: ‘Empatica E3 – a wearable wireless multi‐sensor device for real‐time computerized biofeedback and data acquisition’. January 07, 2015
‘Misfit shine fitness (MSF) + sleep monitor. Burlin‐game (CA): misfit’. Available at https://misfit.com/misfit‐shine‐2, accessed June 2019
Electronicaestudio.com.: ‘HC‐05‐bluetooth to serial port module’, 2018. Available at https://www.electronicaestudio.com/wp‐content/uploads/2018/09/BT811d.pdf, accessed June 2019
Arduino.cc: ‘Arduino reference’, 2018. Available at https://www.arduino.cc/, accessed June 2019
Wold, S., Esbensen, K., Geladi, P.: ‘Principal component analysis’, Chemometr. Intell. Lab. Syst., 1987, 2, (1–3), pp. 37–52
Duin, R.P.W., Juszczak, P., Paclik, P. et al.: ‘A MATLAB toolbox for pattern recognition’, PRTools, 2000, 3, pp. 109–111
Fukunaga, K., Narendra, P.M.: ‘A branch and bound algorithm for computing k‐nearest neighbors’, IEEE Trans. Comput., 1975, 100, (7), pp. 750–753
Guiry, J.J., van de Ven, P., Nelson, J.: ‘Multi‐sensor fusion for enhanced contextual awareness of everyday activities with ubiquitous devices’, Sensors (Basel), 2014, 14, pp. 5687–5701
Robert‐Andrei, V., Ciprian, D., Lidia, B. et al.: ‘Human physical activity recognition using smartphone sensors’, Sensors, 2019, 19, (3), p. 458
Shoaib, M., Bosch, S., Incel, O.D. et al.: ‘Fusion of smartphone motion sensors for physical activity recognition’, Sensors, 2014, 14, pp. 10146–10176
Abukhary, N., Mustafah, Y.: ‘Real‐time human activity recognition’. IOP Conf. Ser. Mater. Sci. Eng., 2017, 260, pp. 12–17
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2020. This work is published under http://creativecommons.org/licenses/by/3.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Recently, the commercial market has seen an increase in the availability of smart wearable Internet of things (IoT) devices (wearables) including such items as: smart shoes, smart watches, wrist bands, and pendants. Many of these devices are part of the human‐in‐the‐loop cyber‐physical systems. In this research, the authors have designed and developed an embedded sensory IoT system with a low‐power Bluetooth communication module to collect body single node voltage using a smartphone. Their approach for sensing the user's movement builds on work in the electric field sensing. Experimentation and verification have been conducted on a group of test subjects with different test scenarios including remaining at rest, walking, jumping, running, hand waving, eating, and bending over. They designed and developed their sensor to detect body motion data, and then used their algorithm to analyse the collected data. This study introduces the use of signal processing techniques for sensor data analytics to detect human body motion. The system can detect activity with a high degree of accuracy (∼ 87%).
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer