Abstract: Air quality in many poultry buildings is less than desirable. However, the measurement of concentrations of airborne pollutants in livestock buildings is generally quite difficult. To counter this, the development of an autonomous robot that could collect key environmental data continuously in livestock buildings was initiated. This research presents a specific part of the larger study that focused on the preliminary laboratory test for evaluating the navigation precision of the robot being developed under the different ground surface conditions and different localization algorithm according internal sensors. The construction of the robot was such that each wheel of the robot was driven by an independent DC motor with four odometers fixed on each motor. The inertial measurement unit (IMU) was rigidly fixed on the robot vehicle platform. The research focused on using the internal sensors to calculate the robot position (x, y, θ) through three different methods. The first method relied only on odometer dead reckoning (ODR), the second method was the combination of odometer and gyroscope data dead reckoning (OGDR) and the last method was based on Kalman filter data fusion algorithm (KFDF). A series of tests were completed to generate the robot's trajectory and analyse the localisation accuracy. These tests were conducted on different types of surfaces and path profiles. The results proved that the ODR calculation of the position of the robot is inaccurate due to the cumulative errors and the large deviation of the heading angle estimate. However, improved use of the gyroscope data of the IMU sensor improved the accuracy of the robot heading angle estimate. The KFDF calculation resulted in a better heading angle estimate than the ODR or OGDR calculations. The ground type was also found to be an influencing factor of localisation errors.
Keywords: autonomous robot, air quality, navigation, Kalman filter data fusion, livestock building, robot localization
(ProQuest: ... denotes formula omitted.)
1 Introduction
The air quality inside a livestock building is more and more taken seriously by the researchers. Previous research has demonstrated that sub-optimal air quality not only could influence the productivity of farm animals, as well as the health and well-being of livestock and workers[1,2], but also will affect the healthy and sustainable development of the pig industry[3,4]. With the development of electronic technology, detection technology, information and communication technology (ICT), a variety of air quality measurement systems are being studied. The earliest livestock environment monitor system was developed based-on micro-chip computer, this system collected the data of temperature, humidity, the concentration of CO2 and the concentration of NH3 inside the animal buildings[5], adjust the data through heater equipment and fans. The commercial technologies with better anti-interference ability, such as Industrial personal computer (IPC), Programmable controller (PLC), field bus, etc., were suitable to be used for the animal environment monitor system, collected environment data, drove the ventilation system[6-8]. The above said are the traditional distributed system, there are wiring complex, easy to cause the problems of poor contact, maintenance difficulties, and the high cost, etc.. The development rapidly of Wireless technologies, wireless transmission has an obvious advantage of significant reduction and simplification in wiring and harness, low wiring cost. Therefore, the wireless system will be instead of wired system using to measure the environment data of animal building[9-11]. As the indoor environmental parameters distribution is inhomogeneity. It is very difficult to choose the data collection points and to deploy the wireless sensors. The environmental condition in livestock buildings are needed to be monitored frequently, flexibly, freely. A portable, low cost, mobile instrument was recently developed in Australia[12-16], it is able to fix on any point indoor to measure the environment data, but this system was immobile. Thus, a project with a long-term aim to develop an autonomous unmanned survey vehicle or a Livestock Building Guard (LBG) was initiated.
Agricultural robots are being applied in variety of areas to execute tasks that are tedious, repetitive, dirty, hazardous and dangerous. Some of the well know applications of livestock robotics are: (1) Automated Milking Systems (AMS) that significantly reduced the amount of labor involved in milking[17]; (2) virtual fencing that was developed for controlling the movements of free-range cattle[18,19]; (3) and cleaning robots that were designed for improving the hygiene levels of piggery buildings[20]. The LBG is a livestock robot that would be used for continuous real-time measurement of environment factors within livestock buildings, thus improving poultry welfare and productivity.
Self-localization is a very important task for autonomous robots in real-world environments[21]. The accurately determining position and orientation of mobile robots is the basis of accurate navigation[22]. Relative localization and absolute localization are two different kinds of localizations which depend on the sensors utilized[23-26]. Autonomous mobile robot navigation sensors could be divided into two categories: internal sensors and external sensors. Internal sensors measure the location and heading angle of the mobile robot. External sensors observe the environment[27-30]. Relative localization method usually utilize internal sensors fixed on-board the autonomous vehicle, such as odometer, gyroscope, accelerometer, compass, etc. The research focused on robots that envisaged to be used inside livestock buildings. As the wheel mobile PBG moves continuously, know exactly what the location is the need of navigation and parameters detection. The livestock building ground usually is hard ground, due to feed, grass and other scattered, the ground situation will change softer. The specific aim of this article is to present the results of a specific section of the larger project that focused on the preliminary laboratory test for evaluating the precision of the navigation accuracy of the robot that is being developed under the different ground surface conditions and different localization algorithm according internal sensors.
2 Materials and methods
2.1 Robotic vehicle
The robot was constructed using off-the-shelf vehicle chassis and electronics components. The vehicle chassis had four motors (i.e. each wheel was steered by one motor) and each DC motor (DC-Direct Current) was independently controlled by a PWM (Pulse Width Modulation) digital motor driver (Sabertooth) (Dimension Engineering, Sabertooth 2×10R/C, Illinois, U.S.A). The motors' speed was monitored with four US-Digital Encoder (QME-01) (National Instruments, Austin, Texas, U.S.A). The robot was controlled by a PC compatible embedded computer (Arduino, Sandy Bridge-the 2nd Gen Intel@) (Arduino, Sandy Bridge-the 2nd Gen Intel@, Ivrea, Italy) and a low cost 9DOF MEMS IMU sensor (SEN-1072) (SEN-1072, Analog Devices, Norwood, Massachusetts, U.S.A) was used to measure the vehicle's attitude information.
The control system steered the wheels via motors connected to the wheels shaftand achieved the vehicle turn by changing the speed of the motors situated on the leftand right sides (i.e. a differential steering system).
The system movement control parameters were the four wheels' angular velocity of rotation. When the two sides of wheels had the same speed but opposite directions, the vehicle was able to turn sharply. The system structure and the robot are shown in Figure 1.
The measurement sensors (temperature, relative humidity and dust sensors) (MA-DFR0066, little Bird Electronics, Sydney, NSW, Australia) were fixed to the vehicle platform and connected to the embedded computer indirectly through an Arduino microcontroller (Sandy Bridge-the 2nd Gen Intel@, Ivrea, Italy).
2.2 Dead-reckoning
Dead reckoning (DR) is a method of mobile robot localization via a simple mathematical procedure to determine the present location of a robot[31]. Dead reckoning using odometers and an Inertial Measurement Units (IMU) is a simple method of achieving robot localisation.
As the robot moves on the ground, the odometers produce a signal sequence to input to the on-board computer. The number of signal pulses and time interval was converted into the distances travelled by the Equation (1):
... 1
where, ΔD is the distance moved by each side; ΔM is the change in the odometer count from this step to the next; C is the circumference of the wheel; G is the gear ratio; N is the number of odometer marks per wheel revolution.
Assuming that the whole robot vehicle is symmetrical, the two leftwheels and the two right wheels have the same velocity. When the leftwheel speed is equal to the right wheel speed, the robot vehicle will go on a straight line. When the two speeds are not equal, the robot vehicle will turn and the turning angle will be dependent on the difference of the two speeds. It is assumed that the change in position for each time interval is very small, thus the heading change can be approximated by:
... 2
where, W is the robot vehicle's width of leftand right wheel's center lines; ΔDL, ΔDR are the robot vehicle's leftand right side distances travelled in the Δt time interval. So the position (xe, ye, θe) of the robot in the indoor environment could be calculated cumulatively, the calculation method is shown in the Equation (3):
... 3
2.3 Localization based-on Kalman Filter data fusion
The accuracy of gyroscope heading generally deteriorates with time[32]. However, methods such as Kalman Filter could be employed to reduce errors due to the random bias driftof gyroscope. Kalman filter is an optimal recursive data processing algorithms and a technique for state and parameter estimation. Navigation is one of the useful applications of the Kalman filter[26]. Kalman filter is the most widely used data fusion tools. The dead-reckoning localization could potentially degrade the accuracy robot localization over long periods of operation. The Kalman filter calculation combine odometers increment heading angel and gyroscope heading angle information to derive the optimal heading angel estimate, leading to more accurate robot position estimation.
Assuming z-axis gyroscope output the z-axis angular velocity ωgk, this value is the robot vehicle's angular velocity in the ground plane (x-y plane), the once integrated, it is the heading angular φk of the robot vehicle.
The gyroscope heading angel equation is calculated as below Equation (4):
... 4
Then the heading angle calculation equation, which combine odometer increment heading angel and gyroscope angel, should be:
... 5
According the vehicle's kinematics equation:
... 6
... 7
From the Kalman filter state equation: Xk+1=AkXk + Bk + wk, we could drive the robot Kalman filter state equation, as demonstrated below:
... 8
The Kalman filter measurement equation style:
Zk = HXk + vk, the robot kalman filter measurement equation is as below:
... 9
In the Equations (8) and (9), wk, vk are the process noise and measurement noise of the robot system respectively, while they are zero-mean Gaussian white noise, their covariances matrix are Q, R respectively.
2.3 Experimental setup
A number of experiments were conducted to assess the accuracy of the dead-reckoning method in relation to robot localization. After planning a predetermined path on the ground, the robot vehicle was initialised at the origin of the established path profile. The centre of the robot vehicle gravity was overlapped with the point of origin and while the robot was driven from the start to the end point it was stopped periodically. At each stop, a small marker was placed on the ground to indicate the location of the robot via a pointer attached to the robot.
After the robot moved along the whole path profile, the distances between these markers and the coordinates of the originally planned path were measured using a tape measure. These measurements described the difference between the actual and pre-planned path trajectory. While the robot was in motion, the odometers and IMU sensor data was also collected in the robot on-board computer. After finishing the test, the sensors' data were recovered. The same tests were repeated on different types of surfaces, such as carpet, concrete and sawdust. The experimental setup on the carpet surface is shown in Figure 2.
In order to accurately grasp the problems inherent with the odometer positions, a number of tests were conducted to establish the expected errors during both a straight line and rotational movements.
Localization accuracy was calculated based on using data from (1) odometers only, on (2) using combined data from odometers together with gyroscope data, and on (3) using Kalman filter data fusion algorithm. The 'odometers only' dead-reckoning (ODR) method was a method of calculating the position of the robot using only odometers data. The 'combined odometers and gyroscope dead-reckoning' (OGDR) method is a method of calculating the position parameters using odometers and gyroscope data, and the heading angle only used the once integrated value of the rate gyroscope data. The Kalman filter data fusion algorithm (KFDF) uses the Kalman filter recursive method to derive the optimal estimate heading angel, and then calculate the position by fusing the gyroscope and odometer data.
2.4 Offset compensation and error correction
The robot's movement/trajectory was described based on the centre of the gravity. However, the movement of the robot could not be recorded directly at the centre of gravity. Instead, a small laser pointer was attached to the back of the robot rigidly. When the robot moved on the ground along a pre-planned path, the pointer position was marked and measured. However, as mentioned before, this measurement was not taken at the centre of the gravity of the robot. Thus, the truth position of the robot was calculated by using a compensation offset value. The offset value calculation is shown in Figure 3. We know from the geometry relation that:
... 10
...11
where, a, b are the size of determining by the distance of centre of gravity and measurement pointer; θ is the robot vehicle heading angle at P position.
It was recognised, that localisation errors would accumulate over time during the dead-reckoning calculations based-on odometer data only. In order to correct the errors, a number of small tests were conducted to establish the expected errors as the robot vehicle moved. The robot was driven on a straight line and the corresponding odometers' data were recovered and used to calculate the distance that wheels have travelled. The percentage ratio of the calculated distance over the actual distance was taken as the linear error of each point. Following that, the average percentage error (APE) of each wheel and their standard deviation was calculated. The purpose of rotation test was to calculate the error of rotation. Four tests were conducted for leftturning and four for right turning and the average percentage error of each wheels rotation and their standard deviation were calculated. Once the errors associated with both the linear travel and rotation were established; the offset compensation value was also calculated.
3 Results and discussion
3.1 Linear error tests
In order to establish the accuracy of the odometers in a straight line, a series of 8 linear motion tests were performed. The linear error tests results of three type ground were shown in Table 1.
According to the two-way analysis of variance (ANOVA), the difference between the ground surfaces is very significant APE (p=0.002?0.01); The difference between the wheels is very significant APE (p=0.002?0.05). No significant difference in SD between different ground surfaces (p=0.12>0.05); No significant difference in SD between different wheels (p=0.789>0.05).
It is evident from the results presented in Table 1 that when the robot vehicle was driven to move along a straight line, all the average percentages errors were between 5% and 10% with standard deviations below 1%. On the carpet surface, the front wheels' APEs were larger than the two back wheels'. The right two wheels' APE was also smaller than the corresponding two leftwheels'. On the concrete surface, just like on the carpet surface the front wheels' APEs were larger than the two back wheels'.
There are three reasons of this situation. First, the robot's mass was not equally distributed across its area. Due to the position of computer mountings, batteries etc, and the majority of mass was in the back third of the machine. Measurements showed that the centre of mass was 120 mm from the back of the machine as overall length of 270 mm. Thus, two back wheels supported 55.56% of the robot's weight. Therefore, the backend was more stable than the frontend while the robot moved. Second, subjective measurement errors were associated with the measured distances. Third, the robot was driven by manual operation, and the two leftwheels driver signal were not necessarily always the same as the two right wheels'. Therefore, there was probably some difference of the two side wheels movement distance.
On the sawdust surface the two front wheels' APE were smaller than the two back wheels' APE. Both of APEs and SDs were the largest of all test measurements. These errors were markedly larger than the APEs measured on carpet and concrete surfaces.
The reason of this situation except for the reasons described as on the carpet ground and concrete ground above, also include a main reason regarding the characteristic of ground. Due to the sawdust surfaces being loosely packed, the wheels piled sawdust under the vehicle. When the robot moved on the loose surfaces, the wheels would spin freely. The resistance to the front wheels from the sawdust was larger than that on the back wheels. So the odometers data were not accurate, the front wheels' APE and SD were larger.
To improve the linear errors and ensure the accuracy, we should address several aspects: (1) Try to make the vehicle mass uniform, when we design and make a robot vehicle; (2) Try to find a suitable firm degree and plane surfaces for the experiment situation; (3) Try to adjust and ensure the vehicle chassis four wheels touch the surfaces on average; (4) Try to reduce subjective measurement error and improve the artificial measurement accuracy; (5) Write the autonomous programme to control the vehicle moving, reduce the difference between the leftdrive motors and the right drive motors drive signal of manual control.
3.2 Rotation error tests
A similar series of tests were conducted on the robot undergoing rotation. The actual rotation was 180 degrees and four tests were conducted for leftturn and four for right turn. The rotation error tests results of three type ground were shown in Table 2.
According to the two-way ANOVA analysis, there was no significant difference in APE (p=0.326>0.05) and SD (p=0.906>0.05) obtained from different ground surfaces. There was no significant difference between the different wheels of APE (p=0.197>0.05) and SD (p=0.992>0.05).
From the Table 2, on the carpet and concrete surfaces, the APE and SD of rotation tests is very different. The right and leftturning errors are quite different as the right turning APEs are larger than the leftturning APEs. However, it seems that the turning errors were always large and the high SDs indicate that the error measurements were not always reliable.
There are potentially a number of reasons for these results. First, as the robotic vehicle was dependent on skid steering, the rotation error might change based on varying friction provided by the different surfaces. Second, the odometers output pulse might produce larger rotation error compared to the actual rotation in skid steering process. Third, if the ground surfaces were rough, (even if the leftand right wheels were aligned), the four wheels would not move in harmony and thus the wheels that are in contact with the terrain would have inconsistent coherence forces. Thus, the odometers' input pulse will register larger error than the actual movement.
The rotational error tests were not conducted on the sawdust surface (Table 2) because the sawdust was so loose that the robotic vehicle was unable to execute the turning properly. The wheels would pile sawdust under the machine, lifting it and causing the wheels to spin freely. Only by applying extra power was it possible to extricate the robot. This problem rendered the turning odometer data highly inaccurate and made the test too inaccurate to be of any use.
Potential ways of improving the rotation errors to ensure higher localisation accuracy would be (1) to use the robot only on firm surfaces that would reduce the slip error of skid steering rotation (but this would reduce the usability of the robot in livestock buildings); (2) to regulate the leftand right wheels' axes, ensure that the four wheels contact the floor uniformly; (3) to ensure even roughness of the floor.
3.4 Path tests
Four different paths tests were designed. Three of them were loop path and the tests were conducted on carpet surface, concrete surface, sawdust surface, respectively. One path was the zigzag path on concrete surface. The odometers data were recovered from the robot computer after each test and the moving trajectory was calculated in Matlab7.0 software (MathWorks, Massachusetts, USA). After incorporation of the error and offset compensation, the corrections yielded a more accurate plot. The ODR calculated path, the ground truth measured position, and the KFDF fusion calculated path are presented in Figure 4 to illustrate accuracy.
In Figures 4a, and 4b, the difference between ODR calculated, OGDR calculated, KFDF calculated, and ground true paths is shown on the carpet surface. Initially the first straight line of the ground truth path is high overlapping with the ODR calculated path, but on the first corner of turning, the heading angle deviated and thus the subsequent paths did not overlap. After the second corner of turning, the heading angle bias was even larger. It appears that gyroscope calculated (OGDR) path (Figure 4a) was much closer to the actual path on the carpet surface, but after turning twice, the heading angel had a larger deviation. The fusion calculated (KFDF) path (Figure 4b) was closer to the measured ground path, especially, when the heading angel was close 90°. The main reason for this difference is the accuracy of heading angle calculation. The heading angle calculated by using the odometers' data has a larger bias compared to the heading angle calculated based on the gyroscope data (OGDR and KFDF). Therefore, the turning angle will be larger in ODR calculated path.
Similar picture emerged on the concrete surface (Figures 4c and 4d). While the ODR calculated path on concrete was marginally better, (i.e. more accurate) than the ODR calculated path on carpet; there were still significant inaccuracies associated with the heading angle calculation. Evidently, the harder concrete surface allowed the robot vehicle to turn sharply around the corners of the predetermined path by providing a better grip for the wheels and thus reducing slippage. Interestingly on concrete (Figure 4c) the OGDR calculated path had a higher rate of error when compared to the ODR calculated path (Figure 4d). One possible explanation for this is the fact that systems that rely on gyroscope data will produce larger error when robots move on surfaces that cause excess vibration. During this study, the gyroscope was fixed on the robot vehicle rigidly. When skid-steered robotic vehicles move on surfaces that have sufficiently large coefficient of friction; these vehicles will produce larger vibrations and thus gyroscopes will produce a fluctuation error. When such fluctuating error is integrated, a sufficiently large heading angle error might be produced. However, further studies are needed to confirm the validity of this hypothesis.
The fusion (KFDF) calculated path on concrete surface (Figure 4d) showed that the heading angel error was calculated better, each turning angel was near 90°. On sawdust (Figures 4e and 4f) the localisation error calculated by ODR was even larger than localisation errors demonstrated during the previous two trials. While the utilisation of gyroscope data (OGDR, KFDF) markedly reduced localisation errors; the loose surface of sawdust made robot localisation (using either ODR or OGDR, KFDF methods) practically unusable. When the robot followed the zigzag path (Figure 4g) slowly on concrete surface, the localisation error calculated by ODR (Figure 4h) was accumulated after two turning. However, the KFDF calculated path estimate showed a more accurate heading angel estimate. Overall, these experiments demonstrated that KFDF method could potentially reduce the heading angel error estimate.
4 Conclusions and suggestion
Over the course of numerous tests, it was apparent that the inertial sensors could not, on their own, calculate the robot's position with practical accuracy. This is especially noticeable when only odometers are used. However, after combining the odometers with the rate gyroscope, a more accurate localisation estimate was possible. Using the Kalman filter algorithm, the heading angel estimation was improved. In both cases, localisation accuracy was heavily influenced by the terrain composition.
4.1 Odometers only
Due to the inherent constraints of the system tested and the odometers used, these study results demonstrated that odometer-based dead reckoning was associated with both systematic and non-systematic errors resulting in a less localisation accuracy when used without supporting sensors. The research team assumed that the robot vehicle used was symmetric and its geometric centre would be overlapping with the centre of gravity. In addition, it was assumed that the four wheels had the same fixed size and turning rate, which was evidently not the case during this study. Thus, system errors were neglected.
After analysing the rotation test data, it was found that the average percentage errors were not large, but the standard deviation was significant. Therefore using the odometric data only to calculate the heading angle could create inaccurate or inconsistent results. On the firmer surfaces (concrete and carpet), the odometer-based dead reckoning was more accurate in comparison with the loose surfaces (Figures 4b, 4d, 4f and 4h). Thus, localisation accuracy was closely related to surface cohesion. However, it can be concluded that using the odometer calculations alone, the heading angle calculation had very large errors on all surfaces.
4.2 Rate gyroscope and odometers
It was obvious from the results that method of using the gyroscope data to calculate the heading angle was more accurate than the method that was based on only the odometer data. However, the study results demonstrated that this method might be prone to additional errors caused by vibration on certain surfaces, such as concrete. Generally, it was concluded that using gyroscope data to calculate the heading angle created a smaller error than using the heading angle calculated from the odometers (Figures 4a and 4e) reading alone. However, on concrete (Figure 4c) gyroscope based localisation created a larger error possibly due to vibration or due to the gyroscope drift.
4.3 Data fusion
The results (Figures 4b, 4d, 4f and 4h) demonstrated that an improved heading estimate was achieved during the path tests, after the Kalman filter data fusion (KFDF) was applied, which combined the rate gyroscope heading angel and odometers heading angel information. The KFDF method improved the calculated heading angel accuracy, as it corrected the heading angel deviation to ensure that the heading angel was closer to the direction of the robot body (Figures 4b, 4d and 4h). Especially, in the case of larger gyroscope data drift(Figure 4d), the calculated heading angle tracked the true heading angle well, improving localisation accuracy.
4.4 Effect of ground type on slip steer and localization
Different surfaces produce different non-systematic errors and loose surfaces tend to produce greater amounts of non-systematic errors compared to hard surfaces. Using the gyroscope data to calculate the heading angle (ODGR, KFDF) is a better solution than only using odometer data, when the vehicle moved in a smooth, stable fashion. Even the same surface (Figures 4d and 4h), the different tests resulted in different accuracy of localisation. This might be related to the speed of the mobile robot. A lower and constant speed (Figure 4h) can reduce the steering slippage, so the odometers accumulation error might also be reduced resulting in better navigational accuracy.
4.5 Suggestions
The final goal of the robot is that it can move independently in the shed and collect data. First, the further development of the LBG should focus on the localization and navigation accuracy through increasing the navigation sensors and navigation algorithm. Second, we still need to further develop the robot before the on-farm experiment can happen, for example, the robot is fitted with appropriate protective measures, such as special, soft'bumper' bars to protect all components from dust and to protect the birds from the robot.
The improvement of mobile robot localization accuracy might be via (1) exploring other relative localization methods other than dead reckoning, (2) employing a more complete multi-sensor data fusion, (3) exploring additional internal sensors (like laser range finder) to correct the heading error and (4) potentially by employing machine vision to improve localization accuracy.
Acknowledgements
We would like to acknowledge the assistance of staffat the University of Southern Queensland and the National Centre of Engineering in Agriculture (NCEA), the funding support of science and technology project of Guangdong Province (2014A020208107) and international agriculture aviation pesticide spraying technology joint laboratory project (2015B05050100).
Citation: Qi H X, Banhazi T M, Zhang Z G, Low T, Brookshaw I J. Preliminary laboratory test on navigation accuracy of an autonomous robot for measuring air quality in livestock buildings. Int J Agric & Biol Eng, 2016; 9(2): 29-39.
[References]
[1] Banhazi T M, Seedorf J, Rutley D L, Pitchford W S Identification of Risk Factors for Sub-Optimal Housing Conditions in Australian Piggeries: Part 1. Study Justification and Design. Journal of Agricultural Safety and Health, 2008; 14: 5-20.
[2] Banhazi T M, Seedorf J, Laffrique M, Rutley D L. Identification of the risk factors for high airborne particle concentrations in broiler buildings using statistical modelling. Biosystems Engineering, 2008; 101: 100-110.
[3] Banhazi T M, Rutley D L, Pitchford W S. Validation and fine-tuning of a predictive model for air quality in livestock buildings. Biosystems Engineering, 2010; 105: 395-401.
[4] Qi H X, Zhang T M, Luo X W, Banhazi T. Advances and prospects of environment monitoring techniques in modern piggery. Acta Ecologae Animalis Domastici, 2015; 36(4): 1-14.
[5] Li L F. Study on monitoring and controlling system for the environment of delivery sow house in northern cold region of China. PhD dissertation. Inner Mongolia Agricultural University, 2011; 4. 114p. (in Chinese)
[6] Wu W H. A study on piggery environmental monitoring and control system based on internet of things. Master thesis. Zhejiang University, 2014; 3. 69p. (in Chinese)
[7] Zhang Y F. Monitoring system of pig-on-site environment based on CAN bus. Master thesis. Jiangsu University, 2009; 4. 75p. (in Chinese)
[8] Dai C X. Measurement and control system of the piggery environmental factors based on field bus. Master thesis. Jiangsu University, 2007; 12. 79p. (in Chinese)
[9] Zhu W X, Dai C Y, Huang P. Environmental control system based on IOT for nursery pig house. Transactions of the CSAE, 2012; 28(11): 177-182. (in Chinese with English abstract)
[10] Liang W J, Cao J, Fan Y, Zhu K F, Wang Z F, Dai Q W. Environment monitoring system for swine house based on wireless sensors network. Jiangsu J. of Agr. Sci, 2013; 29(6): 1415-1420.
[11] Wang N, Zhang N Q, Wang M H. Wireless sensors in agriculture and food industry-Recent development and future perspective. Computers and Electronics in Agriculture, 2006; 50: 1-14.
[12] Banhazi T M. User friendly air quality monitoring system. Applied Engineering in Agriculture, 2009; 25(2): 281-290.
[13] Banhazi T M, Lehr H, Black J L, Crabtree H, Schofield P, Tscharke M, et al. Precision Livestock Farming: An international review of scientific and commercial aspects. Int J Agric & Biol Eng, 2012; 5(3): 1-9.
[14] Banhazi T M, Babinszky L, Halas V, Tscharke M. Precision livestock farming: Precision feeding technologies and sustainable livestock production. Int J Agric & Biol Eng, 2012; 5(4): 54-61.
[15] Ni J Q, Heber A J, Darr M J, Lim T T, Diehl C A, Bogan B W. Air quality monitoring and on-site computer system for livestock and poultry environment studies. Transactions of the ASABE, 2009; 52: 937-947.
[16] Banhazi T M, Currie E, Reed S, Lee I B, Aarnink A J A. Controlling the concentrations of airborne pollutants in piggery buildings, in Sustainable animal production: the challenges and potential developments for professional farming. Aland A, Madec F, Ed. Wageningen Academic Publishers, the Netherlands, 2009; pp.285-311.
[17] Van der Vorst Y, de Koning K. Automatic milking systems and milk quality in three European countries. The First North American Conference on Robotic Milking, Lelystad, Netherlands, 2002; pp.V1-V13.
[18] Butler Z, Corkey P, Rusx D. From robots to animals: virtual fences for controlling cattle. The International Journal for Robotics Research, 2006; 25: 485-508.
[19] Christina U. The evolution of virtual fences: A review. Computers and Electronics in Agriculture, 2011; 75: 10-22.
[20] Braithwaite I, Blanke M, Zhang G Q. Carstensen J M. Design of a vision-based sensor for autonomous pig house cleaning. EURASIP Journal on Applied Signal Processing, 2005; 13: 2005-2017.
[21] Ip Y L, Rad A B, KWong Y, Liu Y, Ren X M. A localization algorithm for autonomous mobile robots via a fuzzy tuned extended kalman filter. Advanced Robotics, 2010; 24: 179-206.
[22] Myung H, Lee H K, Choi K W, Bang S. Mobile robot localization with gyroscope and constrained Kalman filter. International Journal of Control, Automation and Systems, 2010; 8(3): 667-676.
[23] Jetto L, Longhi S, Vitali D. Localization of a wheeled mobile robot by sensor data fusion based on a fuzzy logic adapted Kalman filter. Control Engineering Practice, 1999; 7: 763-771. doi: http://dx.doi.org/10.1016/S0967-0661(99) 00028-3.
[24] Bayar G, Bergerman M, Koku A B, Konukseven E I. Localization and control of an autonomous orchard vehicle. Computers and Electronics in Agriculture, 2015; 115: 118-128.
[25] Yin X, Noboru N. Development and evaluation of a general-purpose electric off-road robot based on agricultural navigation. Int J Agric & Biol Eng, 2014; 7(5): 14-21.
[26] Welch G, Bishop G. An Introduction to the Kalman Filter. Department of Computer Science. University of North Carolina, 2004; TR 29-41.
[27] Cao Q X, Zhang L. Wheeled autonomous mobile robot. Shanghai Jiaotong University Press, 2012.
[28] Hague T, Marchant J A, Tillett N D. Ground based sensing systems for autonomous agricultural vehicles. Computers and Electronics in Agriculture, 2000; 25: 11-28.
[29] Hyun D, Yang H S, Yuk G H, Park H S. A dead reckoning sensor system and a tracking algorithm for mobile robots. Mechatronics, IEEE International Conference, 2009; pp.1-6.
[30] Han S, Zhang Q, Ni B, Reid J F. A guidance directrix approach to vision-based vehicle guidance systems. Computers and Electronics in Agriculture, 2004; 43: 179-195.
[31] Conner D C. Sensor fusion, navigation, and control of autonomous vehicles. Virginia Polytechnic Institute and State University, 2005.
[32] Lee T, Shin J Y, Cho D D. Position estimation for mobile robot using in-plane 3-axis IMU and active beacon. IEEE international symposium on industrial electronics (ISIE2009), Seoul Olympic Parktel, Seoul, Korea, 2009, July 5-8.
Qi Haixia1,2,4, Thomas M. Banhazi2*, Zhang Zhigang1,4, Tobias Low2,3, Iain J. Brookshaw3
(1. College of Engineering, South China Agricultural University, Guangzhou 510641, China; 2. National Centre for Engineering in Agriculture (NCEA), University of Southern Queensland (USQ), West Street, Toowoomba QLD, 4350, Australia; 3. Faculty of Engineering and Surveying, University of Southern Queensland, West Street, Toowoomba QLD 4350, Australia; 4. International Agriculture Aviation Pesticide Spraying Technology Joint Laboratory, Engineering Research Center for Agricultural Aviation Application, South China Agricultural University, Guangzhou 510641, China)
Received date: 2014-04-08 Accepted date: 2015-11-24
Biographies: Qi Haixia, PhD, Major in livestock environment measure and agricultural intelligent equipment navigation, Precision agriculture; Email: [email protected]; Zhang Zhigang, PhD, Major in precision agriculture, Email: [email protected]; T. Low, PhD, Major in agricultural robot, Email: Tobias.Low@ usq.edu.au; Iain J. Brookshaw, PhD, Major in digital image processing, Email: [email protected].
*Corresponding author: Thomas M. Banhazi, PhD, Associate Professor, Principal Scientist, National Centre for Engineering in Agriculture; Faculty of Engineering and Surveying, University of Southern Queensland; West Street, Toowoomba, Queensland 4350 Australia. Tel: +61(0)746311191, Fax: +61(0)746311870, Email: [email protected].
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright International Journal of Agricultural and Biological Engineering (IJABE) Mar 2016
Abstract
This research presents a specific part of the larger study that focused on the preliminary laboratory test for evaluating the navigation precision of the robot being developed under the different ground surface conditions and different localization algorithm according internal sensors. The inertial measurement unit (IMU) was rigidly fixed on the robot vehicle platform. The research focused on using the internal sensors to calculate the robot position through three different methods. The first method relied only on odometer dead reckoning (ODR), the second method was the combination of odometer and gyroscope data dead reckoning (OGDR) and the last method was based on Kalman filter data fusion algorithm (KFDF). A series of tests were completed to generate the robot's trajectory and analyse the localisation accuracy. These tests were conducted on different types of surfaces and path profiles. The results proved that the ODR calculation of the position of the robot is inaccurate due to the cumulative errors and the large deviation of the heading angle estimate.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer