In automatics, an "observer" is the software part that is designed to evaluate the state of the system that we try to control. On my UAV, the observer computes the 3 Euler angles, and the 3D position. There are 6 variables to compute. This is the part of the software that requires the biggest computing power, and the more code lines. This page describes all the parts that make the observer onboard of the UAV.

1.1) Sampling rate:

All the computing (observer and controller) 20 times per second, and there is no need to do it more often. But I set the sampling rate of the reading of UMI (gyroscopes and accelerometers) to 120 times per second. The goal of this "oversampling" is to filter these information via software. It is very important, because the information from these sensor is quite noisy.

The information delivered by gyroscope is the angular rate, which is used in the control-loop algorithm (Derivative term of the PID). Therefore, the information has to be very clean. It is therefore filtered via software. The filter is a 1st order low pass filter, at 20Hzh.

The information delivered by accelerometers is NOT used in the control-loop. The PID only use speed and position, not acceleration. Therefore, the filter is not necessary. The only computation that is done 120 times per second is the integration of the acceleration to obtain the speed. This is just a simplified integral, that will be used to compute the 3D displacements in the 20Hz computing task. The integration is only an addition in the software, and consumes only very few CPU resources.

1.2) 3D orientation with Gyroscopes:

This computation consists in making a mathematical integration of the angular rates given by gyroscopes. This is not easy, because it has ton be done in 3D. A simple rotation can induce variations of the 3 Euler angles at the same time. I use a simplified version of DCM algorithm (Direction Cosine Matrix). I use simplified formulas, to avoid consuming high CPU resources. In my UAV application, the helicopter never tilts more than 34°; I can therefore use simplified formulas.

The result of this operation gives the following angles:

Yaw angle (orientation)

Roll angle (tilt on one side left or right)

Pitch angle (tilt front or rear)

For pitch and roll angles, I use a high pass filter, because the UAV is flat on average (no wind in indoor environment). This compensates the offset error of the gyros.
During servos solicitation (servo command too high), and during high pitch or roll rates, this high pass filter is temporarily deactivated. This filter is necessary, because these 2 angles are the only information (within the 3 angles and 3 coordinates) that cannot be recalibrated with the help of other sensors (sonar, wii-camera or compass).

1.3) 3D orientation - Compass:

I had previously tried to use a 2D compass in my UAV. But I never managed to exploit this sensor, even if the theory tells that it is feasible. This is far much simple with a 3D sensor, which measures the 3 directions (x, y, z) of the magnetic field.

This part of the software estimates only the yaw angle. A yaw angle is estimated by the compass. But this angle can be noisy. Therefore, the yaw angle computed from the compass is only considered as a correction. The final yaw angle estimation converges slowly to the compass yaw angle. The gyroscopes determine the high frequencies components of the yaw angle, whereas the compass determines the low frequencies components.

I will now explain how the yaw angle is estimated with the 3D compass measurements. I use the 3D magnetic field vector measured by the compass, and the pitch and roll angles. These 2 angles were already determined at the previous chapter. The first step is to apply an inverse rotation to the measured compass vector, that corresponds to the roll and pitch angles. The goal is to have a 3D vector that would correspond to the measurement of a virtual 3D compass that would be flat (not tilted). In fact, I only have to compute 2 coordinates (x-y : longitudinal and lateral) of this vector.

Now, with this vector, it is very easy to compute the yaw angle. It is now similar to a 2D compass that would always be flat.

1.4) 3D position - Accelerometers :

This is one of the most complex parts. This does not work fine at the moment. This is partly due to the poor quality of the accelerometers that I use. Theoretically, it is quite simple to do. We already computed the 3D angles. We now have to compute the 3D position, and more precisely the 3D displacements.

First, I take the 3D acceleration vector measured by the 3 accelerometers. I apply a 3D rotation that corresponds to the 3 Euler angles that are already known : yaw, roll, and pitch. The result is the raw acceleration vector in the absolute frame of reference. Then, I subtract the gravity vector (vertical and negative). The result would ideally be the kinematic acceleration vector : the derivative of the speed vector.

Offset compensation.
Unfortunately, this vector has a big error. The offset vector is quite high, and is changing with the time. I have to correct it in real time. This is quite complex. I compute estimated the gravity vector in the UAV frame of reference. This is computed only with the roll and pitch angles. Then, for each sensor, each direction (x, y, z), I consider that the mean temporal value of the difference between [ measured_acceleration - gravity_projected_along_this_direction ] should be null. With this, I can compute the offset error, which is slowly compensated. Unfortunately, this part of the software doesn't work fine. It is very hard to tune. This might be due to the poor performances of the cheap accelerometers that I use.

3D speed and position:
The 3D speed is computed as the integration of the 3D acceleration. The 3D position is computed as the integration of the 3D speed.
Unfortunately, the speed and position information drift very rapidly. The smallest measurement error is highly amplified, because it passes through a double integral. In less than 1 second, the error can be huge. The other sensor (sonars and wii camera) are essential for compensating this error in real time.

1.5) 3D position – Sonars :

The UAV has 4 sonars that are perpendicular. Unfortunately, because of a strange limitation of the "Embedded Master" Board (impossible to open 2 simultaneous I2C connections), I cannot measure all directions at the same time. I am only able to measure each direction every 100ms. I measure 2 directions in the same time. Be careful: I cannot trigger 2 sonars that are parallel (left and right sonars) at the same time, they would interfer with each other.

The aim is to measure the distance between the UAV and a known plane : floor or wall. All these planes are known : the geometry of the room is already known and integrated in the program. A measure is selected only if the sonar is perpendicular with the plane; the tolerance is 18 degrees. The sonar that measures altitude is always selected, because pitch and roll angle quite never exeed 20 degrees.

If a measure is selected, then I compute the projected distance between the plane (wall, floor) and the UAV, with the help of the 3D orientation already estimated. This measure will then be used to correct the 3D position estimation in only 1 direction : the closest direction to the sonar direction. But there are other criteria to this measure to be selected: the speed and position estimated has to be credible.

Readjustment :
2 kinds of variables are readjusted :

The position estimation

The speed estimation

The speed estimation is readjusted with the help of the "estimated sonar speed". This sonar speed is the speed computed between consecutives (100ms) sonar measurement from the same sensor, with the same reference plan.

Finally, I do not readjust the speed or position estimation 100% with the sonar measurement. The measures from sonars can be noisy, and the measurement step is quite big. Furthermore, a small obstacle (door handle) can induce a temporary wrong measure. Therefore, I only readjust 50% for position, and 30% for speed.

Once again, the principle is similar to the orientation estimation. We 2 kinds of sensor for the determination of the same information (3D position):

The high frequency component of the 3D position is computed with the accelerometers

The low frequency component of the 3D position is computed with the sonars (and the Wii camera, see below).

1.6) 3D position - the Wii Remote camera:

The Wii-Remote camera is a great sensor. Unlike the sonars, it enables to compute simultaneously 2 axis (X, Y). The camera is only used 20 times per second in my UAV, but it can measure over 100Hz!

The camera gives (ia I2C) the 2D coordinates inside the picture of an IR spot that lay on the ground. With this information, and with the altitude estimation, and 3D orientation estimation, I can "easily" compute the 2D position (X, Y, and not Z-altitude).

The positions of all spots are known in advance exactly like the walls for the sonars. A new IR spot is only selected as a good reference, if the 3D position that it induces is credible (not far from the current estimated position). Unfortunately, this part doesn't work fine for the moment.

The Wii camera is capable of measuring up to 4 spots simultaneously. The algorithm described above id computed for each spot measured by the camera.

Then, I readjust the 2D position estimation (X, Y) exactly the same way as for sonars (50% position, 30% speed). Unfortunately, the speed information is not well measured by this camera, especially at high altitudes (over 2 meters high). The speed information if essential for the PID control loop!

1.7) Landing detection :

I finally compute an probability for the UAV to lay flat on the floor. If the throttle order is low, and if the altitude estimation is low, then, I know that the UAV is laying on the floor. This is a good source of information! During such situations, I make the following adjustments:

Rapid offset compensation of ALL accelerometers and gyroscopes. This is the perfect situation to do this.

Make the pitch and roll angles rapidly converge to zero