System Architecture

As mentioned earlier, the main challenge of the project is to write embedded software, preferably from scratch. However, as the project has to be completed in limited time (7 weeks), a sample skeleton program will be provided that implements the basic ES functionality of communicating commands and sensor readings over the PC link. This program effectively serves as a guide to the library functions operating the various hardware components of the Quadrupel drone. Note that the skeleton program consists of two parts, one for the PC side (e.g., for reading the joystick, and sending/receiving characters over the RS232 link), and one for the drone side (e.g., for reading the sensor values and writing the motor values). The sample program and additional information regarding hardware components and driver protocols are available on the project site. Note that the sample program is purely provided for reference, and a team is totally free to adopt as much or as little as deemed necessary. That is, the team is completely free in developing their software, i.e., the system architecture, component definitions, interface definitions, etc. are specified, implemented, tested, and documented by the team.

System Development and Testing

An important issue in the project is the order of system development and testing. Typically, the system is developed from joystick to drone (the actuator path), and back (the sensor path), after which the yaw, roll, and pitch controllers are sequentially implemented. All but the last stages are performed in tethered mode. Each phase represents a project milestone, which directly translates into team credit. Note: each phase must start with an architectural design that presents an overview of the approach that is taken. The design must first be approved by the TA before the team is cleared to start the actual implementation.

Actuator Path (Safe, Panic, and Manual mode)

First the joystick software is developed (use non-blocked event mode) and tested by visually inspecting the effect of the various stick axes. The next steps are the development of the RS232 software on the PC, and a basic controller that maps the RS232 commands to the drone hardware interfaces (sensors and actuators). At this time, the safe mode functionality is also tested. If the design is fully tested, the system can be demonstrated using the actual drone.

Sensor Path (Calibration mode)

The next step is to implement the sensor path, for now, using the on-board motion processor that provides angular information. Calibration is an important step, and (zeroed) data should be recorded (logged) and presented at the PC under simulated flying conditions (i.e., applying RPM to the motors, and gently rotating the drone in 3D).

Yaw Control

The next step is to introduce yaw feedback control, where the controller is modified such that the yaw signal is interpreted as setpoint, which is compared to the yaw rate measurement r returned by the sensor path, causing the controller to adjust ae1 to ae4 if a difference between yaw and r is found (using a P algorithm). Testing includes verifying that the controller parameter (P ) can be controlled from the keyboard, after which the drone is “connected” to the ES. By gently yawing the drone and inspecting yaw, r, and the controller-generated signals ae1 to ae4 via the PC screen, but without actually sending these values to the drone’s motors, the correct controller operation is verified. Only when the test has completed successfully, the signals ae1 to ae4 are connected to the drone motors and the system is cleared for the yaw control test on the running drone.

Full Control

The next step is to introduce roll, and pitch feedback control, where the controller is extended in conformance with Roll/Pitch control mode using cascaded P controllers for both angles. Again, the first test is conducted with disconnected ae1 to ae4 in order to protect the system. When the system is cleared for the final demonstration, the test is re-conducted with connected ae1 to ae4.

Raw sensor readings

This step involves doing the sensor fusion on the main controller, instead of relying on the sensing module’s motion processor. Experience has taught that the Kalman filter presented in class [1] looks deceivingly simple, yet the details of implementing it with hand-written fixed-point arithmetic are far from trivial. Therefore the effectiveness of the digital filters must be extensively tested. Log files of raw and filtered sensor data need to be demonstrated to the TAs, preferably as time plots (matplotlib in Python or gnuplot). Before flying may be attempted the TAs will thoroughly exercise the combined filter and control of your drone, to safeguard against major incidents resulting from subtle bugs.

Height Control

The implementation of this feature requires coding yet-another control loop to stabilize the drone’s height automatically, freeing the pilot from this tedious task. It may only be attempted once full control has been successfully demonstrated to the TAs.

Wireless Control

Similarly to height control, wireless control can only be attempted once full control has been sucessfully demonstrated. In the wireless mode the PC link is operated using the Bluetooth link. The particular challenge of this version is the low bandwidth of the link, compared to the tethered version, which only permits very limited setpoint communication to the ES. This implies that ES stabilization of the drone must be high as pilot intervention capabilities are very limited.

Methodology

The approach towards development and testing must be professional, rather than student-wise. Unlike a highly “student-like”, iterative try-compile-test loop, each component must be developed and tested in isolation, preferably outside of lab hours, as to optimize the probability that during lab integration (where time is extremely limited) the system works. This especially applies to tests that involve the drone, as drone availability is limited to lab hours. The situation where there is only limited access to the embedding environment is akin to reality, where the vast majority of development activities has to be performed under simulation conditions, without access to the actual system (e.g., wafer scanner, vehicle) because it either is not yet available, or there are only a few, and/or testing time is simply too expensive.