Abstract: |
While steering sporting aeroplanes, pilots are required to use simultaneously both hands and even feet, since control of such machines requires multi-dimensional input. The first hand directs the stick for ailerons and elevator, both feet set the rudder, and the second hand circulates between primary instrument adjustments, power-level setting, flaps control, and handling secondary equipment like communication devices and other. Already when starting the aero engine, pilots are fully occupied with such manual actions, and on top of that they are furthermore required to protocol the relevant flight operation stages with time accuracy as it is prescribed by air law [1].
Commonly in sports flying, records are noted by hand writing, which theoretically has to be performed in especially critical phases of the flight operation, e.g., while the plane is just taking off from the runway. Another demanding manoeuvre in training represents a “touch and go”, where a landing approach is completed until the gear wheels touch the ground, and then the plane is powered fully for an immediate re-start. In this action sequence, pilots rarely do have good opportunity for noting the touch down time, usually there is not even time for looking at any clock display.
Modern consumer technology offers with smartphones appropriate devices, that feasibly can trace flight movements with sufficient accuracy, since they are equipped with a broad range of suitable sensor features. In the research project described here, a smartphone app is under on-going development, which automatically records the required flight protocol information, so that pilots are relieved from this disturbing duty and can concentrate fully on steering their aerial vehicle.
Several technologies and processing concepts had to be combined in this smartphone app, so that the protocol system works fully automated with minimal or even no input actions of the plane pilot. Flight Activity and its stages are traced and detected by GPS and audio signal sensing, specifically developed filter mechanisms are applied as pre-stage of signal analysis. Final decision logic is based on pattern recognition and Geo fencing.
The control logic frame of the app is realised as finite state machine, which is needs, e.g., seven states, for the operational part till first take-off. A set of different, complementing key techniques is used for detection of transitions. At the moment, the plane type and the mission end are recognized by Geo fencing on hangar parking positions. Plane identification is essential for knowing the proper take-off and landing speeds. Only when using such parameters correctly, accurate starting and landing times can be differentiated as well as complicate actions like "touch and go".
Although the protocol app is in use since longer time and has proven in hundreds of starts and landings its reliability - there were only few, single deviations from protocols taken manually, when air traffic controllers were available as ground service -, the functionality concepts shall be further enhanced. In future, pattern recognition on the engine and propeller sound shall support identifying the particular plane, and gravitational and acceleration smartphone sensors shall be exploited in addition to GPS input for improving the recognition reliability in more complicate flight manoeuvres.
[1] EASA, "FCL.050 Recording of flight time" in: Annex I - Part FCL, V1, p29-39, June 2016. |