Jump to content

Air-Cobot: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Created page with '{{Infobox robot | name = Air-Cobot | logo = | logosize = | image = | imsize = | alt = | capti...'
 
 
(242 intermediate revisions by 52 users not shown)
Line 1: Line 1:
{{Short description|French research and development project (2013–)}}
{{Use dmy dates|date=October 2020}}
{{Infobox robot
{{Infobox robot
| name = Air-Cobot
| name = Air-Cobot
| logo =
| logo = Logo Air-Cobot.png
| logosize =
| logosize =
| image =
| image =
| imsize =
| imsize =
| alt =
| alt =
Line 14: Line 16:
| derived_from =
| derived_from =
| replaced_by =
| replaced_by =
| website = https://aircobot.akka.eu/
| website = {{URL|aircobot.akka.eu/}}
}}
}}


'''Air-Cobot''', Aircraft Inspection enhanced by smaRt & Collaborative rOBOT, is a [[French]] [[research and development]] project of a [[cobot|collaborative]] mobile robot able to inspect [[aircraft]] during maintenance operations. Lead by [[Akka Technologies]], this multi-partner project involves [[research laboratories]] and [[industry]]. Research around this [[prototype]] was developed in three domains: [[autonomous navigation]], [[nondestructive testing]] and [[cobot|collaboration human-robot]]. The robot and the related research developement have been presented in [[exhibition]]s and [[conference]]s.
'''Air-Cobot''' (''A''ircraft ''I''nspection enhanced by sma''R''t & ''C''ollaborative r''OBOT'') is a French [[research and development]] project of a wheeled [[cobot|collaborative mobile robot]] able to inspect aircraft during maintenance operations. This multi-partner project involves research laboratories and industry. Research around this prototype was developed in three domains: [[Autonomous aircraft|autonomous navigation]], human-robot collaboration and [[nondestructive testing]].


Air-Cobot is presented as the first wheeled robot able to perform visual inspections of aircraft. Inspection robots using other types of sensors have been considered before, such as the European project Robair. Since the launch of the project, other solutions based on [[image processing]] began to be developed, such as [[EasyJet]] with a [[Unmanned aerial vehicle|drone]], the swarm of drones from [[Toulouse]] company [[Donecle]] and the Aircam project of the [[aerospace manufacturer]] Airbus.
== Diffusion ==
In october 23, 2014, a [[patent]] was filed by [[Airbus Group]]<ref>[http://worldwide.espacenet.com/publicationDetails/biblio?CC=WO&NR=2015059241&KC=&locale=en_EP&FT=E Collaborative robot for visually inspecting an aircraft]</ref>. From 2014 to 2016, the robot had presentations in five [[exhibition]]s, one on [[robotic]]s and two on [[aeronautic]]s. The research developed in the project was presented in nine [[conference]]s. Twelve scientific articles were published nine [[conference proceeding]]s and three [[journal article]]s. Part of publications is centered on navigation and/or inspection by Air-Cobot while the rest focuses on specific [[numerical method]]s or [[hardware]] solutions related to the issues of the project.


Since the beginning of the project in 2013, the Air-Cobot robot is dedicated to inspect the lower parts of an aircraft. In the continuation of the project, there is the prospect of coupling with a drone to inspect an aircraft's upper parts. In October 2016, [[Airbus|Airbus Group]] launched its research project on the hangar of the future in Singapore. The robots from the Air-Cobot and Aircam projects are included in it.
Located to [[Laboratoire d'analyse et d'architecture des systèmes]] during development, researchers or engineers working on the project regularly present the demonstrator to visitors (external researchers, industrial partners, students) and sometimes the general public during the 2015 feast of the science.


=== Exhibitions ===
== Project description ==
* 2014 : TechnoDay Aerospace Valley,
* 2014 : Robotic Show [[Innorobo]],
* 2014 : Salon des Partenaires de l'Industrie du Grand-Sud (SIANE),
* 2015 : [[Paris Air Show]],<ref>{{fr}} [http://www.industrie-techno.com/bourget-2015-les-dix-rendez-vous-technos-a-ne-pas-louper.38838 Bourget 2015 : les dix rendez-vous technos à ne pas louper]</ref>
* 2016 : [[Singapore Airshow]].<ref>[http://apex.aero/2016/02/24/singapore-airshow-2016-trends-emerging-technologies Singapore Airshow 2016 Trends: Emerging Technologies Take Off]</ref>


=== Journal articles ===
=== Objectives ===
Launched in January 2013,<ref name=lcitf1>{{in lang|fr}} {{Cite web |author=Xavier Martinage |url=http://lci.tf1.fr/economie/entreprise/air-cobot-le-robot-dont-dependra-votre-securite-8622912.html |title=Air-Cobot : le robot dont dépendra votre sécurité |website=lci.tf1.fr |publisher=[[La Chaîne Info]] |date=17 June 2015 |access-date=12 July 2016 |archive-url=https://web.archive.org/web/20160103234912/http://lci.tf1.fr/economie/entreprise/air-cobot-le-robot-dont-dependra-votre-securite-8622912.html |archive-date=3 January 2016 |url-status=dead}}</ref> the project is part of the Interministerial Fund program of [[Aerospace Valley]], a [[business cluster]] in southwestern France.<ref name=competitivite>{{in lang|fr}} {{Cite web |url=http://competitivite.gouv.fr/projets-en-cours-fui-investissements-d-avenir/fiche-projet-r-d-aide-355/air-cobot-253.html?cHash=03b3756b6aaf38b6899b6b169842a060 |title=Air-Cobot : un nouveau mode d'inspection visuelle des avions |website=competitivite.gouv.fr |publisher=Les pôles de compétitivité |access-date=12 July 2016 |archive-url=https://web.archive.org/web/20161011010313/http://competitivite.gouv.fr/projets-en-cours-fui-investissements-d-avenir/fiche-projet-r-d-aide-355/air-cobot-253.html?cHash=03b3756b6aaf38b6899b6b169842a060 |archive-date=11 October 2016 |url-status=dead }}</ref> With a budget of over one million [[euro]]s,<ref name=AirCosmos>{{in lang|fr}} {{Cite journal |author=Olivier Constant |url=http://www.pressreader.com/france/air-cosmos/20150911/282003261203383 |title=Le projet Air-Cobot suit son cours |journal=Air et Cosmos |number=2487 |date=11 September 2015 |access-date=12 July 2016}}</ref> Air-Cobot aims to develop an innovative [[cobot|collaborative]] mobile [[robot]], [[Autonomous aircraft|autonomous]] in its movements and able to perform the inspection of an aircraft with [[nondestructive testing]] [[sensor]]s during preflight or during [[Maintenance, repair, and operations|maintenance operations]] in a [[hangar]].<ref name=competitivite/><ref name=rapport20132014>{{in lang|fr}} {{Cite web |url=http://www.aerospace-valley.com/sites/default/files/documents/ra_14_exe_bd.pdf |title=Rapport d'activité 2013–2014 de l'Aerospace Valley |website=aerospace-valley.com |publisher=[[Aerospace Valley]] |access-date=12 July 2016 |archive-date=24 September 2016 |archive-url=https://web.archive.org/web/20160924195404/http://www.aerospace-valley.com/sites/default/files/documents/ra_14_exe_bd.pdf |url-status=dead }}</ref> Testing has been performed at the premises of [[Airbus]] and [[Air France Industries]].<ref name=AirCobotNews>{{in lang|fr}} {{Cite web |url=https://aircobot.akka.eu/?q=page/news |title=News du projet Air-Cobot |website=aircobot.akka.eu |publisher=Akka Technologies |access-date=12 July 2016 |archive-url=https://web.archive.org/web/20160710100116/https://aircobot.akka.eu/?q=page%2Fnews |archive-date=10 July 2016 |url-status=dead}}</ref>
* {{Cite journal |first1=Igor |last1=Jovancevic |first2=Stanislas |last2=Larnier |first3=Jean-José |last3=Orteu |first4=Thierry |last4=Sentenac |title=Automated exterior inspection of an aircraft with a pan-tilt-zoom camera mounted on a mobile robot |journal=Journal of Electronic Imaging |volume=24 |number=6 |date=November 2015 |url=http://spie.org/Publications/Journal/10.1117/1.JEI.24.6.061110 }}
* {{Cite journal |first1=Jorge Othón |last1=Esparza-Jiménez |first2=Michel |last2=Devy |first3=José Luis |last3=Gordillo |title=EKF-based SLAM fusing heterogeneous landmarks |journal=Sensors |volume=16 |number=4 |year=2016 |url=http://www.mdpi.com/1424-8220/16/4/489/pdf}}
* {{Cite journal |first1=Daniel Törtei |last1=Tertei |first2=Jonathan |last2=Piat |first3=Michel |last3=Devy |title=FPGA design of EKF block accelerator for 3D visual SLAM |journal=Computers and Electrical Engineering |year=2016}}


== References ==
=== Partners ===
[[Image:Air France Airbus A320-214; F-GKXC@MIA;17.10.2011 626bf (6446642243).jpg|thumb|Air-Cobot has been tested on [[Airbus A320]]s in the premises of [[Airbus]] and [[Air France Industries]].<ref name=AirCobotNews/>]]
<references/>
The project leader is Akka Technologies. There are two academic partners; Akka Technologies and four other companies make up the five commercial partners.<ref name=capital>{{in lang|fr}} {{Cite journal |url=http://www.capital.fr/bourse/communiques/akka-technologies-akka-technologies-coordonne-le-projet-air-cobot-un-robot-autonome-d-inspection-visuelle-des-avions.-945346 |title=AKKA Technologies coordonne le projet Air-COBOT, un robot autonome d'inspection visuelle des avions |journal=[[Capital (magazine)|Capital]] |date=1 July 2014 |access-date=14 July 2016 |url-status=dead |archive-url=https://web.archive.org/web/20160625054008/http://www.capital.fr/bourse/communiques/akka-technologies-akka-technologies-coordonne-le-projet-air-cobot-un-robot-autonome-d-inspection-visuelle-des-avions.-945346 |archive-date=25 June 2016}}</ref>

;Academic partners
* Armines and [[Institut Clément Ader]] of the [[École des mines d'Albi-Carmaux]] are in charge of [[nondestructive testing]].<ref name=capital/><ref name=PlaneteRobot/>
* [[Laboratoire d'analyse et d'architecture des systèmes]] (LAAS-CNRS) with the Robotics, Action and Perception (RAP) team handles the [[Autonomous aircraft|autonomous navigation]].<ref name=capital/><ref name=PlaneteRobot/><ref>{{in lang|fr}} {{Cite web |url=https://www.laas.fr/public/fr/contrats-rap |title=Contrats RAP |publisher=[[Laboratoire d'analyse et d'architecture des systèmes]] |access-date=17 July 2016 |archive-date=14 September 2015 |archive-url=https://web.archive.org/web/20150914163208/https://www.laas.fr/public/fr/contrats-rap |url-status=dead }}</ref>

;Industrial partners
* Akka Technologies, particularly the center for [[research and development]] Akka Research Toulouse, leads the project and brings skills in [[image analysis]], navigation and aircraft maintenance.<ref name=AirCosmos/><ref name=capital/><ref name=PlaneteRobot/><ref>{{in lang|fr}} {{Cite journal |url=http://www.leparisien.fr/economie/emploi/top-employeur/akka-technologies-une-marque-employeur-orientee-sur-l-innovation-15-02-2016-5546759.php#xtref=https%3A%2F%2Fwww.google.fr |title=Akka Technologies : une marque employeur orientée sur l'innovation |journal=[[Le Parisien]]|date=15 February 2016 |access-date=17 July 2016}}</ref>
* [[Airbus]] Innovations is the initiator of the project, providing [[CAD model]]s of the [[Airbus A320]] and developing operating scenarios.<ref name=AirCosmos/><ref name=capital/><ref name=PlaneteRobot/>
* 2MoRO Solutions, a company based in the French Basque Country, is in charge of the maintenance information system.<ref name=capital/><ref name=PlaneteRobot/>
* M3 System, a Toulouse-based company, takes care of the outdoor localization solution based on the [[Global Positioning System]] (GPS).<ref name=capital/><ref name=PlaneteRobot/><ref name=Jupiter>{{Cite web |url=http://jupiter-egnss-its.eu/events-and-activities/company-showcase/m3-systems/ |title=M3 Systems Flagship Solution |publisher=M3 Systems |access-date=17 July 2016 |archive-url=https://web.archive.org/web/20160806221018/http://jupiter-egnss-its.eu/events-and-activities/company-showcase/m3-systems/ |archive-date=6 August 2016 |url-status=dead}}</ref>
* Sterela, based in the south of Toulouse, provides the 4MOB mobile platform.<ref name=capital/><ref name=PlaneteRobot/><ref name=4MOB>{{in lang|fr}} {{Cite web |url=http://www.sterela.fr/documents/50/4mob_fr.pdf |title=4MOB, plateforme intelligente autonome |publisher=Sterela Solutions |access-date=17 July 2016 |archive-url=https://web.archive.org/web/20160809221312/http://www.sterela.fr/documents/50/4mob_fr.pdf |archive-date=9 August 2016 |url-status=dead }}</ref>

=== Project finance ===
Project finance is provided by [[Bpifrance (company)|banque publique d'investissement]], the [[Aquitaine Regional Council]], the Pyrénées-Atlantiques Departemental Council, the Midi-Pyrénées Regional Council and by the [[European Union]].<ref>{{in lang|fr}} {{Cite web |url=https://aircobot.akka.eu/?q=page/financeurs |title=Financeurs |website=aircobot.akka.eu |publisher=Akka Technologies |access-date=15 July 2016 |archive-url=https://web.archive.org/web/20160804211759/https://aircobot.akka.eu/?q=page%2Ffinanceurs |archive-date=4 August 2016 |url-status=dead}}</ref>

=== Expected benefits===
Aircraft are inspected during maintenance operations either outdoors on an airport between flights, or in a hangar for longer-duration inspections. These inspections are conducted mainly by human operators, visually and sometimes using tools to assess defects.<ref group=A name=VillemotRFIA2016/> The project aims to improve inspections of aircraft and traceability. A database dedicated to each aircraft type, containing images and three-dimensional scans, will be updated after each maintenance. This allows for example to assess the propagation of a crack.<ref name=rapport20132014/><ref name=Guillermard>{{in lang|fr}} {{Cite journal |author=Véronique Guillermard |url=http://www.lefigaro.fr/societes/2015/06/18/20005-20150618ARTFIG00009-aircobot-controle-les-avions-avant-le-decollage.php |title=Aircobot contrôle les avions avant le décollage |journal=[[Le Figaro]]|date=18 May 2015 |access-date=14 July 2016}}</ref>

The human operator's eyes fatigue over time while an automatic solution ensures reliability and repeatability of inspections. The decrease in time taken for inspections is a major objective for aircraft manufacturers and airlines. If maintenance operations are faster, this will optimize the availability of aircraft and reduce maintenance operating costs.<ref name=rapport20132014/><ref name=Guillermard/>

== Robot equipment ==
[[Image:AFI 03 2016 Air-Cobot in hangar.png|thumb|right|Air-Cobot in a hangar of [[Air France Industries]].<ref group=A name=VillemotRFIA2016/>]]
All electronics equipment is carried by the 4MOB mobile platform manufactured by Sterela. The off-road platform, equipped with four-wheel drive, can move at a speed of 2 metres per second ({{convert|7.2|km/h|2}}).<ref name=4MOB/> Its [[lithium-ion battery]] allows an operating time of eight hours. Two bumpers are located at the front and at the rear. These are obstacle detection bumpers. They stop the platform if they are compressed.<ref name=4MOB/>

The cobot weighs {{convert|230|kg|0}}. It has two computers, one running [[Linux]] for the [[Autonomous aircraft|autonomous navigation]] module and the other [[Windows]] for the [[non-destructive testing]] module. The robot is equipped with several sensors. The [[pan-tilt-zoom camera]] manufactured by Axis Communications and Eva [[3D scanner]] manufactured by [[Artec 3D]] are dedicated to inspection. The sensors for navigation are an [[inertial measurement unit]]; two benches, each equipped with two PointGrey cameras; two Hokuyo laser range finders; and a GPS unit developed by M3 Systems that allows for [[geofencing]] tasks in outdoor environments.<ref name=AirCosmos/><ref name=PlaneteRobot/>

== Autonomous navigation ==
The autonomous navigation of the Air-Cobot robot is in two phases. The first, navigation in the airport or the factory, allows the robot to move close to the aircraft. The second navigation, around the aircraft, allows the robot to position itself at control points referenced in the aircraft virtual model. In addition, the robot must insert itself in a dynamic environment where humans and vehicles are moving. To address this problem, it has an obstacle avoidance module. Many navigation algorithms are constantly running on the robot with real time constraints. Searches are conducted on optimizing the computing time.{{Citation needed|date=July 2021}}{{clarify|What does "Searches are conducted on optimizing the computing time" mean? Searches of what?|date=July 2021}}

=== Navigation in the airport or the factory ===
In an outdoor environment, the robot is able to go to the inspection site by localizing through [[Global Positioning System]] (GPS) data. The GPS device developed by M3 Systems allows [[geofencing]]. At the airport, the robot operates in dedicated navigation corridors respecting speed limits. Alerts are sent to the operator if the robot enters a prohibited area or exceeds a given speed.<ref name=Jupiter/><ref group=A name=DonadioREV2017/>

Another algorithm based on [[computer vision]] provides, in [[Real-time computing|real-time]], a lane marking detection. When visible, painted lanes on the ground can provide complementary data to the positioning system to have safer trajectories.<ref group=A>{{harvnb|Bauda|Bazot|Larnier|2017|loc=ECMSM}}</ref> If in an indoor environment or an outdoor environment where GPS information is not available, the cobot can be switch to follower mode to move behind the human operator and follow her or him to the aircraft to inspect.<ref name=VideoAir-Cobot/><ref group=A name=DonadioREV2017/>

=== Navigation around the aircraft ===
To perform the inspection, the robot has to navigate around the aircraft and get to the checkpoints called up in the aircraft virtual model. The position of the aircraft in the airport or factory is not known precisely; the cobot needs to detect the aircraft in order to know its position and orientation relative to the aircraft. To do this, the robot is able to locate itself, either with the laser data from its laser range finders,<ref group=A name=FrejavilleRFIA2016/> or with image data from its cameras.<ref group=A name=VillemotRFIA2016/><ref group=A name=JovancevicICPRAM2016>{{harvnb|Jovancevic|Viana|Orteu|Sentenac|2016|loc=ICPRAM}}</ref>

Near the aircraft, a point cloud in three dimensions is acquired by changing the orientation of the laser scanning sensors fixed on pan-tilt units. After filtering data to remove floor- or insufficiently large dot clusters, a registration technique with the model of the aircraft is used to estimate the static orientation of the robot. The robot moves and holds this orientation by considering its wheel odometry, its inertial unit and visual odometry.<ref group=A name=FrejavilleRFIA2016/>

{{gallery
|title="Matching clouds"
|height=110 |width=180
|align=center
|File:RFIA2016 Air-Cobot Outside 3D Laser Scan.png
|3D laser acquisition in an outside environment.<ref group=A name=FrejavilleRFIA2016/>
|File:RFIA2016 Air-Cobot Matching Outside 3D Laser Scan With Aircraft Model.png
|Matching of the outside data with the aircraft model.<ref group=A name=FrejavilleRFIA2016/>
|File:RFIA2016 Air-Cobot Inside 3D Laser Scan.png
|3D laser acquisition in an inside environment.<ref group=A name=FrejavilleRFIA2016/>
|File:RFIA2016 Air-Cobot Matching Inside 3D Laser Scan With Aircraft Model.png
|Matching of the inside data with the aircraft model.<ref group=A name=FrejavilleRFIA2016/>
}}

[[Image:Airbus A320 (Air France) (5796695561).jpg|thumb|right|Air-Cobot can estimate its position relative to an aircraft by using visual landmarks on the fuselage.<ref group=A name=JovancevicICPRAM2016/>]]

Laser data are also used horizontally in two dimensions. An algorithm provides a real-time position estimation of the robot when enough elements from the landing gears and engines are visible. A confidence index is calculated based on the number of items collected by lasers. If good data confidence is achieved, the position is updated. This mode is particularly used when the robot moves beneath the aircraft.<ref group=A name=FrejavilleRFIA2016/>

For visual localization, the robot estimates its position relative to the aircraft using visual elements (doors, windows, tires, static ports etc.) of the aircraft. During the evolution of the robot, these visual elements are extracted from a three-dimensional virtual model of the aircraft and projected in the image plane of the cameras. The projected shapes are used for [[pattern recognition]] to detect those visual elements.<ref group=A name=JovancevicICPRAM2016/> The other detection method used is based on the extraction of features with a [[Speeded Up Robust Features]] (SURF) approach. A pairing is performed between images of each element to be detected and the actual scene experienced.<ref group=A name=VillemotRFIA2016/>

By detecting and tracking visual landmarks, in addition to estimating its position relative to the aircraft, the robot can perform a [[visual servoing]].<ref group=A name=MarcusICINCO2014/> Research in vision is also conducted on [[simultaneous localization and mapping]] (SLAM).<ref group=A>{{harvnb|Esparza-Jiménez|Devy|Gordillo|2014|loc=FUSION}}</ref><ref group=A>{{harvnb |Esparza-Jiménez|Devy|Gordillo|2016|loc=Sensors}}</ref> A merger of information between the two methods of acquisition and laser vision is being considered. Artificial intelligence arbitrating various locations is also under consideration.<ref group=A name=FrejavilleRFIA2016>{{harvnb|Frejaville|Larnier|Vetault|2016|loc=RFIA}}</ref><ref group=A name=VillemotRFIA2016>{{harvnb|Villemot|Larnier|Vetault|2016|loc=RFIA}}</ref>

=== Obstacle avoidance ===
In both navigation modes, Air-Cobot is also able to detect, track, identify and avoid obstacles that are in its way. The laser data from laser range sensors and visual data from the cameras can be used for detection, monitoring and identification of the obstacles. The detection and monitoring are better in the two-dimensional laser data, while identification is easier in the images from the cameras; the two methods are complementary. Information from laser data can be used to delimit work areas in the image.<ref group=A name=MarcusICINCO2014/><ref group=A>{{harvnb|Lakrouf|Larnier|Devy|Achour|2017|loc=ICMRE}}</ref><ref group=A name=Leca2019/>

The robot has several possible responses to any obstacles. These will depend on its environment (navigation corridor, tarmac area without many obstacles, cluttered indoor environment etc.) at the time of the encounter with an obstacle. It can stop and wait for a gap in traffic, or avoid an obstacle by using a technique based on a spiral, or perform [[path planning]] trajectories.<ref group=A name=MarcusICINCO2014>{{harvnb|Futterlieb|Cadenat|Sentenac|2014|loc=ICINCO}}</ref><ref group=A name=Leca2019>{{harvnb|Leca|Cadenat|Sentenac|Durand-Petiteville|Gouaisbaut|Le Flécher|2019|loc=ECC}}</ref>

=== Computing time optimization ===
Given the number of navigation algorithms calculating simultaneously to provide all the information in real time, research has been conducted to improve the computation time of some [[numerical method]]s using [[field-programmable gate array]]s.<ref group=A name=TerteiReconfig2014>{{harvnb|Tertei|Piat|Devy|2014|loc=ReConFig}}</ref><ref group=A name=AlhamwiICVS2015>{{harvnb|Alhamwi|Vandeportaele|Piat|2015|loc=ICVS}}</ref><ref group=A name=TerteiCEE2016>{{harvnb|Tertei|Piat|Devy|2016|loc=CEE}}</ref> The research focused on visual perception. The first part was focused on the [[simultaneous localization and mapping]] with an [[extended Kalman filter]] that estimates the state of a dynamic system from a series of noisy or incomplete measures.<ref group=A name=TerteiReconfig2014/><ref group=A name=TerteiCEE2016/> The second focused on the location and the detection of obstacles.<ref group=A name=AlhamwiICVS2015/>

== Non-destructive testing ==
[[Image:Kiefer Lufthansa Airbus A320 D-AIZQ (13315338774).jpg|thumb|upright|left|Air-Cobot can inspect the blades of a [[turbofan]] engine.<ref group=A name=JovancevicJEI2015/>]]

=== Image analysis ===
After having positioned to perform a visual inspection, the robot performs an acquisition with a [[pan-tilt-zoom camera]]. Several steps take place: pointing the camera, sensing the element to be inspected, if needed repointing and zooming with the camera, image acquisition and inspection. Image analysis is used on doors to determine whether they are open or closed; on the presence or absence of protection for certain equipment; the state of [[turbofan]] blades or the wear of [[landing gear]] tires.<ref group=A name=JovancevicJEI2015>{{harvnb|Jovancevic|Larnier|Orteu|Sentenac|2015|loc=JEI}}</ref><ref group=A>{{harvnb|Jovancevic|Orteu|Sentenac|Gilblas|2015a|loc=QCAV}}</ref><ref group=A>{{harvnb|Jovancevic|Orteu|Sentenac|Gilblas|2015b|loc=CMOI}}</ref><ref group=A name=JovancevicMECO2016>{{harvnb|Jovancevic|Arafat|Orteu|Sentenac|2016|loc=MECO}}</ref>

The detection uses [[pattern recognition]] of regular shapes (rectangles, circles, ellipses). The 3D model of the element to be inspected can be projected in the image plane for more complex shapes. The evaluation is based on indices such as the uniformity of segmented regions, convexity of their forms, or periodicity of the image pixels' intensity.<ref group=A name=JovancevicJEI2015/>

The [[feature extraction]] using [[speeded up robust features]] (SURF) is also able to perform the inspection of certain elements having two possible states, such as pitot probes or [[static port]]s being covered or not covered. A pairing is performed between images of the element to be inspected in different states and that present on the scene. For these simple items to be inspected, an analysis during navigation is possible and preferable due to its time saving.<ref group=A name=VillemotRFIA2016/><ref group=A>{{harvnb|Leiva|Villemot|Dangoumeau|Bauda|2017|loc=ECMSM}}</ref>

{{Clear}}

=== Point cloud analysis ===
After having positioned to perform a scan inspection, the pantograph elevates the [[3D scanner]] at the fuselage. A pan-tilt unit moves the scan device to acquire the hull. By comparing the data acquired to the three-dimensional model of the aircraft, algorithms are able to diagnose any faults in the fuselage structure and provide information on their shape, size and depth.<ref name=SciencesEtAvenir>{{in lang|fr}} {{Cite journal |author=Pascal NGuyen |title=Des robots vérifient l'avion au sol |journal=Sciences et Avenir |number=814 |date=December 2014 |url=http://www.sciencesetavenir.fr/high-tech/20141205.OBS7120/des-robots-verifient-l-avion-au-sol.html |access-date=17 July 2016 |url-status=dead |archive-url=https://web.archive.org/web/20160808132509/http://www.sciencesetavenir.fr/high-tech/20141205.OBS7120/des-robots-verifient-l-avion-au-sol.html |archive-date=8 August 2016}}</ref><ref group=A>{{harvnb|Jovancevic|Pham|Orteu|Gilblas|2017|loc=I2M}}</ref><ref group=A>{{harvnb|Bauda|Grenwelge|Larnier|2018|loc=ETRSS}}</ref>

By moving the pan-tilt units of the laser range finders, it is also possible to obtain a point cloud in three dimensions. Technical readjustment between the model of the aircraft and the scene point cloud is already used in navigation to estimate the static placement of the robot. It is planned to make targeted acquisitions, simpler in terms of movement, to verify the absence of chocks in front of the landing gear wheels, or the proper closing of engine cowling [[Latch (hardware)|latches]].<ref group=A name=FrejavilleRFIA2016/>

== Collaboration human-robot ==
As the project name suggests, the mobile robot is a cobot – a collaborative robot. During phases of navigation and inspection, a human operator accompanies the robot; he can take control if necessary, add inspection tasks, note a defect that is not in the list of robot checks, or validate the results. In the case of pre-flight inspections, the diagnosis of the [[walk-around]] is sent to the pilot who decides whether or not to take off.<ref name=PlaneteRobot>{{in lang|fr}} {{Cite journal |journal=Planète Robots |title=Air-Cobot, le robot qui s'assure que vous ferez un bon vol ! |date=March–April 2016 |number=38 |pages=32–33 |url=https://issuu.com/planeterobots/docs/planete_robots_038-17p}}</ref><ref name=VideoAir-Cobot/><ref group=A name=DonadioMCG2016>{{harvnb|Donadio|Frejaville|Larnier|Vetault|2016|loc=MCG}}</ref>

== Other robotic inspection solutions ==

[[Image:Aircraft maintenance dashQ400.JPG|thumb|upright|left|The [[Unmanned aerial vehicle|drones]] can inspect upper parts of the aircraft such as the tail and simplify maintenance checks.]]

=== European project Robair ===
The inspection robot of the European project Robair, funded from 2001 to 2003, is designed to mount on the wings and [[fuselage]] of an aircraft to inspect rows of rivets. To move, the robot uses a flexible network of pneumatic [[suction cup]]s that are adjustable to the surface. It can inspect the lines of rivets with [[ultrasonic wave]]s, [[eddy current]] and [[thermographic]] techniques. It detects loose rivets and cracks.<ref>{{in lang|fr}} {{Cite web |url=http://cordis.europa.eu/result/rcn/85695_fr.html |title=Robair, Inspection robotisée des aéronefs |publisher=[[European Commission]] |access-date=16 July 2016 |archive-date=11 October 2016 |archive-url=https://web.archive.org/web/20161011005707/http://cordis.europa.eu/result/rcn/85695_fr.html |url-status=dead }}</ref><ref>{{Cite web |url=http://www1.lsbu.ac.uk/esbe/mrndt/robair.shtml |title=Robair |publisher=[[London South Bank University]] |access-date=16 July 2016}}</ref><ref>{{Cite journal |first1=Jianzhong |last1=Shang |first2=Tariq |last2=Sattar |first3=Shuwo |last3=Chen |first4=Bryan |last4=Bridge |title=Design of a climbing robot for inspecting aircraft wings and fuselage |journal=Industrial Robot |volume=34 |number=6 |pages=495–502 |year=2007 |doi=10.1108/01439910710832093|url=http://researchopen.lsbu.ac.uk/2795/1/62.%20CLAWAR%202006-Design%20of%20a%20climbing%20robot.pdf }}</ref>

=== EasyJet drone ===
Airline [[EasyJet]] is interested in the inspection of aircraft with drones. It made a first inspection in 2015. Equipped with laser sensors and high resolution camera, the drone performs autonomous flight around the aeroplane. It generates a three-dimensional image of the aircraft and transmits it to a technician. The operator can then navigate in this representation and zoom to display a high-resolution picture of some parts of the aircraft. The operator must then visually diagnose the presence or absence of defects. This approach avoids the use of platforms to observe the upper parts of the aeroplane.<ref>{{in lang|fr}} {{Cite web |author=|url=https://humanoides.fr/2015/06/easy-jet-commence-a-utiliser-des-drones-pour-linspection-de-ses-avions/ |title=Easy Jet commence à utiliser des drones pour l'inspection de ses avions |website=humanoides.fr|date=8 June 2015 |access-date=16 July 2016 |archive-url=https://web.archive.org/web/20151012005810/http://humanoides.fr/2015/06/easy-jet-commence-a-utiliser-des-drones-pour-linspection-de-ses-avions/ |archive-date=12 October 2015 |url-status=dead }}</ref>

=== Donecle drone ===
[[Image:AFI 05 2017 Donecle drone 003.jpg|thumb|upright|right|[[Donecle]]'s autonomous [[Unmanned aerial vehicle|drone]] inspecting an aircraft.]]

Founded in 2015, [[Donecle]], a Toulouse start-up company, has also launched a drone approach which was initially specialized in the detection of [[lightning strike]]s on aeroplanes.<ref>{{in lang|fr}} {{Cite journal |url=http://objectifnews.latribune.fr/innovation/start-up/2015-08-28/aeronautique-la-startup-donecle-invente-le-drone-anti-foudre.html |title=Aéronautique : la startup Donecle invente le drone anti-foudre |author=Florine Galéron |date=28 May 2015 |journal=Objectif News, la Tribune |access-date=16 July 2016}}</ref><ref name=Donecle>{{in lang|fr}} {{Cite journal |author=Arnaud Devillard |url=http://www.sciencesetavenir.fr/high-tech/20160420.OBS8889/des-drones-pour-inspecter-des-avions.html |title=Des drones pour inspecter des avions |journal=Sciences et Avenir |date=20 April 2016 |access-date=16 July 2016 |url-status=dead |archive-url=https://web.archive.org/web/20160808133513/http://www.sciencesetavenir.fr/high-tech/20160420.OBS8889/des-drones-pour-inspecter-des-avions.html |archive-date=8 August 2016}}</ref> Performed by five people equipped with harnesses and platforms, this inspection usually takes about eight hours. The immobilization of the aircraft and the staff are costly for the airlines, estimated at $10 000 per hour. The solution proposed by the start-up lasts twenty minutes.<ref name=Donecle/>

Donecle uses a swarm of drones equipped with laser sensors and micro-cameras. The algorithms for automatic detection of defects, trained on existing images database with a [[machine learning]] software, are able to identify various elements: texture irregularities, [[pitot probe]]s, rivets, openings, text, defects, [[corrosion]], oil stains. A damage report is sent on the operator's touch pad with each area of interest and the proposed classification with a [[probability]] percentage. After reviewing the images, the verdict is pronounced by a qualified inspector.<ref name=Donecle/>
{{Clear}}

== Project continuation ==
In 2015, in an [[interview]] given to the French weekly magazine ''[[Air & Cosmos]]'', Jean-Charles Marcos, [[chief executive officer]] (CEO) of Akka Research, explained that once developed and marketed the Air-Cobot should cost between 100,000 and 200,000 euros. He could meet civilian needs in [[nondestructive testing]] and also military ones.<ref name=AirCosmos/> A possible continuation of the project could be the use of the robot on aircraft larger than the [[Airbus A320]]. The CEO also revealed that Akka Technologies plans to work on a duo of robots for inspection: the same mobile platform for the lower parts, and a [[Unmanned aerial vehicle|drone]] for the upper parts. If funding is allocated then this second phase would take place during the period 2017–2020.<ref name=AirCosmos/>

At the [[Singapore Airshow]] in February 2016, Airbus Group presented Air-Cobot and its use in its vision of the hangar of the future.<ref name=VideoHangarFuture/> The same month, the [[Singapore government]] enlisted Airbus Group to help local [[maintenance, repair, and operations]] providers to stay competitive against neighbour countries like [[Indonesia]], [[Thailand]] and the [[Philippines]] which are cheaper. To improve [[productivity]], Airbus Group launches, in October 2016, a [[testbed]] hangar where [[new technologies]] can be tested. Upon entering the hangar, cameras study the aircraft to detect damages. Mobile robots, such as the one of the Air-Cobot project, and drones, such as the one of the Aircam project, carry out more detailed inspections.<ref>{{Cite web |title=Pimp my Hangar: Excelling in MRO |url=http://www.airbusgroup.com/int/en/news-media/corporate-magazine/Forum-89/Hangar-of-the-future.html |website=airbusgroup.com |publisher=[[Airbus]] |access-date=21 December 2016 |url-status=dead |archive-url=https://web.archive.org/web/20161221162747/http://www.airbusgroup.com/int/en/news-media/corporate-magazine/Forum-89/Hangar-of-the-future.html |archive-date=21 December 2016}}</ref>

During the 14th International Conference on Remote Engineering and Virtual Instrumentation in March 2017, Akka Research Toulouse, one of the centers for [[research and development]] of Akka Technologies, presents its vision of the [[airport]] of the future.<ref group=A name=DonadioREV2017/> In addition of Air-Cobot, a previous step in this research axis is Co-Friend, an intelligent [[video surveillance]] system to monitor and improve airport operations.<ref group=A name=DonadioREV2017/><ref>{{in lang|fr}} {{Cite journal |author=Éric Parisot |url=https://www.usine-digitale.fr/article/co-friend-le-systeme-d-analyse-d-images-qui-reduit-les-temps-d-immobilisation-des-avions.N199908 |title=Co-Friend, le système d'analyse d'images qui réduit les temps d'immobilisation des avions |journal=Usine Digitale |date=21 June 2013 |access-date=24 February 2018}}</ref> Futur researches will focus on the management of this operations, [[autonomous vehicles]], [[non-destructive testing]] and [[human-machine interaction]]s to increase efficiency and security on airports.<ref group=A name=DonadioREV2017>{{harvnb|Donadio|Frejaville|Larnier|Vetault|2017|loc=REV}}</ref> From August 2017, the robot comes in once a month in [[Aeroscopia]], an aeronautics museum of [[Blagnac]]. The researchers of the project take advantage of the collection to test the robot and acquire data on other aircraft models such as [[Airbus A400M]], [[Airbus A300]] and [[Sud-Aviation SE 210 Caravelle]].<ref name=Aeroscopia>{{in lang|fr}} {{Cite web |url=http://www.musee-aeroscopia.fr/fr/actualites/le-mus%C3%A9e-accueille-le-projet-air-cobot |title=Le Musée accueille le projet AIR-COBOT |website=musee-aeroscopia.fr |editor=Aeroscopia |date=August 2017 |access-date=24 February 2018 |archive-date=14 October 2017 |archive-url=https://web.archive.org/web/20171014133559/http://www.musee-aeroscopia.fr/fr/actualites/le-mus%C3%A9e-accueille-le-projet-air-cobot |url-status=dead }}</ref>

== Communications ==
[[Image:AFI 03 2016 Air-Cobot under an Airbus A320.png|thumb|Air-Cobot under the belly of an [[Airbus A320]] in a hangar.<ref group=A name=FrejavilleRFIA2016/>]]
On 23 October 2014, a patent was filed by [[Airbus]].<ref name=espacenet>{{cite web |url=http://worldwide.espacenet.com/publicationDetails/biblio?CC=WO&NR=2015059241&KC=&locale=en_EP&FT=E |title=Espacenet – Bibliographic data – Collaborative robot for visually inspecting an aircraft |publisher=worldwide.espacenet.com |access-date=1 June 2016}}</ref> From 2014 to 2016, the robot had presentations in five exhibitions including [[Paris Air Show]] 2015,<ref name=lcitf1/><ref>{{in lang|fr}} {{Cite journal |author1=Juliette Raynal |author2=Jean-François Prevéraud |url=http://www.industrie-techno.com/bourget-2015-les-dix-rendez-vous-technos-a-ne-pas-louper.38838 |title=Bourget 2015 : les dix rendez-vous technos à ne pas louper |journal=Industrie et Technologies |date=15 June 2015 |access-date=16 July 2016 |archive-date=5 July 2016 |archive-url=https://web.archive.org/web/20160705192903/http://www.industrie-techno.com/bourget-2015-les-dix-rendez-vous-technos-a-ne-pas-louper.38838 |url-status=dead }}</ref><ref>{{in lang|fr}} {{Cite web |url=http://www.mauricericci.com/akka-technologies-au-salon-du-bourget/ |title=Akka Technologies au Salon du Bourget |publisher=Maurice Ricci |date=21 June 2015 |access-date=16 July 2015 |url-status=dead |archive-url=https://web.archive.org/web/20160404125535/http://www.mauricericci.com/akka-technologies-au-salon-du-bourget/ |archive-date=4 April 2016}}</ref> and [[Singapore Airshow]] 2016.<ref name=VideoHangarFuture/><ref name="apex">{{cite web|url=http://apex.aero/2016/02/24/singapore-airshow-2016-trends-emerging-technologies |title=Singapore Airshow 2016 Trends: Emerging Technologies Take Off – APEX &#124; Airline Passenger Experience|publisher=apex.aero|access-date=1 June 2016}}</ref> The research developed in the project was presented in eighteen conferences. Twenty-one scientific articles were published seventeen [[conference proceeding]]s and four journal articles.<ref>{{Cite web |language=fr |url=https://aircobot.akka.eu/?q=page/communications |title=Communications du projet Air-Cobot |website=aircobot.akka.eu |publisher=[[Akka Technologies]] |access-date=14 July 2016 |archive-url=https://web.archive.org/web/20160811205744/https://aircobot.akka.eu/?q=page%2Fcommunications |archive-date=11 August 2016 |url-status=dead}}</ref> Part of publications is centered on navigation and/or inspection by Air-Cobot while the rest focuses on specific [[numerical method]]s or [[Electronic hardware|hardware]] solutions related to the issues of the project. During the international conference {{Lang|en|Machine Control and Guidance}} (MCG) of 2016, the prize for the best final application is awarded to the authors of the publication ''{{Lang|en|Human-robot collaboration to perform aircraft inspection in working environment}}''.<ref>{{cite web |language=en |url=https://mcg2016.irstea.fr/wp-content/uploads/2017/05/MCG2016_bestmcg2016finalapplicationpaper.pdf |title=Best MCG2016 Final Application Award |website=mcg2016.irstea.fr |publisher=Machine Control and Guidance |date=October 2016 |access-date=22 February 2020}}</ref>

On 17 April 2015, Airbus Group distributed a project presentation video, made by the communication agency Clipatize, on its YouTube channel.<ref name=VideoAir-Cobot>{{YouTube|8VwkQFIo7fc|Air-Cobot}}</ref><ref>{{Cite web |url=http://www.clipatize.com/case-study-folder/airbus_aircobot_3d_video_case_study/ |title=AirCobot – Introducing Smart Robots for Aircraft Inspections |website=clipatize.com |publisher=Clipatize |access-date=15 August 2016 |archive-url=https://web.archive.org/web/20160806085835/http://www.clipatize.com/case-study-folder/airbus_aircobot_3d_video_case_study/ |archive-date=6 August 2016 |url-status=dead }}</ref> On 25 September 2015, Toulouse métropole broadcasts a promotional video on its YouTube channel. Toulouse metropolis is presented as an attractive ecosystem, able to build the future and highlights its visibility internationally. The Air-Cobot demonstrator was chosen to illustrate the robotics research of this metropolis.<ref>{{in lang|fr}} {{YouTube|ff9RC6foz7Q|Toulouse métropole, construire le futur}}</ref> Located at [[Laboratoire d'analyse et d'architecture des systèmes]] during development, researchers or engineers working on the project regularly present a demonstration to visitors (external researchers, industrial partners, or students); it was also demonstrated to the general public during the 2015 Feast of Science.<ref name=FeteDeLaScience>{{Cite conference |language=fr |title=Air-Cobot, le robot d'assistance aux inspections des aéronefs |conference=Programme de la fête de la science |date=2015 |url=https://www.laas.fr/public/sites/www.laas.fr.public/files/reserved/comm/pdf/LivretFDS2015_light.pdf |access-date=17 July 2016}}</ref> Airbus Group, on 17 February 2016, broadcast a YouTube video presentation of its vision of the hangar of the future in which it plans to use Air-Cobot.<ref name=VideoHangarFuture>{{YouTube|0aCGptV5sAw|Innovations in Singapore: the Hangar of the Future}}</ref>

== See also ==
{{Commons category|Air-Cobot}}
* [[Aircraft maintenance checks]]
* [[Aerospace Valley]]
* [[Donecle]]

== Notes and references ==

=== Research publications of the project ===
{{Reflist|group=A}}

==== Proceedings ====
* {{Cite journal |first1=Marcus |last1=Futterlieb |first2=Viviane |last2=Cadenat |first3=Thierry |last3=Sentenac |title=A navigational framework combining Visual Servoing and spiral obstacle avoidance techniques |journal=Informatics in Control, Automation and Robotics (ICINCO), 2014 11th International Conference on, Vienna |pages=57–64 |year=2014 |url=https://hal.archives-ouvertes.fr/hal-01354855/document}}
* {{Cite journal |first1=Jorge Othón |last1=Esparza-Jiménez |first2=Michel |last2=Devy |first3=José Luis |last3=Gordillo |title=EKF-based SLAM fusing heterogeneous landmarks |journal=17th International Conference on Information Fusion (FUSION) |year=2014 |pages=1–8 |url=https://hal.archives-ouvertes.fr/hal-01354861/document}}
* {{Cite journal |first1=Daniel Törtei |last1=Tertei |first2=Jonathan |last2=Piat |first3=Michel |last3=Devy |title=FPGA design and implementation of a matrix multiplier based accelerator for 3D EKF SLAM |journal=International Conference on ReConFigurable Computing and FPGAs (ReConFig14) |year=2014 |pages=1–6 |url=https://hal.archives-ouvertes.fr/hal-01354873/document}}
* {{Cite journal |first1=Igor |last1=Jovancevic |first2=Jean-José |last2=Orteu |first3=Thierry |last3=Sentenac |first4=Rémi |last4=Gilblas |editor1-first=Fabrice |editor1-last=Meriaudeau |editor2-first=Olivier |editor2-last=Aubreton |title=Automated visual inspection of an airplane exterior |journal=Proceedings of SPIE |volume=9534 |pages=95340Y |date=April 2015a |url=https://hal.archives-ouvertes.fr/hal-01351735/document |doi=10.1117/12.2182811 |series=Twelfth International Conference on Quality Control by Artificial Vision 2015|bibcode=2015SPIE.9534E..0YJ |s2cid=29158717 }}
* {{in lang|fr}} {{Cite journal |first1=Igor |last1=Jovancevic |first2=Jean-José |last2=Orteu |first3=Thierry |last3=Sentenac |first4=Rémi |last4=Gilblas |title=Inspection d'un aéronef à partir d'un système multi-capteurs porté par un robot mobile |journal=Actes du 14ème Colloque Méthodes et Techniques Optiques pour l'Industrie |date=November 2015b |url=https://hal.archives-ouvertes.fr/hal-01350898/document}}
* {{Cite journal |first1=Ali |last1=Alhamwi |first2=Bertrand |last2=Vandeportaele |first3=Jonathan |last3=Piat |title=Real Time Vision System for Obstacle Detection and Localization on FPGA |journal=Computer Vision Systems – 10th International Conference, ICVS 2015 |pages=80–90 |year=2015 |url=https://hal.archives-ouvertes.fr/hal-01355008/document}}
* {{Cite conference |first1=Igor |last1=Jovancevic |first2=Ilisio |last2=Viana |first3=Jean-José |last3=Orteu |first4=Thierry |last4=Sentenac |first5=Stanislas |last5=Larnier |title=Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods |chapter=Matching CAD Model and Image Features for Robot Navigation and Inspection of an Aircraft |journal=International Conference on Pattern Recognition Applications and Methods |pages=359–366 |date=February 2016 |doi=10.5220/0005756303590366 |isbn=978-989-758-173-1 |chapter-url=https://hal.archives-ouvertes.fr/hal-01353317/document|url=https://hal.archives-ouvertes.fr/hal-01353317/file/article_ICPRAM2016.pdf }}
* {{Cite journal |first1=Igor |last1=Jovancevic |first2=Al |last2=Arafat |first3=Jean-José |last3=Orteu |first4=Thierry |last4=Sentenac |title=Airplane tire inspection by image processing techniques |journal=5th Mediterranean Conference on Embedded Computing |year=2016 |url=https://hal.archives-ouvertes.fr/hal-01351750/document}}
* {{in lang|fr}} {{Cite journal |first1=Jérémy |last1=Frejaville |first2=Stanislas |last2=Larnier |first3=Stéphane |last3=Vetault |title=Localisation à partir de données laser d'un robot naviguant autour d'un avion |journal=Actes de la conférence Reconnaissance de Formes et Intelligence Artificielle |year=2016 |url=https://hal.archives-ouvertes.fr/hal-01333650/document}}
* {{in lang|fr}} {{Cite journal |first1=Tanguy |last1=Villemot |first2=Stanislas |last2=Larnier |first3=Stéphane |last3=Vetault |title=Détection d'amers visuels pour la navigation d'un robot autonome autour d'un avion et son inspection |journal=Actes de la conférence Reconnaissance de Formes et Intelligence Artificielle |year=2016 |url=https://hal.archives-ouvertes.fr/hal-01333651/document}}
* {{Cite journal |first1=Frédéric |last1=Donadio |first2=Jérémy |last2=Frejaville |first3=Stanislas |last3=Larnier |first4=Stéphane |last4=Vetault |title=Human-robot collaboration to perform aircraft inspection in working environment |journal=Proceedings of 5th International Conference on Machine Control and Guidance |year=2016 |url=https://mcg2016.irstea.fr/wp-content/uploads/2017/05/MCG2016_paper_42.pdf}}
* {{Cite book |first1=Mustapha |last1=Lakrouf |first2=Stanislas |last2=Larnier |first3=Michel |last3=Devy |first4=Nouara |last4=Achour |title=Proceedings of the 3rd International Conference on Mechatronics and Robotics Engineering |chapter=Moving Obstacles Detection and Camera Pointing for Mobile Robot Applications |chapter-url=https://hal.archives-ouvertes.fr/hal-01579420/document |year=2017|pages=57–62 |doi=10.1145/3068796.3068816 |isbn=9781450352802 |s2cid=2361994 }}
* {{Cite journal |first1=Frédéric |last1=Donadio |first2=Jérémy |last2=Frejaville |first3=Stanislas |last3=Larnier |first4=Stéphane |last4=Vetault |title=Artificial intelligence and collaborative robot to improve airport operations |journal=Proceedings of 14th International Conference on Remote Engineering and Virtual Instrumentation |year=2017}}
* {{Cite book |first1=Marie-Anne |last1=Bauda |first2=Cécile |last2=Bazot |first3=Stanislas |last3=Larnier |title=2017 IEEE International Workshop of Electronics, Control, Measurement, Signals and their Application to Mechatronics (ECMSM) |chapter=Real-time ground marking analysis for safe trajectories of autonomous mobile robots |pages=1–6 |year=2017|doi=10.1109/ECMSM.2017.7945887 |isbn=978-1-5090-5582-1 |s2cid=25210956 }}
* {{Cite book |first1=Javier Ramirez |last1=Leiva |first2=Tanguy |last2=Villemot |first3=Guillaume |last3=Dangoumeau |first4=Marie-Anne |last4=Bauda |first5=Stanislas |last5=Larnier |title=2017 IEEE International Workshop of Electronics, Control, Measurement, Signals and their Application to Mechatronics (ECMSM) |chapter=Automatic visual detection and verification of exterior aircraft elements |pages=1–5 |year=2017|doi=10.1109/ECMSM.2017.7945885 |isbn=978-1-5090-5582-1 |s2cid=9052556 }}
* {{Cite journal |first1=Marie-Anne |last1=Bauda |first2=Alex |last2=Grenwelge |first3=Stanislas |last3=Larnier |title=3D scanner positioning for aircraft surface inspection |journal=Proceedings of European Congress Embedded Real Time Software and Systems |year=2018 |url=https://www.erts2018.org/uploads/program/ERTS_2018_paper_97.pdf |access-date=24 February 2018 |archive-date=8 February 2018 |archive-url=https://web.archive.org/web/20180208064314/https://www.erts2018.org/uploads/program/ERTS_2018_paper_97.pdf |url-status=dead }}
* {{Cite journal |first1=Dimitri |last1=Leca |first2=Viviane |last2=Cadenat |first3=Thierry |last3=Sentenac |first4=Adrien |last4=Durand-Petiteville |first5=Frédéric |last5=Gouaisbaut |first6=Emile |last6=Le Flécher |title=Sensor-based Obstacles Avoidance Using Spiral Controllers for an Aircraft Maintenance Inspection Robot |journal=Proceedings of European Control Conference |pages=2083–2089 |year=2019 |url=https://www.researchgate.net/publication/335198917}}

==== Journal articles ====
* {{Cite journal |first1=Igor |last1=Jovancevic |first2=Stanislas |last2=Larnier |first3=Jean-José |last3=Orteu |first4=Thierry |last4=Sentenac |title=Automated exterior inspection of an aircraft with a pan-tilt-zoom camera mounted on a mobile robot |journal=Journal of Electronic Imaging |volume=24 |issue=6 |pages=061110 |date=November 2015 |url=https://hal.archives-ouvertes.fr/hal-01351008/document |doi=10.1117/1.JEI.24.6.061110|bibcode=2015JEI....24f1110J |s2cid=29167101 }}
* {{Cite journal |first1=Jorge Othón |last1=Esparza-Jiménez |first2=Michel |last2=Devy |first3=José Luis |last3=Gordillo |title=EKF-based SLAM fusing heterogeneous landmarks |journal=Sensors |volume=16 |pages=489 |number=4 |year=2016 |url=https://hal.archives-ouvertes.fr/hal-01354880/document| doi = 10.3390/s16040489 |pmid=27070602 |pmc=4851003 |doi-access=free }}
* {{Cite journal |first1=Daniel Törtei |last1=Tertei |first2=Jonathan |last2=Piat |first3=Michel |last3=Devy |title=FPGA design of EKF block accelerator for 3D visual SLAM |journal=Computers and Electrical Engineering |year=2016 |url=https://hal.archives-ouvertes.fr/hal-01354883/document}}
* {{Cite journal |language=fr |first1=Igor |last1=Jovancevic |first2=Huy-Hieu |last2=Pham |first3=Jean-José |last3=Orteu |first4=Rémi |last4=Gilblas |first5=Jacques |last5=Harvent |first6=Xavier |last6=Maurice |first7=Ludovic |last7=Brèthes |title=Détection et caractérisation de défauts de surface par analyse des nuages de points 3D fournis par un scanner |journal=Instrumentation, Mesure, Métrologie, Lavoisier |year=2017 |volume=16 |pages=261–282 |url=https://hal.archives-ouvertes.fr/hal-01660998/document}}

==== PhD thesis reports ====
* {{cite book |last=Jovancevic |first=Igor |date=2016 |title=Exterior inspection of an aircraft using a Pan-Tilt-Zoom camera and a 3D scanner moved by a mobile robot: 2D image processing and 3D point cloud analysis |url=https://tel.archives-ouvertes.fr/tel-01687831/document |publisher=École nationale supérieure des mines d'Albi-Carmaux}}
* {{cite book |last=Futterlieb |first=Marcus |date=2017 |title=Vision based navigation in a dynamic environment |url=https://hal.laas.fr/tel-01624233/document |publisher=[[Université Paul Sabatier]]}}

=== Other references ===
{{Reflist|30em}}


== External links ==
== External links ==
<!-- Per [[WP:ELMINOFFICIAL]], choose one official website only -->
* {{Official website|https://aircobot.akka.eu/}}
* {{Official website|https://aircobot.akka.eu/}}
* [https://www.akka-technologies.com/en/innovation/projects/aircobot Air-Cobot] {{Webarchive|url=https://web.archive.org/web/20160701111942/https://www.akka-technologies.com/en/innovation/projects/aircobot |date=1 July 2016 }}
* [https://www.akka-technologies.com/ Akka Technologies]

{{Good article}}


[[Category:Industrial robots]]
[[Category:Industrial robots]]

Latest revision as of 06:35, 25 November 2024

Air-Cobot
CountryFrance
TypeCobot
Websiteaircobot.akka.eu

Air-Cobot (Aircraft Inspection enhanced by smaRt & Collaborative rOBOT) is a French research and development project of a wheeled collaborative mobile robot able to inspect aircraft during maintenance operations. This multi-partner project involves research laboratories and industry. Research around this prototype was developed in three domains: autonomous navigation, human-robot collaboration and nondestructive testing.

Air-Cobot is presented as the first wheeled robot able to perform visual inspections of aircraft. Inspection robots using other types of sensors have been considered before, such as the European project Robair. Since the launch of the project, other solutions based on image processing began to be developed, such as EasyJet with a drone, the swarm of drones from Toulouse company Donecle and the Aircam project of the aerospace manufacturer Airbus.

Since the beginning of the project in 2013, the Air-Cobot robot is dedicated to inspect the lower parts of an aircraft. In the continuation of the project, there is the prospect of coupling with a drone to inspect an aircraft's upper parts. In October 2016, Airbus Group launched its research project on the hangar of the future in Singapore. The robots from the Air-Cobot and Aircam projects are included in it.

Project description

[edit]

Objectives

[edit]

Launched in January 2013,[1] the project is part of the Interministerial Fund program of Aerospace Valley, a business cluster in southwestern France.[2] With a budget of over one million euros,[3] Air-Cobot aims to develop an innovative collaborative mobile robot, autonomous in its movements and able to perform the inspection of an aircraft with nondestructive testing sensors during preflight or during maintenance operations in a hangar.[2][4] Testing has been performed at the premises of Airbus and Air France Industries.[5]

Partners

[edit]
Air-Cobot has been tested on Airbus A320s in the premises of Airbus and Air France Industries.[5]

The project leader is Akka Technologies. There are two academic partners; Akka Technologies and four other companies make up the five commercial partners.[6]

Academic partners
Industrial partners

Project finance

[edit]

Project finance is provided by banque publique d'investissement, the Aquitaine Regional Council, the Pyrénées-Atlantiques Departemental Council, the Midi-Pyrénées Regional Council and by the European Union.[12]

Expected benefits

[edit]

Aircraft are inspected during maintenance operations either outdoors on an airport between flights, or in a hangar for longer-duration inspections. These inspections are conducted mainly by human operators, visually and sometimes using tools to assess defects.[A 1] The project aims to improve inspections of aircraft and traceability. A database dedicated to each aircraft type, containing images and three-dimensional scans, will be updated after each maintenance. This allows for example to assess the propagation of a crack.[4][13]

The human operator's eyes fatigue over time while an automatic solution ensures reliability and repeatability of inspections. The decrease in time taken for inspections is a major objective for aircraft manufacturers and airlines. If maintenance operations are faster, this will optimize the availability of aircraft and reduce maintenance operating costs.[4][13]

Robot equipment

[edit]
Air-Cobot in a hangar of Air France Industries.[A 1]

All electronics equipment is carried by the 4MOB mobile platform manufactured by Sterela. The off-road platform, equipped with four-wheel drive, can move at a speed of 2 metres per second (7.2 kilometres per hour (4.47 mph)).[11] Its lithium-ion battery allows an operating time of eight hours. Two bumpers are located at the front and at the rear. These are obstacle detection bumpers. They stop the platform if they are compressed.[11]

The cobot weighs 230 kilograms (507 lb). It has two computers, one running Linux for the autonomous navigation module and the other Windows for the non-destructive testing module. The robot is equipped with several sensors. The pan-tilt-zoom camera manufactured by Axis Communications and Eva 3D scanner manufactured by Artec 3D are dedicated to inspection. The sensors for navigation are an inertial measurement unit; two benches, each equipped with two PointGrey cameras; two Hokuyo laser range finders; and a GPS unit developed by M3 Systems that allows for geofencing tasks in outdoor environments.[3][7]

Autonomous navigation

[edit]

The autonomous navigation of the Air-Cobot robot is in two phases. The first, navigation in the airport or the factory, allows the robot to move close to the aircraft. The second navigation, around the aircraft, allows the robot to position itself at control points referenced in the aircraft virtual model. In addition, the robot must insert itself in a dynamic environment where humans and vehicles are moving. To address this problem, it has an obstacle avoidance module. Many navigation algorithms are constantly running on the robot with real time constraints. Searches are conducted on optimizing the computing time.[citation needed][clarification needed]

[edit]

In an outdoor environment, the robot is able to go to the inspection site by localizing through Global Positioning System (GPS) data. The GPS device developed by M3 Systems allows geofencing. At the airport, the robot operates in dedicated navigation corridors respecting speed limits. Alerts are sent to the operator if the robot enters a prohibited area or exceeds a given speed.[10][A 2]

Another algorithm based on computer vision provides, in real-time, a lane marking detection. When visible, painted lanes on the ground can provide complementary data to the positioning system to have safer trajectories.[A 3] If in an indoor environment or an outdoor environment where GPS information is not available, the cobot can be switch to follower mode to move behind the human operator and follow her or him to the aircraft to inspect.[14][A 2]

[edit]

To perform the inspection, the robot has to navigate around the aircraft and get to the checkpoints called up in the aircraft virtual model. The position of the aircraft in the airport or factory is not known precisely; the cobot needs to detect the aircraft in order to know its position and orientation relative to the aircraft. To do this, the robot is able to locate itself, either with the laser data from its laser range finders,[A 4] or with image data from its cameras.[A 1][A 5]

Near the aircraft, a point cloud in three dimensions is acquired by changing the orientation of the laser scanning sensors fixed on pan-tilt units. After filtering data to remove floor- or insufficiently large dot clusters, a registration technique with the model of the aircraft is used to estimate the static orientation of the robot. The robot moves and holds this orientation by considering its wheel odometry, its inertial unit and visual odometry.[A 4]

Air-Cobot can estimate its position relative to an aircraft by using visual landmarks on the fuselage.[A 5]

Laser data are also used horizontally in two dimensions. An algorithm provides a real-time position estimation of the robot when enough elements from the landing gears and engines are visible. A confidence index is calculated based on the number of items collected by lasers. If good data confidence is achieved, the position is updated. This mode is particularly used when the robot moves beneath the aircraft.[A 4]

For visual localization, the robot estimates its position relative to the aircraft using visual elements (doors, windows, tires, static ports etc.) of the aircraft. During the evolution of the robot, these visual elements are extracted from a three-dimensional virtual model of the aircraft and projected in the image plane of the cameras. The projected shapes are used for pattern recognition to detect those visual elements.[A 5] The other detection method used is based on the extraction of features with a Speeded Up Robust Features (SURF) approach. A pairing is performed between images of each element to be detected and the actual scene experienced.[A 1]

By detecting and tracking visual landmarks, in addition to estimating its position relative to the aircraft, the robot can perform a visual servoing.[A 6] Research in vision is also conducted on simultaneous localization and mapping (SLAM).[A 7][A 8] A merger of information between the two methods of acquisition and laser vision is being considered. Artificial intelligence arbitrating various locations is also under consideration.[A 4][A 1]

Obstacle avoidance

[edit]

In both navigation modes, Air-Cobot is also able to detect, track, identify and avoid obstacles that are in its way. The laser data from laser range sensors and visual data from the cameras can be used for detection, monitoring and identification of the obstacles. The detection and monitoring are better in the two-dimensional laser data, while identification is easier in the images from the cameras; the two methods are complementary. Information from laser data can be used to delimit work areas in the image.[A 6][A 9][A 10]

The robot has several possible responses to any obstacles. These will depend on its environment (navigation corridor, tarmac area without many obstacles, cluttered indoor environment etc.) at the time of the encounter with an obstacle. It can stop and wait for a gap in traffic, or avoid an obstacle by using a technique based on a spiral, or perform path planning trajectories.[A 6][A 10]

Computing time optimization

[edit]

Given the number of navigation algorithms calculating simultaneously to provide all the information in real time, research has been conducted to improve the computation time of some numerical methods using field-programmable gate arrays.[A 11][A 12][A 13] The research focused on visual perception. The first part was focused on the simultaneous localization and mapping with an extended Kalman filter that estimates the state of a dynamic system from a series of noisy or incomplete measures.[A 11][A 13] The second focused on the location and the detection of obstacles.[A 12]

Non-destructive testing

[edit]
Air-Cobot can inspect the blades of a turbofan engine.[A 14]

Image analysis

[edit]

After having positioned to perform a visual inspection, the robot performs an acquisition with a pan-tilt-zoom camera. Several steps take place: pointing the camera, sensing the element to be inspected, if needed repointing and zooming with the camera, image acquisition and inspection. Image analysis is used on doors to determine whether they are open or closed; on the presence or absence of protection for certain equipment; the state of turbofan blades or the wear of landing gear tires.[A 14][A 15][A 16][A 17]

The detection uses pattern recognition of regular shapes (rectangles, circles, ellipses). The 3D model of the element to be inspected can be projected in the image plane for more complex shapes. The evaluation is based on indices such as the uniformity of segmented regions, convexity of their forms, or periodicity of the image pixels' intensity.[A 14]

The feature extraction using speeded up robust features (SURF) is also able to perform the inspection of certain elements having two possible states, such as pitot probes or static ports being covered or not covered. A pairing is performed between images of the element to be inspected in different states and that present on the scene. For these simple items to be inspected, an analysis during navigation is possible and preferable due to its time saving.[A 1][A 18]

Point cloud analysis

[edit]

After having positioned to perform a scan inspection, the pantograph elevates the 3D scanner at the fuselage. A pan-tilt unit moves the scan device to acquire the hull. By comparing the data acquired to the three-dimensional model of the aircraft, algorithms are able to diagnose any faults in the fuselage structure and provide information on their shape, size and depth.[15][A 19][A 20]

By moving the pan-tilt units of the laser range finders, it is also possible to obtain a point cloud in three dimensions. Technical readjustment between the model of the aircraft and the scene point cloud is already used in navigation to estimate the static placement of the robot. It is planned to make targeted acquisitions, simpler in terms of movement, to verify the absence of chocks in front of the landing gear wheels, or the proper closing of engine cowling latches.[A 4]

Collaboration human-robot

[edit]

As the project name suggests, the mobile robot is a cobot – a collaborative robot. During phases of navigation and inspection, a human operator accompanies the robot; he can take control if necessary, add inspection tasks, note a defect that is not in the list of robot checks, or validate the results. In the case of pre-flight inspections, the diagnosis of the walk-around is sent to the pilot who decides whether or not to take off.[7][14][A 21]

Other robotic inspection solutions

[edit]
The drones can inspect upper parts of the aircraft such as the tail and simplify maintenance checks.

European project Robair

[edit]

The inspection robot of the European project Robair, funded from 2001 to 2003, is designed to mount on the wings and fuselage of an aircraft to inspect rows of rivets. To move, the robot uses a flexible network of pneumatic suction cups that are adjustable to the surface. It can inspect the lines of rivets with ultrasonic waves, eddy current and thermographic techniques. It detects loose rivets and cracks.[16][17][18]

EasyJet drone

[edit]

Airline EasyJet is interested in the inspection of aircraft with drones. It made a first inspection in 2015. Equipped with laser sensors and high resolution camera, the drone performs autonomous flight around the aeroplane. It generates a three-dimensional image of the aircraft and transmits it to a technician. The operator can then navigate in this representation and zoom to display a high-resolution picture of some parts of the aircraft. The operator must then visually diagnose the presence or absence of defects. This approach avoids the use of platforms to observe the upper parts of the aeroplane.[19]

Donecle drone

[edit]
Donecle's autonomous drone inspecting an aircraft.

Founded in 2015, Donecle, a Toulouse start-up company, has also launched a drone approach which was initially specialized in the detection of lightning strikes on aeroplanes.[20][21] Performed by five people equipped with harnesses and platforms, this inspection usually takes about eight hours. The immobilization of the aircraft and the staff are costly for the airlines, estimated at $10 000 per hour. The solution proposed by the start-up lasts twenty minutes.[21]

Donecle uses a swarm of drones equipped with laser sensors and micro-cameras. The algorithms for automatic detection of defects, trained on existing images database with a machine learning software, are able to identify various elements: texture irregularities, pitot probes, rivets, openings, text, defects, corrosion, oil stains. A damage report is sent on the operator's touch pad with each area of interest and the proposed classification with a probability percentage. After reviewing the images, the verdict is pronounced by a qualified inspector.[21]

Project continuation

[edit]

In 2015, in an interview given to the French weekly magazine Air & Cosmos, Jean-Charles Marcos, chief executive officer (CEO) of Akka Research, explained that once developed and marketed the Air-Cobot should cost between 100,000 and 200,000 euros. He could meet civilian needs in nondestructive testing and also military ones.[3] A possible continuation of the project could be the use of the robot on aircraft larger than the Airbus A320. The CEO also revealed that Akka Technologies plans to work on a duo of robots for inspection: the same mobile platform for the lower parts, and a drone for the upper parts. If funding is allocated then this second phase would take place during the period 2017–2020.[3]

At the Singapore Airshow in February 2016, Airbus Group presented Air-Cobot and its use in its vision of the hangar of the future.[22] The same month, the Singapore government enlisted Airbus Group to help local maintenance, repair, and operations providers to stay competitive against neighbour countries like Indonesia, Thailand and the Philippines which are cheaper. To improve productivity, Airbus Group launches, in October 2016, a testbed hangar where new technologies can be tested. Upon entering the hangar, cameras study the aircraft to detect damages. Mobile robots, such as the one of the Air-Cobot project, and drones, such as the one of the Aircam project, carry out more detailed inspections.[23]

During the 14th International Conference on Remote Engineering and Virtual Instrumentation in March 2017, Akka Research Toulouse, one of the centers for research and development of Akka Technologies, presents its vision of the airport of the future.[A 2] In addition of Air-Cobot, a previous step in this research axis is Co-Friend, an intelligent video surveillance system to monitor and improve airport operations.[A 2][24] Futur researches will focus on the management of this operations, autonomous vehicles, non-destructive testing and human-machine interactions to increase efficiency and security on airports.[A 2] From August 2017, the robot comes in once a month in Aeroscopia, an aeronautics museum of Blagnac. The researchers of the project take advantage of the collection to test the robot and acquire data on other aircraft models such as Airbus A400M, Airbus A300 and Sud-Aviation SE 210 Caravelle.[25]

Communications

[edit]
Air-Cobot under the belly of an Airbus A320 in a hangar.[A 4]

On 23 October 2014, a patent was filed by Airbus.[26] From 2014 to 2016, the robot had presentations in five exhibitions including Paris Air Show 2015,[1][27][28] and Singapore Airshow 2016.[22][29] The research developed in the project was presented in eighteen conferences. Twenty-one scientific articles were published seventeen conference proceedings and four journal articles.[30] Part of publications is centered on navigation and/or inspection by Air-Cobot while the rest focuses on specific numerical methods or hardware solutions related to the issues of the project. During the international conference Machine Control and Guidance (MCG) of 2016, the prize for the best final application is awarded to the authors of the publication Human-robot collaboration to perform aircraft inspection in working environment.[31]

On 17 April 2015, Airbus Group distributed a project presentation video, made by the communication agency Clipatize, on its YouTube channel.[14][32] On 25 September 2015, Toulouse métropole broadcasts a promotional video on its YouTube channel. Toulouse metropolis is presented as an attractive ecosystem, able to build the future and highlights its visibility internationally. The Air-Cobot demonstrator was chosen to illustrate the robotics research of this metropolis.[33] Located at Laboratoire d'analyse et d'architecture des systèmes during development, researchers or engineers working on the project regularly present a demonstration to visitors (external researchers, industrial partners, or students); it was also demonstrated to the general public during the 2015 Feast of Science.[34] Airbus Group, on 17 February 2016, broadcast a YouTube video presentation of its vision of the hangar of the future in which it plans to use Air-Cobot.[22]

See also

[edit]

Notes and references

[edit]

Research publications of the project

[edit]

Proceedings

[edit]

Journal articles

[edit]

PhD thesis reports

[edit]

Other references

[edit]
  1. ^ a b (in French) Xavier Martinage (17 June 2015). "Air-Cobot : le robot dont dépendra votre sécurité". lci.tf1.fr. La Chaîne Info. Archived from the original on 3 January 2016. Retrieved 12 July 2016.
  2. ^ a b (in French) "Air-Cobot : un nouveau mode d'inspection visuelle des avions". competitivite.gouv.fr. Les pôles de compétitivité. Archived from the original on 11 October 2016. Retrieved 12 July 2016.
  3. ^ a b c d e f (in French) Olivier Constant (11 September 2015). "Le projet Air-Cobot suit son cours". Air et Cosmos (2487). Retrieved 12 July 2016.
  4. ^ a b c (in French) "Rapport d'activité 2013–2014 de l'Aerospace Valley" (PDF). aerospace-valley.com. Aerospace Valley. Archived from the original (PDF) on 24 September 2016. Retrieved 12 July 2016.
  5. ^ a b (in French) "News du projet Air-Cobot". aircobot.akka.eu. Akka Technologies. Archived from the original on 10 July 2016. Retrieved 12 July 2016.
  6. ^ a b c d e f g h (in French) "AKKA Technologies coordonne le projet Air-COBOT, un robot autonome d'inspection visuelle des avions". Capital. 1 July 2014. Archived from the original on 25 June 2016. Retrieved 14 July 2016.
  7. ^ a b c d e f g h i (in French) "Air-Cobot, le robot qui s'assure que vous ferez un bon vol !". Planète Robots (38): 32–33. March–April 2016.
  8. ^ (in French) "Contrats RAP". Laboratoire d'analyse et d'architecture des systèmes. Archived from the original on 14 September 2015. Retrieved 17 July 2016.
  9. ^ (in French) "Akka Technologies : une marque employeur orientée sur l'innovation". Le Parisien. 15 February 2016. Retrieved 17 July 2016.
  10. ^ a b "M3 Systems Flagship Solution". M3 Systems. Archived from the original on 6 August 2016. Retrieved 17 July 2016.
  11. ^ a b c (in French) "4MOB, plateforme intelligente autonome" (PDF). Sterela Solutions. Archived from the original (PDF) on 9 August 2016. Retrieved 17 July 2016.
  12. ^ (in French) "Financeurs". aircobot.akka.eu. Akka Technologies. Archived from the original on 4 August 2016. Retrieved 15 July 2016.
  13. ^ a b (in French) Véronique Guillermard (18 May 2015). "Aircobot contrôle les avions avant le décollage". Le Figaro. Retrieved 14 July 2016.
  14. ^ a b c Air-Cobot on YouTube
  15. ^ (in French) Pascal NGuyen (December 2014). "Des robots vérifient l'avion au sol". Sciences et Avenir (814). Archived from the original on 8 August 2016. Retrieved 17 July 2016.
  16. ^ (in French) "Robair, Inspection robotisée des aéronefs". European Commission. Archived from the original on 11 October 2016. Retrieved 16 July 2016.
  17. ^ "Robair". London South Bank University. Retrieved 16 July 2016.
  18. ^ Shang, Jianzhong; Sattar, Tariq; Chen, Shuwo; Bridge, Bryan (2007). "Design of a climbing robot for inspecting aircraft wings and fuselage" (PDF). Industrial Robot. 34 (6): 495–502. doi:10.1108/01439910710832093.
  19. ^ (in French) "Easy Jet commence à utiliser des drones pour l'inspection de ses avions". humanoides.fr. 8 June 2015. Archived from the original on 12 October 2015. Retrieved 16 July 2016.
  20. ^ (in French) Florine Galéron (28 May 2015). "Aéronautique : la startup Donecle invente le drone anti-foudre". Objectif News, la Tribune. Retrieved 16 July 2016.
  21. ^ a b c (in French) Arnaud Devillard (20 April 2016). "Des drones pour inspecter des avions". Sciences et Avenir. Archived from the original on 8 August 2016. Retrieved 16 July 2016.
  22. ^ a b c Innovations in Singapore: the Hangar of the Future on YouTube
  23. ^ "Pimp my Hangar: Excelling in MRO". airbusgroup.com. Airbus. Archived from the original on 21 December 2016. Retrieved 21 December 2016.
  24. ^ (in French) Éric Parisot (21 June 2013). "Co-Friend, le système d'analyse d'images qui réduit les temps d'immobilisation des avions". Usine Digitale. Retrieved 24 February 2018.
  25. ^ (in French) Aeroscopia, ed. (August 2017). "Le Musée accueille le projet AIR-COBOT". musee-aeroscopia.fr. Archived from the original on 14 October 2017. Retrieved 24 February 2018.
  26. ^ "Espacenet – Bibliographic data – Collaborative robot for visually inspecting an aircraft". worldwide.espacenet.com. Retrieved 1 June 2016.
  27. ^ (in French) Juliette Raynal; Jean-François Prevéraud (15 June 2015). "Bourget 2015 : les dix rendez-vous technos à ne pas louper". Industrie et Technologies. Archived from the original on 5 July 2016. Retrieved 16 July 2016.
  28. ^ (in French) "Akka Technologies au Salon du Bourget". Maurice Ricci. 21 June 2015. Archived from the original on 4 April 2016. Retrieved 16 July 2015.
  29. ^ "Singapore Airshow 2016 Trends: Emerging Technologies Take Off – APEX | Airline Passenger Experience". apex.aero. Retrieved 1 June 2016.
  30. ^ "Communications du projet Air-Cobot". aircobot.akka.eu (in French). Akka Technologies. Archived from the original on 11 August 2016. Retrieved 14 July 2016.
  31. ^ "Best MCG2016 Final Application Award" (PDF). mcg2016.irstea.fr. Machine Control and Guidance. October 2016. Retrieved 22 February 2020.
  32. ^ "AirCobot – Introducing Smart Robots for Aircraft Inspections". clipatize.com. Clipatize. Archived from the original on 6 August 2016. Retrieved 15 August 2016.
  33. ^ (in French) Toulouse métropole, construire le futur on YouTube
  34. ^ Air-Cobot, le robot d'assistance aux inspections des aéronefs (PDF). Programme de la fête de la science (in French). 2015. Retrieved 17 July 2016.
[edit]