Author
Listed:
- Anton Filatov
(Faculty of Computer Science and Technology, Saint Petersburg Electrotechnical University “LETI”, 197022 Saint Petersburg, Russia)
- Mark Zaslavskiy
(Faculty of Computer Science and Technology, Saint Petersburg Electrotechnical University “LETI”, 197022 Saint Petersburg, Russia)
- Kirill Krinkin
(Faculty of Computer Science and Technology, Saint Petersburg Electrotechnical University “LETI”, 197022 Saint Petersburg, Russia)
Abstract
In the recent decade, the rapid development of drone technologies has made many spatial problems easier to solve, including the problem of 3D reconstruction of large objects. A review of existing solutions has shown that most of the works lack the autonomy of drones because of nonscalable mapping techniques. This paper presents a method for centralized multi-drone 3D reconstruction, which allows performing a data capturing process autonomously and requires drones equipped only with an RGB camera. The essence of the method is a multiagent approach—the control center performs the workload distribution evenly and independently for all drones, allowing simultaneous flights without a high risk of collision. The center continuously receives RGB data from drones and performs each drone localization (using visual odometry estimations) and rough online mapping of the environment (using image descriptors for estimating the distance to the building). The method relies on a set of several user-defined parameters, which allows the tuning of the method for different task-specific requirements such as the number of drones, 3D model detalization, data capturing time, and energy consumption. By numerical experiments, it is shown that method parameters can be estimated by performing a set of computations requiring characteristics of drones and the building that are simple to obtain. Method performance was evaluated by an experiment with virtual building and emulated drone sensors. Experimental evaluation showed that the precision of the chosen algorithms for online localization and mapping is enough to perform simultaneous flights and the amount of captured RGB data is enough for further reconstruction.
Suggested Citation
Anton Filatov & Mark Zaslavskiy & Kirill Krinkin, 2021.
"Multi-Drone 3D Building Reconstruction Method,"
Mathematics, MDPI, vol. 9(23), pages 1-18, November.
Handle:
RePEc:gam:jmathe:v:9:y:2021:i:23:p:3033-:d:688628
Download full text from publisher
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:9:y:2021:i:23:p:3033-:d:688628. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.