Author
Listed:
- Tianyang Chen
- Wenwu Tang
- Craig Allan
- Shen-En Chen
Abstract
Three-dimensional (3D) geospatial object detection has become essential for 3D geospatial studies driven by explosive growth in 3D data. It is extremely labor- and cost-intensive, though, as it often requires manual detection. Deep learning has been recently used to automate object detection within 3D context. Yet, addressing spatial dependency in 3D data and how it might inform deep learning for 3D geospatial object detection remains a significant challenge. Traditional models focus on the use of spatial properties, often overlooking color and contextual information. Exploiting these nonspatial attributes for 3D geospatial object detection thus becomes crucial. Our study pioneers explicit incorporation of spatial autocorrelation of color information into 3D deep learning for object detection. We introduce an innovative framework to estimate spatial autocorrelation, addressing challenges in unstructured 3D data sets. Our experiments suggest the effectiveness of incorporating spatial autocorrelation features in enhancing the accuracy of 3D deep learning models for geospatial object detection. We further investigate the uncertainty of such contextual information brought by diverse configurations, exemplified by the number of nearest neighbors. This study advances 3D geospatial object detection via using spatial autocorrelation to inform deep learning algorithms, strengthening the connection between GIScience and artificial intelligence and, thus, holding implications for diverse GeoAI applications.
Suggested Citation
Tianyang Chen & Wenwu Tang & Craig Allan & Shen-En Chen, 2024.
"Explicit Incorporation of Spatial Autocorrelation in 3D Deep Learning for Geospatial Object Detection,"
Annals of the American Association of Geographers, Taylor & Francis Journals, vol. 114(10), pages 2297-2316, November.
Handle:
RePEc:taf:raagxx:v:114:y:2024:i:10:p:2297-2316
DOI: 10.1080/24694452.2024.2380898
Download full text from publisher
As the access to this document is restricted, you may want to search for a different version of it.
Corrections
All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:taf:raagxx:v:114:y:2024:i:10:p:2297-2316. See general information about how to correct material in RePEc.
If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.
We have no bibliographic references for this item. You can help adding them by using this form .
If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.
For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Longhurst (email available below). General contact details of provider: http://www.tandfonline.com/raag .
Please note that corrections may take a couple of weeks to filter through
the various RePEc services.