- Tytuł:
- Distance estimation using artificial neural networks: architectures, capabilities and limitations
- Autorzy:
- Hachaj, Tomasz
- Powiązania:
- https://bibliotekanauki.pl/articles/51459494.pdf
- Data publikacji:
- 2023
- Wydawca:
- Uniwersytet Komisji Edukacji Narodowej w Krakowie. Instytut Filozofii i Socjologii
- Tematy:
-
single-frame depth estimation
encoder-decoder
deep learning
typical errors
limitations - Opis:
- The ability to judge distances using vision is an extremely important skill that greatly facilitates exploration of one’s immediate environment. Most commonly, spatial vision is associated with stereo vision. Although human eyes also act as a stereo vision system, we can perform a simple experiment by covering one eye and then look at our surroundings: even though we are now observing the world through a single “sensor” we can still judge which objects are closer and which are further away. Though we can also employ a slight change in viewing perspective to improve our sense of distance, this is not necessary and using even one eye and standing still we are able, through the experience we have gained, to correctly estimate the distances between the objects we can see. Also when we look at photographs although the images are two-dimensional, we are able to estimate the distance portrayed in them. In recent years many solutions based on machine learning methods and deep neural networks have been developed that can mimic this process. In particular, encoder-decoder architectures are effective in this task which allows a robot single-frame depth estimation. However, these solutions still have some limitations, which constitute a challenge for researchers and engineers. This paper will discuss the challenges faced by such architectures based on the author’s experience in the practice of developing deep learning-based single-frame depth estimation algorithms.
- Źródło:
-
ARGUMENT: Biannual Philosophical Journal; 2023, 13, 1; 13-28
2083-6635
2084-1043 - Pojawia się w:
- ARGUMENT: Biannual Philosophical Journal
- Dostawca treści:
- Biblioteka Nauki