Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Wyszukujesz frazę "visual odometry" wg kryterium: Temat


Wyświetlanie 1-7 z 7
Tytuł:
On augmenting the visual slam with direct orientation measurement using the 5-point algorithm
Autorzy:
Schmidt, A.
Kraft, A.
Fularz, M.
Domagała, Z.
Powiązania:
https://bibliotekanauki.pl/articles/384310.pdf
Data publikacji:
2013
Wydawca:
Sieć Badawcza Łukasiewicz - Przemysłowy Instytut Automatyki i Pomiarów
Tematy:
visual SLAM
visual odometry
Opis:
This paper presents the attempt to merge two paradigms of the visual robot navigation: Visual Simultaneous Localization and Mapping (VSLAM) and Visual Odometry (VO). The VSLAM was augmented with the direct, visual measurement of the robot orientation change using the 5-point algorithm. The extended movement model of the robot was proposed and additional measurements were introduced to the SLAM system. The efficiency of the 5-point and 8-point algorithms was compared. The augmented system was compared with the state of the art VSLAM solution and the proposed modification allowed to reduce the tracking error by over 30%.
Źródło:
Journal of Automation Mobile Robotics and Intelligent Systems; 2013, 7, 1; 5-10
1897-8649
2080-2145
Pojawia się w:
Journal of Automation Mobile Robotics and Intelligent Systems
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
The registration system for the evaluation of indoor visual slam and odometry algorithms
Autorzy:
Schmidt, A.
Kraft, M.
Fularz, M.
Domagała, Z.
Powiązania:
https://bibliotekanauki.pl/articles/384854.pdf
Data publikacji:
2013
Wydawca:
Sieć Badawcza Łukasiewicz - Przemysłowy Instytut Automatyki i Pomiarów
Tematy:
SLAM
visual odometry
benchmark
Opis:
This paper presents the new benchmark data registra- tion system aimed at facilitating the development and evaluation of the visual odometry and SLAM algorithms. The WiFiBOT LAB V3 wheeled robot equipped with three cameras, XSENS MTi atitude and heading reference system (AHRS) and Hall encoders can be used to gather data in indoor exploration scenarios. The ground truth trajectory of the robot is obtained using the visual motion tracking system. Additional static cameras simulating the surveillance network, as well as artificial markers augmen ting the navigation are incorporated in the system. The datasets registered with the presented system will be freely available for research purposes.
Źródło:
Journal of Automation Mobile Robotics and Intelligent Systems; 2013, 7, 2; 46-51
1897-8649
2080-2145
Pojawia się w:
Journal of Automation Mobile Robotics and Intelligent Systems
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
On the efficiency of population-based optimization in finding best parameters for RGB-D visual odometry
Autorzy:
Kostusiak, Aleksander
Skrzypczyński, Piotr
Powiązania:
https://bibliotekanauki.pl/articles/384397.pdf
Data publikacji:
2019
Wydawca:
Sieć Badawcza Łukasiewicz - Przemysłowy Instytut Automatyki i Pomiarów
Tematy:
particle swarm optimization (PSO)
evolutionary algorithm
visual odometry
RGB-D
Opis:
Visual odometry estimates the transformations between consecutive frames of a video stream in order to recover the camera’s trajectory. As this approach does not require to build a map of the observed environment, it is fast and simple to implement. In the last decade RGBD cameras proliferated in roboTIcs, being also the sensors of choice for many practical visual odometry systems. Although RGB-D cameras provide readily available depth images, that greatly simplify the frame-to-frame transformations computaTIon, the number of numerical parameters that have to be set properly in a visual odometry system to obtain an accurate trajectory estimate remains high. Whereas seƫng them by hand is certainly possible, it is a tedious try-and-error task. Therefore, in this article we make an assessment of two population-based approaches to parameter opTImizaTIon, that are for long time applied in various areas of robotics, as means to find best parameters of a simple RGB-D visual odometry system. The optimization algorithms investigated here are particle swarm optimization and an evolutionary algorithm variant. We focus on the optimization methods themselves, rather than on the visual odometry algorithm, seeking an efficient procedure to find parameters that minimize the estimated trajectory errors. From the experimental results we draw conclusions as to both the efficiency of the optimization methods, and the role of particular parameters in the visual odometry system.
Źródło:
Journal of Automation Mobile Robotics and Intelligent Systems; 2019, 13, 2; 5-14
1897-8649
2080-2145
Pojawia się w:
Journal of Automation Mobile Robotics and Intelligent Systems
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Experimental verification of a walking robot self - localization system with the kinect sensor
Autorzy:
Nowicki, M.
Skrzypczyński, P.
Powiązania:
https://bibliotekanauki.pl/articles/950763.pdf
Data publikacji:
2013
Wydawca:
Sieć Badawcza Łukasiewicz - Przemysłowy Instytut Automatyki i Pomiarów
Tematy:
3D perception
salient features
iterative closest points
visual odometry
walking robot
Opis:
In this paper we investigate methods for self-localisation of a walking robot with the Kinect 3D active range sensor. The Iterative Closest Point (ICP) algorithm is considered as the basis for the computation of the robot rotation and translation between two viewpoints. As an alternative, a feature-based method for matching of 3D range data is considered, using the Normal Aligned Radial Feature (NARF) descriptors. Then, it is shown that NARFs can be used to compute a good initial estimate for the ICP algorithm, resulting in convergent estimation of the sensor egomotion. Experimental results are provided.
Źródło:
Journal of Automation Mobile Robotics and Intelligent Systems; 2013, 7, 4; 42-51
1897-8649
2080-2145
Pojawia się w:
Journal of Automation Mobile Robotics and Intelligent Systems
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Implementacja algorytmu do estymacji ruchu własnego robota w układzie FPGA
Implementation of the Robot Ego-Motion Estimation Algorithm in FPGA Circuits
Autorzy:
Kraft, M.
Powiązania:
https://bibliotekanauki.pl/articles/156593.pdf
Data publikacji:
2011
Wydawca:
Stowarzyszenie Inżynierów i Techników Mechaników Polskich
Tematy:
FPGA
odometria wizyjna
system wizyjny
system wbudowany
visual odometry
vision system
embedded system
Opis:
W artykule opisano implementację w układzie FPGA systemu, realizującego zadanie szacowania ruchu własnego urządzenia (np. robota mobilnego), wyposażonego w pojedynczą kamerę. Zrealizowano ją w architekturze hybrydowej, sprzętowo-programowej. W artykule przedstawiono szczegółowy opis wynikowej architektury, jak również użycie zasobów układu programowalnego, oraz analizę wydajności systemu, wraz z porównaniem z alternatywnym rozwiązaniem opartym o komputer PC.
The paper presents implementation of the robot ego-motion estimation algorithm in a single FPGA. The input data for the algorithm are feature correspondences detected in the image sequence registered by a single camera. The implemented system, based on the Microblaze microprocessor along with a dedicated hardware coprocessor, performs all stages of the algorithm - computation of the essential matrix using the 8-point algorithm employing singular value decomposition, robust estimation of the correct essential matrix using the RANSAC algorithm as well as computation of the rotation matrix and the translation vector (up to a scale) from the essential matrix [1, 2]. The system was implemented in a Virtex 5 PFGA and is capable of working with a clock speed of 100MHz. The microprocessor is used to find successive essential matrices using singular value decomposition. The solutions are tested for correctness using the coprocessor with the RANSAC algorithm [3]. The coprocessor employs a reduced, 23-bit floating point number representation to reduce resource usage. Upon successful completion of the essential matrix estimation, rotation and translation are computed. Additional sensors are used to deal with rotation and translation sign ambiguity. Table 1 presents the summary of resources used for implementation. Figure 1 outlines the system architecture. The results obtained are satisfactory and promising. The availability of inexpensive, low power, small footprint solution for ego-motion estimation is desirable for many applications.
Źródło:
Pomiary Automatyka Kontrola; 2011, R. 57, nr 1, 1; 6-8
0032-4140
Pojawia się w:
Pomiary Automatyka Kontrola
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Mathematical Model of Errors of Odometry and Georeferencing Channels in Visual Correlation Extreme Navigation
Autorzy:
Mukhina, Maryna Petrivna
Kharchenko, Volodymyr Petrovych
Powiązania:
https://bibliotekanauki.pl/articles/503832.pdf
Data publikacji:
2017
Wydawca:
Międzynarodowa Wyższa Szkoła Logistyki i Transportu
Tematy:
correlation extreme navigation
visual odometry
geo-referencing
feature matching
cartographic errors
dead reckoning errors
Opis:
The mathematic model of errors in correlation with the extreme navigation system (CENS) is developed basing on odometry and geo-referencing channels. The realization of the model is done in Simulink, and based on regular and random components of additive noise. The results of simulations prove accumulation of errors for odometry errors and its mitigation in case of geo-referencing in periods of correction.
Źródło:
Logistics and Transport; 2017, 33, 1; 31-36
1734-2015
Pojawia się w:
Logistics and Transport
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Efficient RGB-D data processing for feature-based self-localization of mobile robots
Autorzy:
Kraft, M.
Nowicki, M.
Penne, R.
Schmidt, A.
Skrzypczyński, P.
Powiązania:
https://bibliotekanauki.pl/articles/330295.pdf
Data publikacji:
2016
Wydawca:
Uniwersytet Zielonogórski. Oficyna Wydawnicza
Tematy:
visual odometry
simultaneous localization
simultaneous mapping
RGB-D
tracking
point features
odometria wizyjna
lokalizacja jednoczesna
śledzenie
Opis:
The problem of position and orientation estimation for an active vision sensor that moves with respect to the full six degrees of freedom is considered. The proposed approach is based on point features extracted from RGB-D data. This work focuses on efficient point feature extraction algorithms and on methods for the management of a set of features in a single RGB-D data frame. While the fast, RGB-D-based visual odometry system described in this paper builds upon our previous results as to the general architecture, the important novel elements introduced here are aimed at improving the precision and robustness of the motion estimate computed from the matching point features of two RGB-D frames. Moreover, we demonstrate that the visual odometry system can serve as the front-end for a pose-based simultaneous localization and mapping solution. The proposed solutions are tested on publicly available data sets to ensure that the results are scientifically verifiable. The experimental results demonstrate gains due to the improved feature extraction and management mechanisms, whereas the performance of the whole navigation system compares favorably to results known from the literature.
Źródło:
International Journal of Applied Mathematics and Computer Science; 2016, 26, 1; 63-79
1641-876X
2083-8492
Pojawia się w:
International Journal of Applied Mathematics and Computer Science
Dostawca treści:
Biblioteka Nauki
Artykuł
    Wyświetlanie 1-7 z 7

    Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies