- Tytuł:
- Utilizing relevant RGB-D data to help recognize RGB images in the target domain
- Autorzy:
-
Gao, Depeng
Liu, Jiafeng
Wu, Rui
Cheng, Dansong
Fan, Xiaopeng
Tang, Xianglong - Powiązania:
- https://bibliotekanauki.pl/articles/329725.pdf
- Data publikacji:
- 2019
- Wydawca:
- Uniwersytet Zielonogórski. Oficyna Wydawnicza
- Tematy:
-
object recognition
RGB-D image
transfer learning
privileged information
rozpoznawanie obiektu
obraz RGB-D
uczenie maszynowe
informacja poufna - Opis:
- With the advent of 3D cameras, getting depth information along with RGB images has been facilitated, which is helpful in various computer vision tasks. However, there are two challenges in using these RGB-D images to help recognize RGB images captured by conventional cameras: one is that the depth images are missing at the testing stage, the other is that the training and test data are drawn from different distributions as they are captured using different equipment. To jointly address the two challenges, we propose an asymmetrical transfer learning framework, wherein three classifiers are trained using the RGB and depth images in the source domain and RGB images in the target domain with a structural risk minimization criterion and regularization theory. A cross-modality co-regularizer is used to restrict the two-source classifier in a consistent manner to increase accuracy. Moreover, an L2,1 norm cross-domain co-regularizer is used to magnify significant visual features and inhibit insignificant ones in the weight vectors of the two RGB classifiers. Thus, using the cross-modality and cross-domain co-regularizer, the knowledge of RGB-D images in the source domain is transferred to the target domain to improve the target classifier. The results of the experiment show that the proposed method is one of the most effective ones.
- Źródło:
-
International Journal of Applied Mathematics and Computer Science; 2019, 29, 3; 611-621
1641-876X
2083-8492 - Pojawia się w:
- International Journal of Applied Mathematics and Computer Science
- Dostawca treści:
- Biblioteka Nauki