Home
Scholarly Works
Deep Semantic Mapping for Heterogeneous Multimedia...
Journal article

Deep Semantic Mapping for Heterogeneous Multimedia Transfer Learning Using Co-Occurrence Data

Abstract

Transfer learning, which focuses on finding a favorable representation for instances of different domains based on auxiliary data, can mitigate the divergence between domains through knowledge transfer. Recently, increasing efforts on transfer learning have employed d eep n eural n etworks (DNN) to learn more robust and higher level feature representations to better tackle cross-media disparities. However, only a few articles consider the correction and semantic matching between multi-layer heterogeneous domain networks. In this article, we propose a d eep semantic mapping model for h eterogeneous multimedia t ransfer l earning (DHTL) using co-occurrence data. More specifically, we integrate the DNN with c anonical c orrelation a nalysis (CCA) to derive a deep correlation subspace as the joint semantic representation for associating data across different domains. In the proposed DHTL, a multi-layer correlation matching network across domains is constructed, in which the CCA is combined to bridge each pair of domain-specific hidden layers. To train the network, a joint objective function is defined and the optimization processes are presented. When the deep semantic representation is achieved, the shared features of the source domain are transferred for task learning in the target domain. Extensive experiments for three multimedia recognition applications demonstrate that the proposed DHTL can effectively find deep semantic representations for heterogeneous domains, and it is superior to the several existing state-of-the-art methods for deep transfer learning.

Authors

Zhao L; Chen Z; Yang LT; Deen MJ; Wang ZJ

Journal

ACM Transactions on Multimedia Computing Communications and Applications, Vol. 15, No. 1s, pp. 1–21

Publisher

Association for Computing Machinery (ACM)

Publication Date

January 31, 2019

DOI

10.1145/3241055

ISSN

1551-6857

Contact the Experts team