Tsallis Entropy for Deep Transfer Learning and Domain Adaptation
محل انتشار: چهارمین کنفرانس بین المللی محاسبات نرم
سال انتشار: 1400
نوع سند: مقاله کنفرانسی
زبان: انگلیسی
مشاهده: 247
فایل این مقاله در 11 صفحه با فرمت PDF قابل دریافت می باشد
- صدور گواهی نمایه سازی
- من نویسنده این مقاله هستم
استخراج به نرم افزارهای پژوهشی:
شناسه ملی سند علمی:
CSCG04_101
تاریخ نمایه سازی: 23 اسفند 1400
چکیده مقاله:
In this research, we propose Tsallis entropy for deep transfer learning as an efficient and flexible method to construct a regularized classifier. We improve bias problem on the Convolutional Neural Network (CNN) model based on statistical learning for unsupervised domain adaptation. At first, Tsallis entropy on source domain has been applied to reduce loss. Then, a cosine similarity is used based on K Nearest Neighbors (KNN) classifier to regularize CNN classifier by alleviating the error discrepancy between them. A non-extensive Tsallis entropy function based on KNN classifier is designed as self-regularization for reducing the learning bias. Moreover, the marginal distribution and the conditional distribution have simultaneously been aligned by "Joint Distribution Adaptation " (JDA). Finally, the experiments are implemented on the regularized CNN according to the traditional cross entropy loss and proposed Tsallis entropy loss. The results show that regularized deep transfer learning based on Tsallis entropy is effective and robust for domain adaptation problems, and it has less loss than state-of-the-art domain adaptation methods to detect unreliable samples.
کلیدواژه ها:
نویسندگان
Zahra Ramezani
Department of Statistics, University of Mazandaran, Babolsar, Iran
Ahmad Pourdarvish
Department of Statistics, University of Mazandaran, Babolsar, Iran