Towards Leveraging Structure for Neural Predictor in NAS

سال انتشار: 1401
نوع سند: مقاله ژورنالی
زبان: انگلیسی
مشاهده: 185

فایل این مقاله در 12 صفحه با فرمت PDF قابل دریافت می باشد

استخراج به نرم افزارهای پژوهشی:

لینک ثابت به این مقاله:

شناسه ملی سند علمی:

JR_CKE-5-1_005

تاریخ نمایه سازی: 29 شهریور 1401

چکیده مقاله:

Neural Architecture Search (NAS), which automatically designs a neural architecture for a specific task, has attracted much attention in recent years. Properly defining the search space is a key step in the success of NAS approaches, which allows us to reduce the required time for evaluation. Thus, late strategies for searching a NAS space is to leverage supervised learning models for ranking the potential neural models, i.e., surrogate predictive models. The predictive model takes the specification of an architecture (or its feature representation) and predicts the probable efficiency of the model ahead of training. Therefore, proper representation of a candidate architecture is an important factor for a predictor NAS approach. While several works have been devoted to training a good surrogate model, there exits limited research focusing on learning a good representation for these neural models. To address this problem, we investigate how to learn a representation with both structural and non-structural features of a network. In particular, we propose a tree structured encoding which permits to fully represent both networks’ layers and their intra-connections. The encoding is easily extendable to larger or more complex structures. Extensive experiments on two NAS datasets, NasBench۱۰۱ and NasBench۲۰۱, demonstrate the effectiveness of the proposed method as compared with the state-of-the-art predictors.

نویسندگان

Saeedeh Eslami

Department of Computer Engineering, Engineering Faculty of Ferdowsi University, Mashhad, Iran...

Reza Monsefi

Department of Computer Engineering, Engineering Faculty of Ferdowsi University, Mashhad, Iran...

Mohammad Akbari

Department of Mathematics and Computer Science, Amirkabir University, Tehran, Iran

مراجع و منابع این مقاله:

لیست زیر مراجع و منابع استفاده شده در این مقاله را نمایش می دهد. این مراجع به صورت کاملا ماشینی و بر اساس هوش مصنوعی استخراج شده اند و لذا ممکن است دارای اشکالاتی باشند که به مرور زمان دقت استخراج این محتوا افزایش می یابد. مراجعی که مقالات مربوط به آنها در سیویلیکا نمایه شده و پیدا شده اند، به خود مقاله لینک شده اند :
  • B. Zoph, Q. V. Le, Neural architecture search with reinforcement ...
  • Baker, O. Gupta, N. Naik, R. Raskar, Designing neural network ...
  • Brock, T. Lim, J. M. Ritchie, N. Weston, Smash: one-shot ...
  • Liu, K. Simonyan, Y. Yang, Darts: Differentiable architecture search, arXiv ...
  • Wang, Y. Zhao, Y. Jinnai, Y. Tian, R. Fonseca, Neural ...
  • Zoph, V. Vasudevan, J. Shlens, Q. V. Le, Learning transferable ...
  • Luo, X. Tan, R. Wang, T. Qin, E. Chen, T.-Y. ...
  • Dong, Y. Yang, Nas-bench-۲۰۱: Extending the scope of reproducible neural ...
  • Veit, M. J. Wilber, S. Belongie, Residual networks behave like ...
  • Wen, H. Liu, Y. Chen, H. Li, G. Bender, P.-J. ...
  • Li, A. Talwalkar, Random search and reproducibility for neural architecture ...
  • Wu, X. Dai, D. Chen, Y. Chen, M. Liu, Y. ...
  • White, W. Neiswanger, S. Nolen, Y. Savani, A study on ...
  • Sun, H. Wang, B. Xue, Y. Jin, G. G. Yen, ...
  • G. Talbi, Optimization of deep neural networks: a survey and ...
  • Cai, T. Chen, W. Zhang, Y. Yu, J. Wang, Efficient ...
  • Liu, C., Zoph, B., Neumann, M., Shlens, J., Hua, W., ...
  • Pham, H., Guan, M.Y., Zoph, B., Le, Q.V., Dean, J.: ...
  • Minghao Guo, Yuzhe Yang, Rui Xu, Ziwei Liu, Dahua Lin; ...
  • نمایش کامل مراجع