1. 首页
  2. 人工智能
  3. 论文/代码
  4. MemNAS:具有成长趋势学习功能的内存高效神经体系结构搜索

MemNAS:具有成长趋势学习功能的内存高效神经体系结构搜索

上传者: 2021-01-22 05:54:40上传 .PDF文件 3.38 MB 热度 7次

对自动神经体系结构搜索技术的最新研究表明,该技术具有显着的性能,与手工神经体系结构相比具有竞争优势,甚至更好。但是,大多数现有的搜索方法倾向于使用残差结构以及浅层特征和深层特征之间的级联关系。..

MemNAS: Memory-Efficient Neural Architecture Search With Grow-Trim Learning

Recent studies on automatic neural architecture search techniques have demonstrated significant performance, competitive to or even better than hand-crafted neural architectures. However, most of the existing search approaches tend to use residual structures and a concatenation connection between shallow and deep features.A resulted neural network model, therefore, is non-trivial for resource-constraint devices to execute since such a model requires large memory to store network parameters and intermediate feature maps along with excessive computing complexity. To address this challenge, we propose MemNAS, a novel growing and trimming based neural architecture search framework that optimizes not only performance but also memory requirement of an inference network. Specifically, in the search process, we consider running memory use, including network parameters and the essential intermediate feature maps memory requirement, as an optimization objective along with performance. Besides, to improve the accuracy of the search, we extract the correlation information among multiple candidate architectures to rank them and then choose the candidates with desired performance and memory efficiency. On the ImageNet classification task, our MemNAS achieves 75.4% accuracy, 0.7% higher than MobileNetV2 with 42.1% less memory requirement. Additional experiments confirm that the proposed MemNAS can perform well across the different targets of the trade-off between accuracy and memory consumption.

下载地址
用户评论