Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

8-2021

Abstract

In this paper, we propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks, aiming to automate the design of SPD neural architectures. To address this problem, we first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design. Further, we model our new NAS problem with a one-shot training process of a single supernet. Based on the supernet modeling, we exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search. Statistical evaluation of our method on drone, action, and emotion recognition tasks mostly provides better results than the state-of-the-art SPD networks and traditional NAS algorithms. Empirical results show that our algorithm excels in discovering better performing SPD network design and provides models that are more than three times lighter than searched by the state-of-the-art NAS algorithms.

Discipline

OS and Networks | Systems Architecture

Research Areas

Data Science and Engineering

Publication

Proceedings of the 30th International Joint Conference on Artificial Intelligence (IJCAI-21), Montreal, 2021 Aug 19-26

First Page

3002

Last Page

3009

Publisher

AAAI Press

City or Country

Virtual

Share

COinS