Neural Prompt Search


Yuanhan Zhang1
Kaiyang Zhou1
Ziwei Liu1


S-Lab, Nanyang Technological University







TL;DR

The idea is simple: we view existing parameter-efficient tuning modules, including Adapter, LoRA and VPT, as prompt modules and propose to search the optimal configuration via neural architecture search. Our approach is named NOAH (Neural prOmpt seArcH).

Move mouse over the image below to understand NOAH.




Abstract

The size of vision models has grown exponentially over the last few years, especially after the emergence of Vision Transformer. This has motivated the development of parameter-efficient tuning methods, such as learning adapter layers or visual prompt tokens, which allow a tiny portion of model parameters to be trained whereas the vast majority obtained from pre-training are frozen. However, designing a proper tuning method is non-trivial: one might need to try out a lengthy list of design choices, not to mention that each downstream dataset often requires custom designs. In this paper, we view the existing parameter-efficient tuning methods as ``prompt modules'' and propose Neural prOmpt seArcH (NOAH), a novel approach that learns, for large vision models, the optimal design of prompt modules through a neural architecture search algorithm, specifically for each downstream dataset. By conducting extensive experiments on over 20 vision datasets, we demonstrate that NOAH (i) is superior to individual prompt modules, (ii) has good few-shot learning ability, and (iii) is domain-generalizable.




Paper

Paper thumbnail

Neural Prompt Search

Yuanhan Zhang, Kaiyang Zhou and Ziwei Liu

arXiv, 2022.

@InProceedings{zhang2022NOAH,
    title = {Neural Prompt Search},
    author = {Yuanhan Zhang and Kaiyang Zhou and Ziwei Liu},
    archivePrefix={arXiv},
    year = {2022},
}



Acknowledgements

TBA.