This work addresses the catastrophic forgetting problem in one-shot neural architecture search by treating supernet training as a constrained optimization problem. The proposed method uses a novelty search-based architecture selection approach to enhance diversity and boost performance, achieving competitive results on CIFAR-10 and ImageNet datasets.