In recent years, deep learning has made breakthroughs in many fields due to its powerful representation capability, and the architecture of neural network is crucial to the final performance. However, the design of high-performance neural network architecture heavily relies on the priori knowledge and experience of the researchers. Because there are a lot of parameters for neural networks, it is difficult to design optimal neural network architecture. Therefore, automated Neural Architecture Search (NAS) gains significant attention. NAS is a technique that uses machine learning to automatically search for optimal network architecture without the need for a lot of human effort, and is an important means of future neural network design. NAS is essentially a search optimization problem, by designing search space, search strategy and performance evaluation strategy, NAS can automatically search the optimal network structure. Detailed and comprehensive analysis, comparison and summary for the latest research progress of NAS were provided from three aspects: search space, search strategy, and performance evaluation strategy, which facilitates readers to quickly understand the development process of NAS. And the future research directions of NAS were proposed.