Text sentiment transfer is to change text’s sentiment attribute while preserving its content. Due to the lack of parallel corpora, most of the existing unsupervised methods for text sentiment transfer construct latent representations of sentiment and content through text reconstruction and classification losses, and then realize sentiment transfer. However, this weakly supervised training strategy results in significant model performance degradation under prompt learning paradigms. To address this issue, an unsupervised text sentiment transfer method based on generation prompt was proposed. Firstly, textual content prompts were generated by using a prompt generator. Secondly, the target sentiment prompts were fused as the ultimate prompt. Finally, a two-stage training strategy was formulated to provide smooth training gradients for the model training, thereby solving the problem of model performance degradation. Experimental results on the public dataset for sentiment transfer — Yelp show that the proposed method significantly outperforms the generation based method UnpairedRL in text preservation, sentiment transfer score, and BLEU (BiLingual Evaluation Understudy), and the improvements are 39.1%, 62.3%, and 14.5%, respectively.