Aiming at the over-smoothing and noise problems in the embedding representation in the message passing process of graph convolution based on graph neural network recommendation, a Recommendation model combining Self-features and Contrastive Learning (SfCLRec) was proposed. The model was trained using a pre-training-formal training architecture. Firstly, the embedding representations of users and items were pre-trained to maintain the feature uniqueness of the nodes themselves by fusing the node self-features and a hierarchical contrastive learning task was introduced to mitigate the noisy information from the higher-order neighboring nodes. Then, the collaborative graph adjacency matrix was reconstructed according to the scoring mechanism in the formal training stage. Finally, the predicted score was obtained based on the final embedding. Compared with existing graph neural network recommendation models such as LightGCN and Simple Graph Contrastive Learning (SimGCL), SfCLRec achieves the better recall and NDCG (Normalized Discounted Cumulative Gain) in three public datasets ML-latest-small, Last.FM and Yelp, validating the effectiveness of SfCLRec.