Currently, collaborative filtering-based Graph Neural Network (GNN) recommendation systems face data sparsity and cold start issues. Many related algorithms introduce external knowledge of items for supplementary expansion to alleviate these issues, but these algorithms ignore the severe information utilization imbalance caused by direct fusion of sparse collaborative signals and redundant supplementary parts, as well as the problems of information sharing and propagation among different data. Therefore, a Multi-view Knowledge-aware and interactive Distillation Recommendation algorithm (MKDRec) was proposed. Firstly, to tackle data sparsity, the formed collaborative view was enhanced through random dropout, and then neighborhood contrastive learning was applied to node representations in this view. Secondly, regarding to knowledge redundancy problem, each relation type of edge in the knowledge view was encoded, and an items’ knowledge view was reconstructed on the basis of the head and tail entities as well as connecting relations to fully utilize the information. Finally, an associated view with remote connections was constructed on the basis of the equivalence relations between items and entities. With all the above, graph node representations were learned by different convolutional aggregation methods on the three views to extract multiple types of information for users and items, and embedded representations of multiple users and items were obtained. Besides, knowledge distillation and fusion of node feature vectors in pairwise views were performed to realize information sharing and propagation. Experimental results on Book-Crossing, MovieLens-1M, and Last.FM datasets show that compared to the best results among the baseline methods, MKDRec’s AUC (Area Under Curve) are improved by 2.13%, 1.07%, and 3.44%, respectively, and MKDRec’s F1-scores are improved by 3.56%, 1.14%, and 4.46%, respectively.