Most of the current capsule network methods improve the classification accuracy by modifying iterative routing, while ignoring the burden brought by complex computation of iterative routing itself. Although there are some methods that use non-iterative routing to train the capsule network, the accuracies of these methods are not good. To address the above problem, a non-iterative routing graph capsule network method for remote sensing scene classification was proposed. Firstly, the preliminary features of the input image were extracted using a simple convolutional layer. Then, by performing dual attention between channels and capsules sequentially, a global attention module with dual fusion between channels and capsules was presented to generate global coefficients that weighed high-level capsule features. As a result, the weighted high-level capsule features became more discriminative to highlight the important capsules, thereby improving the classification performance. Meanwhile, an equivariant regularization term that could compute the similarity among the input images was introduced to model the explicit equivariance of the capsule network, thereby improving network performance potentially. Finally, the whole network was trained based on the loss function combining margin loss and equivariance loss to obtain a discriminative classification model. Experimental results on multiple benchmark scene datasets verified the effectiveness and efficiency of the proposed method. Experimental results show that the proposed method has the classification accuracy reached 90.38% on Canadian Institute For Advanced Research-10 image datasets (CIFAR-10), which is 15.74 percentage points higher than the Dynamic Routing Capsule network (DRCaps) method, and achieves classification accuracy of 98.21% and 86.96% on Affine extended National Institute of Standards and Technology dataset (AffNIST) and Aerial Image Dataset (AID), respectively. It can be seen that the proposed method can improve the performance of remote sensing scene classification effectively.