The existing Named Entity Recognition (NER) models based on Bidirectional Long Short-Term Memory (BiLSTM) network are difficult to fully understand the global semantics of text and capture the complex relationships between entities. Therefore, an NER model based on global information fusion and multi-dimensional relation perception was proposed. Firstly, BERT (Bidirectional Encoder Representations from Transformers) was used to obtain vector representation of the input sequence, and BiLSTM was combined to further learn context information of the input sequence. Secondly, a global information fusion mechanism composed of gradient stabilization layer and feature fusion module was proposed. With the former one, the model was able to maintain stable gradient propagation and update as well as optimize representation of the input sequence. In the latter one, the forward and backward representations of BiLSTM were integrated to obtain more comprehensive feature representation. Thirdly, a multi-dimensional relation perception structure was constructed to learn correlations between words in different subspaces in order to capture complex entity relationships in documents. In addition, the adaptive focus loss function was used to adjust the weights of different entity types dynamically to improve the recognition performance of the model for minority entities. Finally, experiments were conducted on 7 public datasets for the proposed model and 11 baseline models. The results show that all of the F1 values of the proposed model are higher than those of the comparison models, validating the comprehensive performance of the proposed model.