To solve the problem of poor performance in segmenting a large number of proprietary words in Chinese text in electric power domain, an improved Chinese Word Segmentation (CWS) method in electric power domain based on improved BERT (Bidirectional Encoder Representations from Transformer) was proposed. Firstly, two lexicons were built covering general words and domain words respectively, and a dual-lexicon matching and integration mechanism was designed to directly integrate the word features into BERT model, enabling more effective utilization of external knowledge by the model. Then, DEEPNORM method was introduced to improve the model’s ability to extract features, and the optimal depth of the model was determined by Bayesian Information Criterion (BIC), which made BERT model stable up to 40 layers. Finally, the classical self-attention layer in BERT model was replaced by the ProbSparse self-attention layer, and the best value of sampling factor was determined by using Particle Swarm Optimization (PSO) algorithm to reduce the model complexity while ensuring the model performance. The test of word segmentation was carried out on a hand-labeled patent text dataset in electric power domain. Experimental results show that the proposed method achieves the F1 score of 92.87%, which is 14.70, 9.89 and 3.60 percentage points higher than those of the methods to be compared such as Hidden Markov Model (HMM), multi-standard word segmentation model METASEG (pre-training model with META learning for Chinese word SEGmentation) and Lexicon Enhanced BERT (LEBERT) model, verifying that the proposed method effectively improves the quality of Chinese text word segmentation in electric power domain.