In the task of Temporal Knowledge Graph Question Answering (TKGQA), it is a challenge for models to capture and utilize the implicit temporal information in the questions to enhance the complex reasoning ability of the models. To address this problem, a Graph Attention mechanism-integrated Complex Temporal knowledge graph Reasoning question answering (GACTR) model was proposed. The proposed model was pretrained on a temporal Knowledge Base (KB) in the form of quadruples, and a Graph Attention neTwork (GAT) was introduced to effectively capture implicit temporal information in the question. The relationship representation trained by Robustly optimized Bidirectional Encoder Representations from Transformers pretraining approach (RoBERTa) was integrated to enhance the temporal relationship representation of the question. This representation was combined with the pretrained Temporal Knowledge Graph (TKG) embedding, and the final prediction result was the entity or timestamp with the highest score. On the largest benchmark dataset CRONQUESTIONS, compared to the baseline models, Knowledge Graph Question Answering on CRONQUESTIONS(CRONKGQA), the GACTR model achieved improvements of 34.6 and 13.2 percentage points in handling complex question and time answer types, respectively; compared to the Temporal Question Reasoning (TempoQR) model, the improvements were 8.3 and 2.8 percentage points, respectively.