Electric power resources are essential for the efficient and orderly development of society. Accurate power load forecasting is a key driver for the low-carbon upgrade of power systems. Traditional forecasting methods often struggle to capture long-term dependencies. Additionally, extracting complex nonlinear features from data remains a significant challenge, making it challenging to meet the accuracy demands of modern power systems. Besides, current deep learning-based forecasting methods cannot simulate multi-granularity power load data. To address these challenges, this paper presents a Generative Pre-trained Transformer model, GPT4PLTS, designed for power data simulation and fine-grained power load forecasting. The model leverages the Transformer architecture, incorporating the first six layers of the GPT decoder structure. It utilizes a multi-head attention mechanism to extract temporal features and includes a time alignment layer to maintain the sequence of time-series data, addressing both short-term and long-term dependencies. Extensive experiments are conducted on load observations from 2000 enterprises. The results demonstrate that GPT4PLTS achieves high accuracy in data simulation and forecasts across different time granularities, particularly excelling in short and medium-term predictions. Future research could focus on optimizing the model structure to enhance the model’s generalization ability.