Abstract:EEG signal recognition methods rarely integrate spatial, temporal and frequency information, in order to fully explore the rich information contained in EEG signals, this paper proposes a multi-information fusion EEG emotion recognition method. The method utilizes a parallel convolutional neural network model (Parallel Convolutional Neural Network, PCNN) that combines a two-dimensional convolutional neural network(2D-CNN) and a one-dimensional convolutional neural network(1D-CNN) to learn the spatial, temporal, and frequency features of the EEG signals to categorize the human emotional states. Among them, 2D-CNN is used to mine spatial and frequency information between neighboring EEG channels, and 1D-CNN is used to mine temporal and frequency information of EEG. Finally, the information extracted from the two parallel CNN modules is fused for emotion recognition. The experimental results of emotion triple classification on the dataset SEED show that the overall classification accuracy of the PCNN fusing spatial, temporal, and frequency features reaches 98.04%, which is an improvement of 1.97% and 0.60%, respectively, compared to the 2D-CNN extracting only null-frequency information and the 1D-CNN extracting temporal-frequency information. And compared with recent similar work, the method proposed in this paper is superior for EEG emotion classification.