Abstract:In recent years, knowledge representation learning has played a crucial role in intelligent recommendation, intelligent question answering, and intelligent retrieval, and has received widespread attention. Knowledge representation learning aims to vectorize semantic information and infer knowledge through mathematical formulas by means of low-dimensional embedding of entities and relationships. Among many knowledge representation learning models, TransE is considered to be the most promising model due to its fewer scoring function parameters, low computational complexity and high computational efficiency. However, TransE has some limitations in dealing with complex relationships other than one-to-one. In order to solve this problem and improve the quality of knowledge embedding, this paper proposes an improved knowledge representation model TransREF based on translation model. Firstly, the embedding of entities and relations is realized by means of relation matrix projection. Secondly, on the basis of the original vector, the relational neighborhood is added to enhance the learning ability of the model. During the training of the model and for entities with high semantic similarity, the replacement of the head entity and the tail entity is realized by the probability method, and then the high-quality negative example triples are generated, and the five-point random method is used to select the relationship neighborhood nodes. Finally, the relevant link prediction experiment is carried out on the subset WN18 of WordNet and the subset FB15K of Freebase, and then the triplet classification experiment is carried out on the three public datasets WN11, FB13 and FB15K. The results show that compared with TransE and TransH, TransREF has better performance improvement in MeanRank, Hits@10, and ACC indicators, which proves the effectiveness of TransREF.