Abstract
Knowledge graph embedding aims to learn distributed representations for entities and relations, and are proven to be effective in many applications. Crossover interactions --- bi-directional effects between entities and relations --- help select related information when predicting a new triple, but hasn't been formally discussed before.
In this paper, we propose CrossE, a novel knowledge graph embedding which explicitly simulates crossover interactions. It not only learns one general embedding for each entity and relation as in most previous methods, but also generates multiple triple specific embeddings for both of them, named interaction embeddings.
We evaluate the embeddings on typical link prediction task and find that CrossE achieves state-of-the-art results on complex and more challenging datasets.
Furthermore, we evaluate the embeddings from a new perspective --- giving explanations for predicted triples, which is important for real applications.
In this work, explanations for a triple are regarded as reliable closed-paths between head and tail entity. Compared to other baselines, we show experimentally that CrossE is more capable of generating reliable explanations to support its predictions, benefiting from interaction embeddings.