Search⌘ K
AI Features

Knowledge Graph Embeddings

Explore the fundamentals of knowledge graph embeddings, including their purpose in capturing both network structure and semantics. Understand key embedding methods such as translation-based, factorization-based, and neural network-based approaches. Learn how scoring functions, loss functions, optimizers, and negative sampling strategies are used to generate meaningful embeddings for machine learning applications.

Embeddings

Knowledge graph embeddings are low-dimensional vectors that capture the network structure as well as the semantics of the entities and relationships. Standard graph embeddings focus on preserving the network structure only. This way, knowledge graph embeddings are different from graph embeddings.

Embeddings are required for a graph's numerical representation so we can use them as input to machine learning methods. There are several ways to generate knowledge graph embeddings, and they can be broadly divided into three parts:

  • Translation-based methods

  • Factorization-based methods

  • Neural network-based methods

Knowledge graph embeddings
Knowledge graph embeddings

All the above-mentioned methods differ from each other because of the type of loss function they use and also the way they capture the knowledge graph patterns.

Graph patterns

The patterns found in knowledge graphs are interesting. They are as follows:

  • Symmetry: This is also called a reciprocal relationship. For instance, if Sarah is a friend of Kate, then Kate is also a friend of Sarah, i.e., the inverse of a triple is also true.

  • Asymmetry: This is the opposite of symmetry in that the inverse of a triple is not true.

  • Inversion: Here, two relationships ...