...

/

What are Graph Embeddings?

What are Graph Embeddings?

Learn about the numerical vector representation of graphs.

Embeddings

As we've seen, graphs collect and store complex, real-life interactions. This makes graphs robust data structures that are intuitive and flexible. Since graphs are non-euclidean data structures, they can't be used directly as input in a machine learning algorithm. This is why we need to learn graph embeddings in a low-dimensional space. We can perform different types of graph analytics tasks using embeddings. Embeddings are vector representations of a graph in an nn-dimensional space.

Press + to interact
Graph embedding example
Graph embedding example

Graph embeddings are usually a vector with float values, whereas the user predetermines the size. In the example above, every node is represented by an embedding size of 7. For instance, if we have a graph with 20 nodes, the resulting embedding matrix shape would be (20,7). The numbers capture the geometric relationships in the original graph.

Representation learning

The process of learning graph embeddings is also called graph representation learning. Formally, this can be defined as a task to learn a mapping function f:GRnf: G\to \mathbb{R}^n, where ...