« Back to Glossary Index

In the context of deep learning, an embedding is a representation of data in a different space than the original one. Often, the embedding is an n-dimensional vector that incorporates as much information as possible from the original data. They are widely used in Natural Language Processing as this technique allows transforming textual data into numerical format, making it more easily processable by the model. Additionally, given two embedded vectors, it is possible to measure the distance using techniques such as Euclidean distance.