Word vectors (also: word embeddings) are mathematical representations of words in a multi-dimensional space. Each word is described by a numeric vector that encodes its meaning and its relationship to other words.
These vectors are generated using machine learning – typically with models like Word2Vec, GloVe, or modern language models like GPT.
The goal is to capture semantic similarity numerically: words with similar meanings are placed close together in vector space. This allows AI to not just read but also interpret and process language contextually.