Vektorwörter

Word vectors (also: word embeddings) are mathematical representations of words in a multi-dimensional space. Each word is described by a numeric vector that encodes its meaning and its relationship to other words.

These vectors are generated using machine learning – typically with models like Word2Vec, GloVe, or modern language models like GPT.

The goal is to capture semantic similarity numerically: words with similar meanings are placed close together in vector space. This allows AI to not just read but also interpret and process language contextually.

  • Mathematical representation of words as numerical vectors
  • Generated using machine learning (e.g. Word2Vec, GloVe)
  • Semantically similar words are close in vector space
  • Used in search, clustering, and AI-driven text processing
  • Core component of modern language models
Augsburg Skyline - Web Design by Denise Jung