# Vector Embeddings
- [Word embedding | Wikipedia](https://en.wikipedia.org/wiki/Word_embedding)
- Any text can have embeddings, not just words. e.g. snippets in [[rag|RAG]].
## Intro
- Recommendation Systems, NLP, Computer Vision, Gen AI, LLM, etc are all based
on vector embeddings.
- Items put into an embedding space.
- Making recommendations based on distance between vectors, e.g. _cosine
distance_
$
\frac{A\cdot B}{\Vert A\Vert \Vert B\Vert} =
\frac{\displaystyle\sum_{i=1}^n A_iB_i}
{
\sqrt{\displaystyle\sum_i^n A_i^2}
\sqrt{\displaystyle\sum_i^n B_i^2}
}
$
- Examples of embedding dimensions: male-female, verb tense, country-capital
- _Latent Space_, aka _Embedding Space_
- Word Embedding Models
- Word2Vec
- GloVe
- BERT
- GPT
- VGGNet (image)
- GoogLeNet (image)
- [[nn|Neural Networks]] to obtain embedding