Basic Difference between Word Embeddings and Word Vectors

Photo by Towfiqu barbhuiya on Unsplash
  • Well, the first and most prominent difference is that the word vector uses a one-hot encoding sort of concept to create vectors for different words, whereas word embeddings use sophisticated algorithms to transform a vector for words (word2Vec)
  • In the case of word vectors, vector size is directly proportional to the vocabulary size which makes it high-dimensional vectors, word embeddings on the other hand have a fixed vector size of low-dimension (e.g. 100, 300, 720, etc.)
  • In most cases, word vectors contain occurrences (or co-occurrences) level information for different words (i.e. discrete numbers or counts). Word embeddings contain contextual and semantic level information (i.e. continuous real numbers)
  • Word vectors can be used where there is a use case to give importance to keywords or need to find an exact match. Whereas, word embeddings should be used where the context or semantic similarity has a high priority (e.g. king and queen should have similar vectors)




Sr. Software Engineer | Data Scientist | Natural Language Processing Expert

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

How to remove Multicollinearity in dataset using PCA?

Spectral Normalization Implementation by Tensorflow 2.0 Keras API

The first step in AI might surprise you

Challenges of the Data Product Manager

Computer Vision and Deep Learning -Part 4

How to implement the right AI technique for your digital transformation projects?

The need for a new algorithm for spatiotemporal data: My PhD in 3 minutes

What is an Artificial Neural Network? Intro to Neural Network

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Syed Shahzaib Ali

Syed Shahzaib Ali

Sr. Software Engineer | Data Scientist | Natural Language Processing Expert

More from Medium

Introduction to Multilabel Classification

Detection and Normalization of Temporal Expressions in French Text (2) — Label Format and…

Why measuring “faithfulness” matters for AI model interpretations

TensorFlow and DJL Machine Learning Pipeline