Word Embeddings: Word2Vec, GloVe | Mellow Academy

Word Embeddings: Word2Vec, GloVe

Lesson 6/9 | Study Time: 120 Min

  • Topics Covered:

    • Understanding Word Embeddings: Introduction to the concept of word embeddings as a way to represent words in a continuous vector space.

    • Word2Vec Model: Deep dive into Word2Vec, explaining skip-gram and continuous bag-of-words (CBOW) models.

    • GloVe Model: Understanding Global Vectors for Word Representation (GloVe) and how it differs from Word2Vec.

    • Using Pre-trained Embeddings: How to leverage pre-trained embeddings (e.g., GloVe, FastText) to improve NLP model performance.

    • Custom Embeddings: Training custom word embeddings on domain-specific corpora.

  • Practical Session:
    • Implementing Word2Vec using Gensim and exploring the embeddings.
    • Using pre-trained GloVe embeddings in a text classification model to improve accuracy.

  • Learning Outcome: Students will gain hands-on experience in creating and using word embeddings, a critical step in improving the performance of NLP models.
  • GDPR

    When you visit any of our websites, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and manage your preferences. Please note, that blocking some types of cookies may impact your experience of the site and the services we are able to offer.