Understanding Word Embeddings: Introduction to the concept of word embeddings as a way to represent words in a continuous vector space.
Word2Vec Model: Deep dive into Word2Vec, explaining skip-gram and continuous bag-of-words (CBOW) models.
GloVe Model: Understanding Global Vectors for Word Representation (GloVe) and how it differs from Word2Vec.
Using Pre-trained Embeddings: How to leverage pre-trained embeddings (e.g., GloVe, FastText) to improve NLP model performance.
Custom Embeddings: Training custom word embeddings on domain-specific corpora.
Practical Session:
Implementing Word2Vec using Gensim and exploring the embeddings.
Using pre-trained GloVe embeddings in a text classification model to improve accuracy.
Learning Outcome: Students will gain hands-on experience in creating and using word embeddings, a critical step in improving the performance of NLP models.
Cookies and similar technologies are used on our sites to personalize content and ads. You can find further details and change your personal settings below. By clicking OK, or by clicking any content on our sites, you agree to the use of these cookies and similar technologies.
GDPR
When you visit any of our websites, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and manage your preferences. Please note, that blocking some types of cookies may impact your experience of the site and the services we are able to offer.