Skip to content

n0obcoder/Skip-Gram-Model-PyTorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Skip-Gram-Model-PyTorch

PyTorch implementation of the word2vec (skip-gram model) and visualization of the trained embeddings using TSNE !

2D representaion of some of the trained word embeddings

My TensorFlow implemntation of Skip-Gram Model can be found here.

Requirements

  • torch >= 1.4
  • numpy >= 1.18
  • matplotlib
  • tqdm
  • nltk
  • gensim

Training

python main.py

Visualizing real-time training loss in Tensorboard

tensorboard --logdir <PATH_TO_TENSORBOARD_EVENTS_FILE>

NOTE: By default, PATH_TO_TENSORBOARD_EVENTS_FILE is set to SUMMARY_DIR in config.py

Testing

python test.py

Inference

war india crime guitar movies desert physics religion football computer
invasion provinces will bass movie shore mathematics judaism baseball digital
soviet pakistan prosecution drum albums hilly mathematical islam championship computers
troop mainland accusations solo songs plateau chemistry religions basketball software
army asian provoke quartet cartoon basin theoretical religious coach electronic
ally colonial prosecute vocals animate highlands analysis jewish wrestler interface

Blog-Post

Check out my blog post on word2vec here.

About

PyTorch implementation of the Word2Vec (Skip-Gram Model) and visualizing the trained embeddings using TSNE

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy