A curated list of pretrained sentence and word embedding models
- Updated
Apr 23, 2021 - Python
A curated list of pretrained sentence and word embedding models
Using pre trained word embeddings (Fasttext, Word2Vec)
Code for "Effective Dimensionality Reduction for Word Embeddings".
Dict2vec is a framework to learn word embeddings using lexical dictionaries.
Persian word embedding ( نشاننده واژه ها فارسی | تعبیه سازی کلمات فارسی )
股市新聞信息與過去股價作為預測股價特徵時應提取文章數量及方法
Bi-Directional Attention Flow for Machine Comprehensions
We have implemented, expanded and reviewed the paper “Sense2Vec - A Fast and Accurate Method For Word Sense Disambiguation In Neural Word Embeddings" by Andrew Trask, Phil Michalak and John Liu.
A curated list of pretrained sentence and word embedding models
Wrapper of Gensim word2vec along with T-SNE visualization
This is final project of Information Retrieval course which is implementation of a search engine
Showcase of Natural Language Processing (NLP) on sentiment analysis of text in survey
2021 Ajou University Spring SW capstone design - FindU NLP (Winning the gold prize 2021 College Student Paper Contest of DCS)
Explore text classification with Logistic Regression and Naive Bayes models. Implementing from scratch, we compare feature engineering techniques like Bag-of-Words, TF-IDF, and Word Embedding for accurate labeling
In this project, the authors propose to use contextual Word2Vec model for understanding OOV (out of vocabulary). The OOV is extracted by using left-right entropy and point information entropy. They choose to use Word2Vec to construct the word vector space and CBOW (continuous bag of words) to obtain the contextual information of the words.
Bengali Word Embedding by using Polygot2
a practice of text sentiment classification
Made in 8 hours during HackUmassXIII. Trained an AI model using supervised learning and word embedding to detect phishing e-mails. With React, used UseEffect and UseState hook statements to take in user inputted emails and display the AI's conclusion.
Add a description, image, and links to the wordembedding topic page so that developers can more easily learn about it.
To associate your repository with the wordembedding topic, visit your repo's landing page and select "manage topics."