py-bpemb (python/py-bpemb) Updated: 9 months, 3 weeks ago Add to my watchlist
Byte pair embeddings in 275 languagesBPEmb is a collection of pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE) and trained on Wikipedia. Its intended use is as input for neural models in natural language processing.
Version: 0.3.5 License: MIT GitHubStatistics for selected duration
2024-Dec-11 to 2025-Jan-10
No stats available for this selection.
Try changing the range of days. Alternatively visit statistics page to have an overall look at the submitted statistics.