- Bellevue, Washington
- http://shaden.io/
Highlights
- Pro
Stars
Learn how to design large-scale systems. Prep for the system design interview. Includes Anki flashcards.
Tensors and Dynamic neural networks in Python with strong GPU acceleration
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Ray is an AI compute engine. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
Data validation using Python type hints
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Best Practices on Recommendation Systems
Ongoing research training transformer models at scale
Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.
Python package built to ease deep learning on graph, on top of existing DL frameworks.
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
A library to generate LaTeX expression from Python code.
An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries
Example models using DeepSpeed
Aim 💫 — An easy-to-use & supercharged open-source experiment tracker.
Web interface for browsing, search and filtering recent arxiv submissions
A flexible package manager that supports multiple versions, configurations, platforms, and compilers.
16-bit CPU for Excel, and related files
Easily turn large sets of image urls to an image dataset. Can download, resize and package 100M urls in 20h on one machine.
Fast Python Collaborative Filtering for Implicit Feedback Datasets
Simple, safe way to store and distribute tensors
Pytorch domain library for recommendation systems






