Machine learning resources
Online resources : dataloop ๐ huggingface
Coding assistants : Windsurf vs-code plugin โฝ Refact.ai ๐ฎ codeconvert ๐ซ
APIs : Anthropic API ๐ก Google GenAI โ OpenAI-python โฆ
LLMs : OLlama ๐ Whisper ๐ค Gemma ห ๐ SpaCy ๐ฅ๏ธ
Light LLMs : litgpt ๐ค pythia ๐ค TinyLlama ๐ค NanoGPT-128Mห
Physics : PhysicsNeMo ๐ฏ ( e.g. Fourier darcy_fno )
Geophysics : SPADE-Terrain-GAN ๐ ada_multigrid_ppo
Training : Lookahead ๐ฏ Gymnasium ๐ฏ
Distributed training : NCCL ๐ฟ GLOO ๐ฟ MPI ๐ฟ ห
ML deployment : llama.cpp ๐ฆ Pytorch ๐ข onnx ๐ฆ jax onnx runtime ๐ง MLflowห
UI : ComfyUI ๐จ
Agents : stanford-oval/storm ๐ottomator ๐ฑ
Datasets : VQA๐ ๐ WeatherBench ๐ธ COCOห ๐ท FineWebห
Computational costs, based on F.G.Raeiniโs MSc, Per batch size of 128, GPU: N100, CPU: AMD 5700U
Model | Num-Params | size (MB) | Inf time (GPU) | Train-time CPU! | Acc. VQAv1/2,AOK |
---|---|---|---|---|---|
ViLT | 82M | 470 | 2 s | 25 s | 72%, 44% |
ResAttLSTM | 80 | 2.5 s | 62%, 30% |
Larger models:
GIT: 707MB | Qwen2-72B: 43GB | BLIP: 990MB-1.9GB | Florence-230M | LLaVA-7B: 15GB | LAVIS
VQA datasets:
dataset | VQA-v2 | VQA-v1 | AOK-VQA | VizWiz |
---|---|---|---|---|
train, val: | 443k, 214k | 214k, 121k | 17.0k, 1.14k | 200k, 40k ห |