Kazuki Fujii's Projects
🚀 A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision
This is competitive programming practice repository for coding interview.
二分探索
アルゴリズムは漸化式: ユークリッドの互除法
A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
2021-3Q アセンブリ言語 (Tokyo Tech)
cpp library for competitive programming (especially AtCoder)
several types of attention modules written in PyTorch
A curated list of software and architecture related design patterns.
Awesome-LLM: a curated list of Large Language Model
新B4のためのNLP startup用 教材を整備するためのリポジトリです。
「BERTによる自然言語処理入門: Transformersを使った実践プログラミング」サポートページ
Ongoing research training transformer language models at scale, including: BERT & GPT-2
Making large AI models cheaper, faster and more accessible
Tokyo Institute of Technology 2022-2Q CSC. T372
2022-1Q コンピュータ論理設計 (Tokyo Tech)
2022-1Q コンピュータネットワーク (Tokyo Tech)
course-review
for competitive programming (C++)
2021-1Q,2Q プログラミング創造演習 (Tokyo Tech)
A GPU-accelerated library containing highly optimized building blocks and an execution engine for data processing to accelerate deep learning training and inference applications.
2021-4Q データ構造とアルゴリズム(Tokyo Tech)
2022-1Q データベース (Tokyo Tech)