Topic: moe Goto Github
Some thing interesting about moe
Some thing interesting about moe
moe,萌萌的网页访客计数器 PHP + Mysql版
User: 1834423612
Home Page: https://count.kjchmc.cn
moe,A sumary of MoE experimental setups across a number of different papers.
User: adamg012
moe,:electron: An unofficial https://bgm.tv app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。
User: czy0729
Home Page: https://bangumi-app.5t5.top/iframe.html?viewMode=story&id=screens-discovery--discovery
moe,PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538
User: davidmrau
moe,A command line tool for all things anime
User: dragonzurfer
moe,pytorch open-source library for the paper "AdaTT Adaptive Task-to-Task Fusion Network for Multitask Learning in Recommendations"
Organization: facebookresearch
moe,Inference framework for MoE layers based on TensorRT with Python binding
User: harry-chen
moe,Libgdx-based game for Android, iOS, and PC following the tutorial by ForeignGuyMike on youtube channel. Read more on README.md
User: haxpor
moe,Unify Efficient Fine-Tuning of 100+ LLMs
User: hiyouga
moe,MOE is an event-driven OS for 8/16/32-bit MCUs. MOE means "Minds Of Embedded system", It’s also the name of my lovely baby daughter :sunglasses:
User: ianhom
Home Page: https://ianhom.github.io/MOE
moe,ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.
Organization: ibm
moe,Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).
User: inferflow
moe,[arXiv'24] Multilinear Mixture of Experts: Scalable Expert Specialization through Factorization
User: james-oldfield
Home Page: http://james-oldfield.github.io/MMoE
moe,Official repository for paper "MATERobot: Material Recognition in Wearable Robotics for People with Visual Impairments" at ICRA 2024, Best Paper Finalist on Human-Robot Interaction
User: junweizheng93
Home Page: https://arxiv.org/pdf/2302.14595.pdf
moe,Node JS enka.network API wrapper written on TypeScript which provides localization, caching and convenience.
User: kravetsone
Home Page: https://kravets.gitbook.io/enkanetwork
moe,Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"
User: kyegomez
Home Page: https://discord.gg/47ENfJQjMq
moe,Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta
User: kyegomez
Home Page: https://discord.gg/GYbXvDGevY
moe,Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"
User: kyegomez
Home Page: https://discord.gg/GYbXvDGevY
moe,Official LISTEN.moe Android app
Organization: listen-moe
Home Page: https://github.com/LISTEN-moe/android-app#download
moe,Official LISTEN.moe browser extension
Organization: listen-moe
moe,Official LISTEN.moe Desktop Client
Organization: listen-moe
Home Page: https://listen.moe/apps
moe,Official LISTEN.moe Discord Bot. Add it to your server!
Organization: listen-moe
Home Page: https://discordapp.com/oauth2/authorize?&client_id=222167140004790273&scope=bot&permissions=36702208
moe,Repo that hosts all of our docs
Organization: listen-moe
Home Page: https://docs.listen.moe
moe,Official LISTEN.moe Windows-only Client
Organization: listen-moe
moe,japReader is an app for breaking down Japanese sentences and tracking vocabulary progress
User: marisukukise
moe,Tutel MoE: An Optimized Mixture-of-Experts Implementation
Organization: microsoft
moe,Moebuntu-SetupHelperScript2 Japanese version
User: mifjpn
Home Page: https://metanchan.hatenablog.com/entry/2022/12/25/090348
moe,
User: mifjpn
Home Page: https://metanchan.hatenablog.com/entry/2022/12/25/090348
moe,MindSpore online courses: Step into LLM
Organization: mindspore-courses
moe,
Organization: nekos-moe
Home Page: https://docs.nekos.moe/
moe,A toolkit for inference and evaluation of 'mixtral-8x7b-32kseqlen' from Mistral AI
Organization: open-compass
moe,Batch download high quality videos from https://twist.moe
User: phanirithvij
moe,⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training
Organization: pjlab-sys4nlp
moe,Mixture-of-Experts for Large Vision-Language Models
Organization: pku-yuangroup
Home Page: https://arxiv.org/abs/2401.15947
moe,Popcorn.moe Api
Organization: popcorn-moe
Home Page: https://api.popcorn.moe
moe,Popcorn.moe Web
Organization: popcorn-moe
Home Page: https://popcorn.moe
moe,会长我挂树了 - 公主连结 vscode-rainbow-fart 扩展语音包 (Priconne extension vocal pack)
User: sahuang
moe,Moebooru is an image board for anime art.
User: tenpi
Home Page: https://moebooru.moe
moe,[ICLR 2023] "Sparse MoE as the New Dropout: Scaling Dense and Self-Slimmable Transformers" by Tianlong Chen*, Zhenyu Zhang*, Ajay Jaiswal, Shiwei Liu, Zhangyang Wang
Organization: vita-group
moe,Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
User: xrsrke
moe,PyTorch implementation of moe, which stands for mixture of experts
User: yeonwoosung
moe,中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
User: ymcui
Home Page: https://arxiv.org/abs/2403.01851
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.