Name: Fang Chang
Type: User
Company: Peking University
Bio: I am Fang Chang, a student in Peking university. I want to study in Github and try to make some contributions to the community
Location: Peking University
Fang Chang's Projects
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Paper list of KBQA
BaiduSpider,一个爬取百度搜索结果的爬虫,目前支持百度网页搜索,百度图片搜索,百度知道搜索,百度视频搜索,百度资讯搜索,百度文库搜索,百度经验搜索和百度百科搜索。
Code and source for paper ``How to Fine-Tune BERT for Text Classification?``
深度学习在推荐系统中的应用及论文小结。
Code for the NeurIPS'17 paper "DropoutNet: Addressing Cold Start in Recommender Systems"
KangDaddy 的个人博客
Code for the SIGIR20 paper -- Recommendation for New Users and New Items via Randomized Training and Mixture-of-Experts Transformation
Provide all my solutions and explanations in Chinese for all the Leetcode coding problems.
Easy-to-use LLM fine-tuning framework (LLaMA, BLOOM, Mistral, Baichuan, Qwen, ChatGLM)
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
总结梳理自然语言处理工程师(NLP)需要积累的各方面知识,包括面试题,各种基础知识,工程能力等等,提升核心竞争力
Open Source Neural Machine Translation in PyTorch
Code and data accompanying Natural Language Processing with PyTorch published by O'Reilly Media https://nlproc.info
Python Code for paper Attention is All You Need For Chinese Word Segmentation.
The official implementation of Self-Play Fine-Tuning (SPIN)
WebRTC, WebRTC and WebRTC. Everything here is all about WebRTC!!
A PyTorch implementation of a BiLSTM \ BERT \ Roberta (+ BiLSTM + CRF) model for Chinese Word Segmentation (中文分词) .
Zero-Shot Cross-Lingual Semantic Parsing (Sherborne & Lapata, ACL 2022)