Yuxing believes firmly in the saying "With four parameters I can fit an elephant, and with five I can make him wiggle his trunk" by Enrico Fermi. [1] So, he devotes his entire though short research career to tuning five hypermeters of a model and is convinced that he will finally make it outperform any SOTA models like GPT-3 and AlphaFold 2.
Yuxing is not a computer scientist, a physicist or a chemist, but a HYPERPARAMETER TUNING SCIENTIST. "That makes a lot of difference. Hyperparameter tuning is the technique that changes our life, especially before pulishing a paper.", he said.
[1] Dyson, F. A meeting with Enrico Fermi. Nature 427, 297 (2004).
From: 19 June 2024 - To: 26 June 2024
Python 12 hrs 20 mins โโโโโโโโโโโโโโโโโโโโโโโโโ 57.44 %
Markdown 1 hr 53 mins โโโโโโโโโโโโโโโโโโโโโโโโโ 08.79 %
C# 1 hr 35 mins โโโโโโโโโโโโโโโโโโโโโโโโโ 07.38 %
TOML 1 hr 7 mins โโโโโโโโโโโโโโโโโโโโโโโโโ 05.20 %
reStructuredText 1 hr 6 mins โโโโโโโโโโโโโโโโโโโโโโโโโ 05.19 %