-
🥇 I'm proficient with:
- torch, scikit-learn, imblearn, numpy/scipy/statsmodels, pandas, xgboost/catboost/lgbm, gplearn, albumentations, category_encoders
-
🥈 I know my way around:
- spark, keras/tensorflow, scikit-optimize, OpenGL, OpenCV, transformers
-
🎖 Recently, I've learned how to:
- never accept the null hypothesis;
- stop misinterpreting p-values;
- read impurity-based, permutation and SHAP feature importances properly;
- spot the outliers in several dimensions at once with Mahalanobis distance;
- combat skewness with QuantileTransformer/PowerTransformer;
- calibrate;
- avoid the PCA trap in classification;
- engineer features with symbolic regression.
dx2-66 Goto Github PK
Name: Максим Эмбаухов
Type: User
Bio: Data Scientist
Location: St.Petersburg