Implemented the algorithm with gradient descents from scratch using numpy and pandas
Algorithm Included are,
- Linear Regression-one feature
- Multilinear regression-Vectorized
- Multilinear regression-Closed form
- Polynomial Regression
- Logistic Regression
- Naive Bayes
- KNN Hard and Soft Parzen
- PCA- to reduce the dimension of the input feature
- SVC-Support Vector classifier
- Neural Network with relu,sigmoid, tanh, leaky relu activations
- Normalize the data for linear models, if not the weights and bias will explode. it will also be helpful for the model to compare and contrast with the other input feature
- Each and every alorithm has its own advantage and disadvantage, understand it more so that you will know why so models perform good while other not EG:linearly seperable and non linearly seperable
- Every algorthim has assumption like Independent Identical Distributed(IID) , gaussian. if the assumption fails in the dataset, the algorithm will perfom poorly
- Use validation set to select the right hyper parameters(these parameters changes from problem to problem) and you can balance between bias and variance. Make sure there is no data leakage
- Gradients are important for the model to learn the right parameters with the perfect learning rate. If your learning rate is high, the gradient will diverge. If your learning rate is too less it take more epoches to converge
- Normalization is important in Neural Network as it reduces the time and memory during the computation of the dot product during the forward and back propagation which helps the model to converge faster and can achieve high accuracy in less epochs when compared with not normalized data.