The goal of this week was to use Gapminder population data to create an animated scatterplot using Matplotlib and start becoming familiar with Pandas and the GitHub workflow.
The first machine learning project, the aim was to use logistic regression and random forest models in scikit-learn to predict which passengers would survive on the Titanic.
The aim of this week was to feature engineer a regression model while avoiding overfitting to predict how many bicycles are needed on any given day for bike sharing customers.
This project involved scraping web lyrics from two artists and training a Naive Bayes classifier to predict the artist from a new piece of text.
This goal of this week was to create a dashboard from fictional 'Northwind' company data using PostgreSQL and AWS cloud computing.
This project utilized the Twitter API to scrape tweets and run a sentiment analysis through a pipeline using Docker and MongoDB.
This project used historic temperature data in Berlin to build a model of trend and seasonality and forecast the weather for the coming five days.
The aim of this project was to using a markov chain model to simulate the minute-to-minute movements of customers between sections of a supermarket over a single day.
The goal of this week was to finetune a pretrained neural network to detect the object held in front of a web cam using a dataset of photos taken by us.
This project aimed to create a movie recommendation system using unsupervised learning and introduced us to building a web application in Flask.