My local gym Fitnessfabrik started offering a visitor count during the Corona period in 2020. The websites only shows the current visitor count but it doesn't offer a history function or diagrams. So I created this small project to scrape the data from the website and create useful diagrams out of it. The diagrams offer an easy way to find less busy time spots in the gym :)
Original website
The Python 3 script uses the Beautiful Soup lib to scrape the visitor count from http://besucher.fitnessfabrik.de every minute. It writes the data to a simple .csv file and uses the pandas and Plotly lib to render a HTML diagram from the data.
I run the script on a Raspberry Pi 3 and has proven to be very reliable for the job.
This a rendered HTML diagram by the script
- A simple way to start the script is to just run
python3 get_count.py
- It's recommended to set the script up as a systemd service and trigger it via cron
50 5 * * 1-5 systemctl start gym_visitor_count
-> Start at 5:50 GMT during the week10 23 * * 1-5 systemctl stop gym_visitor_count
-> Stop at 23:10 GMT during the week50 8 * * 6,0 systemctl start gym_visitor_count
-> Start at 08:50 GMT on weekends10 21 * * 6,0 systemctl stop gym_visitor_count
-> Stop at 21:10 GMT on weekends
- One could also use systemd timers to schedule the service
There multiple options to view the diagrams with the visitor statistics.
- Local workstation
- Go to the output folder specified in the script and open the <ISO_8601_date>.html file (for example 2020-06-30.html)
- Remote device without GUI (for example a Raspberry Pi with CLI only)
- A Raspberry Pi runs the script perfectly fine and offers a low power consumption
- A good way to view the HTML files is to run a webserver (like Nginx) with directory listing enabled
- The script copies the rendered HTML files to the
/var/www/html/
directory which can be served via Nginx
Nginx hosting the HTML files with directory listing enabled
I am not associated with the company running the gym. This is just a small hobby project by a customer.