Table of Contents
A scraping service built on Scrapy, served with Flask, and deployed using Acorn.
This is an example of how you may give instructions on setting up your project locally. To get a local copy up and running follow these simple example steps.
-- This is an example of how to list things you need to use the software and how to install them.
- npm
npm install npm@latest -g
- Get a free API Key at https://example.com
- Clone the repo
git clone https://github.com/Insiares/scraper.git
- Install NPM packages
npm install
- Enter your API in
config.js
const API_KEY = 'ENTER YOUR API';
Use this space to show useful examples of how a project can be used. Additional screenshots, code examples and demos work well in this space. You may also link to more resources.
For more examples, please refer to the Documentation
- Create our spider
- mount mongodb
- Implement an item pipeline to populate our dabatabase from the spider findings
- Build an API on top of your system
- Present it in a nice html/css
- run unit tests
- implement logging
- Write dockerfile
- deploy on acorn
- CI with acorn/dockerhub : add credential to GH actions and acorn logic to build and push to dockerio
- attempt to CD with acorn autoupdate
- security test
- documentation
See the open issues for a full list of proposed features (and known issues).