API build with Node-TS following, DDD Hexagonal Architecture and OOP best practices.
The api allows the user to create news and store the main 5 feed news from different Spanish newspapers (Elmundo & ElPais) for now.
To get the news it's used puppeter at the moment as axios was having some enconding problems and cheerio for handling the request and parsing the data.
The architecture allows to easilly increase the number of newspapers to be scrapper using a ScrapperFactory.
- TypeScript (v4)
- Prettier
- ESLint with:
- Codely's config (includes ESLint's recommended rules, Prettier, Import plugin and more)
- Jest plugin
- Jest with DOM Testing Library
- GitHub Action workflows set up to run tests and linting on push
- Supertest for integration testing
- Express.JS
- Mongodb as db
- Cheerio to parse the request data
- Axios and Puppetter for page crawler
The following are the instructions to set up the API project locally. To get a local copy up and running follow these simple example steps.
- Run the app:
yarn dev
- Install the dependencies:
yarn install
- Execute unit tests:
yarn test:unit
- Execture integration tests:
yarn test:int
- Check linter errors:
npm run lint
- Fix linter errors:
npm run lint:fix
!IMPORTANT
You need to provide the database connection in order for the persistance layer to work, can either be a local or an atlasdb mongo instace.
API docs.
- Feed model
{
"id": "0766c602-d4d4-48b6-9d50-d3253123273e",
"title": "Feed title",
"description": "Feed description",
"url": "https://example.com",
"image": "https://example.com/image.png",
"source": "ELMUNDO",
"author": "",
"location": "",
"date": "2021/03/07"
}
-
PUT /feed
Create and udpate the feed
-
GET /feed/:id
Gets feed by :id
-
DELETE /api/feed/:id
DELETE feed by :id
-
GET /api/feed-scrapper?type="sourcetype"
Get 5 feeds from the selected sourcetype and saves it into the database