GithubHelp home page GithubHelp logo

openaustralia / morph Goto Github PK

View Code? Open in Web Editor NEW
463.0 21.0 74.0 8.95 MB

Take the hassle out of web scraping

Home Page: https://morph.io

License: GNU Affero General Public License v3.0

Ruby 99.33% CoffeeScript 0.05% HTML 0.03% Shell 0.02% Io 0.01% JavaScript 0.01% Python 0.01% Perl 0.01% PHP 0.01% Makefile 0.01% Dockerfile 0.01% SCSS 0.12% Haml 0.38% Procfile 0.01%
civictech docker webscraping

morph's Introduction

Build Status Code Climate

morph.io: A scraping platform

  • A Heroku for Scrapers
  • All code and collaboration through GitHub
  • Write your scrapers in Ruby, Python, PHP, Perl or JavaScript (NodeJS, PhantomJS)
  • Simple API to grab data
  • Schedule scrapers or run manually
  • Process isolation via Docker
  • Email alerts for broken scrapers

Dependencies

Ruby, Docker, MySQL, SQLite 3, Redis, mitmproxy. (See below for more details about installing Docker)

Development is supported on Linux (Ubuntu 20.04) and Mac OS X.

Repositories

User-facing:

Docker images:

Installing Docker

On Linux

Just follow the instructions on the Docker site.

Your user account should be able to manipulate Docker (just add your user to the docker group).

On Mac OS X

Install Docker for Mac.

Starting up Elasticsearch

Morph needs Elasticsearch to run. We've made things easier for development by using docker to run Elasticsearch.

docker-compose up

To Install Morph

bundle install
cp config/database.yml.example config/database.yml
cp env-example .env

Edit config/database.yml with your database settings

Tunnel GitHub webhook traffic back to your local development machine

We use "ngrok" a tool that makes tunnelling internet traffic to a local development machine easy. First download ngrok if you don't have it already. Then,

ngrok http 5100

Make note of the http://*.ngrok.io forwarding URL.

Creating Github Application

You'll need to create an application on GitHub So that morph.io can talk to GitHub. We've pre-filled most of the important fields for a few different configurations below:

You will need to add add and change a few values manually:

  • Disable "Expire user authorization tokens"
  • Add an image - you can use the standard logo at app/assets/images/logo.png (you can add this after the app is created)
  • If the webhooks are active and being used in production (currently not the case) then you'll also need to add a "Webhook secret" for security.

Next you'll need to fill in some values in the .env file which come from the GitHub App that you've just created.

  • GITHUB_APP_ID - Look for "App ID" near the top of the page. This should be an integer
  • GITHUB_APP_NAME - Look for "Public link". The name is what appears after "https://github.com/apps/". It's essentially a url happy version of the name you gave the app.
  • GITHUB_APP_CLIENT_ID - Look for "Client ID" near the top of the page.
  • GITHUB_APP_CLIENT_SECRET - Go to "Generate a new client secret".

Also, a private key for the GitHub app is needed. This can be generated by clicking the "Generate a private key" button and will be automatically downloaded. Move and rename it to config/morph-github-app.private-key.pem.

Now setup the databases:

bundle exec dotenv rake db:setup

Now you can start the server

bundle exec dotenv foreman start

and point your browser at http://127.0.0.1:3000

To get started, log in with GitHub. There is a simple admin interface accessible at http://127.0.0.1:3000/admin. To access this, run the following to give your account admin rights:

bundle exec rake app:promote_to_admin

Running tests

If you're running guard (see above) the tests will also automatically run when you change a file.

By default, RSpec will skip tests that have been tagged as being slow. To change this behaviour, add the following to your .env:

RUN_SLOW_TESTS=1

By default, RSpec will run certain tests against a running Docker server. These tests are quite slow, but not have been tagged as slow. To stop Rspec from running these tests, add the following to your .env:

DONT_RUN_DOCKER_TESTS=1

Guard Livereload

We use Guard and Livereload so that whenever you edit a view in development the web page gets automatically reloaded. It's a massive time saver when you're doing design or lots of work in the view. To make it work run

bundle exec guard

Guard will also run tests when needed. Some tests do integration tests against a running docker server. These particular tests are very slow. If you want to disable them,

DONT_RUN_DOCKER_TESTS=1 bundle exec guard

Mail in development

By default in development mails are sent to Mailcatcher. To install

gem install mailcatcher

Deploying to production

This section will not be relevant to most people. It will however be relevant if you're deploying to a production server.

Ansible Vault

We're using Ansible Vault to encrypt certain files, like the private key for the SSL certificate.

To make this work you will need to put the password in a file at ~/.infrastructure_ansible_vault_pass.txt. This is the same password as used in the openaustralia/infrastructure GitHub repository.

Restarting Discourse

Discourse runs in a container and should usually be restarted automatically by docker.

However, if the container goes away for some reason, it can be restarted:

root@morph:/var/discourse# ./launcher rebuild app

This will pull down the latest docker image, rebuild, and restart the container.

Production devops development

This method defaults to creating a 4Gb VirtualBox VM, which can strain an 8Gb Mac. We suggest tweaking the Vagrantfile to restrict ram usage to 2Gb at first, or using a machine with at least 12Gb ram.

Install Vagrant, VirtualBox and Ansible.

Install a couple of Vagrant plugins: vagrant plugin install vagrant-hostsupdater vagrant-disksize

Install rbenv and ruby-build.

If on Ubuntu, install libreadline-dev: sudo apt install libreadline-dev libsqlite3-dev

Install the required ruby version: rbenv install

Install capistrano: gem install capistrano

Run make roles to install some required ansible roles.

Run vagrant up local. This will build and provision a box that looks and acts like production at dev.morph.io.

Once the box is created and provisioned, deploy the application to your Vagrant box:

cap local deploy

Now visit https://dev.morph.io/

Production provisioning and deployment

To deploy morph.io to production, normally you'll just want to deploy using Capistrano:

cap production deploy

When you've changed the Ansible playbooks to modify the infrastructure you'll want to run:

make ansible

SSL certificates

We're using Let's Encrypt for SSL certificates. It's not 100% automated. On a completely fresh install (with a new domain) as root:

certbot --nginx certonly -m [email protected] --agree-tos

It should show something like this:

Which names would you like to activate HTTPS for?
-------------------------------------------------------------------------------
1: morph.io
2: api.morph.io
3: faye.morph.io
4: help.morph.io

Leave your answer your blank which will install the certificate for all of them

Installing certificates for local vagrant build

sudo certbot certonly --manual -d dev.morph.io --preferred-challenges dns -d api.dev.morph.io -d faye.dev.morph.io -d help.dev.morph.io

Scraper<->mitmdump SSL

Scrapers talk out to the internet by being routed through the mitmdump2 proxy container. The default container you'll get on a devops install has no SSL certificates. This makes it easy for traffic to get out, but means we can't replicate some problems that occur when the SSL validation fails.

To work around this, you'll have to rebuild the mitmdump container. Look in /var/www/current/docker_images/morph-mitmdump; there's a Makefile that will aid in building the new image.

Once that's done, you'll need to build a new version of the openaustralia/buildstep:

  • cd
  • git clone https://github.com/openaustralia/buildstep.git
  • cd buildstep
  • cp /var/www/current/docker_images/morph-mitmdump/mitmproxy/mitmproxy-ca-cert.pem .
  • docker image build -t openaustralia/buildstep:latest .

You should now be able to see in docker image list --all that your new image is ready. The next time you run a scraper it will be rebuilt using the new buildstep image.

How to contribute

If you find what looks like a bug:

  • Check the GitHub issue tracker to see if anyone else has reported issue.
  • If you don't see anything, create an issue with information on how to reproduce it.

If you want to contribute an enhancement or a fix:

  • Fork the project on GitHub.
  • Make your changes with tests.
  • Commit the changes without making changes to any files that aren't related to your enhancement or fix.
  • Send a pull request.

We maintain a list of issues that are easy fixes. Fixing one of these is a great way to get started while you get familiar with the codebase.

Copyright & License

Copyright OpenAustralia Foundation Limited. Licensed under the Affero GPL. See LICENSE file for more details.

morph's People

Contributors

anishka0107 avatar auxesis avatar bxjx avatar chrismytton avatar clockwerx avatar drzax avatar ekaterinasemenova avatar emikulic avatar equivalentideas avatar henare avatar jamezpolley avatar jarib avatar jcartledge avatar joonjoonjoon avatar katkad avatar katska avatar lkundrak avatar mikeralphson avatar millette avatar mlandauer avatar njenkins avatar otherchirps avatar pezholio avatar sam0410 avatar sebbacon avatar snoblenet avatar softgrow avatar waffle-with-pears avatar wfdd avatar zarino avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

morph's Issues

One click fork of a scraper

This is what it should do under the hood:

Fork a scraper in one step:

  1. Forks on github
  2. Connects scraper to platform
  3. Redirect to new scraper page

Collect statistics (and totals) on scraper runs

Stats:

  • CPU time last run
  • User time last run
  • Number of pages scraped (urls)
  • Network down traffic
  • Network up traffic
  • Peak memory usage
  • Amount of new (or less) disk space used
  • Number of rows written to the database
  • And totals / averages across all the scraper runs for the stats above.

Optionally run scraper once per day

Initially do the easy thing and just make them all run at the same time of day (in a queue so they don't actually all run at the same time).

In the long run it would be much better to spread the load of running them throughout the day in much the same kind of way as we do with planningalerts.

Give this thing a name

Currently one idea is neomaki (thank you Kat!). Registered neomaki.com on a whim. Did this a few days ago.

Now after leaving the name to rest I'm not totally convinced by it.

Easy scraper discovery

One of the dreams of ScraperWiki was that if a scraper already existed for the data you wanted, you wouldn't have to rewrite it.

That never really materialised as I'd often get half way through writing a scraper only to find someone else had kinda already done it.

I'd like a central place to really quickly check if someone has scraped something similar. It needn't be complex. Maybe just a quick search of URLs that scrapers using this platform have hit? (#32)

Show remaining CPU time

This is probably best done with a page showing all your scrapers, how much each scraper is using and then your total usage

Email alerts

  • Notify you when one of your scrapers breaks
  • Notify you when one of a set of scrapers with a particular tag breaks
  • You go over a limit (cpu, user...)

GitHub organisation support

So that if you're a member of an organisation and have commit rights to a repo you can add and control a scraper

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.