Scrapyd is a service daemon to run [Scrapy](http://scrapy.org) spiders.
It allows you to deploy your Scrapy projects by building Python eggs of them and uploading them to the Scrapy service using a JSON API that you can also use for scheduling spider runs. It supports multiple projects also.
- Python 2.6 or up
- Works on Linux, Windows, Mac OSX, BSD
The quick way:
pip install scrapyd
You can download the latest stable and development releases from: http://pypi.python.org/pypi/Scrapyd
See http://scrapy.org/community/
See http://doc.scrapy.org/en/latest/contributing.html
See http://scrapy.org/companies/