GithubHelp home page GithubHelp logo

webrobots's Introduction

webrobots

This is a library to help write robots.txt compliant web robots.

Usage

require 'webrobots'
require 'uri'
require 'net/http'

robots = WebRobots.new('MyBot/1.0')

uri = URI('http://digg.com/news/24hr')
if robots.disallowed?(uri)
  STDERR.puts "Access disallowed: #{uri}"
  exit 1
end
body = Net::HTTP.get(uri)
# ...

Requirements

  • Ruby 1.8.7 or 1.9.2+

Contributing to webrobots

  • Check out the latest master to make sure the feature hasn’t been implemented or the bug hasn’t been fixed yet

  • Check out the issue tracker to make sure someone already hasn’t requested it and/or contributed it

  • Fork the project

  • Start a feature/bugfix branch

  • Commit and push until you are happy with your contribution

  • Make sure to add tests for it. This is important so I don’t break it in a future version unintentionally.

  • Please try not to mess with the Rakefile, version, or history. If you want to have your own version, or is otherwise necessary, that is fine, but please isolate to its own commit so I can cherry-pick around it.

Copyright © 2010-2016 Akinori MUSHA. See LICENSE.txt for further details.

webrobots's People

Contributors

knu avatar drbrain avatar prakashmurthy avatar kball avatar

Stargazers

Matt Solt avatar  avatar Nauman Tariq avatar Johan Eckerström avatar Patrik Ragnarsson avatar Masafumi Yokoyama avatar Vesa Vänskä avatar Max Pleaner avatar Yasuhiro Manai avatar Miguel Vaello avatar Humberto avatar Nicolas Williams avatar Mr. Ronald avatar Maxim Filippovich avatar  avatar Leo Schwarz avatar Brandon Black avatar  avatar William Flanagan avatar  avatar sasezaki avatar  avatar Fumitake TANIGUCHI avatar Angelo Lakra avatar

Watchers

 avatar James Cloos avatar  avatar

webrobots's Issues

gemspec date error

I'm trying to install the latest mechanize gem. However it has a dependency on webrobots 0.0.10, and I'm getting an error when trying to install with bundler.

.../webrobots-0.0.10.gemspec]: invalid date format in specification: "2011-07-01 00:00:00.000000000Z

Enable externalization of Crawl-delay directive enforcement from allowed?() and disallowed?().

Consider the following use case. A web crawler downloads an HTML page from a site that has a robots.txt Crawl-delay directive and it wants to filter out links on the page that it is not allowed to crawl. Each call to disallowed?(url) (or allowed?(url) for that matter) will enforce the crawl delay. It would be much better to allow the web crawler to respect the crawl delay before it requests resources that it is allowed to access.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.