Università di Catania - CdL Informatica Magistrale - Compilatori A.A 2014/15
Autor:
- [Aiello Mirko Raimondo] (http://www.mraiello.altervista.org) (https://github.com/R3m3r)
A crawler (aka web crawler, spider or robot) is an automated software that scan a network (or database) and research a series of URLs into the web pages. It's released under the GNU GPL v3 License.
Open "WebCrawler.sln" with Visual Studio 2010 and compile.