Easy Spider is a distributed Perl Web Crawler Project from 2006. It features code from crawling webpages, distributing it to a server and generating xml files from it. The client site can be any computer (Windows or Linux) and the Server stores all data.

Websites that use EasySpider Crawling for Article Writing Software:
https://www.artikelschreiber.com/en/
https://www.unaique.net/en/
https://www.unaique.com/
https://www.artikelschreiben.com/
https://www.buzzerstar.com/
https://easyperlspider.sourceforge.io/
https://www.sebastianenger.com/
https://www.artikelschreiber.com/opensource/

It is fun to look at some code that is few years ago and to see how one has improved himself. If you want to write text automatically try https://www.artikelschreiber.com/en/ or https://www.unaique.net/en/!

Features

Project Samples

Project Activity

See All Activity >

License

GNU Library or Lesser General Public License version 2.0 (LGPLv2)

Follow Easyspider - Distributed Web Crawler

Easyspider - Distributed Web Crawler Web Site

Other Useful Business Software
Our Free Plans just got better! | Auth0 Icon
Our Free Plans just got better! | Auth0

With up to 25k MAUs and unlimited Okta connections, our Free Plan lets you focus on what you do best—building great apps.

You asked, we delivered! Auth0 is excited to expand our Free and Paid plans to include more options so you can focus on building, deploying, and scaling applications without having to worry about your security. Auth0 now, thank yourself later.
Try free now
Rate This Project
Login To Rate This Project

User Ratings

★★★★★
★★★★
★★★
★★
1
0
0
0
0
ease 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 4 / 5
features 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 4 / 5
design 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 5 / 5
support 1 of 5 2 of 5 3 of 5 4 of 5 5 of 5 5 / 5

User Reviews

  • Easyspider is a perl client/Server architecture to crawl the web for interessting webpages. The Server can be any box that has internet access and allows perl programms to run. The client connects to the server, gets its working task, fullfills it and give the resuts as xml stream back to the server. the server then can install that xml file into a oracle/mysql/mariadb etc database or can be parsed by the sphinxsearch.com fulltext indexer to generate searchable content for your webpage. Happy Hacking ;-)
Read more reviews >

Additional Project Details

Languages

English, German

Intended Audience

Telecommunications Industry, System Administrators, Developers

User Interface

Console/Terminal, Command-line

Programming Language

Perl

Database Environment

Flat-file

Related Categories

Perl Search Engines, Perl Internet Software, Perl Web Scrapers

Registered

2014-05-09