Top Qs
Timeline
Chat
Perspective

Spider trap

Set of web pages that can undermine web crawlers From Wikipedia, the free encyclopedia

Remove ads
Remove ads

A spider trap (or crawler trap) is a set of web pages that may intentionally or unintentionally be used to cause a web crawler or search bot to make an infinite number of requests or cause a poorly constructed crawler to crash. Web crawlers are also called web spiders, from which the name is derived. Spider traps may be created to "catch" spambots or other crawlers that waste a website's bandwidth. They may also be created unintentionally by calendars that use dynamic pages with links that continually point to the next day or year.

Common techniques used include:

  • Creation of indefinitely deep directory structures such as http://example.com/bar/foo/bar/foo/bar/foo/bar/...
  • Dynamic pages that produce an unbounded number of documents for a web crawler to follow. Examples include calendars[1] and algorithmically generated language poetry.[2]
  • Documents filled with many characters, crashing the lexical analyzer parsing the document.
  • Documents with session-id's based on required cookies.

There exists no universal algorithm capable of detecting all spider traps. While certain categories of traps can be identified through automated methods, novel and previously unrecognized traps continue to emerge rapidly.

Remove ads

Politeness

A spider trap causes a web crawler to enter something like an infinite loop,[3] which wastes the spider's resources,[4] lowers its productivity, and, in the case of a poorly written crawler, can crash the program. Polite spiders alternate requests between different hosts, and do not request documents from the same server more than once every several seconds,[5] meaning that a "polite" web crawler is affected to a much lesser degree than an "impolite" crawler.[citation needed]

In addition, sites with spider traps usually have a robots.txt telling bots not to go to the trap, so a legitimate "polite" bot would not fall into the trap, whereas an "impolite" bot which disregards the robots.txt settings would be affected by the trap.[6]

Remove ads

See also

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.

Remove ads