(n.) 1. A type of program that automatically retrieves a document anywhere on the web, along with those referenced in the web. Index-building robots retrieve a significant number of the references found in search engines. Also known as a crawler or a spider. 2. A program posing as an Internet Relay Chat (IRC) or a multiuser dungeon (MUD) user and usually performing a function.
« BACK TO DICTIONARY INDEX