ABSTRACT

Sometimes called “agents,” bots are small programs that run across a network and execute information gathering or processing tasks on behalf of the user. These are repetitive tasks that are typically run several times on a schedule and are very often used in e-commerce, Web-site administration, and software distribution. Web spiders are bots that search Web sites, read information, and create entries for Web search engines. Think of Web spiders as indexers. Bots such as ELIZA and Julia are examples of artificial intelligence (see ARTIFICIAL INTELLIGENCE).