Web Spiders
A search engine needs a list of key words and a list of the names and addresses of the sites where those words are found. These list are called indexes because they are much like the indexes in the books. Indexes are typically created by software programs called spiders, wanderers, or robots, which crawl the Web. They automatically go to a page and compile a list of all the significant words that it contains, then follow any links on that page to other pages and repeat the procedure. As they move around the Web the spiders process huge amounts of data and compile the results into a database accessible at a single Web site. Thus when you go to that site and look up any word it has indexed, it can display a list of other sites where that word can be found. Spiders and search engines put a strain on the Internet because of the large amounts of data transferred by these programs.
|