Spider.
Spider is a Computer program that perform the process of spifering
A spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "crawler" or a "bot."
Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index. We use a huge set of computers to fetch (or "crawl") billions of pages on the web. The program that does the fetching is called Googlebot (also known as a robot, bot, or spider).
Web Crawler visits Websites and reads their pages and collect other important information in order to create entries for a search engine index. Web crawlers or web robots are programs that browses the internet in a methodical, automated manner to provide search engines with up-to date data.
The SEO Spider is a desktop website crawler and auditor for PC, Mac or Linux which spiders websites' links, images, CSS, script and apps like a search engine to evaluate onsite SEO.