Also known as a crawler, it is a type of computer that “crawls” all the data and information on a website and checks and, if necessary, evaluates it based on criteria. We can control the crawl behavior using robots.txt, NoIndex or NoFollow.
Interested in other topics? Here we have a few more suitable explanations.