When search engines found a new website, they send their agents called Web Robots to do the same thing before send their visitors. Those spiders check whether you have provided any guidelines or not. It means they seek for your robots.txt file. Its a text file in which you have to declare some pages with allow which allows the spider to crawl them and also describe those pages with disallow to remind search engine spider to not crawl them. :)