The robots.txt file is used to disallow robots, (such as search engine crawlers), from visiting parts, or sections of a website. Most robots will read your robots.txt file, and follow the rules that you have placed there – keep in mind however, that this file alone will not prevent access to robots that do not use robot.txt files!
Use the robots.txt file to disallow indexing of content that you don't want indexed, such as decorative images, borders, or button images. Etc.
User-agent: Google User-agent: Yahoo Disallow: / User-agent: msnbot Allow: /