Schematisches Beispiel einer robots.txt:
# robots.txt für meineseite.com # Diese Webcrawler schließe ich aus User-agent: Sidewinder Disallow: /User-agent: Microsoft.URL.Control Disallow: / # Diese Verzeichnisse/Dateien sollen nicht durchsucht werden User-agent: * Disallow: /default.html Disallow: /Temp/ # diese Inhalte verschwinden bald Disallow: /Privat/Familie/Geburtstage.html
Meine robots.txt für Rootserver XTC:
User-agent: *
Disallow: address_book_process.php
Disallow: account.php
Disallow: account_edit.php
Disallow: account_edit_process.php
Disallow: account_history.php
Disallow: account_history_info.php
Disallow: address_book.php
Disallow: checkout_process.php
Disallow: advanced_search.php
Disallow: advanced_search_result.php
Disallow: checkout_address.php
Disallow: checkout_confirmation.php
Disallow: checkout_payment.php
Disallow: checkout_success.php
Disallow: contact_us.php
Disallow: create_account.php
Disallow: create_account_guest.php
Disallow: create_account_process.php
Disallow: create_account_success.php
Disallow: info_shopping_cart.php
Disallow: login.php
Disallow: logoff.php
Disallow: password_double_opt.php
Disallow: popup_image.php
Disallow: popup_search_help.php
Disallow: privacy.php
Disallow: product_notifications.php
Disallow: product_reviews.php
Disallow: product_reviews_info.php
Disallow: reviews.php
Disallow: shipping.php
Disallow: admin/
Disallow: export/
Disallow: download/
Disallow: includes/
Disallow: pub/
Disallow: media/
Disallow: plesk-stat/
Disallow: chache/
siehe auch:
Neueste Kommentare