i disallow urls in robots file of website , have difficulties.
right robots file has following content:
user-agent: * allow: / disallow: /cgi-bin/ sitemap: http://seriesgate.tv/sitemap.xml
i not want google index following urls:
http://seriesgate.tv/watch-breakingbad-online/season5/episode8/searchresult/
there 8000 more urls this. code in robots file block this.
and want disallow search box robots file search pages not crawled google example url:
seriesgate.tv/search/indv_episodes/friends/
any ideas?
add disallow: /name_of_folder/
not allow google crawl folder , add disallow: /file_name
not allow google crawl specific file..
Comments
Post a Comment