Should I block duplicate pages using robots.txt?

CATEGORIES
+ Autos
+ Animals
+ Animation
+ Comedy
+ Education
+ Entertainment
+ Film
+ Howto
+ Music
+ News
+ Nonprofits
+ People
+ Pets
+ Science
+ Sport
+ Style
+ Technology
+ Travel



VideoMeli.com is an Youtube Powered website

This text will be replaced

Added By: GoogleWebmasterHelp

Description:
Halfdeck from Davis, CA asks: "If Google crawls 1,000 pages day, Googlebot crawling many dupe content pages may slow down indexing of a large site. In that scenario, do you recommend blocking dupes using robots.txt or is using META ROBOTS NOINDEX,NOFOLLOW a better alternative?" Short answer: No, do...

Related videos to: Should I block duplicate pages using robots.txt?

Link:

--------------------------------------------------------------------------
Embed:
--------------------------------------------------------------------------
Search:
--------------------------------------------------------------------------