Halfdeck from Davis, CA asks: "If Google crawls 1,000 pages day, Googlebot crawling many dupe content pages may slow down indexing of a large site. In that scenario, do you recommend blocking dupes using robots.txt or is using META ROBOTS NOINDEX,NOFOLLOW a better alternative?" Short answer: No, do...