1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Robots.txt and Wordpress Duplicate Content

Discussion in 'Content Management' started by cipals15, Jan 16, 2010.

  1. #1
    I am currently on practice with robots.txt.. I have found out that there are lots of duplicate contents in a freshly installed wordpress blog..

    One major source of duplicate contents are the archives section. There are multiple archives.. So i decided to block them from getting indexed by search engine spiders..

    What other sections of a new wordpress blog should be blocked using robot.txt?
     
    cipals15, Jan 16, 2010 IP
  2. internetmarketingiq

    internetmarketingiq Well-Known Member

    Messages:
    3,552
    Likes Received:
    70
    Best Answers:
    0
    Trophy Points:
    165
    #2
    And this information has been discussed over and over since it was "found out". That's why plugins like All in one SEO allow you a check box to choose whether to block them or not.

    But in reality it's not much of an issue, because as it turns out Google already is very aware of this.

    It has not to date had any effect on the rankings of my blog. I do not create robot text files and they rank just fine.
     
    internetmarketingiq, Jan 16, 2010 IP
  3. bama boy

    bama boy Active Member

    Messages:
    376
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    55
    #3
    Install all in one seo pack plugin and use noindex for categories , archives ,tag archives
    this will fix your problem
     
    bama boy, Jan 24, 2010 IP