Jump to content
Sign in to follow this  
trixiemay

SEO for cart in separate directory

Recommended Posts

http://www.waikatocampervansites.co.nz

 

I have a website with a standard html5 homepage that links through to a sub-directory where my ecommerce software sits (Opencart).

 

What is the seo protocol for best results?

 

I have added all meta tags for both homepage and cart pages and separate analytics for the homepage and cart. But Google's Fetch and Render returns a 'partial' result. BTW I am using Concrete5 as the CMS platform for the homepage which usually brings great results for me. But the index request is just sitting in my webmaster tools without being actioned.

 

Any advice appreciated. Thanks

Share this post


Link to post
Share on other sites

Morning mate,

 

Can I just check when you say "But Google's Fetch and Render returns a 'partial' result." we're talking about Webmaster Tools? If so I'm thinking it's to do with your split url links; e.g your home page is on http://www.waikatocampervansites.co.nz/ and your other pages are on http://waikatocampervansites.co.nz which could be causing issues in a few areas.

 

Another problem I noticed was for example on /opencart-2.0.0.0/index.php?route=product/category&path=59 you're running 2 canonical tags, should only be one. In addition when you search Google for 'site:http://waikatocampervansites.co.nz/opencart-2.0.0.0/' you have 56 pages indexed (ish), some of which includes stuff like ?sort=p.model&order=ASC .... this is url clutter and can get out of hand if you're not using canonical correctly.

 

Have you also logged http://waikatocampervansites.co.nz/ as a property in webmaster tools? This will need to be done in order to check both variations.

Share this post


Link to post
Share on other sites

Great response - thanks.

Yeah I am using Google Webmaster Tools. I'll check the canonical. I also had a big issue with page load speed and optimised the code and the photos as much as I could.

 

The other issue was the robot.txt which excluded a number of files used by the CMS (Concrete5). I have relaxed this.

Share this post


Link to post
Share on other sites

The other issue was the robot.txt which excluded a number of files used by the CMS (Concrete5). I have relaxed this.

 

Robots.txt I find is more of a curse than a blessing; excluding can cause side effects, e.g. excluding .js in robots.txt can lead to reputable search engines not looking at it, and so when they try to render the page and exclude the use of the .js file it's like the js file doesn't exist. I know it's old but this its Matt Cutts take.

 

 

Secondly, if it's files which need to be secured then robots.txt acts as a roadmap for bored people looking for something to do or bots wanting to know what you don't want them to access; hell check your server logs for bots looking for /wp-admin/ and getting 404s (with any luck). If it's an includes you want to secure then you can secure with htaccess and yet the server can still pull the includes files.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Recently Browsing

    No registered users viewing this page.

  • Member Statistics

    • Total Members
      57,712
    • Most Online
      4,970

    Newest Member
    adabo
    Joined
  • Forum Statistics

    • Total Topics
      65,781
    • Total Posts
      454,596
×