Jump to content


Photo

SEO for cart in separate directory


  • Please log in to reply
3 replies to this topic

#1 trixiemay

trixiemay

    Forum Newcomer

  • Members
  • Pip
  • 7 posts
  • Gender:Male
  • Experience:Intermediate
  • Area of Expertise:Designer

Posted 04 June 2017 - 05:44 AM

http://www.waikatocampervansites.co.nz

 

I have a website with a standard html5 homepage that links through to a sub-directory where my ecommerce software sits (Opencart).

 

What is the seo protocol for best results?

 

I have added all meta tags for both homepage and cart pages and separate analytics for the homepage and cart. But Google's Fetch and Render returns a 'partial' result. BTW I am using Concrete5 as the CMS platform for the homepage which usually brings great results for me. But the index request is just sitting in my webmaster tools without being actioned.

 

Any advice appreciated. Thanks



#2 BrowserBugs

BrowserBugs

    Unhinged

  • Privileged
  • PipPipPipPipPip
  • 2,060 posts
  • Gender:Male
  • Location:Surrey, UK
  • Experience:Intermediate
  • Area of Expertise:I'm Learning

Posted 05 June 2017 - 06:51 AM

Morning mate,

 

Can I just check when you say "But Google's Fetch and Render returns a 'partial' result." we're talking about Webmaster Tools? If so I'm thinking it's to do with your split url links; e.g your home page is on http://www.waikatoca...vansites.co.nz/ and your other pages are on http://waikatocampervansites.co.nz which could be causing issues in a few areas.

 

Another problem I noticed was for example on /opencart-2.0.0.0/index.php?route=product/category&path=59 you're running 2 canonical tags, should only be one. In addition when you search Google for 'site:http://waikatocamper...encart-2.0.0.0/' you have 56 pages indexed (ish), some of which includes stuff like ?sort=p.model&order=ASC .... this is url clutter and can get out of hand if you're not using canonical correctly. 

 

Have you also logged http://waikatocampervansites.co.nz/ as a property in webmaster tools? This will need to be done in order to check both variations.



#3 trixiemay

trixiemay

    Forum Newcomer

  • Members
  • Pip
  • 7 posts
  • Gender:Male
  • Experience:Intermediate
  • Area of Expertise:Designer

Posted 14 June 2017 - 08:30 AM

Great response - thanks.

Yeah I am using Google Webmaster Tools. I'll check the canonical. I also had a big issue with page load speed and optimised the code and the photos as much as I could.

 

The other issue was the robot.txt which excluded a number of files used by the CMS (Concrete5). I have relaxed this.



#4 BrowserBugs

BrowserBugs

    Unhinged

  • Privileged
  • PipPipPipPipPip
  • 2,060 posts
  • Gender:Male
  • Location:Surrey, UK
  • Experience:Intermediate
  • Area of Expertise:I'm Learning

Posted 20 June 2017 - 11:59 AM

The other issue was the robot.txt which excluded a number of files used by the CMS (Concrete5). I have relaxed this.

 

Robots.txt I find is more of a curse than a blessing; excluding can cause side effects, e.g. excluding .js in robots.txt can lead to reputable search engines not looking at it, and so when they try to render the page and exclude the use of the .js file it's like the js file doesn't exist. I know it's old but this its Matt Cutts take.

 

 

Secondly, if it's files which need to be secured then robots.txt acts as a roadmap for bored people looking for something to do or bots wanting to know what you don't want them to access; hell check your server logs for bots looking for /wp-admin/ and getting 404s (with any luck). If it's an includes you want to secure then you can secure with htaccess and yet the server can still pull the includes files.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users