Jump to content

BrowserBugs

Privileged
  • Content Count

    2,555
  • Joined

  • Last visited

  • Days Won

    46

Everything posted by BrowserBugs

  1. Hi Gang, After some consideration reading posts on here I figured a good basic list of resources would be good. Not sure if a mod will lock this so it can be updated at a later date, might be nice to stop the "what is this SEO" type posts but any other peeps using with GOOD tools feel free to chip in. The Beginners Guide to SEO - Moz Read this! And before asking anything, read this! Seriously, read it; covers everything from how search engines work to tracking success. 90% of your questions will be answered before asking anything. Search Engine Optimization (SEO) Starter Guide - Google PageSpeed Insights - Google Yes, site speed is a factor, no quantity of keywords will fix these, this is 'SEO'. Microdata - Schema Mixed debate, personal favourite of mine, if nothing else have cool breadcrumbs under your listing. Structured Data Testing Tool - Google See what Google picks up from your structured data and clarify errors. HTTP / HTTPS Header Check A great way to see if the supposed 301 is real or if you're giving 302s instead. Redirect Checker Similar but also checks the number of redirects. Learn SEO - Moz Freaking great set of resources. If you're still stuffing keywords into titles and descriptions then you need to start here! Canonical vs NoIndex - Moz Yup, another Moz post but before thinking Canonical is a silver bullet read this. XML Sitemaps: The Most Misunderstood Tool in the SEO's Toolbox - Michael Cottam, Moz Worth a read, especially the consistency section. News & Rumours Google Algorithm Change History - Moz SEMrush Sensor Google PageRank & Algorithm Updates - Search Engine Roundtable Paid Tools SEMrush Moz Pro DeepCrawl Screaming Frog SEO Spider (Free/Paid) Advice Link Spam In Forum Comments Work Very Badly (Thanks @Rich C)
  2. BrowserBugs

    Hi from Lincoln

    Welcome
  3. BrowserBugs

    Query related to redirect from HTTP to HTTPS?

    I'd suggest setting up the https and then using htaccess on the new domain so if the website is accessed on any url other than the true url e.g. the https://www. version then to forward it over. Always remember most domains have 4 variants, so technically the setup above has 8 possibilities when you only want 1. Options +FollowSymlinks Options -Indexes RewriteEngine On RewriteBase / RewriteCond %{HTTPS} off [OR] RewriteCond %{HTTP_HOST} !^www\.yourdomain\.com [NC] RewriteRule ^(.*)$ https://www.yourdomain.com/$1 [R=301,L]
  4. I'd suggest the other way round looking at the source, I've never seen anything quite like it. Structurally there's a major issue with the breadcrumbs! Here's the crumbs from /services/college-and-university-admissions/; <ul class="list-unstyled"> <li><a href="http://capitalcollegeconsulting.com//">Home</a></li> <li><a href="http://capitalcollegeconsulting.com//College Admissions Counseling Services">College Admissions Counseling Services</a></li> <li class="active"><a href="http://capitalcollegeconsulting.com/services/college-and-university-admissions/">College and University Admissions Consultants</a></li> </ul> Those hrefs are all wrong, click any apart from the active and pow, white page. Search engines and visitors are both going to have a hard time with this. Have you actually thought about the structure? Shouldn't /services/ be the parent of college-and-university-admissions? <ul class="list-unstyled"> <li><a href="/">Home</a></li> <li><a href="/services/">Services</a></li> <li class="active"><a href="/services/college-and-university-admissions/">College and University Admissions Consultants</a></li> </ul> ... and please take a look at those js files, it makes me shudder how many are there. Edit: Just ran another check and those blank pages give 200 OK! HTTP/1.1 200 OK => Content-Type => text/html; charset=UTF-8 X-Port => port_10513 X-Cacheable => YES:Forced Date => Thu, 17 Jan 2019 09:55:13 GMT Age => 0 Connection => close Vary => Accept-Encoding, User-Agent X-Cache => uncached X-Cache-Hit => MISS X-Backend => all_requests Think you might also have a security issue ... ... 100% mate ... Last Edit: Might be easier to narrow down with inurl:.html as they seem to be using the pattern.
  5. BrowserBugs

    SKY nternet keeps dropping out.

    Hahaha yeah I remember getting Cable and Wireless back in 1998/9
  6. BrowserBugs

    SKY nternet keeps dropping out.

    The main problem is Sky is at the mercy of BT Open Reach who maintain the lines, hell even BT have to ask BT Open Reach to carry out line work, it's independent. I "think" virgin is the only operator who manage their own cables, and they run fibre to the property whereas BT run fibre to the local box and then it's back to ancient copper telephone lines to the property. We got out of the Sky contract on the basis they couldn't provide the service. Just log calls made and engineer visit dates and it makes it all a lot easier to prove.
  7. BrowserBugs

    SKY nternet keeps dropping out.

    Oddly I switched from Virgin to Sky and had the same problem, they kept saying interference on the line and it was a BT Openreach issue. I had BT here digging up the drive and all sorts to no avail. After 6 weeks got a full refund from Sky and went back to Virgin. The way I see it there are only two options, BT or Virgin cables. Out the two I find Virgin is much faster and goes down less often, that said when Virgin drops I find it's gone for half a day or more whereas BT glitch for half hour or so. No winners here
  8. I think it looks good mate, love the logo as well. One thing I spotted on the home page is the photo of the chap goes all squiffy (for want of a better word) depending on width etc, might want to look into it. Had a quick poke about in the source and the home page has an unusual set of headings throughout where it appears you're using them as a sort of font sizing rather than their intended heading purpose? Also the H1 doesn't seem to appear on the screen at all but is in the source, sort of odd unless I'm blind? If it's not on the page (e.g. masked or display none) then the crawlers will generally ignore it. Speed wise might I suggest running an audit with Lighthouse in Chrome Developer Tools, some quick wins there.
  9. As @fisicx said search engines have no preference to ASP/PHP/Ruby etc, their only preference is semantic html and crawlable JavaScript as that's what they're served. Best bet is to open Chrome Developer Tools and run an audit, this would identify some actionable points.
  10. Sounds like your browser might have cached the other version. How have you set up https in htaccess? If you can access both the https and http versions something is wrong, technically with a www on https there are normally four variants, http://yourdomain.com, http://www.yourdomain.com, https://yourdomain.com, https://www.yourdomain.com. You should be forcing anything not using your preferred setup to the right version, easily done with htaccess (assuming you want to use https://www); Options +FollowSymlinks Options -Indexes RewriteEngine On RewriteBase / RewriteCond %{HTTPS} off [OR] RewriteCond %{HTTP_HOST} !^www\.yourdomain\.com [NC] RewriteRule ^(.*)$ https://www.yourdomain.com/$1 [R=301,L] Edit: Forgot to say this would then redirect the non https css file to the real version which I think is the current problem.
  11. BrowserBugs

    Question About Displaying Database Info on Webpage

    I'd use an SQL database with PHP, then it's simply a select where certification number = the given number.
  12. BrowserBugs

    CSS Stylesheets for multiple devices

    Yes, I consider critical css which forms the first paint, then if it's not required to paint it can go elsewhere. So for example in your case for style.css it would be something like; @import url(font-awesome.min.css); /* Basic */ body, input, select, textarea { font-size: 12pt; color: #646464; font-family: "gotham_mediumregular"; font-weight: 300; line-height: 1.75em; } body.is-loading * { -moz-animation: none !important; -webkit-animation: none !important; -o-animation: none !important; -ms-animation: none !important; animation: none !important; -moz-transition: none !important; -webkit-transition: none !important; -o-transition: none !important; -ms-transition: none !important; transition: none !important; } a { color: #fff; text-decoration: none; font-family: 'gotham'; } a:visited { color: #fff; } strong, b { color: #545454; font-weight: 700; } em, i { font-style: italic; } p { margin: 0 0 2em 0; font-family: 'gotham'; } @media all and (max-width: 800px){ a { color: #333333; } p { font-size: 0.65em; } } ... cleaned it up a bit for you. Then in a second core sheet (e.g. foot.css) something like; a { -moz-transition: color 0.2s ease-in-out, border-color 0.2s ease-in-out; -webkit-transition: color 0.2s ease-in-out, border-color 0.2s ease-in-out; -o-transition: color 0.2s ease-in-out, border-color 0.2s ease-in-out; -ms-transition: color 0.2s ease-in-out, border-color 0.2s ease-in-out; transition: color 0.2s ease-in-out, border-color 0.2s ease-in-out; } a:hover { text-decoration: none; color: #e2dede !important; border-bottom-color: transparent; } @media all and (max-width: 800px){ a:hover { color: #e2dede !important; } } ... as these bits are really only required once the page has loaded and can be interacted with. Second I use javascript to loads these non critical css files with noscript links in the head. <!DOCTYPE html> <html lang="en-GB"> <head> <link href="/css/core.css" rel="stylesheet" type="text/css"> <noscript> <link href="/css/foot.css" rel="stylesheet" type="text/css"> </noscript> </head> <body> .... <script type="text/javascript"> function loadjscssfile(filename, filetype){ if(filetype=="js"){ var fileref=document.createElement('script'); fileref.setAttribute("type","text/javascript"); fileref.setAttribute("src", filename); fileref.setAttribute("charset", "utf-8"); } if(filetype=="css"){ var fileref=document.createElement('link'); fileref.setAttribute("rel", "stylesheet"); fileref.setAttribute("type", "text/css"); fileref.setAttribute("href", filename); } if(typeof fileref!="undefined") { document.getElementsByTagName("head")[0].appendChild(fileref); } } loadjscssfile("/css/foot.css", "css"); loadjscssfile("/js/somejsfile.js", "js"); </script> </body> </html>
  13. BrowserBugs

    CSS Stylesheets for multiple devices

    Ahoy. Taking a look there's a lot of repetition between the style sheets. For example I assume style.css is your foundation as it loads without restriction, taking line 1 to 60 the difference between this file and say style-small.css or style-xsmall.css are tiny, but you've included all the 60 lines again. The differences in those same lines in style-small.css and style-xsmall.css; a { color: #333333; } a:hover { color: #e2dede !important; } ... and one more for style-small.css; p { font-size: 0.65em; } You need to only target when things are different, a lot less to process by adding a media query to the foundation for max width with those couple of lines. Just my 2 pence worth as always Edit: Forgot to say my general approach is to make the foundation for the overall layout, then keep action effects (e.g. hover, lightboxes etc) in separate css sheets so they can be loaded in the background if required.
  14. There's many reasons for a drop, sometimes it's as simple as your competitors are doing better. A good place to start would be to run a back link audit, a possibility is some of those links you were creating the sites in question might have little to no value or they're simply link farms which go against the search engine guidelines. It's also worth checking the link text, overcooking terms is a common mistake with links. If you had 100 links and say 20% all had "Best web design company" as the anchor then it would potentially look suspicious, most natural links would use text like just the url, company name or article title depending.
  15. BrowserBugs

    Schema.org - which ones to use?

    Microdata is about layering context to the information found on any given page. Beyond featured snippets, it's there to help search engines gather a better understanding as to the content provided and how the information connects together at a site wide level. For example a directory listing businesses can turn each div result into a "LocalBusiness", highlighting the "address", "telephone", "website" and "openinghours". This means the search engines need not guess what content belongs together, and by marking the page as a "SearchResultsPage" it now has a clearer idea what the purpose of the page is, it would see 10 businesses listed on a SearchResultsPage which is about that business type in that area. If you want some sites to scan drop me a personal message, I can give you live examples so you can see it's capability.
  16. BrowserBugs

    Schema.org - which ones to use?

    What you use really does depend on the page in question. I don't go for WebSite myself (unless referring to another website) but I start with what type of page it is eg. WebPage, AboutPage, ContactPage etc. It's a big subject, if looking to dip your toe in with articles then How to Boost Your SEO by Using Schema Markup by Neil Patel is not a bad place to start. The thing to remember with microdata is it's about marking up what's relevant, making entities such as an article or site navigational element.
  17. BrowserBugs

    Hello from Norfolk

    Welcome mate
  18. Just to follow on if using parameters be careful how much you let the search engines index / crawl. It's best to always use the same order and validate the final url, e.g. if using ?type=restaurant&orderby=rating don't start then using ?orderby=rating&type=restaurant. It can get out of hand very quickly, I've seen some horrific issues around this first hand. Currently working on an experiment with this testing blocking vs canonical (with a hint of noindex). Send me a PM if you want a link, still a WIP but the basics are in place.
  19. This depends on the content, think of the structure like a file tree, where would the content correctly sit? If the content found at /portfolio is a list of all the portfolio categories / "all portfolios" list and the kitchen renovations acts like a filter / section showcasing kitchen renovations then I would use nesting like /portfolio/kitchen-renovations, /portfolio/bathrooms etc. Using everything at root level is messy. Simply put its the difference between grabbing all your post and chucking it in a box vs putting them in folders, one marked statements, one bills etc. Inside the statements you might need another folder to separate statements by account to manage them. Now you want to find the personal account statement for April 2016 which system would you rather use to look for? People are used to My Pictures > Spain 2016, it's comfortable. I have to disagree mate, and these points conflict, if internal linking matters then why keep it simple and not use breadcrumbs? It's relevant internal linking in its purest form! I agree URL structure is the lesser of the two, but it helps search engines build blueprints to a website. Edit: Forgot to say also breadcrumbs and nesting work better in search results, like; No crumbs; Kitchen Renovations | Company Name company.com/portfolio/kitchen-renovations With crumbs; Kitchen Renovations | Company Name company.com > Portfolio The searcher can see that "kitchen renovations" is just part of this companies "portfolio", wonder what else they might do?
  20. Spot on @BlueDreamer ... horses for courses, usually depends if the "thing" is mightier than the "location". This is where a url might not actually reflect the structure which is where breadcrumbs come into their own. Home > Surrey > Epsom > Restaurants adomain.com/restaurants/surrey/epsom/ Home > Surrey > Epsom > Hotels adomain.com/hotels/surrey/epsom/ The breadcrumbs would still produce the desired architecture with all restaurants vouching they are in Epsom in Surrey. Same can be done for the actual establishments. Home > Surrey > Epsom > Restaurants > Big Bobs Burgers adomain.com/profiles/big-bobs-burgers/ Edit: Would say from a visitor point of view it would be clearer if they landed on the Restaurants in Epsom page. Logic would say they click one level up for All restaurants in Surrey, two levels up for all the restaurants on the site. Flip side, if it was more a market for things to do in Epsom then your location example would be a better fit. Love this industry, no set way to do anything it's what works best in each situation.
  21. There is a huge benefit from information architecture combined with structure from a user experience and search engine perspective, you might not have seen this depending on the size of websites you've worked on. It's not just the url as such, sometimes it's not possible for technical reasons to perfectly stack them but thanks to breadcrumbs you can shape a structure. What you should be looking for is parent and child relationships and encouraging them with logical urls and supporting breadcrumbs. So for example; Home > Restaurants adomain.com/restaurants/ Home > Restaurants > Surrey adomain.com/restaurants/surrey/ Home > Restaurants > Surrey > Epsom adomain.com/restaurants/surrey/epsom/ This set up would be very clear for both visitors and search engines alike. From the "Restaurants in Epsom" page it internally links to its parent "Restaurants in Surrey", and that page then links to its parent "Restaurants in the UK". You also might have adomain.com/restaurants/surrey/dorking/ which does also votes via the breadcrumbs. Where this method really comes into its own is when it comes time that you want to make a page for "Restaurants in Guildford", when you add the new page under Surrey it will gain the benefit of the established parent pages, gaining the strength of "Restaurant" and "Surrey" leaving only the term "Guildford" as new. It doesn't take long for these pages to scoot right up the rankings however don't drill down to the point of thin content, divide as much as required into logical content groups. Just my 2 pence as always
  22. Bang on mate. Got a client who started out his career as a Hod Carrier and now runs his own construction company. People like this inspire me to do better.
  23. BrowserBugs

    I'm Back.

    Welcome back 😄
  24. I can see your points @rbrtsmith and @Jack - but I don't think either of the companies you work (worked) for would change $800 to switch a domain name, devils advocate and all that
  25. BrowserBugs

    How Much Should I Charge. Pricing

    Not forgetting + 10% "no idea" scope Also: Forgot to ask what are we measuring 3500-5000 in?
×