Our BlogREAD ON TO LEARN ABOUT OUR THINKING ON B2B MARKETING

The Three Ss of Technical SEO Explained

Search engine optimisation is becoming ever-more user focused. Content, context, and conversation are more important than ever, as search engine algorithms become better at understanding what is on a page, what is trusted content and opinion, and whether that page is relevant. With all that taken into account, it could be argued that the technical side of SEO is less needed. Search engines have improved at extracting meaning from pages, allowing the best content to rise to the top and the most talked about issues to stay prominent for longer.

I say that technical improvements for search are more important than ever, but not for the sake of directly pleasing algorithms, for improving the user experience –  one of the largest factors in search. Whilst hundreds of individual technical tasks may fall under the reign of a search engine optimisation specialist, everything can be framed within three factors:

  • 1. Speed
  • 2. Structure
  • 3. Security

If you want a checklists of tasks to help you rank well in search, you’re in the wrong place (though there are plenty of other posts, on this site and others, which will help you with that). My goal here is to deliver the why behind technical optimisation – to help you understand what matters to your users and to search engines. Hopefully to help you think of Google, Bing or Yahoo as just another individual with needs and wants to satisfy.

Speed, or don’t be seen

Let me tell you a little story about speed.

In Google’s company philosophy, user focus and speed matter. Stating that;

Fast is better than slow

Focus on the user and all else will follow

Those statements are true for Google’s search algorithms. Since 2010 Google’s PageSpeed has been a factor in search; including a dozen metrics which consider page size, scripting, and on-page speed optimisation – all in the context of user experience.

So, why is speed a considered factor in search?

In the last 5 years, there has been an observable decrease in attention from people whom are currently online. There has also been a rapid shift toward mobile and tablet devices changing the way we search and consume content. When browsing on mobile information is often time-sensitive and subject to the constraints of mobile networks, meaning that a slow loading pages are frequently left.

Some experts in the SEO industry call this behaviour ‘pogo-sticking’ – jumping away from one page before or just after it loads. This has a large negative impact in your ranking. Not only will Google rank your site less highly for that ‘pogo-sticking’ user, but it will also factor into your overall ranking for that page. Google interprets pogo-sticking as a user not finding what they want, and so improves user experience by decreasing the ranking of a slow loading page.

In addition; speed has a massive effect on conversion. When Mozilla optimised its Firefox download page for speed, a 2.2 seconds decrease in page load time resulted in a 15.4% increase in downloads. Admittedly, it’s not clear how long this test was run for, but they have proven a strong correlation (for data lovers) and the additional 60 million downloads this speed improvement delivered in 2010 isn’t to be sniffed at.

Speed is all about how quickly your users can get from a search engine page to the information they want, which when combined with a solid structure is vital for users and search engines…

Solid structure supports everything

Algorithms seek structure.

Structure (done right) should almost give preference to search-engine bots over users. Users are adaptable, people after all have been developing and changing since the year… and for better of worse, if your users want something, they will deal with a little bit of momentary processing to reach their goal. Algorithms on the other-hand are less fluid, they can only use the data given to them, rather than making assumptions like you or I.

Giving search engines more data has its risks and rewards. More data gives more points of comparison, which can positively effect your site’s performance, or if your site is intentionally misleading, poorly configured, or badly designed, provide more negative ranking factors. There are several steps you can take to make sure that search engines understand how all the pages fit together, sometimes by doing little more than creating links.

link piramid

Build your site with a logical structure with pages getting more specific the deeper you go – Moz

Remember a while back I said I would explain ‘the why’. Well, the why for structure is because algorithms, for all their knowledge, make assumptions in a way that attempts to mimic user interaction, but in many cases is not close to parody.

Sure you can use sitemaps to demonstrate holistically the context of a website. You can also use microdata to add additional semantic ‘meaning’ and information priority to guide the eyes of search engines toward what matters, but it does not (yet) replace human monitoring and curation. That’s why search engines use data from your browsing habits to inform the value that a page/or site provides.

Remember back at the top of this section I said structure is more for search than users, that’s not entirely true. Search engines do prioritise site and page structure over data gathered about user habits, though for how long, it’s hard to tell

Security shouldn’t be a second-thought

Security is all about keeping the wrong users out and letting the right ones in.

Search engines like consistency, for example, they value domains that have been established for years far more highly than those which are months old. As such, consistency should run through every aspect of a search strategy, from content to engaging in conversation — something which is impossible if your site is down for days or is penalised due to bad code or practice.

Sites that are poorly secured, have limited time left on their domain registration, or have long periods of down-time are not a friend of search engines. With that in mind, security and consistency should be your first priority. Everything from backup, preventing insertion of malicious code, server maintenance and insuring adequate password policies all comes under security. It’s about ensuring that you always have the greatest possible control over your online presence, and proving what your site delivers is trustworthy and authoritative.

For one of the most important elements of managing your ranking in search, security is most often overlooked. It’s an area which many ‘Search Engine Professionals’ don’t realise they know too little about until it’s too late.

hacked warning

Getting this warning will be the kiss of death for your click through rate at the very least.

Think about it this way, just as you’re trying to gain ranking in search, so are other businesses, and in high competition sectors some organisations think reducing your ranking in search is a valid element in their optimisation strategy.

Let me give you a theoretical worst-case scenario if your security is lapse, which happened to a previous client of mine…

A site in a highly competitive field saw increases in authority and ranking due to an aggressive link building strategy which involved getting several thousand links a month, fuelled by content marketing.

Whilst the majority of the these links were from respectable sources, this vast increase in links made it less obvious that negative SEO (the practice of getting low value links, or performing actions which would reduce a site’s authority) was being performed on the site by a competitor.

Due to a lack of effective monitoring, and limited concern for low quality links, the site received a Google Penalty. Which took six months and several thousand pounds to recover from – resulting tens of thousands of pounds in lost leads from search.

Technically perfect SEO

With search becoming ever more mobile-focused, and users expecting increasingly accurate and rapid responses from search, Google (etc) have had to consider more than just links and content. As search marketing continues to progress, I envision a landscape where it is less acceptable for the site to be technically weak. Every user interaction with your site will (sooner or later) become a ranking factor – from bounce rate, to time on site, to engagement through social – and lax speed, structure and security are three factors which will negatively effect usability more than any-other. That is why you should value a solid technological foundation, because when 90% of your users are lost before they even reach your site, achieving conversion or ranking targets becomes impossible.

If you have any questions about search and how it can benefit your business, I’ll be happy to answer any questions in the comments below.

…or if you fancy a chat I’m always available @YianniPelekanos on Twitter.

  • Phil Szomszor

    Good post. I’m not an SEO expert, more of a hobbyist, but it’s an interesting set of recommendations. Looking at site architecture is often overlooked. I started using some new tools recently and found out my blog was a complete mess (broken links, duplicate page titles, all that stuff). The trouble is, it all takes a while to fix. I guess that’s potentially an emerging role for the SEO industry, acting as a plumber to sort out years of mistakes.

    • YiannisAtKlaxon

      I think that SEO managers have had to be digital plumbers for a while, but you’re right, it has become a lot more important due to recent algorithm updates.

      Out of curiosity, what SEO software are you using? We use a number of suites at Klaxon, and I always find it fascinating to know what tools people use.

      Search Engine Optimisation, I have a feeling that it’s reach is going to keep on getting broader, social media and content marketing have already been pulled in…

      • Phil Szomszor

        We’re using Moz, plus GA, Google Webmaster tools and Keyword tools. Nothing too sophisticated, but we get enough from these to have some things to work on and monitor our progress.

  • http://jontusmedia.com/ Jon Buscall

    Nicely put ! I do think that speed is an incredibly important factor. One of the best reasons to keep plugins to a minimum on WordPress!

    We always include speed testing as part of the SWOT analysis for customers; it’s surprising how many customers will tolerate slower load page times on their own sites because they *know* what’s coming but will click away in irritation on other sites.

    Right now, the layperson doesn’t really have a clue what is acceptable and what’s not so it can actually be hard to have this discussion. But posts like this really help! It’s up to all of us in the industry to dispel the SEO tweak-the-code myths of days gone by. We still get customers asking us to “fine tune the code to appear at the top of google”. Funnily enough they’ll pay for that but are more wary of investing in outsourced content creation.

    • YiannisAtKlaxon

      I completely agree with your load-time comment.

      I like to think about search engines as just another user, and with any user, they have needs which a site has to accommodate. Deep, well researched and engaging content is one of these needs, as is site structure, speed and security. Just as a site is unlikely to engage users if there are no interesting posts, video, audio, etc, it will be the same for getting ranked in search.

      If clients are unwilling appreciate the need for speed improvements I would run their site through http://tools.pingdom.com/fpt/ and compare it to their favourite sites. It can be quite eye-opening to see in black and white that a site takes takes ages to load.

    • YiannisAtKlaxon

      I agree with your load-time comment.

      I like to think about search engines as just another user, and with any user, they have needs which a site has to accommodate. Deep, well researched and engaging content is one of these needs, as is site structure, speed and security. Just as a site is unlikely to engage users if there are no interesting posts, video, audio, etc, it will be the same for getting ranked in search.

      If clients are unwilling appreciate the need for speed improvements I would run their site through http://tools.pingdom.com/fpt/ and compare it to their favourite sites. It can be quite eye-opening to see in black and white that a site takes takes ages to load.