What’s technical SEO? 8 technical aspects everyone should know
An SEO Basics post about technical SEO might seem like a contradiction in terms. Nevertheless, some basic knowledge about the more technical side of SEO can mean the difference between a high ranking site and a site that doesn’t rank at all. Technical SEO isn’t easy, but here we’ll explain – in layman’s language – which aspects you should (ask your developer to) pay attention to when working on the technical foundation of your website.
What is a technical SEO?
Technical SEO refers to improving the technical aspects of a website in order to increase the ranking of its pages in the search engines. Making a website faster, easier to crawl and understandable for search engines are the pillars of technical optimization. Technical SEO is part of on-page SEO, which focuses on improving elements on your website to get higher rankings. It’s the opposite of off-page SEO, which is about generating exposure for a website through other channels.
Why should you optimize your site technically?
Google and other search engines want to present their users with the best possible results for their query. Therefore, Google’s robots crawl and evaluate web pages on a multitude of factors. Some factors are based on the user’s experience, like how fast a page loads. Other factors help search engine robots grasp what your pages are about. This is what, amongst others, structured data does. So, by improving technical aspects you help search engines crawl and understand your site. If you do this well, you might be rewarded with higher rankings or even rich results.
It also works the other way around: if you make serious technical mistakes on your site, they can cost you. You wouldn’t be the first to block search engines entirely from crawling your site by accidentally adding a trailing slash in the wrong place in your robots.txt file.
But it’s a misconception you should focus on technical details of a website just to please search engines. A website should work well – be fast, clear and easy to use – for your users in the first place. Fortunately, creating a strong technical foundation often coincides with a better experience for both users and search engines.
What are the characteristics of a technically optimized website?
A technically sound website is fast for users and easy to crawl for search engine robots. A proper technical setup helps search engines to understand what a site is about and it prevents confusion caused by, for instance, duplicate content. Moreover, it doesn’t send visitors, nor search engines, into dead-end streets by non-working links. Here, we’ll shortly go into some important characteristics of a technically optimized website.
- It’s fast
Nowadays, web pages need to load fast. People are impatient and don’t want to wait for a page to open. In 2016 already, research showed that 53% of mobile website visitors will leave if a webpage doesn’t open within three seconds. So if your website is slow, people get frustrated and move on to another website, and you’ll miss out on all that traffic.
Google knows slow web pages offer a less than optimal experience. Therefore they prefer web pages that load faster. So, a slow web page also ends up further down the search results than its faster equivalent, resulting in even less traffic.
Wondering if your website is fast enough? Read how to easily test your site speed. Most tests will also give you pointers on what to improve. We’ll guide you through common site speed optimization tips here.
- It’s crawlable for search engines
Search engines use robots to crawl or spider your website. The robots follow links to discover content on your site. A great internal linking structure will make sure that they’ll understand what the most important content on your site is.
But there are more ways to guide robots. You can, for instance, block them from crawling certain content if you don’t want them to go there. You can also let them crawl a page, but tell them not to show this page in the search results or not to follow the links on that page.
You can give robots directions on your site by using the robots.txt file. It’s a powerful tool, which should be handled carefully. As we mentioned in the beginning, a small mistake might prevent robots from crawling (important parts of) your site. Sometimes, people unintentionally block their site’s CSS and JS files in the robot.txt file. These files contain code that tells browsers what your site should look like and how it works. If those files are blocked, search engines can’t find out if your site works properly.
All in all, we recommend to really dive into robots.txt if you want to learn how it works. Or, perhaps even better, let a developer handle it for you!
The Meta robots tag
The robots Meta tag is a piece of code that you won’t see on the page as a visitor. It’s in the source code in the so-called head section of a page. Robots read this section when finding a page. In it, they’ll find information about what they’ll find on the page or what they need to do with it.
If you want search engine robots to crawl a page, but to keep it out of the search results for some reason, you can tell them with the robots Meta tag. With the robots Meta tag, you can also instruct them to crawl a page, but not to follow the links on the page.
- It doesn’t have (many) dead links
We’ve discussed that slow websites are frustrating. What might be even more annoying for visitors than a slow page, is landing on a page that doesn’t exist at all. If a link leads to a non-existing page on your site, people will encounter a 404 error page. There goes your carefully crafted user experience!
What’s more, search engines don’t like to find these error pages either. And, they tend to find even more dead links than visitors encounter because they follow every link they bump into, even if it’s hidden.
Unfortunately, most sites have (at least) some dead links, because a website is a continuous work in progress: people make things and break things. Fortunately, there are tools that can help you retrieve dead links on your site. Read about those tools and how to solve 404 errors.
To prevent unnecessary dead links, you should always redirect the URL of a page when you delete it or move it. Ideally, you’d redirect it to a page that replaces the old page. With Yoast SEO Premium, you can easily make redirects yourself. No need for a developer!
Read more: https://yoast.com/what-is-a-redirect/
- It doesn’t confuse search engines with duplicate content
If you have the same content on multiple pages of your site – or even on other sites – search engines might get confused. Because, if these pages show the same content, which one should they rank highest? As a result, they might rank all pages with the same content lower.
Unfortunately, you might have a duplicate content issue without even knowing it. Because of technical reasons, different URLs can show the same content. For a visitor, this doesn’t make any difference, but for a search engine it does; it’ll see the same content on a different URL.
Luckily, there’s a technical solution to this issue. With the so-called, canonical link element you can indicate what the original page – or the page you’d like to rank in the search engines – is. In Yoast SEO you can easily set a canonical URL for a page. And, to make it easy for you, Yoast SEO adds self-referencing canonical links to all your pages. This will help prevent duplicate content issues that you’d might not even be aware of.
- It’s secure
A technically optimized website is a secure website. Making your website safe for users to guarantee their privacy is a basic requirement nowadays. There are many things you can do to make your (WordPress) website secure, and one of the most crucial things is implementing HTTPS.
HTTPS makes sure that no-one can intercept the data that’s sent over between the browser and the site. So, for instance, if people log in to your site, their credentials are safe. You’ll need a so-called SSL certificate to implement HTTPS on your site. Google acknowledges the importance of security and therefore made HTTPS a ranking signal: secure websites rank higher than unsafe equivalents.
You can easily check if your website is HTTPS in most browsers. On the left hand side of the search bar of your browser, you’ll see a lock if it’s safe. If you see the words “not secure” you (or your developer) have some work to do!
Read more: SEO Basics: What is HTTPS?
- Plus: it has structured data
Structured data helps search engines understand your website, content or even your business better. With structured data you can tell search engines, what kind of product you sell or which recipes you have on your site. Plus, it will give you the opportunity to provide all kinds of details about those products or recipes.
Because there’s a fixed format (described on Schema.org) in which you should provide this information, search engines can easily find and understand it. It helps them to place your content in a bigger picture. Here, you can read a story about how it works and how Yoast SEO helps you with that. For instance, Yoast SEO has free structured data content blocks for How-to and FAQ content.
Implementing structured data can bring you more than just a better understanding by search engines. It also makes your content eligible for rich results; those shiny results with stars or details that stand out in the search results.
- Plus: It has an XML sitemap
Simply put, an XML sitemap is a list of all pages of your site. It serves as a roadmap for search engines on your site. With it, you’ll make sure search engines won’t miss any important content on your site. The XML sitemap is often categorized in posts, pages, tags or other custom post types and includes the number of images and the last modified date for every page.
Ideally, a website doesn’t need an XML sitemap. If it has an internal linking structure which connects all content nicely, robots won’t need it. However, not all sites have a great structure, and having an XML sitemap won’t do any harm. So we’d always advise having an XML site map on your site.
- Plus: International websites use hreflang
If your site targets more than one country or countries where the same language is spoken, search engines need a little help to understand which countries or language you’re trying to reach. If you help them, they can show people the right website for their area in the search results.
Hreflang tags help you do just that. You can define for a page which country and language it is meant for. This also solves a possible duplicate content problem: even if your US and UK site show the same content, Google will know it’s written for a different region.