Why Every WordPress Developer Needs to Know SEO
A Checklist of the Functionality Features a Website Should Have
WordPress is the most popular content management system, used by 39% of all websites today. It is easy to set up and use. It’s flexible, updated on a regular basis, and there is no shortage of quality WordPress talent, either.
Aside from traditional skills associated with WordPress development, every software engineer specializing in WordPress also needs to understand the basics of Search Engine Optimization (SEO). This is true of most web developers, but SEO skills are particularly important when it comes to WordPress due to its popularity, the fact that it’s employed by small businesses, hobbyists, or nonprofits, and often maintained by non-technical users.
Why WordPress SEO Skills Matter
Let’s assume your client hired you to build a site with a good user experience on a tight budget, and you immediately conclude WordPress is the right platform to use, considering their requirements and available resources. You are aware that the website must also be optimised for the search engines before it launches. On top of providing the best user experience, you also have to lay the foundations for good SEO practices. Fortunately, WordPress is ready to deliver on that front as well, now more than ever.
- WordPress is excellent for SEO, and as an experienced web developer who knows his way around advanced SEO techniques, I stand by WordPress and its capabilities:
- WordPress is focused on ensuring a good user experience by offering themes and plugins that help make websites appear professional and attractive. A good user experience is a huge SEO factor. The ability to easily craft a great user experience does wonders for the website’s rankings.
- With WordPress you can easily create attractive permalinks that contain keywords relevant to client keywords that the clients use to search on Google. These permalinks are human-friendly and website visitors can guess what the page is about only by looking at the URL.
- WordPress makes metadata easily manageable
- You can easily optimize images for SEO by creating alt text and resizing them so that they won’t hurt the load speed
Finally, WordPress supports plugins that can improve the load times, improve mobile/responsive behavior, enable you to integrate your campaigns with social media, and let’s not forget the availability of powerful SEO plugins that further facilitate optimization.
Adding Microdata in WordPress
Moreover, you can add microdata in WordPress and enhance the SEO features of the site. By adding microdata, the search engines will better understand what the site’s content is about. You can add microdata for products or posts, but first you need to check whether your theme already uses microdata.
Finally, setting up Accelerated Mobile Pages in WordPress (AMP) affects site’s speed, which is a critical factor when it comes to SEO and page rankings. Here is how to do it:
- Step 1: Install the AMP WordPress plugin
- Step 2: Activate the AMP WordPress Plugin and note how it will be activated on all pages; however it won’t immediately redirect the mobile visitors to your /amp pages.
- Step 3: Edit your .htaccess file
- Step 4: Check whether your AMP pages are working across the board
- Step 5: Edit the CSS to personalize your AMP look.
- Step 6: Verify your AMP pages on Google Search Console so that the Google crawling bots can crawl and index them. Also, Google Search Console will notify you whether there are any errors in your AMP pages.
WordPress and Well-Performing Websites
These are just some of the reasons why WordPress can be used to create well-performing websites which deliver a good user experience while at the same time ensuring good rankings on search engines. It’s crucial that WordPress developers who build WordPress sites and those who later administer and maintain the site understand SEO basics so that they can employ the best practices, especially those of tech nature in the early phases of building the site. Doing everything by the book enables Google to properly index and evaluate the website, which in the long run results in more search traffic, revenue, and satisfied clients.
Building a good-looking WordPress site that isn’t SEO-optimized won’t do much good as it may end up buried under dozens of pages on the Google Search Engine Results Page (SERP). Not many people will visit it since only few will find it, as the top SERPs tend to outperform lower ranking sites by a sizable margin. This is why WordPress developers need SEO skills and why they have to communicate the importance of SEO to their clients.
Developers should use their knowledge of SEO to steer the client in the right direction during development, and at the same time, to make sure the client knows when to consult an SEO expert. Once the website is technically prepared to earn its place on google and establish online visibility, the SEO experts take over. Since SEO is not done only on-page, developers should advise their clients to consider hiring SEO experts to continue the optimization work so that the desired online performance is achieved. Also, developers must ensure that everything in their purview is implemented properly, as clients sometimes insist on developing their site in a way that may have a negative impact on SEO.
A competent developer should try to dissuade clients from making poor choices. It’s their professional responsibility, and the more knowledgeable they are, the easier it is to convince clients to abandon ideas that might lead to unsatisfactory outcomes.
Let’s get started with the basics: URL and SSL.
The Importance of URL Structures and SSL
What you need to know about URLs from an SEO point of view is that optimizing the structure according to WordPress best practices is of great importance to how your website will perform on the SERP.
Optimized URL structures provide a good user experience and are easy to understand for human users and search engines alike. So how do we make sure we are doing everything right?
The optimal format of a URL is as follows:
Needless to say, keywords that describe categories and subcategories should be included in the URL. Although the length of an URL shouldn’t be longer than 2,048 characters, what you should keep in mind is that the optimal SEO length is around 60-75 characters since this length is most conveniently crawled and indexed by search engine bots.
That being said, let’s take a look at the general benefits of having SEO-optimised URLs:
- Improved user experience – both users and search engines will easily understand what the page is about
- Better rankings – even though the URL structure of a website doesn’t have a direct influence on the website’s rankings, they do give weight to the authority of the overall domain itself. Also, the keywords in the URL can increase the possibility of your site ranking for them.
- URLs as anchor text – if used as a pasted link in forums, blogs, or social media networks, they will give users a clear idea of what is on the other side of the link
In a nutshell, a properly SEO-optimized URL is simple, concise, includes relevant keywords, with hyphens to separate words. There shouldn’t be any underscores, spaces, or special characters to separate the words. Finally, the use of URL parameters should be avoided since they are often a source of issues with tracking and duplicate content.
So, how can you optimise the URLs according to these SEO recommendations?
- Keep URLs short and simple
- Choose static URLs over dynamic ones
- Keep them organised: use directories, categories, and subfolders rather than subdomains as subdomains split authority
- Use the right keywords that target the page
- Make it reliable/trustworthy: use https instead of http
- If you are working on a multiregional/multilingual website, place the relevant language marker in the URL sequence as it will enhance the user experience and notify people that browsing in that specific directory will be in the same language
https://restaurantsite.com/en/menu (English language)
https://restaurantsite.com/fr/menu (French language)
- Don’t use stop words (a, an, the)
https://sitename.com/choose-a-gift (bad URL)
https://sitename.com/choose-gift (good URL)
- Don’t use special characters including question marks, apostrophes, exclamation points, ampersands, and asterisks as they will cause indexing problems with search engines. Use only hyphens.
https://sitename.com/what-do-you-need-to-know-&-learn? (bad URL)
https://sitename.com/what-do-you-need-to-know-and-learn (good URL)
The Importance of SSL
When it comes to the importance of SSL for search rankings, Google has stated that it gives an SEO boost in the search rankings, even though this is not a major ranking  signal, like the quality of the content per se. Google gives an advantage to sites with the SSL certificate since they want to “encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web.” The reason for this is that SSL certificates are proof of a website’s trustworthiness, which, on the other hand, is a valuable ranking factor for Google and other search engines.
According to Google’s Transparency Report, 90.2% of the browsing time on Chrome is spent on HTTPS pages. Hence, it is clear that the HTTPS pages have become the default result for Google searches. This brings us to the conclusion that you must ensure you implemented the SSL certificate accordingly and, at the same time, provide an excellent visitor experience while avoiding a drop in rankings.
What’s more, research suggests that 85% of online shoppers avoid unsecured websites (GlobalSign), which is another reason why Google has been using HTTPS as a ranking signal since 2014.
Security is crucial, especially when dealing with WooCommerce sites where transactions are completed; meaning, when a user sees that a site has an SSL certificate they feel much more secure about using their credit card or other means of payment.
There are lots of things a developer can do to make sure they secure the WordPress site, and some of them are:
- Use managed WordPress hosting
- Disable file editing since editing the theme and plugin files right from the WordPress admin area might be a huge security risk
- Adding two-factor authentication is a popular security trend that Google, Facebook, and Twitter also employ
The key URL and SSL takeaways WordPress developers need to be able to communicate to their clients are:
- Adhere to WordPress best practices for each new URL
- Use optimized URL structures
- Keep URLs short
- Use relevant keywords in the URLs
- Implement SSL
Most of these items should not be a hard sell, as they are low-hanging fruit. However, in case your client requests something that’s not fully in line with them, be prepared to explain why that wouldn’t be a good idea.
Must-have Pages vs. Good-to-have Pages/Files
Another practice that profoundly affects a WordPress website’s online performance is the inclusion and optimization of must-have pages and files. Here is a list of essential pages you need to include:
HTTP 404, or Page Not Found is a code that indicates that such a page doesn’t exist. You need this page to turn the potentially negative user experience of finding an error into a positive one. These pages give users an exit out of the error page.
An archive page boosts the time visitors spend on the site by letting them dig through archives of content. That way, visitors find related articles or other topics they are interested in and stay on the website longer, decreasing their bounce rate. Also, the number of page views is increased by the visitor’s engagement.
It is also a good idea to have the following files and pages on your WordPress site:
It is a group of individual pages based on a similar project or a theme. They provide order and structure to the content or taxonomy on the website. By putting web pages in categories, you allow visitors to easily navigate the content on your site and access the information they seek. This improves the overall user experience.
Taxonomy is a classification system for a website that ensures a better user experience, thus helping the SEO rankings further. If a website doesn’t use taxonomy, it will be challenging to navigate, and visitors won’t find what they are looking for as quickly as possible.
The author page is beneficial for straightforward navigating through the blogging content as it lists all blog content created by a specific author, along with the author’s bio. The byline and bio also let visitors know they are reading content crafted by a professional author rather than a team of anonymous copywriters.
Same as the author page, the date page lists all blogging content created on a particular date. The date lets visitors know they are accessing up-to-date information, which can be very important in some fast-paced industries (e.g., an article on SEO practices from 2020 is far more relevant than an article from 2012.)
Developers include these pages without paying much attention to them. Very often, though, they focus on the homepage or service pages and leave these pages behind. This is a huge mistake as these pages add a significant value to the reliability and trustworthiness of the site. There are no special tricks or activities the developer must do to optimize them, all they need to do is keep in mind to build them as well.
Communicating the importance of these pages and files should not be problematic, though certain clients may not be too keen on some of them. For example, if a client has a blog full of content generated by various freelance copywriters, they might not want to have an author page for each of them, or display the publication date. Try to convince them otherwise, explain why it is important to include them and how they may generate added value for the client.
Users can also employ poor headings, title tags, forget to enter image alt text, and so on. Poorly formatted posts, badly written H2s, or images without alt text can adversely affect search rankings. Make sure clients are aware of this, so they wouldn’t try to cut any corners.
Files and Elements Crucial for WordPress SEO
Next, we will be discussing crucial files and elements every WordPress developer must include in the structure of the website for it to be correctly SEO-optimized. These aren’t just WordPress SEO best practices, as they’re applicable for most sites, not just WordPress ones.
The first of these files/elements is the robots.txt file. Creating and allowing vs. disallowing robots.txt files determines which pages will be crawled by the search engines and will direct the web crawlers to evaluate relevant pages for ranking. If you want to block all web crawlers from the full content on the website, you use this:
user-agent: * disallow: /
If you want the opposite, the web crawlers to crawl all the pages, you should use this:
Alternatively, if you want to block a specific web crawler from a specific folder, you can use this:
user-agent: Googlebot Disallow: /example-subfolder/
Finally, to block a specific web crawler from a particular web page, you use this:
User-agent: Googlebot Disallow: /example-subfolder/blocked-page.html
Finally, as the search engine updates the file’s contents at least once a day, ensure that you submit your robots.txt URL to Google on your own to update more quickly. To do that, you submit the URL to the Google index, either by submitting a WordPress SEO sitemap or using the Fetch as Google.
Why is the robots.txt file so important?
Apart from enabling web crawlers to crawl the website and discover what your content is about, robots.txt helps them index the content to appear on the SERP.
How do we optimize robots.txt?
First of all, you need to place it in the website’s top-level directory, and it obviously must be titled “robots.txt.” Because this file is publicly accessible to every user and can see which pages you want to be crawled and which not, don’t use the file to hide private user information.
What about meta tags in robots.txt?
Even if we don’t add `<meta name=“robots”>` in the code, the crawler will find the robots.txt file on the server. The main point of putting this line in the code would be to make specific calls, for example, when you want to specifically exclude some pages from indexing.
To do that, you can add this line:
<meta name=”robots” content= “noindex” />
This way, the robots meta tag will instruct search engines to exclude a specific page in the search results. Furthermore, if you want to address a specific crawler, all you need to do is replace the robots value of the name attribute with the name of the crawler you are addressing.
So, if you only want to prevent Googlebot from indexing the page, use this tag:
<meta name=”googlebot” content=”noindex” />
What are the benefits of a good robots.txt file?
It will prevent duplicate content from appearing on SERP, keep a section of the website private per your wishes, prevent search engines from indexing unimportant files such as images, PDFs, etc. Lastly, robots.txt specifies the location of the sitemaps.
To enjoy all these benefits, you should check that you are not blocking any content you want to be crawled by the search engines. Also, since we already made it clear that you should not use the file to hide private user information, you should use password protection. Ensure that you understand that search engines have multiple user-agents, so be careful which one you use to specify directives.
The .htaccess file, short for Hypertext Access, is used by web servers to configure the server’s initial settings. It is used to make the server behave in a wanted way. This file is necessary to include the website’s structure as it configures the directory where password protection, URL rewrites, and redirects are located.
The .htaccess file also makes it easier to block traffic based on IP address, and the search engine spiders use it to restrict access. Additionally, .htaccess allows server control caching to be handled by web browsers and reduces the data transfer, making the pages load faster.
Log files or log entries are a list of activities generated automatically by computer servers, network devices, operating systems, and computer applications. There are three types of log files: request log files, manager log files and internal concurrent manager log files.
The request log files document the execution of a concurrent program running as the result of a concurrent request.
The manager log files document the performance of a concurrent manager running a request.
The internal concurrent manager log documents the performance of the Internal Concurrent manager.
Because there are different types of servers, there is also a different way of managing the log files. Therefore, how to access the log files depends on the type of a server you have: Apache, NGINX or IIS. The data they include is diverse, but basically what you can see is the requested server IP, the time and date when the request was made, the requested URL, the HTTP status code, and the user agent.
Log files can be parsed in a process that splits data into chunks of information so that it is easier to manipulate and store. The main goal of log parsing is to recognize and group the chunks of information into a meaningful way. The log parsing is done by software which executes the parsing in two steps
- Allocation and population of data structures
To analyze the log files in detail, manual log parsing is not acceptable. Instead, there emerged automated log parsing tools. All of them vary on accuracy, efficiency and robustness. Some of them are Graylog, Nagios, ELK Stack, LOGalyze, Fluentd, Logstash, etc.
Raw Log File
A raw log file consists of all logs (from people and bots) that have visited the website. It also contains a list of the activities that the server created, performed, and maintained. Raw log files are essential because search engine bots process websites by looking at the log files.
By analyzing raw log files you can get some great insights regarding optimization. For example you can validate what can or can’t be crawled, identify issues with the crawl that might have site-based implications, understand which pages are prioritized by the search engine bots, and see whether there are any cases of crawl budget waste.
Error Codes and Redirects in WordPress
Based on this, for example, if you find that one page is permanently removed, you’ll need to use a 410 redirect.
# 410 Gone remove links permanently
Redirect gone / example-of-a-slug/example-of-a-title
ErrorDocument 410 default
However, you will need to be careful when fixing the 404 and 410 Gone error codes in your .htaccess file because the two differ from one another. The “404 not found” error code indicates that the webpage could not be found which could mean that the page has been removed, moved, and that the URL wasn’t changed accordingly. Of course, the other possibility is that the user entered the URL incorrectly.
On the other hand, 410 Gone error code is a response that indicates a page requested by a user has been permanently deleted, hence the user should not expect any alternative redirection to another address. In that case, use the lines mentioned above.
410 vs. 404
Usually, setting the 404 code to a page is done when there is no content replacement. If there is, then you need to use the 301 redirect.
However, when the case is that the service/product is not offered anymore, then you need to retire the page with no placement. That is when you use the 404 error code. On the other hand, when you want to speed up the process of page removal from search engine results altogether, then you set a 410 response code. With WordPress, this is quite easy. It can be done with a plugin (the premium version of YoastSEO, or manually).
Here is how to do it manually:
- Log in to your hosting account
- Access cPanel
- Access the file manager
- Select the public_html folder
- Select the .htaccess file
- Select Edit
- Add the page you want to 410, using the syntax below:
Redirect 410 [page-path] example:
Redirect 410 /blog/title-title-title/
To fix a 404 error code, do this:
Redirect 301 /example-of-a-slug/example-of-a-title/ /example-of-a-new-slug/example-of-a-anew-article
The 301, 302, 207 redirects, and Meta Refresh are all types of redirects that send the users to a different URL from the one they intended to visit. One of the best optimization practices is to redirect one URL to another to avoid losing visitors.
Here is what you should do: redirect an old page to a new one, redirect an old domain to a new one, redirect the entire domain from non-.www to .www and the other way around, redirect the whole domain from HTTP to HTTPS, redirect the whole domain from non-.www to .www and HTTP to HTTPS.
These are the simplest, yet some of the most important WordPress SEO tweaks you can perform.
A crucial element that helps search engines find, crawl, and index all of the content you have on your website is the sitemap. By creating one, you also tell the search engine which pages on the site are most important, and by pointing the search engine bots to them, you significantly increase the chances to attract traffic. Keep in mind, though, that there are four main types of sitemaps:
- Normal XML links to different pages on the website.
- Video sitemap leads Google directly to the video content on the page.
- News sitemap helps Google recognize content on the pages approved for Google News.
- Image sitemap directs Google to all of the images featured on the site.
The importance of WordPress SEO sitemaps illustrated in Google’s own words: “If your site’s pages are properly linked, our web crawlers can usually discover most of your site.”
Page Speed SEO Implications
One of the ranking signals officially recognized as a ranking factor as of 2010 is page speed. It is as straightforward as it sounds: if your website is slow, this will hurt your Google ranking.
There are a few things WordPress developers and users can do to avoid poor performance:
- Compress images as they account for upwards of 20% of an average site’s weight
- Clean and compress the code, which means minifying the resources on your page
- Upgrade your hosting
- Activate browser caching
- Implement a CDN
- Perform routine speed tests often to identify any issues that may arise
Bear in mind that the optimal page load speed is between one and two seconds, or a score of 90 or above, according to PageSpeed Insights (PSI) by Google. Also, it should be taken into consideration that when PageSight Insights checks an URL, it will also check the Chrome User Experience Report dataset, and if available, it will report to FCP (First Contentiful Paint), FID (First Input Delay), LCP (Largest Contentful Paint), and CLS (Cumulative Layout Shift). Hence here is how PSI classified page speed data.
Since WordPress sites are often maintained by people lacking technical skills, do your best to outline why page load speed is so important and what they have to do to maintain it, e.g. what sort of images they should use and how they should optimize them to improve load times.