It’s probably one of the most commonly asked questions in the digital marketing space. “It’s been three months now, why haven’t my search rankings improved? Where’s my ROI?” – although the time frame is clearly a fundamental reason (it can often take at least 6 months to rectify SEO issues and start to build the momentum needed to build genuine authority online), there are a plethora of other reasons why your website might not be flying up the SERPs.
Although rarely, we’ve even heard these questions from our own clients, and it takes a little explanation to describe just what’s going on, and to demonstrate that sometimes SEO is a long-term game – it’s often a marathon, not a sprint.
To make things easier, we’ve picked out 3 of the top reasons why your SEO strategy might not be working:
1. Your Foundations are Weak: On-Site Technical SEO is the bedrock.
A vast majority of our initial SEO work for clients falls into the category of on-site technical remedials, essentially fixing the underpinnings of page structures, markup and meta data, which is then multiplied and compounded by pages-upon-pages of content being created, replicating these fundamental issues time and time again site-wide. Some of the most common technical SEO issues include:
Header Tags (also known as <h> tags) – whilst recent algorithm changes have made it clear that header tags aren’t as critically important for your site rankings as they used to be, they still serve an important function, both for your user experience (UX) and your SEO. They can indirectly influence your rankings by making your content easier and more enjoyable for visitors to read, and by providing keyword-rich context about your content for search engine crawlers. Essentially, your headers should be structured much like a book – The H1 introduces the topic your page is all about, just as a title tells a reader what a book is all about; H2s are like ‘chapters’, describing the main topics you’ll cover within each section of the article; and all subsequent headers, usually tagged as H3-H6s, are basically sub-headings within each section, just as a chapter may be split up into sub-topics. One place this is becoming more and more important is around how search engines use Featured Snippets, and this can be impacted in two ways. The first is optimising your header tag for a long-tail voice search keyword, and then answering the query directly below using text within the paragraph (<P>) tags. With Voice search being utilised more and more, this is going to become more of a feature over time. The second is to cleverly use subsequent, H3-H6 headings to outline different list items. Search engines, like Google, use these headers to create their own bulleted/numbered lists in the featured snippet results. Very cool, and something which is only going to grow in presence as search engines look to serve conversational “answers” to questions, rather than just list web pages.
Image Alt Tags – The amount of sites we inherit/work on with a total disregard for this attribute has led us to raising awareness of the Alt tag. Alt text (alternative text), also known as “alt attributes”, “alt descriptions”, or technically incorrectly as “alt tags,” are used within the HTML code of a web page to describe the appearance and/or function of an image on a page (primarily to aid the visually impaired, hover over an image and you’ll often see it). Whilst again, not as major an error as 404 pages and broken internal links (more on that to follow), omitting or neglecting them soon starts to add up site-wide. We recently worked on a site with over 4,000 missing image title and alt tags… quite a time consuming task to fix (properly). Images are becoming increasingly important in SERPs (just try searching for “Halloween Decorations” and look at how visual the results are on the first page of Google!), so ignoring this simple (and quick) task of adding alt attributes to images will have an increasingly negative impact on your SEO. They should include keywords, but be careful not to over-stuff them as you’ll undo all the hard work of actually using them in the first place!
Meta Data/Descriptions – It’s a bit of a myth that these directly affect rankings, as in 2009 Google removed them as a major ranking factor, however, they do affect CTR (click through rate) once you do appear in SERPs, as they are the short description of exactly what is on the page (you know, the little bit of text under the page title in a search result listing?). Meta descriptions can technically be any length, but Google generally shortens snippets to around 155 characters. Ensure your meta description is long enough that it is sufficiently describing the content on the page, but not over the max suggested character count, and ensure they’re punchy and motivate an all important click through to your site. These are also used as the description text when shared on social media sites by default. Oh, and stop duplicating meta descriptions – stop it now! If you clone or duplicate pages to get a head start on formatting / page template, ensure you over write the meta data as a priority or you’ll have a negative impact on your SEO!
Getting these elements in shape, and ensuring they are implemented correctly on every page with every piece of content that’s created goes a huge way to improving the underpinnings of your SEO strategy. Many sites we develop have in-built failsafes in place to ensure these are mandatory pieces of information before a page even goes live, catching potential mishaps before they happen. This is vital on sites where content is regularly created.
2. Your Content is Duplicated. Your Content is Duplicated…
See what we did there? Duplicate content is often defined as content (text, images, video…) that appears on the internet in more than one place, i.e. on more than one URL/web page. Although Google doesn’t always penalise for duplicate content, it does still sometimes impact your website’s positioning in SERPs. When there are multiple pieces of “appreciably similar content” (as Google terms it) on multiple websites, it can be difficult for search engines to decide which version is more relevant to a search query, meaning a dilution of the visibility of each of the duplicates. Link equity can be further diluted because other sites have to choose between the duplicates as well, so instead of all inbound links (another important ranking factor) pointing to one piece of content, they link to multiple pieces, spreading the link equity among the duplicates.
There a multitude of reasons sites may have duplicate content, but one of the most common we come across with our e-commerce clients, for example, is where a product description is distributed to multiple retailers and used verbatim site by site. Often, if many different websites sell the same items, and they all use the manufacturer’s descriptions of those items, identical content winds up in multiple locations across the web. This is particularly troublesome for manufactures who sell both B2B and B2C as often their site is ranked down despite the fact they’re the original source of the product description. In this scenario, it’s important to try and write unique versions of product descriptions if and where possible. When working with clients in these scenarios, we utilise a variety of methods for rectifying duplicate content issues, not all of which require content to be rewritten or deleted all together, but involve technical remedial work such as utilising 301 redirects and canonical URLs.
3. Speed matters.
In 2010, Google added site speed (and as a result, page speed) as one of the signals it uses in its algorithm when ranking pages, with most citing that this was as a result of the huge growth in mobile browsing, where networks and devices weren’t as powerful as desktops – meaning leaner, faster loading websites give a better experience. Aside from the obvious impact on user experience (let’s face it, none of us like to wait forever for our content to load), a slow page speed means that search engines can crawl fewer pages using their allocated crawl budget, negatively affecting how your site is indexed. Page speed is often measured by either “page load time” (the time it takes to fully show the content on a page) or “time to first byte” (how long it takes for your browser to receive the first byte of information). Page speed is also important as pages which take longer to load generally have higher bounce rates and lower average time on page (both signals to Google and co that your site is of poor quality). Longer load time has also been shown to negatively affect conversions once the page has loaded, so there really are a number of reasons to improve load speeds across your site. This isn’t always the first port of call for SEO strategy, but akin to the on-site technical SEO foundations, getting this right really does have a big impact site-wide and, ultimately, affects the ranking of every page in your domain.
There’s so much more to it.
These are just three key areas you may need to focus on to maximise the returns on your wider SEO strategy. You can write all the content you like, really valuable content in fact, but see little to no impact on your rankings for all of your hard work, just because you’re missing the basics. Getting these right will help put you in a good place to push on and start to climb up the search engine rankings.
The Art of Human-Centered Design: Wonderful Strategies
Human-Centred Design is a problem-solving approach that explores the needs, behaviours, and experiences of users at every stage of the design process, creating digital products that are fundamentally people-orientated – a methodology that combines real data with creativity to develop solutions that are not only functional but also intuitive, meaningful, and aligned with user expectations.
Accessibility in UX: Designing for All Users
The Wizard of UX
Exploring the role of wonder, play and innovation in creating memorable online experiences. Uncovering the tricks behind not only getting users to their desired destinations but also how we make the steps in getting there pleasurable. What role does wonder play in these interactions and, when we know the end goal, how can we reverse-engineer wonder into the process to ignite curiosity and captivate users?