The Canonical And Mobile-Indexing Bugs: What You Need To Know

September saw some noticeable canonical and indexing issues that affected Google’s search results. Although some in the SEO community originally thought that these spikes in the SERPs were the result of an as-yet-unannounced Google update, it was later revealed to be two different bugs that were causing all the commotion.  

As confirmed by Google, one of the issues affected mobile indexing, while the other affected canonicalisation, which in turn affected how content was being shown in search resultsFor clarification, canonicalisation refers to search engines being notified of which version of a URL is the ‘original’. If a site owner decides to syndicate content – meaning they allow their content to be republished on another site – then canonical tags are used to show search engines whether a URL is the original content page, or a ‘duplicate’. This helps the site that originally provided the content to still rank in the SERPs when its content is reproduced elsewhere.   

So, if your site’s organic traffic was affected on 21st September, it was likely due to the indexing issues. 

On 1st October, Google Search Liaison reported they were “working to resolve two separate indexing issues that have impacted some URLs.” 

They have been elaborating further via their Twitter thread:  

What Happened 

On 23rd September, there was talk in the SEO community of another Google search ranking algorithm update, with automated tracking tools showing spikes. Users reported seeing unrelated results for their queries, and there were reports of previously indexed web pages no longer appearing in search results.  

The fluctuating SERPs can be seen in this screenshot of the SERP volatility checker below: 

SEO veteran Glenn Gabe commented on 26th September“I had a news publisher reach out to me that has seen 1,300 articles get canonicalised TO COMPLETELY DIFFERENT ARTICLES.” 

Fibre previously tweeted about the issues on 30th September (see here).  That same morning, SEO Roundtable’s Barry Schwartz reported seeing several complaints on Twitter about pages being dropped from Google’s search index 

One user commented on the article“Today I found out that the main page has disappeared from the index. The page has been in the top for more than 4 years, no prohibited methods were used, there were no alerts, no changes were made to the site either. Yesterday I filled out the questionnaire “Share your experience with Search Console Insights” and today the page is not in the index, and accordingly the drawdown of positions and traffic. Where to dig, what happened.”  

However, Google Search Liaison said the canonicalisation problems actually started as early as 20th September, and that the mobile-indexing issues began even earlier (although these bugs really became more noticeable on 21st September); 

Why It Matters 

The topic of duplicate content is already tricky in the SEO community, as you have to be careful that search engines don’t mistake the wrong URL as the original. When Google started selecting the wrong page despite the canonical tag, this made the situation even more complicated. The canonicalisation issue has resulted in syndicated content no longer being linked to its original site, so original web pages no longer showed in the search results, causing sites to lose organic traffic. URLs that once ranked well in the SERPs no longer rank.  

The Current Situation 

GarIllyes from Google said that they are “estimating the impact and potentially annotating the reports affected,” which means Google might be adding footnotes to Search Console reports to highlight the indexing bug. 

Google said the indexing bug only impacted about 0.02% of its index, and as of 2nd October, Google restored about 10%. This was followed by another update on 6th October, where Google Search Liaison reported this: 

What Can Site Owners Do? 

Site Owners don’t need to panic; in this case it is down to Google to fix the problem.  

On 2nd October, Google Search Liaison tweeted: 

Conclusion  

This isn’t the first time Google has experienced indexing issues; 2020 has seen numerous problems take place. For instance, just in September Google reported an indexing bug in the Top Stories section of search, and, back in JulyGoogle Search Console flagged up a problem with indexing, but this was fixed relatively quickly and with less commotionAnd lets not forget the large glitch that occurred in August, caused by an indexing bug, which resulted in major, temporary changes to search rankings.  

There has long been speculation that something is underway at Google; so perhaps these bugs are just the catalyst for bigger changes further down the line? We’ll keep an eye out for more developments.  

The Google Guaranteed Badge: What We Know So Far

There have been many signs that Google will launch a subscription-based plan that monetises local SEO, and we may now be another step closer.

The Google Guaranteed badge enables businesses to be screened by the search engine giant. If you pass, you’re awarded the badge, which in turn makes your Local Service Ad (LSA) stand out even more from the very top of search results, thanks to the bright green tick that appears underneath your business name and star rating.

But will this subscription plan truly catch on? Here’s what we know so far.

The Basics

Google My Business (GMB) started as a free marketing platform for all kinds of businesses, allowing business owners to control their presence online. But fraudulent listings slowly grew into more of a problem, and so Google realised that it needed to up its game when it came to building trust for its users. How can people really be sure that these businesses and sole traders are safe enough to invite into their homes?

Google tested ‘Home Services Ads’ for services such as plumbers and cleaners, allowing users to regain confidence in those who were advertising themselves via paid local search. The search giant then stated:

“In order to prevent fraudulent businesses from advertising on Google using false identities, Google Ads and Local Services advertisers in certain verticals will be required to complete Advanced Verification.”

In the past, businesses have relied on reviews and ratings left by their customers, which boosted local search rankings. But depending on everyday users made it harder for Google to verify each business, so now they are upping their game, it seems.

The Google Guaranteed Badge

The spam build-up has resulted in the coveted badge (which was released in 2016 to LSA’s only, but first promoted to GMB members in 2020), allowing various service providers to improve their local Ads. Over the past few years, Local Service Ads have been slowly moving up to top the local SERPs for home maintenance searches, and this has just confirmed Google’s push for this channel.

Google Guarantee Badge Preview

So how can local businesses get the badges?

Google sent out emails to GMB members (of eligible industries) who aren’t already using Local Service Ads during the summer months, offering an ‘upgraded business profile’ for $50 a month. This was immediately met with suspicion throughout the SEO community, as it adds to the belief that Google is trying to shift organic search to paid.

For those who sign up, they will undergo background checks and provide various documentation and credentials. Thanks to Google’s use of third parties, the overall process can take weeks or months.  

The Reaction

There has been a mixed reaction overall about the badge. Many saw it as a new opportunity to stand out against competitors operating in the same area, which may drive said competitors to also invest in the badge. There’s no denying that this may be an exciting era for small business owners – to say that you’ve obtained a Google Guarantee badge does give off a good impression.

However, one obstacle is the cost. Not all business owners – especially those just starting out – can afford to pay $50 a month. They risk losing out on customers because one or two of their competitors have obtained the Google Guaranteed badge, and so are more appealing to users than the listing without. SEO veteran Lily Ray tweeted that it “boils down to whether it’s worth $50 a month for your business to have a label that says “Google screened this business.”

As more information rolls out about this badge system, we can only imagine what this means for not just paid local search, but organic too. The fact that Google reached out to GMB members who were not utilising LSA suggests that, as aforementioned, they are once again attempting to push organic search into the hands of online advertisers. As Search Engine Journal pointed out, what once started as a move to combat fraud, now appears to have a monetary aim.

We’ll certainly be keeping an eye on this one to see where things go.  

 

Above image from Search Engine Land.

Google’s Recent SERP Feature Tests

In the past few weeks, Google has been testing new search features to decide on the best way to progress in the services that they are offering their users – as well as being effective for them.

Ensuring that Google users are getting the best results and businesses are having the most impact with their SEO efforts is essential, and this is why it is important to have a good understanding of where Google is heading with search results.

Google have been testing several changes and new features to discover what difference they can make to those who use them, including:

  • Categorised sections
  • Changes in advertisement headline sizes
  • Local Q&A box interface
  • A ‘For Context’ section
  • An ‘Also in the News’ Box
  • The visual look of the SERPs

Testing Categorised Sections

One of the major aspects that Google has been testing is in the way that they display the results on their SERPs. The testing involved grouping the search results into categories, meaning that instead of giving searchers a list of websites that correlate to their search, the results are split into categories, such as ‘reviews,’ related ‘videos’, ‘Nearby stores’, related ‘Images’ and ‘Online stores’.

This is an interesting way to display the search results and could have knock-on effects for SEO. With a SERP that is split into categories, SEO strategies need to consider each separate category to have the maximum effect.

Google Ads Headline Size

Another change to the visual impact of Google’s SERPs that was being tested is that in the top four ad results, a hyper-linked sub-heading is shown. The Heading 1 and Heading 2 are also having a significant difference in size. It is worth noting that in the test, this wasn’t the case in the results below the top four.

With a larger font size, this will probably result in a higher click-through rate for the top advertisers – and be beneficial for Google too. This contributes to the age-old debate that Google will eventually make paid search a more predominant method of search, making it harder for organic listings to earn traffic, especially for startups or smaller businesses who may not have the budget to stretch to Google Ads.

Local Q&A Box Interface

Google already gives searchers the opportunity to ask local businesses questions through its Local Q&A box. They appear to be, however, testing putting this box outside of the panel for each business. It is still unclear why they are considering making this change, but it could be to allow more general replies as well as locally focussed ones.

It is important, therefore that businesses ensure that their Google My Business account is kept as up-to-date as possible.

‘For Context’ Section

One of the biggest issues with using search engines is that sometimes searchers get content out of context. The ‘For Context’ section that has been tested mainly in conjunction with news, displays links to other articles that can provide extra context.

In terms of your business’s SEO, this means that you would be able to have more opportunities to get your content visible, but also makes getting your content accurate and relevant even more important.

‘Also in the News’ Section

In a similar way to the ‘For context’ section, Google was also testing an ‘Also in the News’ section. This also appeared underneath the news section and appears to link other news articles that are related to the original story. The main difference between the two sections is that the ‘Also in the News’ stories can be older and not as recent.

The question that arises from this is whether or not your content needs to be indexed in Google News, in order for your site to appear underneath within this section.

https://twitter.com/robin_xing/status/1290186649918349312

SERPs Visually

Google has also been testing out some other aspects of their SERPs. Some tests have included the use of thumbnail pictures in the results page as well as in the search suggestions box, and thicker grey lines between the search results. Of course, these potential changes may be to help searchers to understand the results quicker, but there may be other reasons for them.

Google is continuously looking to update the services that they are offering both searchers and the businesses that use Google to attract people to their websites. This means that it is more important than ever to ensure that your website is visible in searches relevant to your business. In order for this to happen, it’s vital that the SERPs are observed on a regular basis, so that you know when it’s time to adapt your SEO strategy and optimise what you can to make the most of the latest features.

Mobile-First Indexing: What Is It, and Why Does It Matter?

Google recently announced that they will make the switch over to mobile-first indexing from March 2021, changing the way that website rankings are calculated. They originally planned to roll out these changes in September 2020, but the pressures of the coronavirus and the uncertain times we’re living in have pushed back the deadline.

Once this change happens, it’ll be more important than ever to have a mobile friendly website that delivers the best possible experience to your website visitors.

Here’s what you need to know about mobile-first indexing so you can protect (or even improve) your Google rankings, and help safeguard the future of your business.

What is mobile-first indexing?

‘Mobile-first indexing’ simply means that Google will use the mobile version of your website first for indexing and ranking.

However, despite what you might have heard, it won’t ignore your desktop website completely, or create an entirely new index when it does this. It will simply be a switch in focus that aims to deliver the best possible experience to 21st century users who generally spend more time on their mobile devices than on desktop.

Traditionally, Google primarily focused on the desktop version when calculating the rankings as it was presumed to be the ‘main version’ of the website. And for many years, it was.

However, over the recent five years, the use of mobile devices, such as smartphones and tablets, has increased, encouraging Google to make these changes.

As Google said back in 2016, “…algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results…”

Why does mobile indexing matter?

As we’ve just mentioned, mobile indexing is becoming more important than ever because of the huge increase in mobile searches from smartphones, tablets, and other devices that continues to happen.

According to Statistica, “As of the first quarter of 2020, it was found that mobile devices accounted for 56 percent of organic search engine visits.”

 Therefore, if your website isn’t working optimally on mobile, your customers are unlikely to be getting an optimal experience when they visit your website.

They might get frustrated if links are too small to click, the page is unresponsive, or they simply can’t navigate your website as effectively as they can on a desktop computer. As a result, your bounce rates will increase, the image of your business will decline, and your customers are unlikely to want to come back. Not exactly ideal when you’re running a business, is it?

Additionally, once Google implements this mobile-first indexing, your website rankings are also likely to suffer if you don’t have a mobile-friendly version in place.

How can you improve your website for mobile indexing?

If you can make your website as mobile-friendly as possible, you’ll significantly improve your overall performance as a business. Your website visitors will be able to find the information they need quickly and easily, customer satisfaction will improve, and you’re much more likely to rank highly. Here are some tips that can help:

Make sure you have a responsive website

Above all else, you should make sure your website is responsive. This simply means that it will adjust according to what specific device is used to access your website, to help the reader enjoy the best possible user experience. If you don’t do this, it’s likely you’ll be hit hard by the switch to mobile-first indexing.

Focus on mobile page speed

Mobile page speed and load time are key factors that won’t just influence your Google rankings, but they’ll also affect how your website visitors experience interacting with your website. After all, if it takes too long, they’ll just get frustrated and click away. 

85% of mobile users expect pages to load as fast as, or faster than, they load on the desktop,” say leading technology service, Radware, adding that, “Two out of three smartphone users say they expect pages to load in 4 seconds or less.”

For that reason, you should work hard on improving your load speed before the mobile-first switch over happens. An easy way to do this include running a speed test in Google Search Console. The test will also highlight what you could improve so you can make the changes quickly and easily.

Create high quality content

If you’ve been reading our blog for a while, you’ll know that you should be consistently creating high quality website content if you want to rank highly. When the switchover to mobile-first indexing happens, you should also check that the mobile version of your website contains the same content as your desktop. Also check that you’re using ‘alt-attributes’ for your images.

Add your mobile site to Google Search Console

If you haven’t done so already, now is the time to add and verify your mobile site with Google Search Console. You’ll be able to check how visitors interact with your mobile site and make any changes needed.

Test that your mobile site is accessible

Is your mobile site accessible to Googlebot? If you’re not 100% certain, you should use the robots.txt testing tool to find out.

Think about mobile user experience

You should also take time to consider how the elements on your page will look on mobile, as well as desktop. Choose your images carefully to ensure they don’t dominate the page, make sure any buttons are large enough to be clickable, make sure the text is large enough to read, and so on. It’s always worth viewing your website on different devices to see how the elements interact for your users.

 

There’s no need to be concerned about the switchover to mobile-first indexing. If you have a responsive, mobile-friendly website, you shouldn’t be too affected once it finally rolls out.

Otherwise, now is the perfect time to implement those tweaks and changes that will improve your website and help you maintain or even improve your rankings in 2021.

Google Discover’s Help Doc Update: What Does It Mean For SEO?

Earlier this week, Google overhauled their help document for Google Discover to make it clear what the web feature is and how it can supplement regular searches.

If you don’t already know or haven’t used it yet, Discover works by recommending pieces of content to users based on their previous Google searches. It doesn’t work in the same way as organic search as it isn’t based on something that a person has typed into Google, and searched for at a given moment in time. Instead, it takes a more long-term view of what you might like so it can send the right information your way.

What’s especially interesting with the help document update is the fact that they have added a section that refers to a key website ranking factor called ‘E-A-T’.

It reads:

“Our automated systems surface content in Discover from sites that have many individual pages that demonstrate expertise, authoritativeness and trustworthiness (E-A-T). Those looking to improve E-A-T can consider some of the same questions we encourage site owners to consider for Search.”

This highlights again the importance of following the ‘E-A-T’ principles, and suggests that if you do so, you are also more likely to be featured on Google Discover.

Here at Fibre Marketing, we spend a lot of time working on E-A-T for our clients. That’s why today we’d like to explain more about how E-A-T works and what you can do to improve your website with this in mind.

What is E-A-T, anyway?

The acronym ‘E-A-T’ stands for expertise, authority and trust.

These three characteristics are what Google looks for when it evaluates the quality of your website. They’re becoming increasingly important when it comes to getting your web pages to rank highly.

  • Expertise means that you should be knowledgeable and capable in your chosen field (and often have the credentials to back yourself up.)
  • Authority means that your website and its authors are respected by others in the industry.
  • Trust means that people should feel safe and secure on your website.

By using these characteristics to measure a website, Google can find the best quality content for its users and can avoid falling for some of the spammy SEO tricks used in the past.

This gives users a better experience, allows websites to showcase that they have to offer, and helps Google work better.

Although there isn’t currently a direct E-A-T score as such, these factors underpin everything else to do with rankings.

Therefore, if you want to get noticed online in 2020, you must focus on building and showing your expertise, authoritativeness, trustworthiness and provide the best possible user experience.

How can you improve your E-A-T scores?

The following tips can help this to happen:

1. Create great content

As we’ve explained many times on the blog, one of the best ways to improve your SEO and increase rankings is to consistently create high quality content.

By ensuring that it’s relevant, regularly updated, free from typos or grammatical errors, comprehensive and trustworthy, you will demonstrate your expertise in your field. Your website visitors will also enjoy the experience and want to come to your website again and again.

They’ll be more engaged, they’ll be more likely to be loyal to your brand and they’re more likely to make a purchase in the future.

To read more tips on creating outstanding content for your website, read our blog post “How to Create Great Web Content for SEO”.

2. Make your authors visible

If you want to build your website authority, Google needs to see that you have connections to other knowledgeable and authoritative figures in your field.

The simplest way to do this is to attribute your website content to an author by linking their name to the top of the page. You could also use an ‘author box’ at the bottom of the post that includes their name, biography, photo and a link to their website if possible.

This won’t just help your website to gain authority but will also help build the writer’s reputation too so it’s well worth taking the time to do.

3. Link to reputable websites

If you’ve done any link building to help build your site authority, you’ll know how beneficial it can be in terms of your SEO and web rankings. By linking out to authoritative websites within your niche, you’ll associate your site with it while backing up your knowledge.

However, it’s important that any site you link to is reputable and offers high quality content. The last thing you want is to link to a spammy site that ends up causing you trouble or even gets you blacklisted.

If you do find any unwanted links, you might want to get rid of them via a process called ‘disavowing’. Read this article to learn more about the process.

4. Improve your website security

Having a secure website isn’t just an excellent way to show your website visitor that they can trust you with their details.

Google also treats website security as a key ranking feature these days as it demonstrates that you take user experience seriously and are therefore more likely to provide a high quality experience.

The quickest way you can do this is to switch over to https as it adds encryption to your site to prevent hackers from accessing your information.

There are also various pieces of security software and security certificates you can add to your website to keep it safe from hackers, spam and unwanted visitors. Speak to your web master or host to discuss the options available to you.

According to SEO experts, Search Engine Watch, 82% of customers said they’d leave a website if it wasn’t secure so this is a tweak you need to make to your website sooner rather than later.

5. Focus your content

Instead of writing website content spread across a variety of topics, get focused.

Learn more about what your audience wants when they visit your website then create content that answers their questions, provides solutions and adds maximum value possible.

When you do this, you’ll be showing your audience that you understand and care about their problems and can provide a solution. Not only does that leave them feeling happy with their experience on your website, it builds your authority, trust and brand image at the same time.

Because this content is almost certain to include a variety of targeted and natural keywords which centre around a particular topic, Google are also more likely to see you as an authority in your field- another win.

Summary

The recent updates to the Google Discover help document demonstrate that E-A-T is more important than ever when it comes to SEO and improving your rankings. Keep working on your website expertise, authority and trust and you’ll see excellent results.

Introducing: Google’s Manual Actions

Imagine firing up your laptop and discovering that your website has disappeared completely from the search rankings.

Despite all of the hard work you poured into boosting your organic rankings and building your brand, your website is simply nowhere to be found.

You’re back to square one.

That’s exactly what could happen if Google issues a manual action against you.

Luckily, these ‘website red cards’ aren’t issued often or without careful consideration. Nor do they always have such a devastating impact.

But given the strict penalties, it’s vital to understand what Google’s manual actions are, why they are issues and importantly, what you can do if the worst should happen.

Here’s our short guide.

What is a manual action?

Manual actions are penalties that are given out by the humans at Google.

They’re given to websites that are using unethical practices to boost their website rankings and disregarding the Google Webmaster Quality Guidelines.

These penalties can affect individual pages or even entire websites, and can cause them to drop significantly in rankings or even disappear completely. This happens quickly – often overnight.

They’re a serious problem that can have a profound effect on your business.

How do websites get a manual action?

Provided you have created a high quality website and you aren’t using those underhand, spammy SEO practices that aim to cheat the system (often called black hat SEO), you shouldn’t ever be issued with a manual action.

“Experience shows that manual penalties are infrequently issued and only for serious offences,” agree industry experts, Search Engine Land.

Having said that, sometimes Google can issue a manual action as a result of other people’s behaviour such as dodgy links that point towards your website, blog or forum spam comments or even a hacked shared server. This is why it always pays to be aware of potential problems and monitor your website consistently.

With this in mind, here are some of the reasons why these manual actions are issued:

  • User-generated spam: This can come from spammy, self-promoting blog comments, forum posts, or other user-created content.
  • Using a free host: If infected with spam, shared servers can cause Google to issue a manual action, even if your website itself isn’t infected.
  • Unnatural links to your site: Unusual links that appear to be manipulating page rankings will be penalised.
  • Unnatural links from your site: Likewise, unusual outbound links could also be creating spam or point to disallowed practices.
  • Cloaking and/or sneaky redirects: Giving users a hidden page or image instead of the one submitted to google or redirecting users to a page they didn’t click on.
  • Pure spam: blatant use of spam.
  • Hidden text and/or keyword stuffing: Over repeating keywords or hiding them on the page.
  • Sneaky mobile redirects: When mobile users are directed to a different URL than clicked on.

It’s important to quickly note that you won’t be issued with a manual action when a user manually reports a website for spam.

According to Gary Illyes’ post on the Google Webmaster blog last Friday, “…we use spam reports only to improve our spam detection algorithms,” adding that although spam reports play a helpful role, there’s an inefficient way of detecting spam.

How do you know if you’ve got a manual action on your website?

If you have a manual action on your website, you’ll immediately receive a message from Google via Google Search Console. This will tell you what the problem is, which page or pages have been affected, and what steps you can take to resolve the problem.

If you haven’t received a message or haven’t noticed one, you can also check whether your website has a manual action against it by looking at your Google Search Console.

What should you do if you have a manual action?

If you discover that you have a manual action, don’t panic! Although it’s certainly not great news for your website rankings, you can fix the problem and then appeal to Google to reconsider.

Here are the steps you should take if this is the case:

1. Read Google’s message carefully

Before you do anything else, take your time to read the notification message carefully so you understand why your website has been penalised. Often, you’ll see how you can solve the issue.

2. Understand the problem

If the reason for the problem isn’t immediately clear, you will need to gather and evaluate key data so you can identify the cause. Depending on the size of your website and the issue in question, this can take anywhere from a few hours to a few weeks.

At this stage, it’s also a good idea to review Google’s Webmaster Guidelines to help guide your investigations.

3. Fix the problem

The data you’ve gathered in the previous step should have identified what is flagging Google’s system and causing you problems. Now you need to resolve the issue by following one or several of the following steps.

User-generated spam

Check your website for malicious content then delete it. Ensure all comments and content submitted to your site are moderated.

Using a free host

Contact the hosting company to inform them of the problem and consider moving to a secure host instead.

Unnatural links to your site

Disavow any links that appear suspicious.

Unnatural links from your site

Remove excessive links or low-quality links and use tags to identify where there are affiliate links.

Cloaking and/or sneaky redirects

Fix any pages that contain code, content or sneaky redirects that are hiding content.

Pure spam

Check Google’s Webmaster Guidelines and remove spam.

Cloaked images

Ensure that both Google and your users can see the same images.

Hidden text and/or keyword stuffing

Remove excessive use of keywords and hidden text including in the html and CSS of your site.

Sneaky mobile redirects

Remove any redirects and check for malware or hacking.

4. Ensure full company compliance

If the manual action was issued because of a team member’s mistake, take steps to prevent the same happening again in the future.

Call a meeting and ensure that everyone understands why it happened and its severity then issue clear guidelines to safeguard your business.

5. Request a reconsideration

Finally, once you’ve fixed the problem you should request a reconsideration, including detailed information on how you fixed the problem and whether your website was hacked. Don’t request one before you’ve fixed your website as you’ll only be wasting Google’s time and your own.

At this stage you’ll need to be patient because it can take them anywhere from a few days to a few weeks to process your request and make updates if relevant.

Once Google has come to a decision, they’ll let you know whether your request was accepted.

When will my website recover?

For obvious reasons, a manual action affecting just one page on your website will be easier to recover from than problems with your entire site.

Generally speaking, it can take anywhere from a few weeks to a few years to recover from a manual action. You’ll need to provide useful, valuable content to your users and rebuild trust so you can move up the rankings again.

To round up, Google manual actions can cause huge problems for your website rankings and even cause them to disappear completely.

The good news is that they aren’t routinely issued and are never the result of a spam report issued by a user. By adhering to the guidelines and fixing any issues, most businesses can avoid ever receiving one.

A Breakdown of the Nofollow Link Attributes

In SEO, nothing stays the same for long; a link strategy that works one day may not work the next. Google originally launched its nofollow link attribute in 2005 in an attempt to stop comment spam and untrusted links. The Googlebot didn’t crawl, index or rank internal and external links, but ignored them completely. The rel=“nofollow” attribute excluded a link from the search algorithm, and that was that.

In September 2019, Google revealed that the nofollow feature would begin weighing in on how Google would rank the site, i.e. the content, links and anchor text would feature in spam measurements, but more as a “hint” than an explicit directive. With this update Google could now recognise content and anchor text, and follow the links they deemed necessary, regardless of nofollow. Importantly, Google claimed it still would not crawl or index these links in any way. With this update we were also introduced two new related link attributes, rel=“sponsored” and rel=“ugc” (where UGC is user-generated content). Both of these contain more detailed information about the link.

What each attribute means

  • rel=“nofollow” applies whenever you want to link to another webpage but don’t want to endorse or give any credit to that page in the process. This attribute basically allows Google to treat the untrusted or spammy link, and will likely be ignored.
  • rel=“sponsored” is to help Google recognise those links that you’ve deliberately added for advertising, sponsorships, or other paid agreements.
  • rel=“ugc” tells Google what you consider user-generated content, for example legitimate comments on a blog or a forum post.

It’s possible to give a single link more than one attribute, for example rel=“nofollow sponsored.” However, the impact of a link will be reduced if it’s not an accurate description of the link, for example if you’ve chosen rel=“sponsored” for a blog comment.

The March 2020 update

After March 2020, Google will treat the nofollow attribute as a hint for conducting a spam analysis, and possibly for rankings, depending on just how relevant and high-quality the links were. If Google feels that, after crawling the surrounding content and anchor text, the link is in fact relevant, then it may crawl the link despite the nofollow tag.

What’s the confusion?

Google claimed in its update that links would not be ignored if they had any of these three attributes; rather, the attribute would give Google algorithms a “hint” about the kind of link it was seeing. In other words, data is still being collected from those links to track and rank the link scheme, ultimately having the potential to improve the overall website search ranking. This can be confusing unless you understand exactly why Google has made these updates.

The reason for this feature is to constantly discourage spammers – Google attempts to identify spammy, unnatural link patterns so these links can be properly omitted. The only way to do this is to gather the data for these links so Google can correctly recognise and categorise them. By calling these features a “hint” Google can still access the important data it needs.

To add to the discussion, John Mueller recently stated that all guest blog post links should also have at the minimum the nofollow attribute on them, regardless if they’re paid or not, and irrespective of how natural the link seems (for example, if you’ve put a link in the byline). This claim divided the SEO community as some links in guest posts are included for natural reasons, such as genuinely using a site for research purposes, or the link is included in an author’s bio. People also felt that, if a change like this were ever to be implemented, Google would be trying to have a larger claim over the internet.

Overall the mixed reaction to this change is yet to go away, with questions frequently appearing across social media platforms and forums. Furthermore, Google’s Gary Illyes mentioned in June that more nofollow link changes may be on the horizon.

What this means for your search results

Google will use these attributes to gather data to feed into its search ranking schemes, tagging links with information that marks it as excluded or included in searches. Though Google claims there should be no significant effect on search results, some people are naturally worried. It’s important to remember that even after the March update, the attributes are still just hints.

What should you do?

You don’t have to make any changes, if you don’t want to. You could opt to do nothing and keep using nofollow attributes as you have been. You could always consult your SEO team if you’re concerned about a few key links, however. For paid links, you will need to keep a nofollow attribute or amend it to a sponsored attribute or a mix of sponsored and nofollow.

Disavowing Links in 2020

The topic of disavowing links has always been a hotly debated one, with SEO experts disagreeing on whether to disavow “bad” links, and how this might affect site rankings on Google. Though the extent is debatable, ranking algorithms still do factor in link quality, so a poor link could very well damage search rankings. With all the changes and challenges bought by 2020, we are finding it wise to regularly audit links for their relevance and quality. Let’s take a closer look at how and why to disavow backlinks – and how to determine whether you need to do it at all.

A brief intro on disavowing

In the past, Google ranked pages by link quality via PageRank, but this system was quickly exploited. The Penguin algorithm update in 2012 aimed to discourage sites with an abundance of spammy links. These sites could only recover by removing those bad links, and hence Google’s Disavow tool was born. Later, Penguin 4.0 saw an important change: Google wouldn’t penalise low quality links so much as devalue them, meaning you only needed to disavow a link if you’d received a notice to this effect, i.e. a “manual action.”

Fast forward to today, and Google maintains that disavowing links will not do much to help your site’s ranking. The unofficial position, however, is a little more ambiguous, with Google’s John Mueller claiming that disavowing may in fact benefit some sites. Ultimately, disavowing can help in some cases, but Google makes it rather difficult to use the necessary tool – and going too far can definitely backfire.

Understanding manual actions

You can find out your site status by looking in the Google Search Console under “manual actions.” This is simply when Google lets you know that it plans to omit/penalise certain lower quality content from search results. Penalised links can include anything from keyword stuffing or hidden text, unnatural or inorganically received links, automated links, PBNs, comment and forum spam, influencer-inspired/paid links, suspicious redirects, thin content and the like.

Is it a good idea to disavow?

As with most things SEO-related, it all depends. Just how low quality are the links and how much are they affecting rankings? A good rule of thumb is to only disavow when you literally have no other option. For example, first reach out to the owners of low-quality links pointing to your site and ask for them to be removed, using disavowal only as a last resort. Ultimately, it’s an advanced feature that should be used carefully since it could cause more harm than good.

What if I don’t have a manual action?

Before you do anything, you need a comprehensive link audit to properly understand how your links are performing. Rest assured that Google will ignore all but those links that your SEO team are directly responsible for. In appraising link quality, look for anything that violates Google’s terms, such as paid links, link schemes and reciprocal linking/swapping, suspicious anchor texts, articles with links to dodgy sites, malware and cloaked sites, or “pills, poker and porn.”

If these links don’t generate revenue and also don’t affect your organic search traffic, then the solution may be as simple as removing the page completely.

Tips for effective disavowing

  • Naturally, remove any links you’ve received a manual order for.
  • Focus on creating links with value to human readers, and which will boost your site’s domain authority and trust.
  • Unless you’re ultra-confident with SEO, use a backlink monitoring tool or consult an SEO team for expert advice on exactly which links to disavow as part of your routine site audit.
  • Check that you’re receiving links from high-authority, industry relevant sites.
  • Links to expired domains or overly regional content may slip through ordinary filters, so keep an eye out for them.
  • Generally, the most likely culprits are links that have over-optimised anchor text, links that are not industry relevant or links that seem spammy, so start with those.
  • Don’t assume that removing any old link will automatically improve rankings – it may just do the opposite.
  • Finally, there’s plenty of wiggle room, and many links will fall in a generous grey area.

The best strategy

Overall, your focus should be on maintaining the highest quality outbound links possible and disavowing those that are obviously and significantly harming your organic search results – otherwise tread carefully. Keep an eagle eye on any incoming links that could be harming your site’s reputation. Even scrupulous link audits can miss a few bad links, but this relatively small risk should be weighed up against your overall SEO priorities, the risk of disavowing incorrectly, and your marketing strategy in general.

All You Need to Know About the Page Experience Update

Google have recently announced that they’ll be rolling out another new ranking factor in 2021 called Google Page Experience.

This will judge your website based on how user-friendly your website is perceived to be, using metrics such as:

This means that if your website visitors don’t have a great user experience (UX), your Google rankings are likely to suffer after this new update is rolled out.

In this short article we’d like to explain more about what the Google Page Experience update is, why it matters and how you can prepare your website for the changes so your search performance isn’t affected.

What is page experience?

Think back to when you last visited a website and found yourself getting frustrated.

Perhaps the page took ages to load. Or annoying pop-up adverts and opt-ins kept getting in your way. Maybe you could barely read the content because it was designed for desktop use and was tiny.

When this happened, you almost certainly just gave up, clicked the back button and found another website instead.

In this case, the problem wasn’t that the website didn’t offer the right information. It may very well have. The problem was that you didn’t have a good experience using it.

This is what we call page experience.

How much will the update affect your rankings?

At this stage, it’s unclear to how much the Google page experience update will affect your search performance. We’ll only know for sure when the update will be rolling out in 2021.

Having said that, the metrics which Google will be focusing on with this update do already have an impact on your search performance, but in an organic, audience-led way.

As we mentioned at the beginning, visitors are less likely to stick around on your website if they can’t read your information on their mobile, annoying ads dominate the screen or it takes ages for your page to load.

If you haven’t already optimised these factors, it’s more than likely that your website won’t perform as well as it potentially could, regardless of the update.

It’s also important to note that the page experience update factors won’t be the only ones used in ranking your website. Producing engaging, useful content is still the most important factor.

How to prepare for the Page Experience update

Although there aren’t currently any specific tools that can measure page experience, you can identify what needs improvement and make changes with several other tools. These will help you improve your website and get ready for the update.

1. Optimise your site for mobile.

If you’re running a business in the 21st century, you must make sure that your website is mobile friendly.

According to market and consumer data provider Statista, around 52% of traffic in 2019 came from mobile devices. This figure looks set to grow over the coming years, especially as the online market expands as a result of COVID-19.

2. Make your site more secure

These days you must have a secure website if you want to succeed in the online world. You need to keep hackers away, you want to protect your customer’s sensitive personal information and you want to provide users with a safe user experience.

When you can do this, your customers will trust your business, your search rankings are likely to improve and your website will be safer.

There are many ways you can do this, including getting an SSL certificate and switching to https. Both encrypt your website, keeping threats out and helping to boost your SEO rankings.

3. Check your site for security issues

Most businesses will already know if they have been hacked, contains malware or contains content that is deceptive or could potentially harm a visitor or their computer.

However, you can check if your site has any issues that could influence page experience, user experience (UX) or search rankings by using Google Search Console to access a security report.

4. Review how you use ads and images

Using ads and visuals on your website will drive customer engagement, improve the user experience and help boost conversions.

However, it’s vitally important that these don’t detract from your content and provide a poor user experience but add value.

Pages that show intrusive interstitials provide a poorer experience to users than other pages where content is immediately accessible. This can be problematic on mobile devices where screens are often smaller,” say Google Webmasters.

Check through your website pages and ensure that any visuals used such as images, infographics, and ads don’t dominate the page. Always opt for high quality, professional standard images instead of stock images.

If you do use pop ups to offer opt-in gifts or encourage newsletter sign ups, keep them to a minimum, avoid using them for mobile and make them easy to close.

5. Improve your page load speed

If you haven’t followed the advice in our blog post ‘Why Low Speed Scores Could Be Killing Your Traffic’, now is the time to do it.

As we said in the blog, your page load speed makes a huge difference when it comes to user experience, your Google rankings and attracting website visitors and it can be relatively straightforward to improve.

“Longer page load times have a severe effect on bounce rates,” say Google Webmasters, “If page load time increases from 1 second to 3 seconds, bounce rate increases 32%.

Start by running a speed test in Google Search Console, then read through their advice and optimise the elements suggested. This can include using a design that loads quickly, optimising images, and minifying CSS, JavaScript and HTML.

Summary

To summarise, page experience won’t become a direct ranking factor when the page experience update happens. But combined with other factors, it will influence your rankings overall.

So make sure your site is optimised for mobile, make site security a priority, review your images and opt-ins, and improve your page load speed. You’ll be better placed to maintain your rankings and make your website a more user-friendly, professional place for your customers to visit.

The Complete Rundown of Google’s Algorithm Updates 2020 (so far)

Google makes thousands of changes to search each year – in fact, it was reported that in 2018, Google conducted 3,234 updates alone.

While we await the total for 2019, it is still clear to see that, when it comes to managing your website, you should not sit idle. Studies have shown that between 70-80% of users research a business online before making a purchase, meaning that being found online is vital for your company. As Google continues to roll out algorithm updates, it’s important to adjust your website as needed in order to boost your search visibility and gain first page rankings.

But with so many changes being made throughout the year, it can be hard to keep up. So, we’ve compiled a list of all algorithmic changes of 2020 so far, including the winners and losers, which we’ll update as time goes on.

January 13th – January 2020 Core Update

The first update of the year was announced by Google on January 13th, rolling out across the world and affecting all languages. Tracking tools showed high volatility for three days, with Google confirming that the update was ‘mostly done’ rolling out on the 16th.

Core updates do not target a particular industry, and anyone can be affected. They are designed to improve how Google’s systems rank content. The search engine giant explains this in more detail on their blog:

“…imagine you made a list of the top 100 movies in 2015. A few years later, in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before. The list will change, and films previously higher on the list that move down aren’t bad. There are simply more deserving films that are coming before them.

Winners and Losers

Towards the end of 2019, most updates saw YMYL (your money or your life) sites as the most affected. This update was no different, particularly for the health and finance sectors. Very Well Health was one of the top winners, according to multiple sources, as was Yahoo Finance. A few dictionary sites came out triumphant, too. Meanwhile, a further group of healthcare sites fell victim.

Overall, this update has been duped by many as large and fierce, causing great tremors throughout top ten rankings.

January 23rd – Featured Snippet Deduplication

While this wasn’t exactly an algorithm update, the deduplication of featured snippets was still a considerable change that majorly affected websites’ click-through-rates (CTR).

Previously, featured snippets (pictured below) were counted as their own, stand-alone search engine results page (SERP) feature, not an organic search listing. If a site obtained a featured snippet, the same URL would also appear in the listings below as an ordinary listing.

On the 23rd, Google announced that, if a URL is featured in a snippet, it would not appear on the first page of search results. Thus aligning this SERP feature with Google’s claim that a featured snippet is an organic entity, counting as position number one. Before this deduplication, a featured snippet counted as position zero.

Winners and Losers

In this case, there were no obvious winners and losers to be precise, but many sites did report losses of traffic. This deduplication led to many site owners having to decide which they’d rather lose: a first-page ranking, or a featured snippet. Moz tried conducting a CTR study to see which loss would have a bigger impact, but unfortunately, it is impossible to decipher if clicks to a URL were from the featured snippet or the organic listing.

Overall, this major change was met with confusion and outcry and sparked much discussion over the future of featured snippets.

February 9th – Unconfirmed Search Ranking Update

While a core update has been denied by Google, we have decided to include it in this blog as there were significant amounts of chatter and tracker tools were off the charts for five days – longer than the standard algorithm updates.

From the February 9th, discussions of a suspected search update started to arise on Twitter and various web forums. SEO spokesperson Barry Shwartz reported the fluctuations as ‘really big, maybe even massive’ changes that were taking place. Many sites experienced severe traffic drops and spikes.

On the 13th, Google’s Danny Sullivan stated, ‘We do updates all the time’ in response to this speculation. This suggests that algorithmic changes were made during this time, just not on the same scale as a core update.

Winners and Losers

Unfortunately, it’s hard to decipher a clear sector that either benefitted or suffered the most following these changes. The suggested update occurred across the globe, and there were both winners and losers in a range of industries. There was even some speculation that several updates actually took place, as many of the sites that saw drops then experienced traffic increases a few days later. The one thing that was certain following the confusing five days was that some sort of changes were made – it’s just unclear what exactly those changes were.  

May 4th – May 2020 Core Update

The second core update of the year began rolling out on May 4th and appeared to have mostly ended by the 7th. Rank Ranger dubbed this update as an ‘absolute monster’ as the effects appear more brutal than those which occurred in January. Furthermore, this update took place amidst the Coronavirus pandemic, which had already significantly affected a wide range of sites and caused a change in search patterns.

Winners and Losers

Unlike many other updates, the May core update appeared to have less focus on typical E-A-T areas and was broader than usual updates, making it harder to detect a clear winner or user. Marie Haynes claimed that Google is getting better at recognising which links are ‘truly recommendations for your content and which are self-made for SEO reasons.’

It was interesting to note that Spotify took a hit, despite it progressing steadily over the last few years.

As of today, 15th May, information surrounding this update is still rolling out, and so we will continue to update our blog with our findings.

June 23rd – Suspected Federal Update

On June 23rd, there were reports of another algorithm taking place, although Google is yet to comment. The SEO trackers were showing high volatility in the SERPs, and many reported significant gains.

What’s interesting about this update is that it seemed to target .gov sites, along with other YMYL industries, a trend which appears to be mirrored across the globe. This pattern is what led Barry Shwartz to label is update as ‘The Federal Update.’

Winners and Losers

.gov sites were not the only winners of this update; health and news sites were also affected. But it is clear to see that government sites were the ones impacted the most. It’s possible that Google now prefers these sites over other domains, due to their authority. Authority is part of the E-A-T metric that Google refers to in its Quality Rater Guidelines, so it’s only natural for .gov and .org sites to experience a boost in search visibility.

July 6th – Small Search Ranking Update

A small search ranking update may have taken place on 6th July, lasting until the 8th. While there wasn’t as much chatter surrounding this update as the last, some webmasters still detected some changes in their site traffic. This algorithm update targeted niche websites, and our SERP volatility tracker reported nothing more than a slight increase in movement amongst rankings.

Winners and Losers

It’s harder to say who the winners and losers were with this update, as only a small number of sites were affected. Education sites reportedly saw some movement, but it’s highly likely that a particular niche of sites was amongst those most affected, due to the small amount of volatility.

July 24 – Suspected Search Ranking Update & What SEO’s Get Wrong About Google’s Updates

Not long after the previous suspected update, another one came along. Only this time, it was much bigger, with our volatility tracker showing fluctuations amongst the SERPs for most industries.

To add to this, a few days later (28th July), Barry Shwartz published an article on Search Engine Land exploring what SEO’ers often get wrong about Google algorithm updates. Shwartz promotes the idea that we should stop obsessing over what Google is changing each time an update rolls out and focus on our websites instead. This is because most people blame their drops in traffic on updates, when, in reality, it’s because their websites usually aren’t up to scratch. The time spent trying to crack the algo updates should be spent on improving content, UX, link profiles and other ranking factors.

This coincides with a statement Google made in August 2019. They stated that, when it comes to improving your site following updates, ‘There’s nothing to fix.’ Nothing really changes when Google rolls out updates, only how they assess content. The search engine giant then followed this up with ‘We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.’ Shwartz’s advice, therefore, aligns with Google’s own – but what do you think?

Marie Haynes chimed into this discussion, offering a few tips to help you decipher if any traffic losses are the result of a recent update. You should start by looking at any recent changes made to your site, as well as the number of pages that were affected. If a noticeable drop in traffic occurred within 48 hours of an algorithm’s start date, then that was likely the cause.

Winners and Losers

Many industries experienced some fluctuations, according to our volatility tracker. The sports, news and arts & entertainment industries saw drastic changes. According to SEMrush, victims included WorkingTitleFilms.com, Marvel and Talk Sport, whereas winners included GuitarWorld, Teen Vogue and TheStage.com.

August 10th – August 11th – 2020 Google Glitch

On Monday afternoon, Barry Schwartz reported one of the largest Google updates many had seen for a long time. He commented that it looked like there was “a big Google search ranking algorithm rolling out”, although it was not confirmed by Google.

By Monday night, many in the SEO community were noticing huge discrepancies in search results. Content that was once high-ranking suddenly seemed to drop several pages, while other low-ranking sites started popping up in the first pages of the SERPs.

However, it was soon revealed that this was not an update, but in fact a glitch.

Fortunately, things seemed to revert to normal by August 11th, and Google’s John Mueller later posted on Twitter that the glitch had been fixed.

On August 11th, Google commented on the glitch, “On Monday we detected an issue with our indexing systems that affected Google search results. Once the issue was identified, it was promptly fixed by our Site Reliability Engineers and by now it has been mitigated.”

Winners and Losers

Technically, there weren’t really any winners or losers in this scenario – as it was a glitch – but it was certainly interesting to see which sites temporarily ranked higher.

According to Marie Haynes, sites who experienced a boost in traffic included medical articles that did not include supporting references or links to authoritative sites. These articles referenced natural remedies that were not backed by scientific facts. Haynes also commented that many of the sites that started ranking high included unnatural links.

But, on August 15th, it appeared the glitch trauma was not over. Strong amounts of chatter picked in the SEO community following significant changes being reported in the SERPs, similar to those seen during the initial glitch. But by the 16th, things seemed to have calmed down, leading to speculation of another issue from Google’s side.

The amount of fluctuations were captured by our volatility checker, which was off the charts throughout the 15th.

If this is another glitch, this mirrors the pattern that’s been forming recently when it comes to indexing. Speculation was rife in May with more indexing issues, although  Google’s John Mueller stated that there wasn’t a bug, at most a few minor glitches. There were also reports in June, when Google experienced more problems. Furthermore, throughout 2019, there were many problems with content being de-indexed, and at one point, new content wasn’t being indexed at all.

If your website has been affected by an algorithm update and you’re struggling to find a solution, get in touch with us at Fibre Marketing, and we’ll help you get back on track.