In the past few weeks, Google has been testing new search features to decide on the best way to progress in the services that they are offering their users – as well as being effective for them.
Ensuring that Google users are getting the best results and businesses are having the most impact with their SEO efforts is essential, and this is why it is important to have a good understanding of where Google is heading with search results.
Google have been testing several changes and new features to discover what difference they can make to those who use them, including:
Categorised sections
Changes in advertisement headline sizes
Local Q&A box interface
A ‘For Context’ section
An ‘Also in the News’ Box
The visual look of the SERPs
Testing Categorised Sections
One of the major aspects that Google has been testing is in the way that they display the results on their SERPs. The testing involved grouping the search results into categories, meaning that instead of giving searchers a list of websites that correlate to their search, the results are split into categories, such as ‘reviews,’ related ‘videos’, ‘Nearby stores’, related ‘Images’ and ‘Online stores’.
This is an interesting way to display the search results and could have knock-on effects for SEO. With a SERP that is split into categories, SEO strategies need to consider each separate category to have the maximum effect.
Crazy to me how broken up the SERPs are now… Specific sections for online store and reviews (as well as images, local, etc.)@rustybrickpic.twitter.com/NStKwgknBV
Another change to the visual impact of Google’s SERPs that was being tested is that in the top four ad results, a hyper-linked sub-heading is shown. The Heading 1 and Heading 2 are also having a significant difference in size. It is worth noting that in the test, this wasn’t the case in the results below the top four.
With a larger font size, this will probably result in a higher click-through rate for the top advertisers – and be beneficial for Google too. This contributes to the age-old debate that Google will eventually make paid search a more predominant method of search, making it harder for organic listings to earn traffic, especially for startups or smaller businesses who may not have the budget to stretch to Google Ads.
Google testing new format for Headlines in top ad slots. Looks like Headline 1 is much larger than Headline 2. Doesn’t appear to be happening for lower ad slots outside of top 4 @rustybrick#ppcchatpic.twitter.com/WEncaF1vO5
Google already gives searchers the opportunity to ask local businesses questions through its Local Q&A box. They appear to be, however, testing putting this box outside of the panel for each business. It is still unclear why they are considering making this change, but it could be to allow more general replies as well as locally focussed ones.
It is important, therefore that businesses ensure that their Google My Business account is kept as up-to-date as possible.
One of the biggest issues with using search engines is that sometimes searchers get content out of context. The ‘For Context’ section that has been tested mainly in conjunction with news, displays links to other articles that can provide extra context.
In terms of your business’s SEO, this means that you would be able to have more opportunities to get your content visible, but also makes getting your content accurate and relevant even more important.
new feature within the top stories carousel "for context"
In a similar way to the ‘For context’ section, Google was also testing an ‘Also in the News’ section. This also appeared underneath the news section and appears to link other news articles that are related to the original story. The main difference between the two sections is that the ‘Also in the News’ stories can be older and not as recent.
The question that arises from this is whether or not your content needs to be indexed in Google News, in order for your site to appear underneath within this section.
Google has also been testing out some other aspects of their SERPs. Some tests have included the use of thumbnail pictures in the results page as well as in the search suggestions box, and thicker grey lines between the search results. Of course, these potential changes may be to help searchers to understand the results quicker, but there may be other reasons for them.
Google is continuously looking to update the services that they are offering both searchers and the businesses that use Google to attract people to their websites. This means that it is more important than ever to ensure that your website is visible in searches relevant to your business. In order for this to happen, it’s vital that the SERPs are observed on a regular basis, so that you know when it’s time to adapt your SEO strategy and optimise what you can to make the most of the latest features.
Google recently announced that they will make the switch over to mobile-first indexing from March 2021, changing the way that website rankings are calculated. They originally planned to roll out these changes in September 2020, but the pressures of the coronavirus and the uncertain times we’re living in have pushed back the deadline.
Once this change happens, it’ll be more important than ever to have a mobile friendly website that delivers the best possible experience to your website visitors.
Here’s what you need to know about mobile-first indexing so you can protect (or even improve) your Google rankings, and help safeguard the future of your business.
What is mobile-first indexing?
‘Mobile-first indexing’ simply means that Google will use the mobile version of your website first for indexing and ranking.
However, despite what you might have heard, it won’t ignore your desktop website completely, or create an entirely new index when it does this. It will simply be a switch in focus that aims to deliver the best possible experience to 21st century users who generally spend more time on their mobile devices than on desktop.
Traditionally, Google primarily focused on the desktop version when calculating the rankings as it was presumed to be the ‘main version’ of the website. And for many years, it was.
However, over the recent five years, the use of mobile devices, such as smartphones and tablets, has increased, encouraging Google to make these changes.
As Google said back in 2016, “…algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results…”
Why does mobile indexing matter?
As we’ve just mentioned, mobile indexing is becoming more important than ever because of the huge increase in mobile searches from smartphones, tablets, and other devices that continues to happen.
According to Statistica, “As of the first quarter of 2020, it was found that mobile devices accounted for 56 percent of organic search engine visits.”
Therefore, if your website isn’t working optimally on mobile, your customers are unlikely to be getting an optimal experience when they visit your website.
They might get frustrated if links are too small to click, the page is unresponsive, or they simply can’t navigate your website as effectively as they can on a desktop computer. As a result, your bounce rates will increase, the image of your business will decline, and your customers are unlikely to want to come back. Not exactly ideal when you’re running a business, is it?
Additionally, once Google implements this mobile-first indexing, your website rankings are also likely to suffer if you don’t have a mobile-friendly version in place.
How can you improve your website for mobile indexing?
If you can make your website as mobile-friendly as possible, you’ll significantly improve your overall performance as a business. Your website visitors will be able to find the information they need quickly and easily, customer satisfaction will improve, and you’re much more likely to rank highly. Here are some tips that can help:
Make sure you have a responsive website
Above all else, you should make sure your website is responsive. This simply means that it will adjust according to what specific device is used to access your website, to help the reader enjoy the best possible user experience. If you don’t do this, it’s likely you’ll be hit hard by the switch to mobile-first indexing.
Focus on mobile page speed
Mobile page speed and load time are key factors that won’t just influence your Google rankings, but they’ll also affect how your website visitors experience interacting with your website. After all, if it takes too long, they’ll just get frustrated and click away.
“85% of mobile users expect pages to load as fast as, or faster than, they load on the desktop,” say leading technology service, Radware, adding that, “Two out of three smartphone users say they expect pages to load in 4 seconds or less.”
For that reason, you should work hard on improving your load speed before the mobile-first switch over happens. An easy way to do this include running a speed test in Google Search Console. The test will also highlight what you could improve so you can make the changes quickly and easily.
Create high quality content
If you’ve been reading our blog for a while, you’ll know that you should be consistently creating high quality website content if you want to rank highly. When the switchover to mobile-first indexing happens, you should also check that the mobile version of your website contains the same content as your desktop. Also check that you’re using ‘alt-attributes’ for your images.
Is your mobile site accessible to Googlebot? If you’re not 100% certain, you should use the robots.txt testing tool to find out.
Think about mobile user experience
You should also take time to consider how the elements on your page will look on mobile, as well as desktop. Choose your images carefully to ensure they don’t dominate the page, make sure any buttons are large enough to be clickable, make sure the text is large enough to read, and so on. It’s always worth viewing your website on different devices to see how the elements interact for your users.
There’s no need to be concerned about the switchover to mobile-first indexing. If you have a responsive, mobile-friendly website, you shouldn’t be too affected once it finally rolls out.
Otherwise, now is the perfect time to implement those tweaks and changes that will improve your website and help you maintain or even improve your rankings in 2021.
Earlier this week, Google overhauled their help document for Google Discover to make it clear what the web feature is and how it can supplement regular searches.
If you don’t already know or haven’t used it yet, Discover works by recommending pieces of content to users based on their previous Google searches. It doesn’t work in the same way as organic search as it isn’t based on something that a person has typed into Google, and searched for at a given moment in time. Instead, it takes a more long-term view of what you might like so it can send the right information your way.
What’s especially interesting with the help document update is the fact that they have added a section that refers to a key website ranking factor called ‘E-A-T’.
It reads:
“Our automated systems surface content in Discover from sites that have many individual pages that demonstrate expertise, authoritativeness and trustworthiness (E-A-T). Those looking to improve E-A-T can consider some of the same questions we encourage site owners to consider for Search.”
This highlights again the importance of following the ‘E-A-T’ principles, and suggests that if you do so, you are also more likely to be featured on Google Discover.
Here at Fibre Marketing, we spend a lot of time working on E-A-T for our clients. That’s why today we’d like to explain more about how E-A-T works and what you can do to improve your website with this in mind.
What is E-A-T, anyway?
The acronym ‘E-A-T’ stands for expertise, authority and trust.
These three characteristics are what Google looks for when it evaluates the quality of your website. They’re becoming increasingly important when it comes to getting your web pages to rank highly.
Expertise means that you should be knowledgeable and capable in your chosen field (and often have the credentials to back yourself up.)
Authority means that your website and its authors are respected by others in the industry.
Trust means that people should feel safe and secure on your website.
By using these characteristics to measure a website, Google can find the best quality content for its users and can avoid falling for some of the spammy SEO tricks used in the past.
This gives users a better experience, allows websites to showcase that they have to offer, and helps Google work better.
Although there isn’t currently a direct E-A-T score as such, these factors underpin everything else to do with rankings.
Therefore, if you want to get noticed online in 2020, you must focus on building and showing your expertise, authoritativeness, trustworthiness and provide the best possible user experience.
How can you improve your E-A-T scores?
The following tips can help this to happen:
1. Create great content
As we’ve explained many times on the blog, one of the best ways to improve your SEO and increase rankings is to consistently create high quality content.
By ensuring that it’s relevant, regularly updated, free from typos or grammatical errors, comprehensive and trustworthy, you will demonstrate your expertise in your field. Your website visitors will also enjoy the experience and want to come to your website again and again.
They’ll be more engaged, they’ll be more likely to be loyal to your brand and they’re more likely to make a purchase in the future.
If you want to build your website authority, Google needs to see that you have connections to other knowledgeable and authoritative figures in your field.
The simplest way to do this is to attribute your website content to an author by linking their name to the top of the page. You could also use an ‘author box’ at the bottom of the post that includes their name, biography, photo and a link to their website if possible.
This won’t just help your website to gain authority but will also help build the writer’s reputation too so it’s well worth taking the time to do.
3. Link to reputable websites
If you’ve done any link building to help build your site authority, you’ll know how beneficial it can be in terms of your SEO and web rankings. By linking out to authoritative websites within your niche, you’ll associate your site with it while backing up your knowledge.
However, it’s important that any site you link to is reputable and offers high quality content. The last thing you want is to link to a spammy site that ends up causing you trouble or even gets you blacklisted.
If you do find any unwanted links, you might want to get rid of them via a process called ‘disavowing’. Read this article to learn more about the process.
4. Improve your website security
Having a secure website isn’t just an excellent way to show your website visitor that they can trust you with their details.
Google also treats website security as a key ranking feature these days as it demonstrates that you take user experience seriously and are therefore more likely to provide a high quality experience.
The quickest way you can do this is to switch over to https as it adds encryption to your site to prevent hackers from accessing your information.
There are also various pieces of security software and security certificates you can add to your website to keep it safe from hackers, spam and unwanted visitors. Speak to your web master or host to discuss the options available to you.
According to SEO experts, Search Engine Watch, 82% of customers said they’d leave a website if it wasn’t secure so this is a tweak you need to make to your website sooner rather than later.
5. Focus your content
Instead of writing website content spread across a variety of topics, get focused.
Learn more about what your audience wants when they visit your website then create content that answers their questions, provides solutions and adds maximum value possible.
When you do this, you’ll be showing your audience that you understand and care about their problems and can provide a solution. Not only does that leave them feeling happy with their experience on your website, it builds your authority, trust and brand image at the same time.
Because this content is almost certain to include a variety of targeted and natural keywords which centre around a particular topic, Google are also more likely to see you as an authority in your field- another win.
Summary
The recent updates to the Google Discover help document demonstrate that E-A-T is more important than ever when it comes to SEO and improving your rankings. Keep working on your website expertise, authority and trust and you’ll see excellent results.
Imagine firing up your laptop and discovering that your website has disappeared completely from the search rankings.
Despite all of the hard work you poured into boosting your organic rankings and building your brand, your website is simply nowhere to be found.
You’re back to square one.
That’s exactly what could happen if Google issues a manual action against you.
Luckily, these ‘website red cards’ aren’t issued often or without careful consideration. Nor do they always have such a devastating impact.
But given the strict penalties, it’s vital to understand what Google’s manual actions are, why they are issues and importantly, what you can do if the worst should happen.
Here’s our short guide.
What is a manual action?
Manual actions are penalties that are given out by the humans at Google.
They’re given to websites that are using unethical practices to boost their website rankings and disregarding the Google Webmaster Quality Guidelines.
These penalties can affect individual pages or even entire websites, and can cause them to drop significantly in rankings or even disappear completely. This happens quickly – often overnight.
They’re a serious problem that can have a profound effect on your business.
How do websites get a manual action?
Provided you have created a high quality website and you aren’t using those underhand, spammy SEO practices that aim to cheat the system (often called black hat SEO), you shouldn’t ever be issued with a manual action.
Having said that, sometimes Google can issue a manual action as a result of other people’s behaviour such as dodgy links that point towards your website, blog or forum spam comments or even a hacked shared server. This is why it always pays to be aware of potential problems and monitor your website consistently.
With this in mind, here are some of the reasons why these manual actions are issued:
User-generated spam: This can come from spammy, self-promoting blog comments, forum posts, or other user-created content.
Using a free host: If infected with spam, shared servers can cause Google to issue a manual action, even if your website itself isn’t infected.
Unnatural links to your site: Unusual links that appear to be manipulating page rankings will be penalised.
Unnatural links from your site: Likewise, unusual outbound links could also be creating spam or point to disallowed practices.
Cloaking and/or sneaky redirects: Giving users a hidden page or image instead of the one submitted to google or redirecting users to a page they didn’t click on.
Pure spam: blatant use of spam.
Hidden text and/or keyword stuffing: Over repeating keywords or hiding them on the page.
Sneaky mobile redirects: When mobile users are directed to a different URL than clicked on.
It’s important to quickly note that you won’t be issued with a manual action when a user manually reports a website for spam.
According to Gary Illyes’ post on the Google Webmaster blog last Friday,“…we use spam reports only to improve our spam detection algorithms,” adding that although spam reports play a helpful role, there’s an inefficient way of detecting spam.
How do you know if you’ve got a manual action on your website?
If you have a manual action on your website, you’ll immediately receive a message from Google via Google Search Console. This will tell you what the problem is, which page or pages have been affected, and what steps you can take to resolve the problem.
If you haven’t received a message or haven’t noticed one, you can also check whether your website has a manual action against it by looking at your Google Search Console.
What should you do if you have a manual action?
If you discover that you have a manual action, don’t panic! Although it’s certainly not great news for your website rankings, you can fix the problem and then appeal to Google to reconsider.
Here are the steps you should take if this is the case:
1. Read Google’s message carefully
Before you do anything else, take your time to read the notification message carefully so you understand why your website has been penalised. Often, you’ll see how you can solve the issue.
2. Understand the problem
If the reason for the problem isn’t immediately clear, you will need to gather and evaluate key data so you can identify the cause. Depending on the size of your website and the issue in question, this can take anywhere from a few hours to a few weeks.
The data you’ve gathered in the previous step should have identified what is flagging Google’s system and causing you problems. Now you need to resolve the issue by following one or several of the following steps.
User-generated spam
Check your website for malicious content then delete it. Ensure all comments and content submitted to your site are moderated.
Using a free host
Contact the hosting company to inform them of the problem and consider moving to a secure host instead.
Unnatural links to your site
Disavow any links that appear suspicious.
Unnatural links from your site
Remove excessive links or low-quality links and use tags to identify where there are affiliate links.
Cloaking and/or sneaky redirects
Fix any pages that contain code, content or sneaky redirects that are hiding content.
Pure spam
Check Google’s Webmaster Guidelines and remove spam.
Cloaked images
Ensure that both Google and your users can see the same images.
Hidden text and/or keyword stuffing
Remove excessive use of keywords and hidden text including in the html and CSS of your site.
Sneaky mobile redirects
Remove any redirects and check for malware or hacking.
4. Ensure full company compliance
If the manual action was issued because of a team member’s mistake, take steps to prevent the same happening again in the future.
Call a meeting and ensure that everyone understands why it happened and its severity then issue clear guidelines to safeguard your business.
5. Request a reconsideration
Finally, once you’ve fixed the problem you should request a reconsideration, including detailed information on how you fixed the problem and whether your website was hacked. Don’t request one before you’ve fixed your website as you’ll only be wasting Google’s time and your own.
At this stage you’ll need to be patient because it can take them anywhere from a few days to a few weeks to process your request and make updates if relevant.
Once Google has come to a decision, they’ll let you know whether your request was accepted.
When will my website recover?
For obvious reasons, a manual action affecting just one page on your website will be easier to recover from than problems with your entire site.
Generally speaking, it can take anywhere from a few weeks to a few years to recover from a manual action. You’ll need to provide useful, valuable content to your users and rebuild trust so you can move up the rankings again.
To round up, Google manual actions can cause huge problems for your website rankings and even cause them to disappear completely.
The good news is that they aren’t routinely issued and are never the result of a spam report issued by a user. By adhering to the guidelines and fixing any issues, most businesses can avoid ever receiving one.
In SEO, nothing stays the same for long; a link strategy that works one day may not work the next. Google originally launched its nofollow link attribute in 2005 in an attempt to stop comment spam and untrusted links. The Googlebot didn’t crawl, index or rank internal and external links, but ignored them completely. The rel=“nofollow” attribute excluded a link from the search algorithm, and that was that.
In September 2019, Google revealed that the nofollow feature would begin weighing in on how Google would rank the site, i.e. the content, links and anchor text would feature in spam measurements, but more as a “hint” than an explicit directive. With this update Google could now recognise content and anchor text, and follow the links they deemed necessary, regardless of nofollow. Importantly, Google claimed it still would not crawl or index these links in any way. With this update we were also introduced two new related link attributes, rel=“sponsored” and rel=“ugc” (where UGC is user-generated content). Both of these contain more detailed information about the link.
What each attribute means
rel=“nofollow” applies whenever you want to link to another webpage but don’t want to endorse or give any credit to that page in the process. This attribute basically allows Google to treat the untrusted or spammy link, and will likely be ignored.
rel=“sponsored” is to help Google recognise those links that you’ve deliberately added for advertising, sponsorships, or other paid agreements.
rel=“ugc” tells Google what you consider user-generated content, for example legitimate comments on a blog or a forum post.
It’s possible to give a single link more than one attribute, for example rel=“nofollow sponsored.” However, the impact of a link will be reduced if it’s not an accurate description of the link, for example if you’ve chosen rel=“sponsored” for a blog comment.
The March 2020 update
After March 2020, Google will treat the nofollow attribute as a hint for conducting a spam analysis, and possibly for rankings, depending on just how relevant and high-quality the links were. If Google feels that, after crawling the surrounding content and anchor text, the link is in fact relevant, then it may crawl the link despite the nofollow tag.
What’s the confusion?
Google claimed in its update that links would not be ignored if they had any of these three attributes; rather, the attribute would give Google algorithms a “hint” about the kind of link it was seeing. In other words, data is still being collected from those links to track and rank the link scheme, ultimately having the potential to improve the overall website search ranking. This can be confusing unless you understand exactly why Google has made these updates.
The reason for this feature is to constantly discourage spammers – Google attempts to identify spammy, unnatural link patterns so these links can be properly omitted. The only way to do this is to gather the data for these links so Google can correctly recognise and categorise them. By calling these features a “hint” Google can still access the important data it needs.
To add to the discussion, John Mueller recently stated that all guest blog post links should also have at the minimum the nofollow attribute on them, regardless if they’re paid or not, and irrespective of how natural the link seems (for example, if you’ve put a link in the byline). This claim divided the SEO community as some links in guest posts are included for natural reasons, such as genuinely using a site for research purposes, or the link is included in an author’s bio. People also felt that, if a change like this were ever to be implemented, Google would be trying to have a larger claim over the internet.
Overall the mixed reaction to this change is yet to go away, with questions frequently appearing across social media platforms and forums. Furthermore, Google’s Gary Illyes mentioned in June that more nofollow link changes may be on the horizon.
What this means for your search results
Google will use these attributes to gather data to feed into its search ranking schemes, tagging links with information that marks it as excluded or included in searches. Though Google claims there should be no significant effect on search results, some people are naturally worried. It’s important to remember that even after the March update, the attributes are still just hints.
What should you do?
You don’t have to make any changes, if you don’t want to. You could opt to do nothing and keep using nofollow attributes as you have been. You could always consult your SEO team if you’re concerned about a few key links, however. For paid links, you will need to keep a nofollow attribute or amend it to a sponsored attribute or a mix of sponsored and nofollow.
The topic of disavowing links has always been a hotly debated one, with SEO experts disagreeing on whether to disavow “bad” links, and how this might affect site rankings on Google. Though the extent is debatable, ranking algorithms still do factor in link quality, so a poor link could very well damage search rankings. With all the changes and challenges bought by 2020, we are finding it wise to regularly audit links for their relevance and quality. Let’s take a closer look at how and why to disavow backlinks – and how to determine whether you need to do it at all.
A brief intro on disavowing
In the past, Google ranked pages by link quality via PageRank, but this system was quickly exploited. The Penguin algorithm update in 2012 aimed to discourage sites with an abundance of spammy links. These sites could only recover by removing those bad links, and hence Google’s Disavow tool was born. Later, Penguin 4.0 saw an important change: Google wouldn’t penalise low quality links so much as devalue them, meaning you only needed to disavow a link if you’d received a notice to this effect, i.e. a “manual action.”
Fast forward to today, and Google maintains that disavowing links will not do much to help your site’s ranking. The unofficial position, however, is a little more ambiguous, with Google’s John Mueller claiming that disavowing may in fact benefit some sites. Ultimately, disavowing can help in some cases, but Google makes it rather difficult to use the necessary tool – and going too far can definitely backfire.
Understanding manual actions
You can find out your site status by looking in the Google Search Console under “manual actions.” This is simply when Google lets you know that it plans to omit/penalise certain lower quality content from search results. Penalised links can include anything from keyword stuffing or hidden text, unnatural or inorganically received links, automated links, PBNs, comment and forum spam, influencer-inspired/paid links, suspicious redirects, thin content and the like.
Is it a good idea to disavow?
As with most things SEO-related, it all depends. Just how low quality are the links and how much are they affecting rankings? A good rule of thumb is to only disavow when you literally have no other option. For example, first reach out to the owners of low-quality links pointing to your site and ask for them to be removed, using disavowal only as a last resort. Ultimately, it’s an advanced feature that should be used carefully since it could cause more harm than good.
What if I don’t have a manual action?
Before you do anything, you need a comprehensive link audit to properly understand how your links are performing. Rest assured that Google will ignore all but those links that your SEO team are directly responsible for. In appraising link quality, look for anything that violates Google’s terms, such as paid links, link schemes and reciprocal linking/swapping, suspicious anchor texts, articles with links to dodgy sites, malware and cloaked sites, or “pills, poker and porn.”
If these links don’t generate revenue and also don’t affect your organic search traffic, then the solution may be as simple as removing the page completely.
Tips for effective disavowing
Naturally, remove any links you’ve received a manual order for.
Focus on creating links with value to human readers, and which will boost your site’s domain authority and trust.
Unless you’re ultra-confident with SEO, use a backlink monitoring tool or consult an SEO team for expert advice on exactly which links to disavow as part of your routine site audit.
Check that you’re receiving links from high-authority, industry relevant sites.
Links to expired domains or overly regional content may slip through ordinary filters, so keep an eye out for them.
Generally, the most likely culprits are links that have over-optimised anchor text, links that are not industry relevant or links that seem spammy, so start with those.
Don’t assume that removing any old link will automatically improve rankings – it may just do the opposite.
Finally, there’s plenty of wiggle room, and many links will fall in a generous grey area.
The best strategy
Overall, your focus should be on maintaining the highest quality outbound links possible and disavowing those that are obviously and significantly harming your organic search results – otherwise tread carefully. Keep an eagle eye on any incoming links that could be harming your site’s reputation. Even scrupulous link audits can miss a few bad links, but this relatively small risk should be weighed up against your overall SEO priorities, the risk of disavowing incorrectly, and your marketing strategy in general.
This means that if your website visitors don’t have a great user experience (UX), your Google rankings are likely to suffer after this new update is rolled out.
In this short article we’d like to explain more about what the Google Page Experience update is, why it matters and how you can prepare your website for the changes so your search performance isn’t affected.
What is page experience?
Think back to when you last visited a website and found yourself getting frustrated.
Perhaps the page took ages to load. Or annoying pop-up adverts and opt-ins kept getting in your way. Maybe you could barely read the content because it was designed for desktop use and was tiny.
When this happened, you almost certainly just gave up, clicked the back button and found another website instead.
In this case, the problem wasn’t that the website didn’t offer the right information. It may very well have. The problem was that you didn’t have a good experience using it.
At this stage, it’s unclear to how much the Google page experience update will affect your search performance. We’ll only know for sure when the update will be rolling out in 2021.
Having said that, the metrics which Google will be focusing on with this update do already have an impact on your search performance, but in an organic, audience-led way.
As we mentioned at the beginning, visitors are less likely to stick around on your website if they can’t read your information on their mobile, annoying ads dominate the screen or it takes ages for your page to load.
If you haven’t already optimised these factors, it’s more than likely that your website won’t perform as well as it potentially could, regardless of the update.
It’s also important to note that the page experience update factors won’t be the only ones used in ranking your website. Producing engaging, useful content is still the most important factor.
How to prepare for the Page Experience update
Although there aren’t currently any specific tools that can measure page experience, you can identify what needs improvement and make changes with several other tools. These will help you improve your website and get ready for the update.
1. Optimise your site for mobile.
If you’re running a business in the 21st century, you must make sure that your website is mobile friendly.
According to market and consumer data provider Statista, around 52% of traffic in 2019 came from mobile devices. This figure looks set to grow over the coming years, especially as the online market expands as a result of COVID-19.
2. Make your site more secure
These days you must have a secure website if you want to succeed in the online world. You need to keep hackers away, you want to protect your customer’s sensitive personal information and you want to provide users with a safe user experience.
When you can do this, your customers will trust your business, your search rankings are likely to improve and your website will be safer.
There are many ways you can do this, including getting an SSL certificate and switching to https. Both encrypt your website, keeping threats out and helping to boost your SEO rankings.
3. Check your site for security issues
Most businesses will already know if they have been hacked, contains malware or contains content that is deceptive or could potentially harm a visitor or their computer.
However, you can check if your site has any issues that could influence page experience, user experience (UX) or search rankings by using Google Search Console to access a security report.
4. Review how you use ads and images
Using ads and visuals on your website will drive customer engagement, improve the user experience and help boost conversions.
However, it’s vitally important that these don’t detract from your content and provide a poor user experience but add value.
“Pages that show intrusive interstitials provide a poorer experience to users than other pages where content is immediately accessible. This can be problematic on mobile devices where screens are often smaller,” say Google Webmasters.
Check through your website pages and ensure that any visuals used such as images, infographics, and ads don’t dominate the page. Always opt for high quality, professional standard images instead of stock images.
If you do use pop ups to offer opt-in gifts or encourage newsletter sign ups, keep them to a minimum, avoid using them for mobile and make them easy to close.
As we said in the blog, your page load speed makes a huge difference when it comes to user experience, your Google rankings and attracting website visitors and it can be relatively straightforward to improve.
“Longer page load times have a severe effect on bounce rates,” say Google Webmasters, “If page load time increases from 1 second to 3 seconds, bounce rate increases 32%.”
Start by running a speed test in Google Search Console, then read through their advice and optimise the elements suggested. This can include using a design that loads quickly, optimising images, and minifying CSS, JavaScript and HTML.
Summary
To summarise, page experience won’t become a direct ranking factor when the page experience update happens. But combined with other factors, it will influence your rankings overall.
So make sure your site is optimised for mobile, make site security a priority, review your images and opt-ins, and improve your page load speed. You’ll be better placed to maintain your rankings and make your website a more user-friendly, professional place for your customers to visit.
Several months into lockdown and the country is still cautiously trying to navigate its way into the post-COVID world. Some industries have been massively and permanently altered by the pandemic, while others are scrambling to find creative ways to stay afloat in changing times. How has the SEO world been affected, and more specifically, what lies in store for SEO workers and for the way businesses show up in searches?
Google’s Gary Illyes recently set up four Twitter polls and asked SEOs around the world how the virus had impacted their work. Perhaps unsurprisingly, the results were mixed, with some reporting barely any change and others reporting massive differences.
Almost half claimed that the pandemic actually increased their SEO-related workload; just 20% claimed it had decreased, and 30% noticed no change. When asked whether it was harder to convince decision makers on the value of SEO services, the results were again mixed – 37% said it had gotten harder, but 31 and 32% said it had gotten easier, or there had been no change, respectively.
Did this happen because SEO workers were pitching differently during the crisis? When Illyses asked about this, the result was split three ways, with roughly a third each claiming they spent less time, more time, or roughly the same time pitching as before. Perhaps the most revealing result came when people were asked about their experience working with developers on SEO projects. A full 55% said it had stayed the same, 30% said it got harder, and only around 15% said it had gotten easier.
What are we to make of these results? As with most things in SEO, there is seldom a simple, straightforward answer. Polls like this go to show that SEO’s range is so broad that it can be challenging to pinpoint trends sometimes. In the end, exactly how any one industry or business is affected will depend heavily on their marketing strategy before the pandemic, the nature of the business, and how swiftly that business responds to the new challenges.
The virus has changed what people search for. For example, e-commerce is experiencing increases across the board, but less so for more non-essential items. Health and wellness sites are seeing traffic boosts as are some recipe sites, but the travel industry is a little more complicated – searches for flights may be up, but could be simply due to people seeking cheaper deals. Most restaurants are having to pivot into home delivery (making things like Google Posts useful for updates).
Ultimately, the answer to how the Coronavirus has affected SEO marketers and their clients’ campaigns is: it depends. The virus has affected every industry differently. The job of any SEO expert is to understand these unique changes and respond accordingly.
What have we been doing at Fibre Marketing?
When the pandemic hit, we had to adapt quickly and as efficiently as possible. Our client portfolio spans across a variety of industries that were affected in different ways. Because of this, each strategy we crafted was unique and adaptable, as the growing uncertainty resulted in almost daily updates and advice on what to do next.
Here’s a breakdown of some of the ways we helped navigate our clients through the Coronavirus crisis:
Google My Business Listings (GMBs)
One of the first major changes to search was the sudden release of GMB features, which allowed businesses to inform customers of their situation more accurately, as well as holding back on others. The option to mark your business as ‘Temporarily Closed’ was long-awaited, so we implemented this feature across clients’ GMBs ASAP. We also published a post for each one to ensure their customers and clients that they were operating smoothly, and how they were opening/closing.
As well as this, we made sure that any contact details were updated so that customers could reach the client as easy as possible – especially after Google temporarily suspended the posting of new GMB reviews.
As the pandemic changed search behaviour, we needed to ensure that our clients were making the most of any opportunities that arose. Monitoring the rapidly evolving trends and search frequencies, we worked with our clients to create content that suits the users’ interests, answered their questions, and capitalised on new, untapped searches to bring in organic traffic.
This was done in the forms of dedicated COVID-19 sections, which included resources designed with customers in mind. For our elderly care client, this section included: arranging support during lockdown, how carers and customers are keeping safe, and even a video on setting up a Zoom account so family members can stay in touch with their loved ones.
A shipping container client’s section is catered towards medical facilities that urgently require extra storage or modular hospital buildings. The content section explores the options available, offering discounts to both medical and charitable organisations. The portable hospital page, which is linked in this guide, now ranks 2nd for the term ‘portable hospital.’
Link Building
As more and more news updates, theories, and advice emerged daily, the type of content that users found valuable changed drastically. So, we had to quickly change our link building strategies to ensure that we’d continue building great backlink profiles.
To do this, we shifted the focus of our content to suit users’ demands and needs, including working from home and mental health. We also monitored Twitter for story requests from journalists and bloggers, recommending our clients as a valuable resource for their articles. For example, a journalist was on the hunt for recruitment experts who could provide tips on writing CVs. One of our clients – an engineering recruitment company – had a blog article already on their site, providing the information the journo wanted, so we reached out, shared our blog with her, and thus gained a link on a top-tier site.
This wasn’t the only campaign we created. Other campaigns included a guide to making care packages, a dedication to home carers to name a few – all of which positioned our clients as experts.
While we await the total for 2019, it is still clear to see that, when it comes to managing your website, you should not sit idle. Studies have shown that between 70-80% of users research a business online before making a purchase, meaning that being found online is vital for your company. As Google continues to roll out algorithm updates, it’s important to adjust your website as needed in order to boost your search visibility and gain first page rankings.
But with so many changes being made throughout the year, it can be hard to keep up. So, we’ve compiled a list of all algorithmic changes of 2020 so far, including the winners and losers, which we’ll update as time goes on.
January 13th – January 2020 Core Update
The first update of the year was announced by Google on January 13th, rolling out across the world and affecting all languages. Tracking tools showed high volatility for three days, with Google confirming that the update was ‘mostly done’ rolling out on the 16th.
Core updates do not target a particular industry, and anyone can be affected. They are designed to improve how Google’s systems rank content. The search engine giant explains this in more detail on their blog:
“…imagine you made a list of the top 100 movies in 2015. A few years later, in 2019, you refresh the list. It’s going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before. The list will change, and films previously higher on the list that move down aren’t bad. There are simply more deserving films that are coming before them.
Winners and Losers
Towards the end of 2019, most updates saw YMYL (your money or your life) sites as the most affected. This update was no different, particularly for the health and finance sectors. Very Well Health was one of the top winners, according to multiple sources, as was Yahoo Finance. A few dictionary sites came out triumphant, too. Meanwhile, a further group of healthcare sites fell victim.
Overall, this update has been duped by many as large and fierce, causing great tremors throughout top ten rankings.
January 23rd – Featured Snippet Deduplication
While this wasn’t exactly an algorithm update, the deduplication of featured snippets was still a considerable change that majorly affected websites’ click-through-rates (CTR).
Previously, featured snippets (pictured below) were counted as their own, stand-alone search engine results page (SERP) feature, not an organic search listing. If a site obtained a featured snippet, the same URL would also appear in the listings below as an ordinary listing.
On the 23rd, Google announced that, if a URL is featured in a snippet, it would not appear on the first page of search results. Thus aligning this SERP feature with Google’s claim that a featured snippet is an organic entity, counting as position number one. Before this deduplication, a featured snippet counted as position zero.
Winners and Losers
In this case, there were no obvious winners and losers to be precise, but many sites did report losses of traffic. This deduplication led to many site owners having to decide which they’d rather lose: a first-page ranking, or a featured snippet. Moz tried conducting a CTR study to see which loss would have a bigger impact, but unfortunately, it is impossible to decipher if clicks to a URL were from the featured snippet or the organic listing.
Overall, this major change was met with confusion and outcry and sparked much discussion over the future of featured snippets.
February 9th – Unconfirmed Search Ranking Update
While a core update has been denied by Google, we have decided to include it in this blog as there were significant amounts of chatter and tracker tools were off the charts for five days – longer than the standard algorithm updates.
From the February 9th, discussions of a suspected search update started to arise on Twitter and various web forums. SEO spokesperson Barry Shwartz reported the fluctuations as ‘really big, maybe even massive’ changes that were taking place. Many sites experienced severe traffic drops and spikes.
On the 13th, Google’s Danny Sullivan stated, ‘We do updates all the time’ in response to this speculation. This suggests that algorithmic changes were made during this time, just not on the same scale as a core update.
Winners and Losers
Unfortunately, it’s hard to decipher a clear sector that either benefitted or suffered the most following these changes. The suggested update occurred across the globe, and there were both winners and losers in a range of industries. There was even some speculation that several updates actually took place, as many of the sites that saw drops then experienced traffic increases a few days later. The one thing that was certain following the confusing five days was that some sort of changes were made – it’s just unclear what exactly those changes were.
May 4th – May 2020 Core Update
The second core update of the year began rolling out on May 4th and appeared to have mostly ended by the 7th. Rank Ranger dubbed this update as an ‘absolute monster’ as the effects appear more brutal than those which occurred in January. Furthermore, this update took place amidst the Coronavirus pandemic, which had already significantly affected a wide range of sites and caused a change in search patterns.
Winners and Losers
Unlike many other updates, the May core update appeared to have less focus on typical E-A-T areas and was broader than usual updates, making it harder to detect a clear winner or user. Marie Haynes claimed that Google is getting better at recognising which links are ‘truly recommendations for your content and which are self-made for SEO reasons.’
It was interesting to note that Spotify took a hit, despite it progressing steadily over the last few years.
As of today, 15th May, information surrounding this update is still rolling out, and so we will continue to update our blog with our findings.
June 23rd – Suspected Federal Update
On June 23rd, there were reports of another algorithm taking place, although Google is yet to comment. The SEO trackers were showing high volatility in the SERPs, and many reported significant gains.
What’s interesting about this update is that it seemed to target .gov sites, along with other YMYL industries, a trend which appears to be mirrored across the globe. This pattern is what led Barry Shwartz to label is update as ‘The Federal Update.’
Winners and Losers
.gov sites were not the only winners of this update; health and news sites were also affected. But it is clear to see that government sites were the ones impacted the most. It’s possible that Google now prefers these sites over other domains, due to their authority. Authority is part of the E-A-T metric that Google refers to in its Quality Rater Guidelines, so it’s only natural for .gov and .org sites to experience a boost in search visibility.
July 6th – Small Search Ranking Update
A small search ranking update may have taken place on 6th July, lasting until the 8th. While there wasn’t as much chatter surrounding this update as the last, some webmasters still detected some changes in their site traffic. This algorithm update targeted niche websites, and our SERP volatility tracker reported nothing more than a slight increase in movement amongst rankings.
Winners and Losers
It’s harder to say who the winners and losers were with this update, as only a small number of sites were affected. Education sites reportedly saw some movement, but it’s highly likely that a particular niche of sites was amongst those most affected, due to the small amount of volatility.
July 24 – Suspected Search Ranking Update & What SEO’s Get Wrong About Google’s Updates
Not long after the previous suspected update, another one came along. Only this time, it was much bigger, with our volatility tracker showing fluctuations amongst the SERPs for most industries.
To add to this, a few days later (28th July), Barry Shwartz published an article on Search Engine Land exploring what SEO’ers often get wrong about Google algorithm updates. Shwartz promotes the idea that we should stop obsessing over what Google is changing each time an update rolls out and focus on our websites instead. This is because most people blame their drops in traffic on updates, when, in reality, it’s because their websites usually aren’t up to scratch. The time spent trying to crack the algo updates should be spent on improving content, UX, link profiles and other ranking factors.
This coincides with a statement Google made in August 2019. They stated that, when it comes to improving your site following updates, ‘There’s nothing to fix.’ Nothing really changes when Google rolls out updates, only how they assess content. The search engine giant then followed this up with ‘We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.’ Shwartz’s advice, therefore, aligns with Google’s own – but what do you think?
Marie Haynes chimed into this discussion, offering a few tips to help you decipher if any traffic losses are the result of a recent update. You should start by looking at any recent changes made to your site, as well as the number of pages that were affected. If a noticeable drop in traffic occurred within 48 hours of an algorithm’s start date, then that was likely the cause.
Winners and Losers
Many industries experienced some fluctuations, according to our volatility tracker. The sports, news and arts & entertainment industries saw drastic changes. According to SEMrush, victims included WorkingTitleFilms.com, Marvel and Talk Sport, whereas winners included GuitarWorld, Teen Vogue and TheStage.com.
August 10th – August 11th – 2020 Google Glitch
On Monday afternoon, Barry Schwartz reported one of the largest Google updates many had seen for a long time. He commented that it looked like there was “a big Google search ranking algorithm rolling out”, although it was not confirmed by Google.
By Monday night, many in the SEO community were noticing huge discrepancies in search results. Content that was once high-ranking suddenly seemed to drop several pages, while other low-ranking sites started popping up in the first pages of the SERPs.
However, it was soon revealed that this was not an update, but in fact a glitch.
On August 11th, Google commented on the glitch, “On Monday we detected an issue with our indexing systems that affected Google search results. Once the issue was identified, it was promptly fixed by our Site Reliability Engineers and by now it has been mitigated.”
Winners and Losers
Technically, there weren’t really any winners or losers in this scenario – as it was a glitch – but it was certainly interesting to see which sites temporarily ranked higher.
According to Marie Haynes, sites who experienced a boost in traffic included medical articles that did not include supporting references or links to authoritative sites. These articles referenced natural remedies that were not backed by scientific facts. Haynes also commented that many of the sites that started ranking high included unnatural links.
But, on August 15th, it appeared the glitch trauma was not over. Strong amounts of chatter picked in the SEO community following significant changes being reported in the SERPs, similar to those seen during the initial glitch. But by the 16th, things seemed to have calmed down, leading to speculation of another issue from Google’s side.
The amount of fluctuations were captured by our volatility checker, which was off the charts throughout the 15th.
If this is another glitch, this mirrors the pattern that’s been forming recently when it comes to indexing. Speculation was rife in May with more indexing issues, although Google’s John Mueller stated that there wasn’t a bug, at most a few minor glitches. There were also reports in June, when Google experienced more problems. Furthermore, throughout 2019, there were many problems with content being de-indexed, and at one point, new content wasn’t being indexed at all.
If your website has been affected by an algorithm update and you’re struggling to find a solution, get in touch with us at Fibre Marketing, and we’ll help you get back on track.
Google is constantly making updates – thousands a year – although more extensive core updates understandably play a greater role in how search results will be generated. On the 4 May, 2020, Google announced that they rolling out a core algorithm update, the second one of its kind for this year.
This is a broad update which will affect everyone regardless of location, industry or language – but what does it mean for your business? Part of our work at Fibre is to properly understand exactly how these updates affect our clients, and how they can position themselves to stay ahead. We’re here to answer a few questions about the upcoming changes.
What are core updates essentially for?
Though the hundreds of smaller updates throughout the year go by largely unnoticed, core updates cause changes that can be very noticeable, internationally, and are likely to have obvious effects on the way your business shows up in search results.
This may not be a bad thing however – at Fibre we’ve noticed that some sectors are more negatively impacted while others will actually see a gain in their search rankings. Updates generally aim to improve content relevancy for users, but this does mean that businesses need to routinely reappraise their SEO strategy and be aware of how any updates affect their visibility.
How long does the update take to have effect?
Though the rollout is live currently, it’s likely still too early for businesses to notice any major effects. Typically, updates of this size take around a week or two to be established fully. Google’s first core update of this year was in January.
What kind of content will fare better after the update?
Google’s algorithms are designed to reward content that is more likely to satisfy users’ queries. If your content has dropped in relevancy since the last core update, its new search ranking will reflect this. On the other hand, some content will in fact perform better. There are no hard and fast rules – every business needs to do a careful assessment of pre-existing content to find out what’s working and what isn’t.
Wouldn’t the Coronavirus prevent Google’s updates?
The straightforward answer is no. Google have confirmed that scheduled updates will take place regardless. This could be worrisome, especially for those businesses who are already experiencing some volatility. Though some are choosing to essentially pause during lockdown and hold off on making strategic changes to their search marketing plan, it’s worth remembering that the new algorithms have not paused, and any losses experienced during this period may be difficult to recoup after lockdown lifts.
The unavoidable truth is that the Covid-19 pandemic has significantly altered not only consumer behavior but also online search behavior, with Google reporting, for example, that searches related to the virus have been the single most popular topic in the search engine’s history. Even casual users will have noticed the many changes Google has made to provide accurate information and data, as well as prioritise certain helpful content in the public interest.
Thus, Google’s update has rolled out as planned, but the criteria for “relevant” is greatly impacted by the needs of a world ravaged by the Coronavirus. The May core update may prove to be disruptive primarily because people’s searching behaviour is vastly different from what it was just six months ago. Some unexpected content topics are gaining in relevance while previously high performing ones will take a knock.
What you can do
In Google’s constant attempts to refine their search algorithms, businesses are forced to respond by increasing the quality of their sites. Though the changes can seem intimidating, many of the old rules still apply: businesses should always seek to be an authoritative and trustworthy source of truly valuable information, and reassess continually to make sure they’re hitting the mark.
A flexible SEO team can help you narrow down an intelligent strategy that exploits rising trends – such as increasing mobile use, the decline of text-rich content in favour of other formats, the rise of Google My Business and a growing focus on a more local perspective.