The Importance of Disavowing Links In 2019

Disavowing links has been a hotly debated topic within SEO circles for many years now.

Many SEO experts are unsure whether they still need to submit disavows for low quality links or whether this practice could actually have a negative impact on their website ranking, as Google suggest.

Regardless of the debate, Google’s algorithms do still use quality of links as a way to rank websites and they do still hand out penalties if the site doesn’t comply with their tight guidelines.

This means that it’s just as important in 2019 to audit links and file a disavow with Google when it’s negatively impacting on your website ranking.

In this article, we’re going to look closer at the debate in the SEO industry, remind ourselves what manual actions are, and understand which sites we should still disavow.

Why the debate?

To clear up the debate, we need to travel back in time to those days when Britney Spears shaved her head, ‘to google’ was officially declared a verb and MySpace was the best thing since sliced bread. We’re talking about the early days of the internet as we know it- from the late 90s to the early 2000s.

During that time, SEO was still in its infancy and people did what they could to get their webpages to rank highly. This involved a lot of spammy practices such as keyword stuffing and hidden text on the website.

PageRank

Google had always set itself ahead of the competition by focussing on link quality, so it created PageRank. The thinking was that if there were lots of links to a website, then it must be of high quality. But people simply added their details to directories, left comments on website and paid for links so the algorithm didn’t work as effectively as hoped.

The Penguin algorithm & the disavow tool

Once the Penguin algorithm update went live in April 2012, Google could now penalise your website for having low quality or spammy links. Linking practices changed completely and many websites found that their traffic sharply declined overnight. The only way to get around this was by going in and removing these links.

To help webmasters to recover and remove these links, Google created the disavow tool. By uploading a text file, you could ask Google to stop looking at the links of your choice and so your web traffic could recover.

Penguin 4.0

‘Problem solved!’ you’re probably thinking. But the case isn’t quite so straightforward because in September 2016, Penguin 4.0 was released. This meant that Google would no longer penalise your site for low quality links but instead devalue them.

According to Google, you wouldn’t need to submit a disavow unless you’d been actively dealt a manual action (more on that below) or you were actively trying to prevent one.

Basically, if you hadn’t suffered from problems arising from your links and you didn’t have cause for concern, there wouldn’t be any need to do anything.

Google’s current position on disavowing links

Over the years since, Google has stated that disavows don’t help websites, but there has always been some confusion as to whether that’s correct.

But in a Google Help Hangout this year, John Mueller from Google stated that disavowing can actually help some websites, especially when there are ‘bad links’ that haven’t resulted in a manual action.

…So, it’s something where our algorithms when we look at it and they see, oh, there are a bunch of really bad links here. Then maybe they’ll be a bit more cautious with regards to the links in general for the website. So, if you clean that up, then the algorithms look at it and say, oh, there’s– there’s kind of– it’s OK. It’s not bad.

To clarify, this means that it’s still worth disavowing your links, even if you haven’t received a manual action yet. Unnatural links can still influence the trust that Google places in your site and affect your ranking.

Having said that, Google still isn’t a fan of allowing us to use the disavow tool and has ensured that it’s hard to find in the Google Search Console. The thinking is that people are disavowing too many links, and according to Gary Illyes of Google, “If you don’t know what you’re doing, you can shoot yourself in the foot.”

What is a manual action?

Much of the debate centres of the idea of a manual action, so it’s worth quickly recapping what that actually is.

As the name suggests, a manual action is taken by a Google staff member and lets you know that Google isn’t happy with some of your content. It tells you that it will omit or demote certain pages of your website from its search results.

The content and pages that can be penalised in this way include:

  • Unnatural links to your site: Just like it says on the tin, this refers to both inbound and outbound links which don’t seem quite right.
  • Thin content: This includes content created purely to promote, auto-generated content and content created by guest posters that don’t add value.
  • Hidden text and keyword stuffing: Keywords and text that appear too often and aren’t always visible to reader.
  • Clocking and suspicious redirects: Hidden content and conditional redirects fall into this category.
  • Use-generated spam: Spammy blog comments, forum posts and profiles that are spammy.
  • Spammy freehosts: If your website is hosted with other spammy websites, Google might paint you with the same brush.
  • Spammy structured markup: Does your markup match your site? If not, Google is likely to penalise you.

You can find out if you are subject to a manual action by going to Google Search Console and looking under ‘manual actions’.

Should you disavow your links?

This is a difficult question to answer. It all depends on just how bad those links and whether it is having a significant impact upon your organic search traffic.

Disavowing should always be done with care, and only once you’ve done everything else you can to remove the link manually.

It’s a serious action to take and as Google say, ‘This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results.’

In essence, if you use the disavow tool incorrectly, you could harm your SEO.

If you discover spammy, artificial, or low-quality links pointing back toward your site or you have received a manual action, you should first reach out to the owner on the website in question and ask for the link to be removed.

Usually this is as simple as visiting their ‘contact us’ page on the website or searching for them on social media. Then you can send them a polite email asking for them to remove the link. Only then should you consider disavowing your links.

What about when there’s not a manual action against you?

It gets trickier still if you don’t have a manual action against you, yet you’re still concerned about the quality of your links.

The best way to start is by conducting a link audit that can shed light on the situation. From there, you can go through the links to check their quality.

Don’t worry too much about those random low-quality links that can often appear because Google can happily ignore them. Google will only issue a manual action for low quality links that you or your SEO team are responsible for.

Instead focus on those linking practices which violate Google’s terms.

This includes things like:

  • Paid links or link schemes
  • Paid articles containing links
  • Publishing articles containing links to other sites (often through guest posting)
  • Product reviews and links which offer free products
  • Excessive use of reciprocal linking
  • Widgets that require linking
  • A high number of suspicious anchor texts

It’s also worth considering non-editorial links such as:

  • Malware
  • Cloaked sites (show google one set of results but the user a different set)
  • Suspicious 404s
  • Pills, poker and porn

You can often use your common sense to decide how bad a link is and whether to disavow. If there’s a borderline case and it seems like a matter of time before you receive a manual action, it might be worth disavowing anyway.

Generally speaking, if these links only relate to pages that you don’t care about and that don’t generate revenue for your business and they don’t negatively impact upon your organic search traffic and rankings as a whole, then you might not need to worry. You could simply delete the page altogether and move on.

Conclusion: Disavowing links and the future of Google algorithms

Maintaining high quality links and disavowing those which are low quality remains just as important in 2019 as it ever has.

The continual advancement of algorithms such as Penguin 4.0 can minimise the problem, but there will always be cases which slip through the net and negatively affect your website ranking.

The key remains to focus on creating high quality outbound links on your website, regularly monitoring the quality of incoming links via the Google Search Console or your SEO team and disavowing those which are harming your online presence.

How To Recover From The Google Medic Update

What Was The Google Medic Update?

The Google Medic update was one of the largest core algorithm updates in recent memory. The update was rolled out on the 1st August 2018 and confirmed by Google on August 2nd in the UK.

Who Was Most Affected by the Medic Update?

Analysis of the impacted websites showed that Healthcare and Ecommerce were the most affected areas.

The above pie chart demonstrates the most affected industries (taken from SERoundtable).

Why Were These Sites Affected?

The Google Medic update saw the rise in importance of EAT on YMYL sites. Both EAT and YMYL are acronyms that had appeared in Google’s Quality Rater Guidelines for quite some time prior to the update.

YMYL sites that displayed low levels of EAT were affected the most, due to Google deeming them to be low quality and potentially dangerous to the website visitor.

What Is EAT?

E-A-T stands for “Expertise, authoritativeness, trustworthiness,” and it’s the metric by which Google’s evaluators rank pages.

Google hires evaluators to analyse content and rank it on a scale of “Lowest” to “Highest” quality. The evaluators help Google understand how the algorithm updates are performing, but they don’t have direct input into the algorithm itself.

Though the evaluator guidelines document was created to help with quality rating, it’s an excellent source to better understand how Google defines quality.

Expertise – You need to be an expert in your field. This means you need to show the expertise of the creator for the Main Content (MC) and mention it within your text.

Authoritativeness – You need to show that you are an authority, or the authoritativeness of the creator for the main content – this comes from the expertise of your writers or organisation.

Trustworthiness – You need to show users they can trust the creator or company of the main content and the website.

What Is YMYL?

YMYL stands for ‘Your Money or Your Life,’ and refers to queries that involve your money, finances as well as your life, health and well-being. The reason why these queries are important for Google to tighten up on, is that they can strongly affect a person’s life.

If non-authoritative and non-factual sites are prominent in these kind of queries and are giving bad advice, then that could be dangerous to the well-being or financial state of that searcher.

What’s The Relationship Between EAT & YMYL?

High-ranking YMYL pages will show a high level of E-A-T. That’s because the safer a user feels while visiting a page, and the more the content meets their search query, the more it will meet the needs of E-A-T. Sites that are genuinely offering helpful advice or a solution to a problem will meet these needs more readily than sites that try to game Google’s system.

Google defines pages that “could potentially impact the future happiness, health, financial stability, or safety of users” as “Your Money or Your Life” pages, and these include:

  • Shopping or financial transaction pages
  • Financial information pages
  • Medical information pages
  • Legal information pages
  • News articles or public/official information pages

What Does Google’s Quality Rater Guidelines Say About These Sites?

“High E-A-T medical advice should be written or produced by people or organizations with appropriate medical expertise or accreditation. High E-A-T medical advice or information should be written or produced in a professional style and should be edited, reviewed, and updated on a regular basis.

We will consider content to be low quality if it is created without adequate time, effort, expertise, or talent/skill.” 

By rolling out the Medic update, Google decided to reduce the organic visibility of domains that are deemed to be offering services or selling products that it deems unsubstantiated by the necessary bodies — be that official government institutions, medical associations, charities for illness/disease/medical conditions, or external reviewers (including customers and industry experts) — or that are publishing content written by unqualified professionals.

What Is Google Looking For?

  1. Enough main content (MC): content should be ample enough to satisfy the needs of a user for a page’s unique topic and purpose (broad topics require more information than narrow topics, for example)
  2. The page and its associated content is expert, authoritative, and trustworthy for the topic they discuss
  3. The website has a positive reputation for its page topics
  4. The website features enough auxiliary information, for example, “About Us,” “Contact,” “Customer Service” information
  5. The website features supplementary content (SC) that enhances the user’s enjoyment and experience of a web page
  6. The page is designed in a functional fashion that allows users to easily locate the information they want
  7. The website is maintained, edited regularly and frequently
  8. Is your website easy to use on mobile?
  9. Is your website easy to navigate?
  10. Is the main content easy to read?
  11. Are there too many ads interrupting the reading and flow of content?
  12. Does the page take too long to read?
  13. Is your content original?
  14. Is your websites NAP consistent throughout the web?
  15. Does your organisation have a positive reputation across the web

How Do We Achieve This?

This process needs to be utilised across your core ranking and key performance pages – as supplementary pages on your site won’t require as extensive attention to detail or EAT as they aren’t used to generate convertible organic traffic.

Always remember to write for the user first, not the search engine. By doing this, you commit to creating credible, accurate and trustworthy information that is displayed in a safe environment.

Please see our checklist on how to achieve this below:

Expertise

You need to be an expert in your field. This means you need to show the expertise of the creator for the Main Content (MC) and mention it in your content.

Points to consider:

  • Can you demonstrate why you are in a position to offer advice?
  • What experience can show expertise for this topic?
  • Write in your brand’s voice, directed towards your audience
  • Make sure you include accurate statistics and attribute the correct sources
  • Make sure your content is updated and contains accurate facts
  • Review content so that there is a greater chance of publishing the most accurate content possible

Authoritativeness

You need to show that you are an authority, or the authoritativeness of the creator for the main content – this comes from the expertise of your writers or organisation.

Points to consider:

  • Why are you credible?
  • Can you demonstrate accolades, awards, industry recognition for the topic in question?
  • Credentials are important, but so are personal experiences
  • If you can’t demonstrate authority on the chosen subject, cite your research
  • Do not display business advice without a creditable source or author
  • Customer reviews/case studies will help demonstrate reputation, experience and authority

Trustworthiness

You need to show users they can trust the creator or company of the main content and the web site. This is especially important for websites that could directly impact a searcher’s life and/or finances.

Points to consider:

  • Do not keyword stuff your content
  • Content should be written to help users not search rankings
  • Your content should either help or teach your users something on a topic related to your business
  • Adding links to other related articles will increase trust on the chosen subject

Google SERPs Potential New Layout?

Google makes regular core updates multiple times throughout every month, seemingly increasing each year. In 2018, Google stated that they had made over 3,000 improvements to search, compared to 2009, where there were only 350-400 changes reported. Some of these are hardly noticeable, while others have a significant impact on search engine results and SEO rankings, which is why it is vital that any algorithm update Google releases are closely monitored.

Here at Fibre Marketing, we track these search changes and help our clients beat the updates to improve their rankings.

Recently (Sept 28th 2019), it has become apparent that Google was testing a new search results page design for desktop – something that caught our attention.

With this particular test, Google has added several additional options to the area at the right and left of the search results that was previously a blank white space. The left side now offers more search filters, while the right side has related search options that allow users to expand their search and look at related search result pages.

(click on images to expand)

Types of searches this will apply to

It appears that the new search results design update will only be available on certain types of searches. This may include searches for:

  • Songs
  • Games
  • News
  • Video

So far, it seems that the new design cannot be replicated and may not apply to searches for movies, books, artists, or bands, according to Barry Shwartz and Adarsh Verma, who reported the test. Although this could of course change at any time in the near future.

Impact on search results

As with all Google updates, this has left people questioning what effect the changes will have on search results.

Decrease In Click-Through-Rate?

Firstly, it has the potential for less click-through rates as users have more options when searching for answers and information online. This also includes YouTube searches, as the videos from the video platform are integrated in the results page here, although it is currently unclear whether or not the video will play in the SERPs, or if it’ll open in a new tab on YouTube itself.

On the other hand, music-sharing platforms such as Spotify will clearly benefit from this change, as they are linked directly underneath the video, above fold.

Of course, all of the above features will likely see search results themselves pushed further down the SERPs than they already are. Google tailors the design of their results pages to the user in order to optimise their search experience. This means presenting the answers within the SERPs themselves, as shown by this potential design, which will likely result in an increase in zero-click searches.

In June 2019, 49% of all Google searches ended with zero-clicks. Google has now become a competitor within a variety of sectors, including hotels, flights, song lyrics, etc., which has landed many website owners in a panic as they watch their organic traffic decrease. This potential search design will not likely help this situation.

A UX-Based Design

The changes could likely result in quicker searches as users will be able to locate the information they are looking for more efficiently using keywords. Google is constantly looking for ways to improve the user experience, and these changes to the results design page could help search results become more UX based as it offers more shortcuts to what they need, if they are wanting a media search itself.

A Tough Challenge For Organic Search Results

However, this new SERP design will push the organic results further down page, a continuing trend with any new search update. Over the years, Google has added a staggering amount of features onto their results pages over the years – 39 overall, according to Paige Hobart’s talk at BrightonSEO – from featured snippets, map packs and knowledge cards. And then, there’s the Ads.

If your site is currently ranking as the top position, this does not necessarily mean that your listing will be above fold. Therefore, site owners have adapted their strategy over the years to create more user-friendly content which will appear in featured snippets, as well as implementing schema to take up more space in the results. It is not clear from this recent test how featured snippets will show up – beneath the media results, or above.

Regardless of this, if Google does go ahead with the proposed search design, site owners will continue to watch this drop.

Final Insights

So far, the only real information we have is that Google has made changes to its search results page design. The full effect of these changes on search results is still unclear, but it is likely to create a more UX-based experience that could decrease organic traffic to your website – although it is too early to say for sure. Currently, it appears that these changes only apply to a limited number of search types including songs and games, and it is unknown whether the changes will apply to other search types at a later date.

All You Need To Know About The Google Update To Reviews Rich Results

Making your business stand out online can be tough what with all the competition out there – it’s therefore vital that your search listings stand out in search results. Rich results, created by schema markup has always been one of the best and most popular ways to do this as it allows for prices, dates, star ratings and more to show underneath your meta title. These features can work wonders for your organic performance, as over 80% of local business consumers trust online reviews as much as personal recommendations.

However, on Sept 16th 2019, Google implemented changes to their reviews rich results policies and procedures, affecting how it shows review rich results and rating stars. The overall aim of this change is to improve the rich results for search users as well as addressing abusive implementation (e.g. ‘self-serving’ reviews) that have occurred over the years.

All website developers should strive to understand these new changes to ensure that their knowledge is up to date and relevant.

What Are Reviews Rich Results?

Reviews rich results are those results which show at the top of the Google Search Results. They are based on the reviews and ratings of a product, service, or production that have been produced by

Rich Results

a well reputed and established website. There are many different types of products and services for which a review can be left, including books, events, guides, local businesses and establishments, software and applications, recipes, and other similar features.

Reviews rich results tend to look similar to the image shown on the right.

The Google Update

The changes that Google have released, at their very simplest, are designed to limit the number of reviews rich search results that can be made; notably, self-serving reviews are no longer allowed.

The schema that are now allowed are:

  • Book
  • Course
  • Creative work session
  • Creative work series
  • Episode
  • Event
  • Game
  • How to
  • Local business
  • Media object
  • Movie
  • Music playlist
  • Music recording
  • Organization
  • Product
  • Recipe
  • Software application

Clarification

Google’s primary goal with the new regulations were based on preventing businesses from self-promoting their own material, content, services, and the like – hence, creating ‘self-serving’ reviews. For example, reviews about business A that have also been posted on business A’s website will no longer feature as a reviews rich result; only reviews which have been made by unbiased third party individuals will be considered.

A more detailed case could be a search result featuring the review markup showing something like 5000+ reviews, when realistically they actually have 100. These extra reviews have been generated by the business themselves, and therefore do not count.

This update therefore is to protect the integrity of the content and ensure that the results are of the most relevance and use to the searcher as possible; biased self-reviews, unsurprisingly, do not meet this requirement.

This algorithmic update was met with confusion amongst the SEO community, and so the team at Google Webmasters updated their blog to clarify the regulations more clearly. They stated that, essentially, you can’t review your own local business and then host it on your own website. See the Twitter discussion here.

But Why Do We Care?

You might be wondering why you might need to be worried about these new changes. While they don’t necessarily need to be all that consequential, it should be considered as they could impact a page’s ability to show its own star rating. This means that everyone should always work to ensure that they have reviewed and analysed the new changes to ensure that their markup meets the regulations in order to avoid their rich results being dropped.

In addition, for people who make use of these reviews rich results when making a search, the new changes could also be beneficial. The changes are heavily based on the idea of making it easier for people to use the system without having to worry about reviews from biased sources. This means that search users can feel confident in the quality of the reviews that they are finding for the products or services that they are searching for, thus receiving a better user experience from Google.

How should you respond to rich results review display limitations

Even though this update is only a small change, it may have an indirect impact on a site’s ranking as these reviews can affect organic performance. Therefore, if your website uses review rich results, you should strive to understand the new changes.

Elimination of self-serving reviews

The changes to self-serving reviews are now in place for entities that are a local business, organization, or anything in between. The same will be the case for third party reviews—such as a TripAdvisor review—that are embedded into a business’ website. Already, there have been a large number of cases of sites losing their review rich results, as reported by numerous tools including Mozcast (35.8% rich results, down from 39.2%) and SemRush (47.6%, down from 52%) – statistics dated two days after Google’s announcement.

It should be noted that there will not be any penalties for businesses who still display these self-serving reviews; rather, the case will simply be that the snippet won’t appear in the Google Search results.

Mandatory “name” property in featured snippets

In addition to this, Google have also announced changes that have been made to the ways in which you name your reviews. In order for a review to be eligible for showing  as a reviews rich result, they will now need to feature the name of the product or service in their markup—and failing to do so will mean that the review won’t be possible for being a reviews rich result.

Final Thoughts

If you’re deeply involved with SEO, you’ll know that review rich results are not a ranking factor, so it won’t your site’s position. But, adding this markup took websites a long way as it can affect the number of sales and users, information that search engines use when selecting the best results for the search user. So this small change will likely have a significant impact amongst the SEO community and search results, as we’re already started to see. It is therefore vital that you get to grips with the latest regulations to ensure that your organic performance isn’t affected a great deal.

Author Authority in Search: Does it Matter to Google?

The ability of author reputation to influence page rank has been an ongoing dispute for many years. Google Authorship was a feature that appeared in Google search results for around three years, from June 2011 until August 2014. The feature allowed and encouraged content creators to identify themselves when posting a piece of content – be it a blog post, article or other type of web-based copy – by displaying a profile image and linking to their Google Plus account. Theoretically, this aimed to help authors stand out in the SERP and bolster their click-through-rate. The Google Authorship markup fell under Google’s E-A-T (Expertise, Authoritativeness, and Trustworthiness) umbrella, whereby the verification of authors on the internet would improve users’ overall search experience.

Unfortunately, the Google Authorship initiative retired in August 2014 when Google removed all author photos in their mission to better marry the user experience with mobile and desktop search – which involved decluttering the search page. Google had also established that participation in the authorship markup was extremely low (almost non-existent in many verticals), with searchers receiving little to no value from the addition. It was reported that when the markup was removed, there was little difference in ‘click behaviour’ on the search results page than when the authorship feature was in play.

However, conversations around author authority in the SEO space have since crept back into the headlines. In July there was an interesting discussion on Twitter surrounding the weight of author authority in the health industry. Google’s John Mueller referenced YMYL sites as an example of why authorship is a necessary factor to consider when publishing or reading sensitive content, such as online medical advice. He mentioned that if you are writing about a topic on health and you’re not an expert in that field, then you’re already starting off on a “shaky foundation.” He added that it makes sense for writers to find experts to write or review the content so that it is “correct and trustworthy.” Despite this, there is still little evidence to suggest that content authorship is a ranking factor in Google search. However this doesn’t necessarily mean that it isn’t an important element. In fact, many people believe that author authority has a number of benefits. With this in mind, let’s look at some of the evidence for and against the importance of author authority in Google search.

Considerations for content authorship

There was renewed interest in the impact of author authority on Google search following Google’s Search Quality Raters Guidelines (SQRG) update back in July 2018. Under these guidelines, web pages could appear higher in the search results if they rank highly in Google’s three attributes of content quality – Expertise, Authority, and Trustworthiness. An addition to the July 2018 update was the inclusion of content creators as part of the measure of content quality. On assumption, this could imply that the reputation and expertise of the author is still an important component of the overall E-A-T rating.

What’s more (while this has not been confirmed), should Google want to identify and evaluate authors on the web according to the E-A-T specification, using something like Machine-Readable Entity IDs (MREIDs) would be essential. Google is constantly looking for ways to enhance the user experience, so it makes sense that at some point Google will begin to look for signs of authoritative and reputable authorship on content pages. For that reason, it’s advisable that publishers only accept content from creators with good reputations who have experience in the specified field. Publishers should also give preference to content creators who have a clear, positive presence online i.e. creators with active social media accounts.

The authorship veto

Back in August 2014 Google removed the ability for publishers to display the author’s name, photograph and the number of Google Plus circles the author had been added to. Once the Google Authorship feature was removed, many marketers and publishers no longer resonated with the importance of content authorship. What’s more, as mentioned above, Google has publically declared that content authorship is not a ranking factor. According to information on searchenginejournal.com – “Google’s John Mueller has clarified that the search engine’s algorithms do not look at author reputation when ranking websites.” For that reason, many businesses don’t consider it an important factor in their marketing efforts.

Final thoughts

There is little evidence to support both sides of the authority debate. However, this doesn’t mean that it’s not an important factor to consider in your marketing efforts – there is certainly no harm in providing a quality, transparent and accountable author reputation to your content. It’s also important to keep in mind that Google’s Search Quality Rater Guidelines could update at any point in the near future and perhaps include author authority as a ranking factor.

The Untapped Potential of Google Image Search

When it comes to SEO, most people automatically think of words. Keywords, meta descriptions and links are all important parts of SEO, but one area which is often either overlooked or not given much attention is images.

In fact, images can have a massive effect on your SEO – Google Image has an important role in the world today. As everything becomes more visual, images, infographics and the like are increasingly important and can be a valuable asset in terms of SEO.

Understanding how Google Image Search can help your SEO is important in today’s world. There are a number of things that you can do to tap into the power that Google Image can have for you.

Optimising your current images

Before you start anything else it is important to make sure that your current images are optimised for search engines like Google. This basically means that it can help your image to appear in the Google Images rankings.

The aim again is to have your images come out at the top of the Google Images rankings – of which there are many more per page than text searches.

  • Edit your file names to something that is relevant to the picture (instead of the file name such as DSC_052.jpg as named by the camera).
  • Edit your alt tags to something rich in keywords and as short as possible. These alt tags can help with SEO, but their main purpose is to help people who are visually impaired so keep this in mind when you are choosing the words.
  • Compress your images. The time that a page takes to load is vital for SEO and attracting people. A large image file can take a long time to load, put people off, and, ultimately, bring up your bounce rate. By compressing your images, they will take up less memory, load quicker and improve your SEO.

Avoid Stock Photos

Stock photos are great because they’re free. But that’s about where their usefulness ends, as far as SEO is concerned anyway.

Stock photography usually looks poor quality and doesn’t do much for your own specific branding. It can be used sometimes but definitely not all of the time!

Using your own images can be vital to good image SEO – in a similar way to other content. If you have an image which is unique to your website, it will always link to your website and not to the hundreds of others that have used the same picture. People are also likely to use your image and include a backlink.

With Google Images, the first result on their search results list will always be the original image so it is worth the effort of getting your own images on there.

Content management involves ensuring that you have the best, most effective images, which can help to attract as much traffic to your website as possible. And this involves using unique images which are specific to your website.

A Reason for your Images

If you are putting an image onto a page, don’t put it on there just for the sake of it. Make sure that it is there for a reason. Just like in other SEO content marketing strategy, you will only increase your bounce rate if you have irrelevant and unconnected images.

You should also keep this in mind when you are labelling your images. Make sure that you label them in a way that connects them with your content. If, for example, you have a blog about cactus’s, make sure that you label it this way and not ‘Mexican landscape’. People who are looking for pictures of landscapes in Mexico might not want to learn about the inner workings of a cactus.

Infographics are an excellent way of attracting relevant traffic to your website, and, as long as you are making them, they can be both completely unique to you and informative.

Image Sitemap

Sitemaps ate useful to search engines like Google as they can help to tell them what images are present on your website and where. This means that all of the images that you want to be found are found quickly.
WordPress and Yoast automatically add images to your sitemap or you can create your own image sitemap for your website.

Understanding the power that images have in terms of SEO is important, and judging by current online trends, is only likely to become more important. By getting your image SEO right, you can add to the other work that is done to keep your digital marketing efforts effective and see it show not only in an SEO audit but also in the amount of traffic that you are getting to your website.

5 Biggest Observations From Google’s Latest Cheat Sheet

Many companies find that their site’s search rankings are negatively affected following Google’s many algorithm updates. And, seeing how many changes are made throughout the year, it can be frustrating when you can’t see a way to regain any losses.

Fortunately, for the first time Google has released a post giving advice on how companies can take advantage of new core updates to improve their SEO rankings. Here are the key things you should be aware of following Google’s advice on core updates.

1. There is nothing to fix

Google has advised that there is nothing you can do to fix a site if you see a decline in search rankings following an update. “We know those with sites that experience drops will be looking for a fix, and we want to ensure they don’t try to fix the wrong things. Moreover, there might not be anything to fix at all.” They then added – “There’s nothing wrong with pages that may perform less well in a core update.”

Many companies are therefore questioning how they can improve their site’s search rankings and what they should do following core updates. Google said that what has changed, is simply how Google assesses the value of content. Companies should therefore focus on producing the highest quality content possible. Google advised – “We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.”

2. Content should be written for the user/audience

While Google have reiterated the fact that there is no quick fix following updates, they have however given advice on how to improve the quality of your content. Google has released a list of questions with their blog that companies should ask themselves if their site has been negatively affected by the core update. This includes things such as:

  • Does the content provide original information, reporting, research or analysis?
  • Was the content produced well, or does it appear sloppy or hastily produced?
  • Is the content free from spelling or stylistic issues?
  • Does the headline and/or page title provide a descriptive, helpful summary of the content?
  • Does the content provide a substantial, complete or comprehensive description of the topic?

Recent updates prove that writing for the user/audience is the most effective way to write quality content and will be favoured by the Google algorithms. Their advice on assessing content asks questions like “Is this the sort of page you’d want to bookmark, share with a friend, or recommend?” and “Does the content present information in a way that makes you want to trust it?” The emphasis is on the reader and their opinion on the quality and value of the content.

3. Content must focus on E-A-T

To improve search rankings, businesses must adapt their content to meet the new guidelines. One of the simplest ways to produce quality content is by focusing on E-A-T, which stands for Expertise, Authoritativeness and Trustworthiness. Companies can read the search quality raters guidelines to get further advice on how to improve the overall quality and effectiveness of their content. According to Google – “Reading the guidelines may help you assess how your content is doing from an E-A-T perspective and improvements to consider.” They added “If you understand how raters learn to assess good content, that might help you improve your own content. In turn, you might perhaps do better in Search.”

The list of questions that Google published in their blog include a section on E-A-T and what you should ask when writing content for your site.

4. There are no further updates confirmed

The post was not confirmation of another update. Google confirmed their June core update but since then, no other rumoured updates have been confirmed by Google. However, companies should be aware that unannounced updates occur on a regular basis. Google said – “We are constantly making updates to our search algorithms, including smaller core updates,” They added “we don’t announce all of these because they’re generally not widely noticeable.”

5. Google wants to improve the user experience

It is clear that Google’s ultimate aim in introducing the core updates is to improve the user experience as much as possible. In summary, companies should focus on building high-quality websites that offer users high-quality, trustworthy content which has been produced in line with the search quality raters guidelines and E-A-T.

All You Need to Know About The robots.txt File

If you know a little about SEO then you might have heard of the robots.txt file. It can be a useful tool for anyone who has a website that hopes to attract visitors through search engines. The robots.txt file can essentially tell search engines where they can search on your website, meaning that more time is spent searching the useful pages.

The robots.txt file is also known as the “Robots Exclusion Protocol.” Some people put their ‘noindex’’ rule inside their robots.txt file, but this is all about to change as Google has announced that from September 1st, 2019 they will no longer support robots.txt files which have a noindex directive which is listed in the file.

They are changing their rules to try to keep the eco-system as healthy as possible and to make them as best prepared for any future open source releases. This is a complex issue and one which business who have websites need to respond to. And to enable them to do this, it is important to fully understand what the robots.txt file is and its role in the SEO space.

The robots.txt File in Action

Search engines like Google have web crawlers – known as spiders – which look at millions of websites every day, reading information which helps them to decide which websites they will put at the top of their search results.

If there are pages that you don’t want Google to crawl, you can put a robots.txt file in place. This can be dangerous however, as you can accidentally prevent Google from crawling your entire site. So make sure you pay attention to what you are blocking when it comes to adding these files.

Putting these files in place will also prevent any links on that page from being followed.

It is important to note, however, that although you are blocking the robots, the page can still be indexed by Google. This means that they will still appear in the search results – although without any details. If you don’t want your page to be indexed on Google, you must use the ‘noindex’ function.

Why is it Important?

The main reason why robots.txt files are important and useful to a website is that it can stop the search engine from wasting their resources on pages that won’t give accurate results, leaving extra capability to search the pages displaying the most useful information.

They can also be useful for blocking non-public pages on your website – pages such as log-in pages or a staging version of a page.

They can also be used to prevent the indexing of resources such as multimedia resources like images and PDFs.

The robotos.txt file can also be useful if you have duplicate pages on your website. Without using the file, you might find that the search engine comes up with duplicate results, which can be harmful to your website’s SEO.

The Pros of Using the robots.txt File

Most crawlers will have a pre-determined number of pages that it can crawl, or at least a certain amount of resource that it can spend on each website. This is why it is important to be able to block certain sections of your website from being crawled, allowing the robots to spend their ‘allowance’ only on sections which are useful.

There are some instances when using robots.txt files can be useful, including:

  • Preventing duplicate content
  • Keeping sections of your website for private use only
  • Preventing certain files on your website from being indexed e.g. images and PDFs
  • Specifying crawl delays to prevent your servers being overloaded

The Cons of Using the robots.txt File

Blocking your page with a robots.txt file won’t stop it from being indexed by the search engines. This means that it won’t actually be removed from the search results. The ‘noindex’ tag is what is important in preventing the search engine from indexing your page, but remember, if you have used the robots.txt file, the robots won’t actually see the ‘noindex’ tag, and therefore it won’t be effective.

Another potential issue with the robots.txt file is that if you block the robot, all of your links on this page will automatically be valueless to you in terms of flowing from one page or site to another.

The Consequences

There are a number of consequences for brands of Google’s decision to no longer support robots.txt files with the ‘noindex’ directive.

The main concern is that you will need to make sure that if you have used the robots.txt file previously, that you have an alternative to fall back on.

If you are using ‘noindex’ in a robots.txt file, you should look for alternatives and set these in ace before the deadline of 1st September. These include:

  • Noindex in meta tags
  • 404 and 410 HTTP status codes
  • Use password protection to hide a page from search engines
  • Disallow in robots.txt
  • Search Console Remove URL tool

Here at Fibre, we know that, the majority (91%) of people who use a search engine use Google and this is why it is important to stay up to date with any changes in their policies. By being able to understand and act on their directives, you can ensure that your website is found and will continue to be found by search engines as other rules change.

The Complete Rundown Of Google’s 2019 Algorithm Updates

Over the past year, Google made a whopping 3,200 changes to its search system, amounting to an impressive 8 changes a day. It’s safe to say that this number is a far cry from the lowly one change a day initiated back in 2010. As SEO experts we also know that these changes often arrive unannounced and take their time to settle in and unpack their algorithms on the world wide web.

The full effects of Google’s algorithm updates on the search system is still unknown, leaving many of us in the dark. However, it’s not all doom and gloom. While some consequences include drops in traffic and shifts in positions on the SERP, those practicing ethical SEO strategies can adapt to these changes and still achieve high quality rankings. Remember that the ultimate aim of all Google algorithm updates is to create a high quality user experience and improve the understanding of search queries.

In understanding where and when these changes occur, industry professionals are able to account for certain side-effects to webpages in the update aftermath and devise the best strategy to neutralise them.

To keep you informed, we’ve compiled a comprehensive list of all Google’s major algorithm updates in 2019 to date.

March 12th 2019, Google Core Algorithm Update

Touted as a ‘broad core update’, this change was a big one. The term ‘core’ indicates that Google was not targeting any particular website or quality, almost like an NHS health check. The underlying goal was to improve overall user satisfaction through the process of Neural Matching – an algorithm first introduced in 2018 that enables the generation of more diverse search results.

Winners and Losers

Interestingly, this algorithm update affected all search queries relating to sensitive topics in the legal, financial and most prominently health industries. Some have said that this update reversed the affects of the E-A-T (Expertise, Authoritativeness, Trust) update rolled out in August 2018 – otherwise known as the Medic Update – which also targeted webpages in the health arena. Everydayhealth.com and verywellhealth.com were among those hit the hardest.

However it has also been reported that other medical sites with a strong brand profile and topical focus experienced improvements in site visibility. A key factor known to account for these positive side-effects was the favouring of YMYL keywords. This acronym for, ‘your money or your life’ is what Google uses for webpages that impact the future of users’ happiness, i.e. health, financial stability etc. In laymen stems,  webpages that could successfully answer user search queries relating to YMYL keywords went up in the world as they could provide a higher level of trust.

June 3rd 2019, Google Core Update

The June 2019 core algorithm update was the first update that Google announced ahead of its implementation – a stark indication that this would likely have major effects on the SEO ecosystem. It has been speculated that this update targeted news providers that offered low-value content to users.

Winners and Losers

The five-day roll out impacted a plethora of large digital publishers such as the Daily Mail (who saw a 50% drop in search traffic) and CNN, who still claim to be in recovery. On the flip side, others including the Mirror, the Sun and the Metro experienced positive spikes in search traffic. The underlying causes of why the Daily Mail saw such dismal results post-update are still unknown. Some blame the lack of user trust in their content, their political positioning and the volume of poor quality advertisements. What we do know, is that SEO analysis of the core update continues and has been made doubly as difficult to understand due to its overlap with the June Diversity update (see below.) 

June 6th 2019, Google Diversity Update

Google’s diversity update was the antithesis of the March core algorithm update. Side-effects were minor and the change itself was narrowly targeted as a result of lesser known improvements made. The aim of the update was to limit SERPs from displaying various results from the same websites and to improve the assortment of options on offer for the user.

Winners and Losers

Some say that because this update overlapped with the June core update (which finished rolling out on the 8th June), that its affects on webpages were minimal. Users continue to demand a greater array of websites on the SERP, suggesting whether Google has done enough to address the issue of limited diversity? While there have been mostly positive results from this update, we have spotted a few discussions on Twitter that suggest otherwise. SEO industry professionals have predicted that we should expect to see more algorithm updates surrounding filtering out similar content on the SERP in the near future, so results may continue to improve in time.

Possible updates sighted in July 2019

On 11th – 13th July we spotted possible signs of an algorithm update. While this was unannounced, some have reported changes to YMYL sites, many health related. Prior to this around 1st – 9Th July, there were also spotting’s of a possible change to the June 3rd update. Again, information on this is in scarce supply and further highlights the importance of consistent site monitoring and metric analysis.

Winners and Losers

The unconfirmed July update has yet to expose those who benefited and those who suffered from the change. Some debate that the scale of the update was not momentous enough to impose significant damage, while others claim to have experienced rank fluctuations to the bottom half of the SERP across all markets.

August 16th 2019, Google Search Ranking Update

Over the weekend, there have been signs of an algorithm update, labelled the Google Search Ranking Algorithm Update, that begun on August 16th 2019. While we personally have not yet seen many changes to our clients’ websites (19th August 2019), there are ongoing discussions on Webmasterworld as well as Twitter.

Winners and Losers

Chatter surrounding this update died down quickly, and Google are yet to confirm if an update was released.

The issue with small updates or tweaks to search is that, as we receive small amounts of information, it’s harder to detect what updates specifically target – which applies to this small August update. Not much insight has been presented to us and any changes in traffic or rankings appear to be minor compared to the many other updates from this year. Yet many suspect that the changes were the result of seasonal factors, such as the summer holidays being in full swing.

August 29th– 30th 2019, Unnamed Update

Following on from the suspected update just the week before, Google launched an algorithm update that commenced on the 28th of August, continuing until the 30th.

What was interesting about this particular update, was that self-proclaimed Google employee, Bill Lambert warned the SEO community a few weeks before that this update was imminent, and advised site owners to build as much traffic as possible while trying to keep users on the site for longer.

Winners and Losers

The update appeared to have affected a wide range of industries as opposed to the usual YMYL or ones struggling with elements of E-A-T.

There were a few associations with this update and affiliate linking websites, which typically are sites of lower quality. While this theory is yet to be confirmed, it is also interesting to note that Marie Haynes reported this update as a ‘possible link related update’ after her clients experienced gains following recent disavow work. This strengthens the affiliate linking theory, as the update may have been related to Google’s ability to assess link trust.

September 13th – 18th 2019, Potential Update

From September 13th – 18th, SERP volatility trackers were incredibly high, indicating another Google update. However, the effects did not appear to be as significant as previous updates.

Winners and Losers

The winners and losers of this update were more transparent than during previous algorithm changes. As the update began, sites related to health, finance, law and government were affected in more negative ways than others. This indicated to us that YMYL sites were the predominant target.

However, the update then spiked on the 18th of September which flipped the above predictions on its head. The categories that were heavily hit included arts & entertainment, science, gaming and more, while YMYL sites were mostly absent.

This suggests that either the September 13th update did not last as long as we thought, and it may have been a separate update that occurred on the 18th.

September 24th 2019, September 2019 Core Update

Google announced on their Twitter account that a broad core update was incoming, formally named the “September 2019 Core Update.’ This announcement informed us that it would be significant as Google rarely declares updates beforehand. The search engine giant also stated that their advice for recovering from such updates remain the same.

This update took a while to get going and for the affects to be noticed – the day after the announcement, Barry Schwartz posted a Twitter poll asking if anyone had detected any changes and 61% claimed that they had not. But the next day, search results were certainly shaking up.

Winners and Losers

At first, there appeared to be quite a mix of recoveries and losses across the board, but either way the results did not appear as problematic compared to previous broad updates.

Similar to the last suspected August update, links appeared to be a factor with this update. One example of this, as reported by Search Engine Journal, came from members of the Proper PBN Facebook Group – a Facebook group who specialise in grey hat SEO tactics – who reported negative effects, especially after employing the 301 spam trick (when a different domain similar to the main site would be purchased, spammy links would be built to that domain, and then redirected to the main site with a 301). Sites with relevant 301s however retained their ranking

Furthermore, earlier in the month, Google announced small changes regarding how nofollow links will be treated, so it would make sense for links to be an update target during this time period.

October 25th 2019, BERT Algorithm Update

At the start of the week, signs of an algorithm update were extremely noticeable as changes were noted regarding traffic. Then, on Friday 25th, Google announced that they had released one of the most significant changes to search in recent years – the BERT algorithm update.

Similar to RankBrain, the aim of BERT was to help Google better understand language used in search queries. This means that the search company will be able to provide users with more accurate results within the SERPs as well as more useful featured snippets.

Winners and Losers

While Google described this update as one of the biggest changes they’ve made to search for a while, it certainly did not feel as big as previous major updates such as Penguin. This is likely because, over the years, the idea of keyword stuffing and writing for SEO has died down and been replaced with writing genuine, useful content for users. So you could say that the majority of websites have been preparing for this sort of update for a long time.

Of course, what will likely be affected is traffic, especially if your site often holds featured snippets. These SERP features will be changed – possibly quite dramatically – as Google can now recognise more accurately whether or not the content answers certain posed search query.

The main reason for a change in traffic is because Google will be changing which sites suit which queries, so any keywords your site ranks for may or may not alter slightly, depending on the quality of your content.

For more info about this update, click here. 

November 7th – 8th 2019, Search Ranking Algorithm Update

There was a considerable amount of chatter within the SEO community from November 7th, suspecting a significant search ranking algorithm update from Google. However, confusion was also at large as the majority of tracking tools did not pick up on the fluctuations.

But a week later, Google came forward to confirm the update, stating in a tweet that they rolled out several updates that were no different than usual. This signifies that these updates were smaller by Google’s standards – nothing more than routine, despite the number of tremors the community picked up.

Google’s Danny Sullivan also stated that he believed this update was unrelated to BERT.

Winners and Losers

Victims of this update mostly came from the US, with many sites affected significantly by these algorithm changes. These were primarily small and medium affiliate websites, notably from the travel, food, and health industries. SEO veteran Barry Shwartz asked on Twitter what people’s thoughts are, and the answers appear to support these findings. Overall, the update has reportedly been ‘aggressive on small, affiliate websites.’

There has been much discussion surrounding the online health industry recently, what with Google announcing that they are working on a search tool to help with medical research, and several updates this year touting YMYL sites as targets. Marie Haynes has been looking into these changes, believing that Google has been working to crack down on alternative health sites that go against general scientific consensus – even if these alternatives were backed up their claims thoroughly on their websites. According to Search Metrics, because of this update, many of the organic traffic drops were more than 30%.

November 4th – 10th 2019, Bedlam Update/ Local Search Update

We, alongside many within the SEO community, saw significant tremors throughout local map rankings throughout the week. There were shifts in rankings, and it took a while for the Local RankFlux tacker to settle. Local updates are quite rare, so any changes are usually quite noticeable.

Coined the ‘Bedlam’ Update by local search expert Joyanne Hawkins, this update was ‘a scene of uproar and confusion’ as changes were made across the board, with theories coming out left right and centre to later be hidden away again as further effects continued to take place.

A few weeks later, Google confirmed that the local update did take place, naming it the Nov.19 Local Search Update.

Winners and Losers

According to Google, the update focused on adding neural matching to local queries, following the implementation of this to organic search back in 2018. This means that, while the fundamentals of local rankings remain the same, the way Google understands these search queries have changed.

This proves Joyanne’s Hawkins belief that relevance was the priority of the update, not proximity. Google stated that there’s no need for businesses to do anything in the aftermath of this update, except follow the fundamental advice already available on their website.

In her blog post published during the early days of the update, Joyanne Hawkins said that many of the drastic changes had relaxed by the 10th, with only a few not being reversed. As an example, she stated that a lawyer started ranking for several zip codes that he never had before, which continued to increase as the update carried on.

 

With thousands of changes rolling out each year and over 200 contributing factors to Google’s algorithm, navigating your website through a Google update minefield remains no easy task. However, in the event of a major update, those in the SEO community should take the time to scrutinise traffic and rankings in order to understand if a site has been hit at a given time and pinpoint if any other affecting factors have come into play.

If you’d like to know how to protect your site from future Google updates, click here for our breakdown of Google’s blog, or get in touch today.

3D Animals – The Latest Google Feature

Thanks to Google, you can now interact with different animals by using your phone.

In May, Google announced that some searches would feature augmented reality results that users can interact with. Now, we’re starting to see them in action.

You can now see different animals using your phone camera, similar to Pokemon Go.

Google has not yet clarified how extensively they plan to implement AR into search results, but, right now, it is certainly providing a lot of fun!

How To Use It

If you search ‘giant panda’ as an example, and scroll through the results, you will find an option that reads ‘Meet a life-sized giant panda up close.’

Once you’ve tapped on ‘View in 3D,’ a small version of the panda will appear on your screen. Tap ‘View in your space,’ and it’s inserted into your current surroundings.

You can then change its size, move it around and even take photos with it. The animal is fully animated, so it moves the same way a real panda would –  it sits there eating a stick of bamboo.

Most of the other animals make sounds as well, although not much can be heard from the panda.

How do I know if my phone supports this feature?

For Android, you will need Android 7.0 or later, and your device should have originally come with Google Play Store installed.

For iPhone, you will need iOS 11.0 or later.

Which animals does it work for?

These are just a few that have been found so far.

Giant Panda, Lion

Shark

For us, we found that the option is only available if you search ‘sharks’ on Android. On Apple devices, ‘shark’ works.

Octopus, wolf

Tiger

 

There is a giant thread on Twitter showing more animals that have been discovered, such as an alligator, a turtle, and a pony. However, there are likely more if you’re willing to search for them all!