Author Authority in Search: Does it Matter to Google?

The ability of author reputation to influence page rank has been an ongoing dispute for many years. Google Authorship was a feature that appeared in Google search results for around three years, from June 2011 until August 2014. The feature allowed and encouraged content creators to identify themselves when posting a piece of content – be it a blog post, article or other type of web-based copy – by displaying a profile image and linking to their Google Plus account. Theoretically, this aimed to help authors stand out in the SERP and bolster their click-through-rate. The Google Authorship markup fell under Google’s E-A-T (Expertise, Authoritativeness, and Trustworthiness) umbrella, whereby the verification of authors on the internet would improve users’ overall search experience.

Unfortunately, the Google Authorship initiative retired in August 2014 when Google removed all author photos in their mission to better marry the user experience with mobile and desktop search – which involved decluttering the search page. Google had also established that participation in the authorship markup was extremely low (almost non-existent in many verticals), with searchers receiving little to no value from the addition. It was reported that when the markup was removed, there was little difference in ‘click behaviour’ on the search results page than when the authorship feature was in play.

However, conversations around author authority in the SEO space have since crept back into the headlines. In July there was an interesting discussion on Twitter surrounding the weight of author authority in the health industry. Google’s John Mueller referenced YMYL sites as an example of why authorship is a necessary factor to consider when publishing or reading sensitive content, such as online medical advice. He mentioned that if you are writing about a topic on health and you’re not an expert in that field, then you’re already starting off on a “shaky foundation.” He added that it makes sense for writers to find experts to write or review the content so that it is “correct and trustworthy.” Despite this, there is still little evidence to suggest that content authorship is a ranking factor in Google search. However this doesn’t necessarily mean that it isn’t an important element. In fact, many people believe that author authority has a number of benefits. With this in mind, let’s look at some of the evidence for and against the importance of author authority in Google search.

Considerations for content authorship

There was renewed interest in the impact of author authority on Google search following Google’s Search Quality Raters Guidelines (SQRG) update back in July 2018. Under these guidelines, web pages could appear higher in the search results if they rank highly in Google’s three attributes of content quality – Expertise, Authority, and Trustworthiness. An addition to the July 2018 update was the inclusion of content creators as part of the measure of content quality. On assumption, this could imply that the reputation and expertise of the author is still an important component of the overall E-A-T rating.

What’s more (while this has not been confirmed), should Google want to identify and evaluate authors on the web according to the E-A-T specification, using something like Machine-Readable Entity IDs (MREIDs) would be essential. Google is constantly looking for ways to enhance the user experience, so it makes sense that at some point Google will begin to look for signs of authoritative and reputable authorship on content pages. For that reason, it’s advisable that publishers only accept content from creators with good reputations who have experience in the specified field. Publishers should also give preference to content creators who have a clear, positive presence online i.e. creators with active social media accounts.

The authorship veto

Back in August 2014 Google removed the ability for publishers to display the author’s name, photograph and the number of Google Plus circles the author had been added to. Once the Google Authorship feature was removed, many marketers and publishers no longer resonated with the importance of content authorship. What’s more, as mentioned above, Google has publically declared that content authorship is not a ranking factor. According to information on searchenginejournal.com – “Google’s John Mueller has clarified that the search engine’s algorithms do not look at author reputation when ranking websites.” For that reason, many businesses don’t consider it an important factor in their marketing efforts.

Final thoughts

There is little evidence to support both sides of the authority debate. However, this doesn’t mean that it’s not an important factor to consider in your marketing efforts – there is certainly no harm in providing a quality, transparent and accountable author reputation to your content. It’s also important to keep in mind that Google’s Search Quality Rater Guidelines could update at any point in the near future and perhaps include author authority as a ranking factor.

The Untapped Potential of Google Image Search

When it comes to SEO, most people automatically think of words. Keywords, meta descriptions and links are all important parts of SEO, but one area which is often either overlooked or not given much attention is images.

In fact, images can have a massive effect on your SEO – Google Image has an important role in the world today. As everything becomes more visual, images, infographics and the like are increasingly important and can be a valuable asset in terms of SEO.

Understanding how Google Image Search can help your SEO is important in today’s world. There are a number of things that you can do to tap into the power that Google Image can have for you.

Optimising your current images

Before you start anything else it is important to make sure that your current images are optimised for search engines like Google. This basically means that it can help your image to appear in the Google Images rankings.

The aim again is to have your images come out at the top of the Google Images rankings – of which there are many more per page than text searches.

  • Edit your file names to something that is relevant to the picture (instead of the file name such as DSC_052.jpg as named by the camera).
  • Edit your alt tags to something rich in keywords and as short as possible. These alt tags can help with SEO, but their main purpose is to help people who are visually impaired so keep this in mind when you are choosing the words.
  • Compress your images. The time that a page takes to load is vital for SEO and attracting people. A large image file can take a long time to load, put people off, and, ultimately, bring up your bounce rate. By compressing your images, they will take up less memory, load quicker and improve your SEO.

Avoid Stock Photos

Stock photos are great because they’re free. But that’s about where their usefulness ends, as far as SEO is concerned anyway.

Stock photography usually looks poor quality and doesn’t do much for your own specific branding. It can be used sometimes but definitely not all of the time!

Using your own images can be vital to good image SEO – in a similar way to other content. If you have an image which is unique to your website, it will always link to your website and not to the hundreds of others that have used the same picture. People are also likely to use your image and include a backlink.

With Google Images, the first result on their search results list will always be the original image so it is worth the effort of getting your own images on there.

Content management involves ensuring that you have the best, most effective images, which can help to attract as much traffic to your website as possible. And this involves using unique images which are specific to your website.

A Reason for your Images

If you are putting an image onto a page, don’t put it on there just for the sake of it. Make sure that it is there for a reason. Just like in other SEO content marketing strategy, you will only increase your bounce rate if you have irrelevant and unconnected images.

You should also keep this in mind when you are labelling your images. Make sure that you label them in a way that connects them with your content. If, for example, you have a blog about cactus’s, make sure that you label it this way and not ‘Mexican landscape’. People who are looking for pictures of landscapes in Mexico might not want to learn about the inner workings of a cactus.

Infographics are an excellent way of attracting relevant traffic to your website, and, as long as you are making them, they can be both completely unique to you and informative.

Image Sitemap

Sitemaps ate useful to search engines like Google as they can help to tell them what images are present on your website and where. This means that all of the images that you want to be found are found quickly.
WordPress and Yoast automatically add images to your sitemap or you can create your own image sitemap for your website.

Understanding the power that images have in terms of SEO is important, and judging by current online trends, is only likely to become more important. By getting your image SEO right, you can add to the other work that is done to keep your digital marketing efforts effective and see it show not only in an SEO audit but also in the amount of traffic that you are getting to your website.

5 Biggest Observations From Google’s Latest Cheat Sheet

Many companies find that their site’s search rankings are negatively affected following Google’s many algorithm updates. And, seeing how many changes are made throughout the year, it can be frustrating when you can’t see a way to regain any losses.

Fortunately, for the first time Google has released a post giving advice on how companies can take advantage of new core updates to improve their SEO rankings. Here are the key things you should be aware of following Google’s advice on core updates.

1. There is nothing to fix

Google has advised that there is nothing you can do to fix a site if you see a decline in search rankings following an update. “We know those with sites that experience drops will be looking for a fix, and we want to ensure they don’t try to fix the wrong things. Moreover, there might not be anything to fix at all.” They then added – “There’s nothing wrong with pages that may perform less well in a core update.”

Many companies are therefore questioning how they can improve their site’s search rankings and what they should do following core updates. Google said that what has changed, is simply how Google assesses the value of content. Companies should therefore focus on producing the highest quality content possible. Google advised – “We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.”

2. Content should be written for the user/audience

While Google have reiterated the fact that there is no quick fix following updates, they have however given advice on how to improve the quality of your content. Google has released a list of questions with their blog that companies should ask themselves if their site has been negatively affected by the core update. This includes things such as:

  • Does the content provide original information, reporting, research or analysis?
  • Was the content produced well, or does it appear sloppy or hastily produced?
  • Is the content free from spelling or stylistic issues?
  • Does the headline and/or page title provide a descriptive, helpful summary of the content?
  • Does the content provide a substantial, complete or comprehensive description of the topic?

Recent updates prove that writing for the user/audience is the most effective way to write quality content and will be favoured by the Google algorithms. Their advice on assessing content asks questions like “Is this the sort of page you’d want to bookmark, share with a friend, or recommend?” and “Does the content present information in a way that makes you want to trust it?” The emphasis is on the reader and their opinion on the quality and value of the content.

3. Content must focus on E-A-T

To improve search rankings, businesses must adapt their content to meet the new guidelines. One of the simplest ways to produce quality content is by focusing on E-A-T, which stands for Expertise, Authoritativeness and Trustworthiness. Companies can read the search quality raters guidelines to get further advice on how to improve the overall quality and effectiveness of their content. According to Google – “Reading the guidelines may help you assess how your content is doing from an E-A-T perspective and improvements to consider.” They added “If you understand how raters learn to assess good content, that might help you improve your own content. In turn, you might perhaps do better in Search.”

The list of questions that Google published in their blog include a section on E-A-T and what you should ask when writing content for your site.

4. There are no further updates confirmed

The post was not confirmation of another update. Google confirmed their June core update but since then, no other rumoured updates have been confirmed by Google. However, companies should be aware that unannounced updates occur on a regular basis. Google said – “We are constantly making updates to our search algorithms, including smaller core updates,” They added “we don’t announce all of these because they’re generally not widely noticeable.”

5. Google wants to improve the user experience

It is clear that Google’s ultimate aim in introducing the core updates is to improve the user experience as much as possible. In summary, companies should focus on building high-quality websites that offer users high-quality, trustworthy content which has been produced in line with the search quality raters guidelines and E-A-T.

All You Need to Know About The robots.txt File

If you know a little about SEO then you might have heard of the robots.txt file. It can be a useful tool for anyone who has a website that hopes to attract visitors through search engines. The robots.txt file can essentially tell search engines where they can search on your website, meaning that more time is spent searching the useful pages.

The robots.txt file is also known as the “Robots Exclusion Protocol.” Some people put their ‘noindex’’ rule inside their robots.txt file, but this is all about to change as Google has announced that from September 1st, 2019 they will no longer support robots.txt files which have a noindex directive which is listed in the file.

They are changing their rules to try to keep the eco-system as healthy as possible and to make them as best prepared for any future open source releases. This is a complex issue and one which business who have websites need to respond to. And to enable them to do this, it is important to fully understand what the robots.txt file is and its role in the SEO space.

The robots.txt File in Action

Search engines like Google have web crawlers – known as spiders – which look at millions of websites every day, reading information which helps them to decide which websites they will put at the top of their search results.

If there are pages that you don’t want Google to crawl, you can put a robots.txt file in place. This can be dangerous however, as you can accidentally prevent Google from crawling your entire site. So make sure you pay attention to what you are blocking when it comes to adding these files.

Putting these files in place will also prevent any links on that page from being followed.

It is important to note, however, that although you are blocking the robots, the page can still be indexed by Google. This means that they will still appear in the search results – although without any details. If you don’t want your page to be indexed on Google, you must use the ‘noindex’ function.

Why is it Important?

The main reason why robots.txt files are important and useful to a website is that it can stop the search engine from wasting their resources on pages that won’t give accurate results, leaving extra capability to search the pages displaying the most useful information.

They can also be useful for blocking non-public pages on your website – pages such as log-in pages or a staging version of a page.

They can also be used to prevent the indexing of resources such as multimedia resources like images and PDFs.

The robotos.txt file can also be useful if you have duplicate pages on your website. Without using the file, you might find that the search engine comes up with duplicate results, which can be harmful to your website’s SEO.

The Pros of Using the robots.txt File

Most crawlers will have a pre-determined number of pages that it can crawl, or at least a certain amount of resource that it can spend on each website. This is why it is important to be able to block certain sections of your website from being crawled, allowing the robots to spend their ‘allowance’ only on sections which are useful.

There are some instances when using robots.txt files can be useful, including:

  • Preventing duplicate content
  • Keeping sections of your website for private use only
  • Preventing certain files on your website from being indexed e.g. images and PDFs
  • Specifying crawl delays to prevent your servers being overloaded

The Cons of Using the robots.txt File

Blocking your page with a robots.txt file won’t stop it from being indexed by the search engines. This means that it won’t actually be removed from the search results. The ‘noindex’ tag is what is important in preventing the search engine from indexing your page, but remember, if you have used the robots.txt file, the robots won’t actually see the ‘noindex’ tag, and therefore it won’t be effective.

Another potential issue with the robots.txt file is that if you block the robot, all of your links on this page will automatically be valueless to you in terms of flowing from one page or site to another.

The Consequences

There are a number of consequences for brands of Google’s decision to no longer support robots.txt files with the ‘noindex’ directive.

The main concern is that you will need to make sure that if you have used the robots.txt file previously, that you have an alternative to fall back on.

If you are using ‘noindex’ in a robots.txt file, you should look for alternatives and set these in ace before the deadline of 1st September. These include:

  • Noindex in meta tags
  • 404 and 410 HTTP status codes
  • Use password protection to hide a page from search engines
  • Disallow in robots.txt
  • Search Console Remove URL tool

Here at Fibre, we know that, the majority (91%) of people who use a search engine use Google and this is why it is important to stay up to date with any changes in their policies. By being able to understand and act on their directives, you can ensure that your website is found and will continue to be found by search engines as other rules change.

The Complete Rundown Of Google’s 2019 Algorithm Updates

Over the past year, Google made a whopping 3,200 changes to its search system, amounting to an impressive 8 changes a day. It’s safe to say that this number is a far cry from the lowly one change a day initiated back in 2010. As SEO experts we also know that these changes often arrive unannounced and take their time to settle in and unpack their algorithms on the world wide web.

The full effects of Google’s algorithm updates on the search system is still unknown, leaving many of us in the dark. However, it’s not all doom and gloom. While some consequences include drops in traffic and shifts in positions on the SERP, those practicing ethical SEO strategies can adapt to these changes and still achieve high quality rankings. Remember that the ultimate aim of all Google algorithm updates is to create a high quality user experience and improve the understanding of search queries.

In understanding where and when these changes occur, industry professionals are able to account for certain side-effects to webpages in the update aftermath and devise the best strategy to neutralise them.

To keep you informed, we’ve compiled a comprehensive list of all Google’s major algorithm updates in 2019 to date.

March 12th 2019, Google Core Algorithm Update

Touted as a ‘broad core update’, this change was a big one. The term ‘core’ indicates that Google was not targeting any particular website or quality, almost like an NHS health check. The underlying goal was to improve overall user satisfaction through the process of Neural Matching – an algorithm first introduced in 2018 that enables the generation of more diverse search results.

Winners and Losers

Interestingly, this algorithm update affected all search queries relating to sensitive topics in the legal, financial and most prominently health industries. Some have said that this update reversed the affects of the E-A-T (Expertise, Authoritativeness, Trust) update rolled out in August 2018 – otherwise known as the Medic Update – which also targeted webpages in the health arena. Everydayhealth.com and verywellhealth.com were among those hit the hardest.

However it has also been reported that other medical sites with a strong brand profile and topical focus experienced improvements in site visibility. A key factor known to account for these positive side-effects was the favouring of YMYL keywords. This acronym for, ‘your money or your life’ is what Google uses for webpages that impact the future of users’ happiness, i.e. health, financial stability etc. In laymen stems,  webpages that could successfully answer user search queries relating to YMYL keywords went up in the world as they could provide a higher level of trust.

June 3rd 2019, Google Core Update

The June 2019 core algorithm update was the first update that Google announced ahead of its implementation – a stark indication that this would likely have major effects on the SEO ecosystem. It has been speculated that this update targeted news providers that offered low-value content to users.

Winners and Losers

The five-day roll out impacted a plethora of large digital publishers such as the Daily Mail (who saw a 50% drop in search traffic) and CNN, who still claim to be in recovery. On the flip side, others including the Mirror, the Sun and the Metro experienced positive spikes in search traffic. The underlying causes of why the Daily Mail saw such dismal results post-update are still unknown. Some blame the lack of user trust in their content, their political positioning and the volume of poor quality advertisements. What we do know, is that SEO analysis of the core update continues and has been made doubly as difficult to understand due to its overlap with the June Diversity update (see below.) 

June 6th 2019, Google Diversity Update

Google’s diversity update was the antithesis of the March core algorithm update. Side-effects were minor and the change itself was narrowly targeted as a result of lesser known improvements made. The aim of the update was to limit SERPs from displaying various results from the same websites and to improve the assortment of options on offer for the user.

Winners and Losers

Some say that because this update overlapped with the June core update (which finished rolling out on the 8th June), that its affects on webpages were minimal. Users continue to demand a greater array of websites on the SERP, suggesting whether Google has done enough to address the issue of limited diversity? While there have been mostly positive results from this update, we have spotted a few discussions on Twitter that suggest otherwise. SEO industry professionals have predicted that we should expect to see more algorithm updates surrounding filtering out similar content on the SERP in the near future, so results may continue to improve in time.

Possible updates sighted in July 2019

On 11th – 13th July we spotted possible signs of an algorithm update. While this was unannounced, some have reported changes to YMYL sites, many health related. Prior to this around 1st – 9Th July, there were also spotting’s of a possible change to the June 3rd update. Again, information on this is in scarce supply and further highlights the importance of consistent site monitoring and metric analysis.

Winners and Losers

The unconfirmed July update has yet to expose those who benefited and those who suffered from the change. Some debate that the scale of the update was not momentous enough to impose significant damage, while others claim to have experienced rank fluctuations to the bottom half of the SERP across all markets.

August 16th 2019, Google Search Ranking Update

Over the weekend, there have been signs of an algorithm update, labelled the Google Search Ranking Algorithm Update, that begun on August 16th 2019. While we personally have not yet seen many changes to our clients’ websites (19th August 2019), there are ongoing discussions on Webmasterworld as well as Twitter.

Winners and Losers

Chatter surrounding this update died down quickly, and Google are yet to confirm if an update was released.

The issue with small updates or tweaks to search is that, as we receive small amounts of information, it’s harder to detect what updates specifically target – which applies to this small August update. Not much insight has been presented to us and any changes in traffic or rankings appear to be minor compared to the many other updates from this year. Yet many suspect that the changes were the result of seasonal factors, such as the summer holidays being in full swing.

August 29th– 30th 2019, Unnamed Update

Following on from the suspected update just the week before, Google launched an algorithm update that commenced on the 28th of August, continuing until the 30th.

What was interesting about this particular update, was that self-proclaimed Google employee, Bill Lambert warned the SEO community a few weeks before that this update was imminent, and advised site owners to build as much traffic as possible while trying to keep users on the site for longer.

Winners and Losers

The update appeared to have affected a wide range of industries as opposed to the usual YMYL or ones struggling with elements of E-A-T.

There were a few associations with this update and affiliate linking websites, which typically are sites of lower quality. While this theory is yet to be confirmed, it is also interesting to note that Marie Haynes reported this update as a ‘possible link related update’ after her clients experienced gains following recent disavow work. This strengthens the affiliate linking theory, as the update may have been related to Google’s ability to assess link trust.

September 13th – 18th 2019, Potential Update

From September 13th – 18th, SERP volatility trackers were incredibly high, indicating another Google update. However, the effects did not appear to be as significant as previous updates.

Winners and Losers

The winners and losers of this update were more transparent than during previous algorithm changes. As the update began, sites related to health, finance, law and government were affected in more negative ways than others. This indicated to us that YMYL sites were the predominant target.

However, the update then spiked on the 18th of September which flipped the above predictions on its head. The categories that were heavily hit included arts & entertainment, science, gaming and more, while YMYL sites were mostly absent.

This suggests that either the September 13th update did not last as long as we thought, and it may have been a separate update that occurred on the 18th.

September 24th 2019, September 2019 Core Update

Google announced on their Twitter account that a broad core update was incoming, formally named the “September 2019 Core Update.’ This announcement informed us that it would be significant as Google rarely declares updates beforehand. The search engine giant also stated that their advice for recovering from such updates remain the same.

This update took a while to get going and for the affects to be noticed – the day after the announcement, Barry Schwartz posted a Twitter poll asking if anyone had detected any changes and 61% claimed that they had not. But the next day, search results were certainly shaking up.

Winners and Losers

At first, there appeared to be quite a mix of recoveries and losses across the board, but either way the results did not appear as problematic compared to previous broad updates.

Similar to the last suspected August update, links appeared to be a factor with this update. One example of this, as reported by Search Engine Journal, came from members of the Proper PBN Facebook Group – a Facebook group who specialise in grey hat SEO tactics – who reported negative effects, especially after employing the 301 spam trick (when a different domain similar to the main site would be purchased, spammy links would be built to that domain, and then redirected to the main site with a 301). Sites with relevant 301s however retained their ranking

Furthermore, earlier in the month, Google announced small changes regarding how nofollow links will be treated, so it would make sense for links to be an update target during this time period.

October 25th 2019, BERT Algorithm Update

At the start of the week, signs of an algorithm update were extremely noticeable as changes were noted regarding traffic. Then, on Friday 25th, Google announced that they had released one of the most significant changes to search in recent years – the BERT algorithm update.

Similar to RankBrain, the aim of BERT was to help Google better understand language used in search queries. This means that the search company will be able to provide users with more accurate results within the SERPs as well as more useful featured snippets.

Winners and Losers

While Google described this update as one of the biggest changes they’ve made to search for a while, it certainly did not feel as big as previous major updates such as Penguin. This is likely because, over the years, the idea of keyword stuffing and writing for SEO has died down and been replaced with writing genuine, useful content for users. So you could say that the majority of websites have been preparing for this sort of update for a long time.

Of course, what will likely be affected is traffic, especially if your site often holds featured snippets. These SERP features will be changed – possibly quite dramatically – as Google can now recognise more accurately whether or not the content answers certain posed search query.

The main reason for a change in traffic is because Google will be changing which sites suit which queries, so any keywords your site ranks for may or may not alter slightly, depending on the quality of your content.

For more info about this update, click here. 

November 7th – 8th 2019, Search Ranking Algorithm Update

There was a considerable amount of chatter within the SEO community from November 7th, suspecting a significant search ranking algorithm update from Google. However, confusion was also at large as the majority of tracking tools did not pick up on the fluctuations.

But a week later, Google came forward to confirm the update, stating in a tweet that they rolled out several updates that were no different than usual. This signifies that these updates were smaller by Google’s standards – nothing more than routine, despite the number of tremors the community picked up.

Google’s Danny Sullivan also stated that he believed this update was unrelated to BERT.

Winners and Losers

Victims of this update mostly came from the US, with many sites affected significantly by these algorithm changes. These were primarily small and medium affiliate websites, notably from the travel, food, and health industries. SEO veteran Barry Shwartz asked on Twitter what people’s thoughts are, and the answers appear to support these findings. Overall, the update has reportedly been ‘aggressive on small, affiliate websites.’

There has been much discussion surrounding the online health industry recently, what with Google announcing that they are working on a search tool to help with medical research, and several updates this year touting YMYL sites as targets. Marie Haynes has been looking into these changes, believing that Google has been working to crack down on alternative health sites that go against general scientific consensus – even if these alternatives were backed up their claims thoroughly on their websites. According to Search Metrics, because of this update, many of the organic traffic drops were more than 30%.

November 4th – 10th 2019, Bedlam Update/ Local Search Update

We, alongside many within the SEO community, saw significant tremors throughout local map rankings throughout the week. There were shifts in rankings, and it took a while for the Local RankFlux tacker to settle. Local updates are quite rare, so any changes are usually quite noticeable.

Coined the ‘Bedlam’ Update by local search expert Joyanne Hawkins, this update was ‘a scene of uproar and confusion’ as changes were made across the board, with theories coming out left right and centre to later be hidden away again as further effects continued to take place.

A few weeks later, Google confirmed that the local update did take place, naming it the Nov.19 Local Search Update.

Winners and Losers

According to Google, the update focused on adding neural matching to local queries, following the implementation of this to organic search back in 2018. This means that, while the fundamentals of local rankings remain the same, the way Google understands these search queries have changed.

This proves Joyanne’s Hawkins belief that relevance was the priority of the update, not proximity. Google stated that there’s no need for businesses to do anything in the aftermath of this update, except follow the fundamental advice already available on their website.

In her blog post published during the early days of the update, Joyanne Hawkins said that many of the drastic changes had relaxed by the 10th, with only a few not being reversed. As an example, she stated that a lawyer started ranking for several zip codes that he never had before, which continued to increase as the update carried on.

 

With thousands of changes rolling out each year and over 200 contributing factors to Google’s algorithm, navigating your website through a Google update minefield remains no easy task. However, in the event of a major update, those in the SEO community should take the time to scrutinise traffic and rankings in order to understand if a site has been hit at a given time and pinpoint if any other affecting factors have come into play.

If you’d like to know how to protect your site from future Google updates, click here for our breakdown of Google’s blog, or get in touch today.

3D Animals – The Latest Google Feature

Thanks to Google, you can now interact with different animals by using your phone.

In May, Google announced that some searches would feature augmented reality results that users can interact with. Now, we’re starting to see them in action.

You can now see different animals using your phone camera, similar to Pokemon Go.

Google has not yet clarified how extensively they plan to implement AR into search results, but, right now, it is certainly providing a lot of fun!

How To Use It

If you search ‘giant panda’ as an example, and scroll through the results, you will find an option that reads ‘Meet a life-sized giant panda up close.’

Once you’ve tapped on ‘View in 3D,’ a small version of the panda will appear on your screen. Tap ‘View in your space,’ and it’s inserted into your current surroundings.

You can then change its size, move it around and even take photos with it. The animal is fully animated, so it moves the same way a real panda would –  it sits there eating a stick of bamboo.

Most of the other animals make sounds as well, although not much can be heard from the panda.

How do I know if my phone supports this feature?

For Android, you will need Android 7.0 or later, and your device should have originally come with Google Play Store installed.

For iPhone, you will need iOS 11.0 or later.

Which animals does it work for?

These are just a few that have been found so far.

Giant Panda, Lion

Shark

For us, we found that the option is only available if you search ‘sharks’ on Android. On Apple devices, ‘shark’ works.

Octopus, wolf

Tiger

 

There is a giant thread on Twitter showing more animals that have been discovered, such as an alligator, a turtle, and a pony. However, there are likely more if you’re willing to search for them all!

Google’s Latest Design of Search Engine Results Pages

You are probably already aware that Google make regular minor changes that affect the design of search result pages. This includes several small changes to the font and size of ads over the years. However, last week Google announced that they have made a major change to the design of mobile search results and ads. Google state that the design update aims to put “website’s branding front and center” so that web users “better understand where the information is coming from”. The new layout should also create a more streamlined design on search engine result pages. With this in mind, here’s everything you need to know about Google’s new update.

What are the changes?

Black ad sign

The green ads sign has been replaced with bold, black text. This may seem minor, however it could have a significant impact on search results as it helps ads to blend in with the rest of the search results and appear more like organic content.

Name of the site

With the new design, the site name and information appears next to the favicon in search results. Prior to the update, the name of the site appeared in smaller green text below the title and there were no logos or images present. This helps search users to understand where their information is coming from.

No grey line

Previously, ads stood out from the rest of the search results and were easily recognisable. For example, back in 2013 Google ads were displayed in a single block with a coloured background, clearly separate from other result types. More recently, ads and organic search results were  differentiated by the presence a grey line, but Google has now removed this to give each result a more singular look.

Google SERP

 

Image from Twitter

Presence of a favicon

The new design layout takes a brand-first approach and the site’s favicon will now appear at the top of the search result. This means that the user will be able to instantly see where the information is coming from. This is an important feature as it will help you in building an online brand.

These changes mean that organic and paid results will appear more similar on search result pages. The new design also brings some branding to search results and allows users to get quick information on the businesses and sites included in their search results.

Do I have to have a favicon?

A favicon is essentially a small icon that can be associated with your site. Previously, they would only usually be seen in the address bar of a browser, or next to a bookmark for your site. However, with Google’s new update, having a favicon is more important as they will be displayed next to search results. Not having a favicon may affect how your business appears to consumers and have a negative effect on traffic to your site. Your favicon should act as a visual representation of your business and brand, and it needs to the correct size and format so that it can be supported. There are plenty of free tools available online that can convert a standard image into a favicon file. Check Google’s guidelines for more information.

Will the changes increase traffic?

The most notable benefit of Google’s new update is that it helps ads appear more like organic search results. This may help improve click through rates and traffic to paid search result pages. Although it may be too early to notice any impact yet, it is definitely worth monitoring your click through rates on both organic pages and paid results to check for any changes. Another benefit of the new design, is that it provides users with details on where their search results are coming from, so they can get instant information on your site and business. This will help build brand awareness and make your business more recognisable to people. Consumers are far more likely to buy from your business if they know and trust it. For that reason, the new update could help to improve organic traffic to your site, leading to more conversions and an increase in revenue.

Final thought

Overall, the changes have created a more streamlined design to search result pages. It is too early to say whether Google’s new update will improve traffic to paid search results. However, re-designing ads to appear more similar to organic search results, will certainly make it harder for consumers to differentiate between the two. It is important for all marketers and site owners to monitor the potential impact on traffic from Google mobile search, in response to the new updates.

Google’s Quality Raters Guidelines: What They Mean

The Quality Raters Guidelines (QRG) are intended to help the team of Quality Raters that work for Google to assess the quality of websites. It makes sense that updates to the QRG are important as the Quality Raters help Google engineers to understand whether changes they make to the algorithms will work as intended.

This is why, in the past, updates to the QRG have been reflected in Google algorithm updates that have followed. For instance, after the QRG was last updated, in July 2018, relevant algorithm updates seemed to follow with the Google Medic Update of August 2018 and the following update in September. These updates centred around the “safety of users”. So, what can we learn from the latest QRG update that might help us to predict future Google algorithm changes?

Interstitial ads

It’s no secret that Google does not have much love for interstitial ads, especially when they are used on mobile content. These full screen ads that pop-up during the flow of an app are especially disliked by Google when they are displayed at the start of the user experience.

In the latest QRG update, Google has added the phrase “interstitial page” when discussing ease of access for users. This is interesting as Google already has an algorithm that deals with sites that use interstitial ads immediately as users arrive on a page.

It could be that the use of the phrase is intended to clarify the situation regarding interstitial ads, with Quality Raters.

E-A-T, how formal does expertise need to be?

Expertise, Authority and Trust (E-A-T) are factors that Google uses when it measures the trust that can be placed in a website or the brand it represents.

There has always been a question about how formal expertise needs to be. For instance, does an individual or brand need to have qualifications in order to be considered an expert in a specific area? In some cases, such as when medical advice is being given, qualifications are important.

However, in the latest QRG update Google suggests that a medium rating should be given to creators of content who display expertise that is not formal. This change is probably more to do with better evaluating E-A-T than a suggestion that the importance of E-A-T is being reduced.

It could be that Google is recognising that not all experience is best gauged by professional qualifications. This could mean that people who have in-depth expertise as an amateur will rank better in the future.

Authority of brand and content

We all know that authority of brand and authority of content are not always the same thing. Some content that is not of a particularly high standard seems to still be able to rank highly if it’s produced by an authoritative brand.

Interestingly, at one point in the updated QRG Google refers to the authority of a brand “making the information on this page highly authoritative”, rather than suggesting that the information on the page is authoritative in its own right.

This could seem to suggest that the authority of the brand itself can be seen to make content authoritative. This can make it difficult for content creators to compete for position in search results against the big brands. It means that expectations need to be considered when trying to compete for certain keywords and phrases.

Emphasis on the quality of pages

One interesting change that is noticeable in the latest QRG update is that the phrase “page quality” has replaced the E-A-T acronym on several occasions. While it’s obvious that E-A-T still has huge importance, it seems as though there is a move towards assessing the quality of individual pages. This is different from the previous emphasis on assessing the E-A-T of a website and of its author.

The impact of the QRG update on future algorithm changes

While it’s not possible for anyone to predict the precise effect that the QRG changes may have on future Google algorithm changes, it is possible to suggest what the effect may be.

The biggest change that could be coming is an increased emphasis on page quality. While E-A-T is not going away it could be that sites that have found a way to be valuable in other ways will rank higher than they do now.

It will be interesting to see if and when this happens.

A Guide to Featured Snippets

 

A Google snippet or featured snippet is a small summary of your website on the Google results page. It offers small segments of detailed information with the aim of helping a Google user choose if he or she should visit your website. It is programmatically chosen from your web pages. If you are wanting to land a featured snippet, you’ll have to optimise your content, conduct some keyword research and answer the questions that your target audience are looking for.

Where Are Google Snippets Coming From?

Information is extracted from what a reader sees on your web page. If the Google algorithm indicates that your website offers an answer for a Google user’s question, then the answer  will be presented on the featured snippet.

The snippet shows directly under the paid ads, but before the rest of the search results on the SERP (search engine results page).

The snippets come in a range of formats:

  • Paragraph
  • List
  • Table
  • Video

Is There Anything I Can Do To Affect Snippets?

There is nothing you can do besides offer a suitable level of information or content on your web pages. For example, if your web page has a single image and a single line of text, then there is not very much for the website to go on.

To optimise your content for featured snippets, there are many things you can do. But it all comes down to content.

  • Keep your content concise. And make sure it’s quite specific as well. Add bullet points or short paragraphs (the average length of a paragraph snippet is 45 words), as these will help user retention.
  • Answer the questions that people are asking. Your content needs to provide the information that they are looking for, so do your research (the ‘People also ask’ tab on SERPs are useful for this) and lay out the answers searchers want on your page.
  • Do your keyword research. Try and fit a range of keywords with a decent search volume into your content – just make sure you don’t spam, else you’ll likely be penalised by Google. It would also help if your keyword research included a mix, such as long-tail keywords and questions.

Are there any search terms that usually give featured snippets?

There are so many search terms that generate featured snippets. While many are questions, a few short searches do frequently show them as well, such as ‘definition,’ ‘vs,’ and ‘recipe.’

Build an Online Brand

If you want your website message and your Google snippets messages to match up, then simply brand your website correctly. If your website is not sending mixed messages, then your Google snippets will not transmit mixed messages. There is nothing wrong with website wackiness, but do not break or bend your brand principles. If you make a claim, then live up to that claim consistently and your online brand will hold.