Why Low Speed Scores Could Be Killing Your Traffic

Earlier this month, Google rolled out its new speed reports on the Google Search Console to the general public. This aims to identify webpages that are slow to load and provide advice on how to improve the website’s performance.

Use this new feature and you could potentially boost your website rankings and improve user experience, allowing you to stand out from the competition and get the business results you need.

Keep reading to learn why your website speed matters, how the new Google speed reports can provide useful insight and learn more on how you can improve your website speed.

Why does site speed matter?

Site speed matters because it can potentially make or break user experience and influence your Google rankings.

Since the Google Speed update in 2018, speed has been one of the direct ranking factors used by the search algorithms to rank your pages and is used for both mobile and desktop sites. If your website is slow to load, the search engines could crawl fewer pages and your rankings are likely to suffer as a result.

Unsurprisingly, slow load speeds also affect the overall user experience. When a website visitor has to wait longer than they expect for a page to load, they’re more likely to click away from your page and choose your competition instead.

According to an article titled “Why Performance Matters” on the Google developer guide, “The BBC found they lost an additional 10% of users for every additional second their site took to load.” This can result in a higher bounce rate and a lower than average time spent on the page which damages the reputation of your brand and negatively impact your conversions.

Clearly, for both SEO and user experience, we need to keep website load speed in mind and troubleshoot any issues we come across. But how do you know how your website is performing?

Enter Google Search Console speed reports.

Using the Google Search Console speed report

Google Search Console speed reports were officially released this months after several months of testing.

Available in the Google Search Console interface under the ‘enhancements’ tab, users can quickly test the speed of the sections and URLs on their websites and identify any potential problems.

The results divided between ‘fast’, ‘moderate’ and ‘slow,’ for both mobile and desktop sites and colour-coded for ease of reference.

Google also provide useful tips on how you can overcome any problematic pages and increase your page load speed. Once you make these changes, you can continue to track your performance and make changes until your website performs as well as possible.

How to increase your website speed

If you identify a problem with one or several of your webpages, don’t panic.

There are many ways you can boost the speed of your website. Here are some tips:

Compress your images

By decreasing the file size of your images and choosing the right file format, you can speed up your website load time significantly.

While we do need crisp, compelling images to drive conversion, according to HTTP Archive, they can take up around 21% of the weight of the entire webpage. They also tend to be resource-heavy, impact user experience and slow down the page load speed considerably.

There are many tools, programmes plugins and scripts that can help you to achieve this relatively seamlessly.

The most popular of these is Affinity Photo as it is free and works in a similar way to Adobe Photoshop. Gimp is another programme which can help you achieve the same. You might prefer to use an online tool such as JPEG Mini or ImageResizer.com.

When you compress your images they will naturally lose quality so it’s important to find the balance between file size and quality. Experiment to find what works best for your website.

It’s also important to save images in the right format. Generally speaking, it’s better to choose JPEG for larger images as these have more flexibility with resizing and compression, WebP for smaller images and SVG for logos and icons as this format is vector-based.

Minify CSS, JavaScript and HTML

Cleaning up your website code and removing any unnecessary characters, spaces, commas, comments, formatting and unused code, you can significantly boost your website speed and improve both UX and your search rankings.

Although these tiny pieces of unwanted code might not seem like much, they can slow down the time it takes to load your website and increase the crawl time needed by the Google bots to do their job.

Ask your web developer to do this of use one of the minifying resources recommended by Google such as HTML Minifier, CSSNana and UglifyJS.

Reduce redirects

Redirects are more than just annoying. They force your website visitor and Google to wait longer before they can access the information they are looking for.

Start fixing the problem by first identifying where you have redirects on your website. Tools such as Redirect Mapper can be excellent tools to help. Once you’ve found them, ask yourself why it exists and see how it affects the rest of your site. If it’s not essential, remove it where possible.

Use browser caching

When a user accesses information on a website, certain information such as images and stylesheets are stored on their browser. This allows them to be accessed quickly next time they visit.

You can leverage this process by telling browsers what they should do with the various information on your website.

There are several ways that this can be done by asking your developer to add code directly to your website or select a plugin that will handle the process for you. For WordPress, W3 Total Cache and WP Rocket are excellent choices.

Boost server response time

Provide faster results to your website visitors by improving your server response time.

Ensure that you’re using the best host and server that can meet the unique needs of your business. It should provide enough resources, provide excellent customisation option and give you fast results. Also, configure the settings to use HTTP2 and enable the cache so your website loads faster.

Use a content distribution network

Content distribution networks (CDNs) allow your website content to be accessed more quickly to users who are geographically closer to your servers. For example, if you’re a website visitor in Bristol, you’ll access the website on a London-based CDN much quicker than someone based in Sydney, Australia.

They can also provide a range of benefits including improving site accessibility, reducing website downtime, compressing images and delivering a more stable website to your visitors. Ask a professional if you’d like to do this for your business.

A slow website is more than just a minor annoyance. It directly affects SEO and user experience, increases bounce rate, reduces conversions and harms your brand image. Use the new Google Search Console speed reports and you can identify any issues, find a solution and boost your flow of website traffic again.

All You Need To Know About BERT: Google’s Latest Update

Last Friday, Google announced a major algorithm update that is being referred to by SEO experts as one of the most important updates in the past five years. The BERT update is thought to impact around 1 in 10 search queries and has been rolled out in Google Search over the past week. Below, we are going to discuss everything you need to know about the BERT algorithm update and how it could impact your SEO strategies.

What is the Google BERT update?

Google state that BERT (Bidirectional Encoder Representations from Transformers) algorithm is a neural network-based technique for natural language processing.

Essentially, BERT helps Google recognise natural language and understand what words in a sentence mean to the user. This helps search engines better understand the intentions behind queries and provide users with more useful and relevant information.

Why was the update brought in?

According to Google, this update will aid complicated search queries that depend largely on context. Google said in their blog: “These improvements are oriented around improving language understanding, particularly for more natural language/conversational queries, as BERT is able to help search better understand the nuance and context of words in Searches and better match those queries with helpful results. You can search in a way that feels natural for you.” The BERT update means that Google Search has gotten better at understanding more prolonged and more conversational queries, thus improving the overall user experience.

Search Engine Journal provides a useful example of how the BERT update has improved search engine results. They state: “ In New England, the word “cow” in the context of fishing means a large striped bass.” They go on to describe how they typed the phrase, “how to catch a cow fishing” in Google and were provided with results related to livestock and cows, despite using the word “fishing” to provide context.

Now, following the BERT update, the same query brought up search results that are related to fishing and bass in particular. They conclude: “The BERT algorithm appeared to have understood the context of the word “fishing” as important and changed the search results to focus on fishing-related web pages.”

How does the update affect SEO?

Since last week, Google has been using the BERT algorithm to display search results. It has been confirmed that the new algorithm will impact around 10% of all search queries entered on Google Search.

This update will also affect Featured Snippets as Google will select different content from sites to display in their snippets, as previously, in many queries the snippets displayed were not that useful. Now, because of BERT, users will see featured snippets that will actually answer the posed question.

The BERT algorithm update has not been designed to punish sites, it simply aims to improve how Google understands search queries and user intent. Google has in fact already stated that there is no real way to optimise for the BERT update – Google’s Danny Sullivan tweeted: “There is nothing to optimise for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.”

However, it’s clear that the new update will not favour sites with poorly written content. This means that all businesses can benefit from implementing strategies to improve the quality of their on-site content in order to get a boost in the rankings while avoiding being trampled by the new update. According to Google’s E-A-T guidelines, content creators should be writing for human enjoyment and real people, rather than focusing on writing for search engines.

How can I improve the quality of my content?

If you noticed a decline in your search engine rankings and organic traffic in the past week or so, then it’s likely that your site has been affected by the BERT update. As mentioned above, there isn’t really a simple answer for optimising in order to avoid a traffic decline. Primarily, there is nothing you can do.

But looking at your content may help. The update is more content-focused so technical improvements to your site are unlikely to improve your rankings. One of the main ways to boost your search engine rankings is by implementing strategies to improve the quality of your content in order to further meet Google’s guidelines.

With that in mind, here are some simple ways to enhance your on-site content and increase organic traffic to your site:

  • Update the content on your site regularly and ensure that it’s well-written, informative, and relevant. Remember to write for real people rather than machines.
  • Work with an SEO consultancy to create effective content strategies. SEO experts can work with you to build brand authority, improve search engine page rankings, and generate increased traffic flow and quality leads to your website.
  • Use a variety of different content on your site and experiment with various marketing techniques to engage as many users as possible.
  • Use natural internal linking within your content to help your website rank higher in search engine results.
  • While BERT is not directly linked to E-A-T, they do vaguely fit together as E-A-T guidelines strongly affect how Google reads your content – therefore, ensure that your site works with these factors.

Final thoughts

It is clear that the BERT algorithm update has had a significant impact on search queries – according to a press release by Google, BERT represents one of the biggest improvements in five years.

Overall, Google’s aim with updates is to enhance the user experience. BERT allows users to find relevant information quicker and use Google Search to find more satisfying results.

Remember, there is no real way for businesses to optimise their websites for the update – the best and most effective strategy you can implement is producing regular high-quality content that has been written to engage and inform your readership, thus improving your search engine rankings.

The Importance of Disavowing Links In 2019

Disavowing links has been a hotly debated topic within SEO circles for many years now.

Many SEO experts are unsure whether they still need to submit disavows for low quality links or whether this practice could actually have a negative impact on their website ranking, as Google suggest.

Regardless of the debate, Google’s algorithms do still use quality of links as a way to rank websites and they do still hand out penalties if the site doesn’t comply with their tight guidelines.

This means that it’s just as important in 2019 to audit links and file a disavow with Google when it’s negatively impacting on your website ranking.

In this article, we’re going to look closer at the debate in the SEO industry, remind ourselves what manual actions are, and understand which sites we should still disavow.

Why the debate?

To clear up the debate, we need to travel back in time to those days when Britney Spears shaved her head, ‘to google’ was officially declared a verb and MySpace was the best thing since sliced bread. We’re talking about the early days of the internet as we know it- from the late 90s to the early 2000s.

During that time, SEO was still in its infancy and people did what they could to get their webpages to rank highly. This involved a lot of spammy practices such as keyword stuffing and hidden text on the website.

PageRank

Google had always set itself ahead of the competition by focussing on link quality, so it created PageRank. The thinking was that if there were lots of links to a website, then it must be of high quality. But people simply added their details to directories, left comments on website and paid for links so the algorithm didn’t work as effectively as hoped.

The Penguin algorithm & the disavow tool

Once the Penguin algorithm update went live in April 2012, Google could now penalise your website for having low quality or spammy links. Linking practices changed completely and many websites found that their traffic sharply declined overnight. The only way to get around this was by going in and removing these links.

To help webmasters to recover and remove these links, Google created the disavow tool. By uploading a text file, you could ask Google to stop looking at the links of your choice and so your web traffic could recover.

Penguin 4.0

‘Problem solved!’ you’re probably thinking. But the case isn’t quite so straightforward because in September 2016, Penguin 4.0 was released. This meant that Google would no longer penalise your site for low quality links but instead devalue them.

According to Google, you wouldn’t need to submit a disavow unless you’d been actively dealt a manual action (more on that below) or you were actively trying to prevent one.

Basically, if you hadn’t suffered from problems arising from your links and you didn’t have cause for concern, there wouldn’t be any need to do anything.

Google’s current position on disavowing links

Over the years since, Google has stated that disavows don’t help websites, but there has always been some confusion as to whether that’s correct.

But in a Google Help Hangout this year, John Mueller from Google stated that disavowing can actually help some websites, especially when there are ‘bad links’ that haven’t resulted in a manual action.

…So, it’s something where our algorithms when we look at it and they see, oh, there are a bunch of really bad links here. Then maybe they’ll be a bit more cautious with regards to the links in general for the website. So, if you clean that up, then the algorithms look at it and say, oh, there’s– there’s kind of– it’s OK. It’s not bad.

To clarify, this means that it’s still worth disavowing your links, even if you haven’t received a manual action yet. Unnatural links can still influence the trust that Google places in your site and affect your ranking.

Having said that, Google still isn’t a fan of allowing us to use the disavow tool and has ensured that it’s hard to find in the Google Search Console. The thinking is that people are disavowing too many links, and according to Gary Illyes of Google, “If you don’t know what you’re doing, you can shoot yourself in the foot.”

What is a manual action?

Much of the debate centres of the idea of a manual action, so it’s worth quickly recapping what that actually is.

As the name suggests, a manual action is taken by a Google staff member and lets you know that Google isn’t happy with some of your content. It tells you that it will omit or demote certain pages of your website from its search results.

The content and pages that can be penalised in this way include:

  • Unnatural links to your site: Just like it says on the tin, this refers to both inbound and outbound links which don’t seem quite right.
  • Thin content: This includes content created purely to promote, auto-generated content and content created by guest posters that don’t add value.
  • Hidden text and keyword stuffing: Keywords and text that appear too often and aren’t always visible to reader.
  • Clocking and suspicious redirects: Hidden content and conditional redirects fall into this category.
  • Use-generated spam: Spammy blog comments, forum posts and profiles that are spammy.
  • Spammy freehosts: If your website is hosted with other spammy websites, Google might paint you with the same brush.
  • Spammy structured markup: Does your markup match your site? If not, Google is likely to penalise you.

You can find out if you are subject to a manual action by going to Google Search Console and looking under ‘manual actions’.

Should you disavow your links?

This is a difficult question to answer. It all depends on just how bad those links and whether it is having a significant impact upon your organic search traffic.

Disavowing should always be done with care, and only once you’ve done everything else you can to remove the link manually.

It’s a serious action to take and as Google say, ‘This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results.’

In essence, if you use the disavow tool incorrectly, you could harm your SEO.

If you discover spammy, artificial, or low-quality links pointing back toward your site or you have received a manual action, you should first reach out to the owner on the website in question and ask for the link to be removed.

Usually this is as simple as visiting their ‘contact us’ page on the website or searching for them on social media. Then you can send them a polite email asking for them to remove the link. Only then should you consider disavowing your links.

What about when there’s not a manual action against you?

It gets trickier still if you don’t have a manual action against you, yet you’re still concerned about the quality of your links.

The best way to start is by conducting a link audit that can shed light on the situation. From there, you can go through the links to check their quality.

Don’t worry too much about those random low-quality links that can often appear because Google can happily ignore them. Google will only issue a manual action for low quality links that you or your SEO team are responsible for.

Instead focus on those linking practices which violate Google’s terms.

This includes things like:

  • Paid links or link schemes
  • Paid articles containing links
  • Publishing articles containing links to other sites (often through guest posting)
  • Product reviews and links which offer free products
  • Excessive use of reciprocal linking
  • Widgets that require linking
  • A high number of suspicious anchor texts

It’s also worth considering non-editorial links such as:

  • Malware
  • Cloaked sites (show google one set of results but the user a different set)
  • Suspicious 404s
  • Pills, poker and porn

You can often use your common sense to decide how bad a link is and whether to disavow. If there’s a borderline case and it seems like a matter of time before you receive a manual action, it might be worth disavowing anyway.

Generally speaking, if these links only relate to pages that you don’t care about and that don’t generate revenue for your business and they don’t negatively impact upon your organic search traffic and rankings as a whole, then you might not need to worry. You could simply delete the page altogether and move on.

Conclusion: Disavowing links and the future of Google algorithms

Maintaining high quality links and disavowing those which are low quality remains just as important in 2019 as it ever has.

The continual advancement of algorithms such as Penguin 4.0 can minimise the problem, but there will always be cases which slip through the net and negatively affect your website ranking.

The key remains to focus on creating high quality outbound links on your website, regularly monitoring the quality of incoming links via the Google Search Console or your SEO team and disavowing those which are harming your online presence.

Google SERPs Potential New Layout?

Google makes regular core updates multiple times throughout every month, seemingly increasing each year. In 2018, Google stated that they had made over 3,000 improvements to search, compared to 2009, where there were only 350-400 changes reported. Some of these are hardly noticeable, while others have a significant impact on search engine results and SEO rankings, which is why it is vital that any algorithm update Google releases are closely monitored.

Here at Fibre Marketing, we track these search changes and help our clients beat the updates to improve their rankings.

Recently (Sept 28th 2019), it has become apparent that Google was testing a new search results page design for desktop – something that caught our attention.

With this particular test, Google has added several additional options to the area at the right and left of the search results that was previously a blank white space. The left side now offers more search filters, while the right side has related search options that allow users to expand their search and look at related search result pages.

(click on images to expand)

Types of searches this will apply to

It appears that the new search results design update will only be available on certain types of searches. This may include searches for:

  • Songs
  • Games
  • News
  • Video

So far, it seems that the new design cannot be replicated and may not apply to searches for movies, books, artists, or bands, according to Barry Shwartz and Adarsh Verma, who reported the test. Although this could of course change at any time in the near future.

Impact on search results

As with all Google updates, this has left people questioning what effect the changes will have on search results.

Decrease In Click-Through-Rate?

Firstly, it has the potential for less click-through rates as users have more options when searching for answers and information online. This also includes YouTube searches, as the videos from the video platform are integrated in the results page here, although it is currently unclear whether or not the video will play in the SERPs, or if it’ll open in a new tab on YouTube itself.

On the other hand, music-sharing platforms such as Spotify will clearly benefit from this change, as they are linked directly underneath the video, above fold.

Of course, all of the above features will likely see search results themselves pushed further down the SERPs than they already are. Google tailors the design of their results pages to the user in order to optimise their search experience. This means presenting the answers within the SERPs themselves, as shown by this potential design, which will likely result in an increase in zero-click searches.

In June 2019, 49% of all Google searches ended with zero-clicks. Google has now become a competitor within a variety of sectors, including hotels, flights, song lyrics, etc., which has landed many website owners in a panic as they watch their organic traffic decrease. This potential search design will not likely help this situation.

A UX-Based Design

The changes could likely result in quicker searches as users will be able to locate the information they are looking for more efficiently using keywords. Google is constantly looking for ways to improve the user experience, and these changes to the results design page could help search results become more UX based as it offers more shortcuts to what they need, if they are wanting a media search itself.

A Tough Challenge For Organic Search Results

However, this new SERP design will push the organic results further down page, a continuing trend with any new search update. Over the years, Google has added a staggering amount of features onto their results pages over the years – 39 overall, according to Paige Hobart’s talk at BrightonSEO – from featured snippets, map packs and knowledge cards. And then, there’s the Ads.

If your site is currently ranking as the top position, this does not necessarily mean that your listing will be above fold. Therefore, site owners have adapted their strategy over the years to create more user-friendly content which will appear in featured snippets, as well as implementing schema to take up more space in the results. It is not clear from this recent test how featured snippets will show up – beneath the media results, or above.

Regardless of this, if Google does go ahead with the proposed search design, site owners will continue to watch this drop.

Final Insights

So far, the only real information we have is that Google has made changes to its search results page design. The full effect of these changes on search results is still unclear, but it is likely to create a more UX-based experience that could decrease organic traffic to your website – although it is too early to say for sure. Currently, it appears that these changes only apply to a limited number of search types including songs and games, and it is unknown whether the changes will apply to other search types at a later date.

All You Need To Know About The Google Update To Reviews Rich Results

Making your business stand out online can be tough what with all the competition out there – it’s therefore vital that your search listings stand out in search results. Rich results, created by schema markup has always been one of the best and most popular ways to do this as it allows for prices, dates, star ratings and more to show underneath your meta title. These features can work wonders for your organic performance, as over 80% of local business consumers trust online reviews as much as personal recommendations.

However, on Sept 16th 2019, Google implemented changes to their reviews rich results policies and procedures, affecting how it shows review rich results and rating stars. The overall aim of this change is to improve the rich results for search users as well as addressing abusive implementation (e.g. ‘self-serving’ reviews) that have occurred over the years.

All website developers should strive to understand these new changes to ensure that their knowledge is up to date and relevant.

What Are Reviews Rich Results?

Reviews rich results are those results which show at the top of the Google Search Results. They are based on the reviews and ratings of a product, service, or production that have been produced by

Rich Results

a well reputed and established website. There are many different types of products and services for which a review can be left, including books, events, guides, local businesses and establishments, software and applications, recipes, and other similar features.

Reviews rich results tend to look similar to the image shown on the right.

The Google Update

The changes that Google have released, at their very simplest, are designed to limit the number of reviews rich search results that can be made; notably, self-serving reviews are no longer allowed.

The schema that are now allowed are:

  • Book
  • Course
  • Creative work session
  • Creative work series
  • Episode
  • Event
  • Game
  • How to
  • Local business
  • Media object
  • Movie
  • Music playlist
  • Music recording
  • Organization
  • Product
  • Recipe
  • Software application

Clarification

Google’s primary goal with the new regulations were based on preventing businesses from self-promoting their own material, content, services, and the like – hence, creating ‘self-serving’ reviews. For example, reviews about business A that have also been posted on business A’s website will no longer feature as a reviews rich result; only reviews which have been made by unbiased third party individuals will be considered.

A more detailed case could be a search result featuring the review markup showing something like 5000+ reviews, when realistically they actually have 100. These extra reviews have been generated by the business themselves, and therefore do not count.

This update therefore is to protect the integrity of the content and ensure that the results are of the most relevance and use to the searcher as possible; biased self-reviews, unsurprisingly, do not meet this requirement.

This algorithmic update was met with confusion amongst the SEO community, and so the team at Google Webmasters updated their blog to clarify the regulations more clearly. They stated that, essentially, you can’t review your own local business and then host it on your own website. See the Twitter discussion here.

But Why Do We Care?

You might be wondering why you might need to be worried about these new changes. While they don’t necessarily need to be all that consequential, it should be considered as they could impact a page’s ability to show its own star rating. This means that everyone should always work to ensure that they have reviewed and analysed the new changes to ensure that their markup meets the regulations in order to avoid their rich results being dropped.

In addition, for people who make use of these reviews rich results when making a search, the new changes could also be beneficial. The changes are heavily based on the idea of making it easier for people to use the system without having to worry about reviews from biased sources. This means that search users can feel confident in the quality of the reviews that they are finding for the products or services that they are searching for, thus receiving a better user experience from Google.

How should you respond to rich results review display limitations

Even though this update is only a small change, it may have an indirect impact on a site’s ranking as these reviews can affect organic performance. Therefore, if your website uses review rich results, you should strive to understand the new changes.

Elimination of self-serving reviews

The changes to self-serving reviews are now in place for entities that are a local business, organization, or anything in between. The same will be the case for third party reviews—such as a TripAdvisor review—that are embedded into a business’ website. Already, there have been a large number of cases of sites losing their review rich results, as reported by numerous tools including Mozcast (35.8% rich results, down from 39.2%) and SemRush (47.6%, down from 52%) – statistics dated two days after Google’s announcement.

It should be noted that there will not be any penalties for businesses who still display these self-serving reviews; rather, the case will simply be that the snippet won’t appear in the Google Search results.

Mandatory “name” property in featured snippets

In addition to this, Google have also announced changes that have been made to the ways in which you name your reviews. In order for a review to be eligible for showing  as a reviews rich result, they will now need to feature the name of the product or service in their markup—and failing to do so will mean that the review won’t be possible for being a reviews rich result.

Final Thoughts

If you’re deeply involved with SEO, you’ll know that review rich results are not a ranking factor, so it won’t your site’s position. But, adding this markup took websites a long way as it can affect the number of sales and users, information that search engines use when selecting the best results for the search user. So this small change will likely have a significant impact amongst the SEO community and search results, as we’re already started to see. It is therefore vital that you get to grips with the latest regulations to ensure that your organic performance isn’t affected a great deal.

Why Your ‘About Us’ Page Is Replacing Your Business Card

Before the digital world took over, business cards were one of the best ways to promote your brand and foster new business. However, with a plethora of online alternatives providing instant information for consumers at the touch of an enter key, the humble business card is slowly vanishing from the enterprise space.

According to data from Internet live stats, the number of daily searches on Google is rising exponentially – currently standing at 3.5. billion – and when consumers arrive at your website ‘About’ page, you need to be confident that you have followed SEO best practice to ensure maximum conversions.

Designing your website doesn’t have to be a complicated feat, so long as each component is carefully considered, planned and executed within Google’s guidelines. 81% of shoppers will conduct online research before making a purchasing decision, so it’s important that your website represents your business accurately and assures users that your company is the stand out competitor.

When users arrive organically at the website of a business they’ve never heard of before, they will likely conduct a quick online investigation. According to a study lead by KoMarketing, 52% of visitors will be looking for an ‘About’ page. Without one, businesses risk being hidden from potential customers.

An ‘About’ page:

  1. Allows customers to get to know you, the basis of a relationship between them and your brand. The better the relationship, the more successful your business will be.
  2. Present your achievements, experience and credentials, improving your site’s E-A-T (Enterprise, Authority and Trust) and thus helping your rankings.

The problem is, many businesses fail to create ‘About’ pages that effectively convey their message and brand mission to consumers. It is also all too common for companies to overlook the weight of this page entirely, despite evidence from Google Analytics that suggests ‘About’ pages are one of the most frequented webpages.

With this in mind, let’s look at the key features of an ‘About’ page, and what is needed to assure users that your company is one that they can trust.

Tell Us Your Story

When creating content of any form, it is vital that you keep your audience engaged. They need to be interested enough to carry on reading.

This page does not need to be an essay, but it does need to explain where you came from, what your goals are and how you or your business got started. Focus on key elements that have influenced you in some way and how they’ve impacted your company. Your readers don’t need to know how many suppliers you’ve been with or a year-by-year growth timeline – just the parts that emphasise your relevance to your target audience. It is generally advised to cut out any jargon – you’re better off using a simple tone of voice that suits your company ethos and writing in a way that users will actually understand.

Make Your Mission, Values and Visions Clear

This is probably one of the most important features to consider when creating your ‘About’ page. Your mission, values and visions provides an insight into what your business entails – they reflect pretty much every facet that makes up your company.

Is your company environmentally conscious? What do you believe in? These are your values, and they say a lot about your company culture.

Note, your values are not the same as your mission statement. Your mission is why your company exists and what it wants to achieve in the short term. This matters to users as it’s more action orientated and thus suggests what your team are doing to put their values into practice.

Now on to your vision statement. Your vision statement should explain what your business is aiming to achieve in the long run. The simplest way to differentiate this from your mission statement is to think about what goal your mission serves. How will your company change its sector? Where does your company want to sit within your industry or even general society?

E-A-T Factors

If you know anything about SEO, you’ll know that E-A-T is a major ranking factor (if you weren’t aware of this, click here).

There are many reasons why an ‘About’ page can work wonders for E-A-T. You can include your credentials, testimonials and experience within your content to show that you are a reliable business to go to within your field. When talking about experience, don’t forget to mention how long your company has been operating – this will fit in nicely with your story.

To demonstrate your enterprise, you’ll also need to include information on any qualifications and awards that you’ve obtained. Have you been mentioned by relevant experts from your industry? Integrate this into your page (and link to the source to make it easier for Google to associate you with them).

By following the factors above, Google is more likely to recognise your business’ website as credible and will favour you when ranking sites. Just remember that the aim of your content is to clearly demonstrate you or your business’ purpose, and how you intend to meet it. This information should always be easily accessible.

Include A Variety Of Features

User experience (UX) is an indirect ranking factor when it comes to websites. Google’s aim is to improve UX as much as possible by presenting high-quality sites to its users.

Visuals are a good way to achieve this, as they break up your text and make the page look less intimidating. They can add value to your content and, if they’re personal, they can add an element of company culture which users often find appealing. They don’t necessarily need to be headshots of your team members – there is a large variety of media you can go for instead, such as images, videos, infographics, and timelines. Don’t forget to optimise images for SEO.

A Quick Recap

There are many key elements that make up a quality ‘About’ page. We’ve explored the most essential features above, but here is a quick check list for you to take note:

  • The story of your business
  • Your values, mission and vision statements
  • E-A-T factors such as credentials and experience
  • Different forms of media

By including these different elements, your ‘About’ page will speak volumes and assure users that you and your team are capable of meeting their requirements. This will contribute to your overall visibility and keep customers coming back to you, boosting your rankings and future conversion rates.

Leveraging Schema To Increase Organic Performance

Update:

From 16th September 2019, Google made changes regarding the regulations of Reviews Rich Snippets. Please see here before implementing this schema markup. 

If you’re somewhat involved in SEO, you will have likely heard of phrases such as schema and structured data. You also may have heard that by leveraging your schema or other forms of structured data, you can help improve your site’s rankings and visibility through featured snippets, knowledge graphs and other search features.

But what exactly is schema? Is it an essential for your website? And how does it help increase your organic performance?

In 2018, John Mueller stated that while structured data will not give you a ranking boost, Google understands that schema is important for a website as it can help them understand your content more, potentially leading to higher rankings.

Therefore, schema implantation is recommended for any website, especially now content is one of the most significant factors of SEO. If you’ve created quality content that complies with the Quality Guidelines, it may not rank if Google does not understand what exactly is on the webpage.

To understand what schema and structured data can do for your business, follow our guide below.

Schema and Structured Data – what’s the difference?

Schema

Schema, also known as schema.org, is a project formed in 2011 across multiple search engines. It consists of different mark-ups made up of tags that you can add to your site’s HTML to help search engines understand your webpage and the way that page is displayed, thus resulting in better representation in the SERPs.

Essentially, it is these tags that tell search engines whether your information is about a specific place, person, movie, book, etc. These tags are classed as structured data.

Structured Data

When it comes to structured data, it is essentially a way of adding a standard set of values to the text on your webpage that will help search engines understand your content. It’s a bit like talking them through your website section by section so they can make sense of every part of it.

These set values are part of a hierarchy:

  • Itemscope – this simply paves the way for a new item on the webpage.
  • Itemtype – there are many different itemtypes, not just movies, people and places as we mentioned earlier. The broadest type is ‘Thing’, which has four different properties: name, description, URL, image. There are then more specific properties within these broader types, known as Itemprops, the final instalment in the hierarchy.

Here’s an example from the Schema.org webpage to break this down a little more.

So you have a webpage about the movie Avatar which includes details about a ‘Person’, which is a specific ‘Thing’. Say that person is James Cameron (director of Avatar), so that’s a specific type of ‘Person’. You can even broaden this even more, by saying that the film he directed is Sci-Fi, a specific type of ‘Movie’.

So, by adding these into your schema markup with the correct tags,  the search engine is now aware that this page features a link to the Avatar trailer, a Sci-Fi film directed by James Cameron. See here for how this would look in HTML code, or for a quick reference, here’s their image of this code:

Schema

And the difference?

To summarise, structured data puts names with values within your HTML code to help ensure that search engines thoroughly understand your content and index the webpage correctly. Schema on the other hand is the overall project that provides the set of standard values and definitions for the tags.

To check that your structured data is validated, you can test using Google’s Structured Data Testing tool.

How do you leverage schema to improve organic search performance?

There are a variety of ways to use schema that can help boost your visibility and presentation in the SERPs. This can be done by adding different features that can help promote your site and company as a whole.

The way that these results are presented were previously labelled as ‘rich snippets,’ but now they are ‘rich results’, and can take many different forms such as carousels, additional data, star ratings under the meta title, etc.

Here are a few examples of different types of schema that can help boost your visibility.

Location Schema

Schema can work wonders for local SEO, as it allows search engines to understand location-related information for your business, such as addresses, phone numbers, HQ location, events and more.

The problem with location schema is that not many people realise that it can work for local SEO as they have verified GMB listings and therefore believe that they do not need to do anything more. But the reality is that using schema can help provide essential information into your business’ knowledge panel.

Here are some ways you can do this:

  • You can apply Postal Address schema, which adds your local address for your business. Click here to see how.
  • There is also the Local Business schema mark-up which is ideal for brick and mortar businesses. This schema allows you to add opening hours and types of payments that your company offers.
  • The Organization mark-up allows you to specify your logo, social links and contact information.

As an example, the screenshot below shows The Lab Spa using schema markup for their business.

Reviews and Ratings

By using Ratings schema mark-up, star ratings and reviews that your business has received on its website will show in the SERPs. This can be added to certain content types, including:

  • Local businesses
  • Movies
  • Music
  • Recipes
  • Products
  • Books

This schema can be particularly useful for restaurants hotels and bars, as shown here:

Events:

This is incredibly beneficial for spreading the word about an event that you have organised. When your events show in the search results, you’re able to direct users to the listing most relevant to them while also taking up more space in the SERPs – streamlining traffic to your site.  As you can see below, the dates and locations of your event (so in this case, performances of Les Miserables) will be listed underneath the meta description and will take you straight to the landing page when clicked.

Event Schema

You can find the vocabulary for this schema mark-up here.

Product Information:

The Product schema mark-up allows businesses to show more information about the products they offer directly in the SERPs. So, underneath the meta title and the green URL in search results, users can see the product’s price, ratings and if it’s in stock, as seen below.

Product Schema

This particular schema can be beneficial for small businesses who are competing against well-known brands and organisations, such as eBay, Ikea, Amazon, or other large companies that operate within your industry. If your offerings are more price competitive, the searcher will likely come to you.

Breadcrumb:

This mark-up allows users to understand a website’s hierarchy by implementing a series of links in the SERP. It’s a useful schema to apply when the ranking page doesn’t appeal to the user as you can present a few other pages that you have on offer. To expand on the image below, the blue links under the meta description are known as ‘child pages’ or ‘breadcrumbs’. These results clearly emphasise the ‘Enhance Your Site’s Attributes’ page, but in case that doesn’t interest you, two more pages, (such as the, ‘Establish Your Business’ page), display below as an alternative option for the user. In doing this, you will increase your chances of users arriving at your website from an array of different search queries.

Breadcrumb Schema
The schema mark-ups listed above are just a small selection of possible options available to you and your website. While schema and structured data are not quick fixes for the architecture of a site, it can certainly enhance your chances of gaining featured snippets while improving your organic listings. This way, your website is more likely to be noticed over your competitors, improving your click-through-rate and organic traffic. With so many different Schema and structured data mark-ups available, they can truly be a benefit to any business looking to improve their organic search visibility.

The Untapped Potential of Google Image Search

When it comes to SEO, most people automatically think of words. Keywords, meta descriptions and links are all important parts of SEO, but one area which is often either overlooked or not given much attention is images.

In fact, images can have a massive effect on your SEO – Google Image has an important role in the world today. As everything becomes more visual, images, infographics and the like are increasingly important and can be a valuable asset in terms of SEO.

Understanding how Google Image Search can help your SEO is important in today’s world. There are a number of things that you can do to tap into the power that Google Image can have for you.

Optimising your current images

Before you start anything else it is important to make sure that your current images are optimised for search engines like Google. This basically means that it can help your image to appear in the Google Images rankings.

The aim again is to have your images come out at the top of the Google Images rankings – of which there are many more per page than text searches.

  • Edit your file names to something that is relevant to the picture (instead of the file name such as DSC_052.jpg as named by the camera).
  • Edit your alt tags to something rich in keywords and as short as possible. These alt tags can help with SEO, but their main purpose is to help people who are visually impaired so keep this in mind when you are choosing the words.
  • Compress your images. The time that a page takes to load is vital for SEO and attracting people. A large image file can take a long time to load, put people off, and, ultimately, bring up your bounce rate. By compressing your images, they will take up less memory, load quicker and improve your SEO.

Avoid Stock Photos

Stock photos are great because they’re free. But that’s about where their usefulness ends, as far as SEO is concerned anyway.

Stock photography usually looks poor quality and doesn’t do much for your own specific branding. It can be used sometimes but definitely not all of the time!

Using your own images can be vital to good image SEO – in a similar way to other content. If you have an image which is unique to your website, it will always link to your website and not to the hundreds of others that have used the same picture. People are also likely to use your image and include a backlink.

With Google Images, the first result on their search results list will always be the original image so it is worth the effort of getting your own images on there.

Content management involves ensuring that you have the best, most effective images, which can help to attract as much traffic to your website as possible. And this involves using unique images which are specific to your website.

A Reason for your Images

If you are putting an image onto a page, don’t put it on there just for the sake of it. Make sure that it is there for a reason. Just like in other SEO content marketing strategy, you will only increase your bounce rate if you have irrelevant and unconnected images.

You should also keep this in mind when you are labelling your images. Make sure that you label them in a way that connects them with your content. If, for example, you have a blog about cactus’s, make sure that you label it this way and not ‘Mexican landscape’. People who are looking for pictures of landscapes in Mexico might not want to learn about the inner workings of a cactus.

Infographics are an excellent way of attracting relevant traffic to your website, and, as long as you are making them, they can be both completely unique to you and informative.

Image Sitemap

Sitemaps ate useful to search engines like Google as they can help to tell them what images are present on your website and where. This means that all of the images that you want to be found are found quickly.
WordPress and Yoast automatically add images to your sitemap or you can create your own image sitemap for your website.

Understanding the power that images have in terms of SEO is important, and judging by current online trends, is only likely to become more important. By getting your image SEO right, you can add to the other work that is done to keep your digital marketing efforts effective and see it show not only in an SEO audit but also in the amount of traffic that you are getting to your website.

5 Biggest Observations From Google’s Latest Cheat Sheet

Many companies find that their site’s search rankings are negatively affected following Google’s many algorithm updates. And, seeing how many changes are made throughout the year, it can be frustrating when you can’t see a way to regain any losses.

Fortunately, for the first time Google has released a post giving advice on how companies can take advantage of new core updates to improve their SEO rankings. Here are the key things you should be aware of following Google’s advice on core updates.

1. There is nothing to fix

Google has advised that there is nothing you can do to fix a site if you see a decline in search rankings following an update. “We know those with sites that experience drops will be looking for a fix, and we want to ensure they don’t try to fix the wrong things. Moreover, there might not be anything to fix at all.” They then added – “There’s nothing wrong with pages that may perform less well in a core update.”

Many companies are therefore questioning how they can improve their site’s search rankings and what they should do following core updates. Google said that what has changed, is simply how Google assesses the value of content. Companies should therefore focus on producing the highest quality content possible. Google advised – “We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.”

2. Content should be written for the user/audience

While Google have reiterated the fact that there is no quick fix following updates, they have however given advice on how to improve the quality of your content. Google has released a list of questions with their blog that companies should ask themselves if their site has been negatively affected by the core update. This includes things such as:

  • Does the content provide original information, reporting, research or analysis?
  • Was the content produced well, or does it appear sloppy or hastily produced?
  • Is the content free from spelling or stylistic issues?
  • Does the headline and/or page title provide a descriptive, helpful summary of the content?
  • Does the content provide a substantial, complete or comprehensive description of the topic?

Recent updates prove that writing for the user/audience is the most effective way to write quality content and will be favoured by the Google algorithms. Their advice on assessing content asks questions like “Is this the sort of page you’d want to bookmark, share with a friend, or recommend?” and “Does the content present information in a way that makes you want to trust it?” The emphasis is on the reader and their opinion on the quality and value of the content.

3. Content must focus on E-A-T

To improve search rankings, businesses must adapt their content to meet the new guidelines. One of the simplest ways to produce quality content is by focusing on E-A-T, which stands for Expertise, Authoritativeness and Trustworthiness. Companies can read the search quality raters guidelines to get further advice on how to improve the overall quality and effectiveness of their content. According to Google – “Reading the guidelines may help you assess how your content is doing from an E-A-T perspective and improvements to consider.” They added “If you understand how raters learn to assess good content, that might help you improve your own content. In turn, you might perhaps do better in Search.”

The list of questions that Google published in their blog include a section on E-A-T and what you should ask when writing content for your site.

4. There are no further updates confirmed

The post was not confirmation of another update. Google confirmed their June core update but since then, no other rumoured updates have been confirmed by Google. However, companies should be aware that unannounced updates occur on a regular basis. Google said – “We are constantly making updates to our search algorithms, including smaller core updates,” They added “we don’t announce all of these because they’re generally not widely noticeable.”

5. Google wants to improve the user experience

It is clear that Google’s ultimate aim in introducing the core updates is to improve the user experience as much as possible. In summary, companies should focus on building high-quality websites that offer users high-quality, trustworthy content which has been produced in line with the search quality raters guidelines and E-A-T.

All You Need to Know About The robots.txt File

If you know a little about SEO then you might have heard of the robots.txt file. It can be a useful tool for anyone who has a website that hopes to attract visitors through search engines. The robots.txt file can essentially tell search engines where they can search on your website, meaning that more time is spent searching the useful pages.

The robots.txt file is also known as the “Robots Exclusion Protocol.” Some people put their ‘noindex’’ rule inside their robots.txt file, but this is all about to change as Google has announced that from September 1st, 2019 they will no longer support robots.txt files which have a noindex directive which is listed in the file.

They are changing their rules to try to keep the eco-system as healthy as possible and to make them as best prepared for any future open source releases. This is a complex issue and one which business who have websites need to respond to. And to enable them to do this, it is important to fully understand what the robots.txt file is and its role in the SEO space.

The robots.txt File in Action

Search engines like Google have web crawlers – known as spiders – which look at millions of websites every day, reading information which helps them to decide which websites they will put at the top of their search results.

If there are pages that you don’t want Google to crawl, you can put a robots.txt file in place. This can be dangerous however, as you can accidentally prevent Google from crawling your entire site. So make sure you pay attention to what you are blocking when it comes to adding these files.

Putting these files in place will also prevent any links on that page from being followed.

It is important to note, however, that although you are blocking the robots, the page can still be indexed by Google. This means that they will still appear in the search results – although without any details. If you don’t want your page to be indexed on Google, you must use the ‘noindex’ function.

Why is it Important?

The main reason why robots.txt files are important and useful to a website is that it can stop the search engine from wasting their resources on pages that won’t give accurate results, leaving extra capability to search the pages displaying the most useful information.

They can also be useful for blocking non-public pages on your website – pages such as log-in pages or a staging version of a page.

They can also be used to prevent the indexing of resources such as multimedia resources like images and PDFs.

The robotos.txt file can also be useful if you have duplicate pages on your website. Without using the file, you might find that the search engine comes up with duplicate results, which can be harmful to your website’s SEO.

The Pros of Using the robots.txt File

Most crawlers will have a pre-determined number of pages that it can crawl, or at least a certain amount of resource that it can spend on each website. This is why it is important to be able to block certain sections of your website from being crawled, allowing the robots to spend their ‘allowance’ only on sections which are useful.

There are some instances when using robots.txt files can be useful, including:

  • Preventing duplicate content
  • Keeping sections of your website for private use only
  • Preventing certain files on your website from being indexed e.g. images and PDFs
  • Specifying crawl delays to prevent your servers being overloaded

The Cons of Using the robots.txt File

Blocking your page with a robots.txt file won’t stop it from being indexed by the search engines. This means that it won’t actually be removed from the search results. The ‘noindex’ tag is what is important in preventing the search engine from indexing your page, but remember, if you have used the robots.txt file, the robots won’t actually see the ‘noindex’ tag, and therefore it won’t be effective.

Another potential issue with the robots.txt file is that if you block the robot, all of your links on this page will automatically be valueless to you in terms of flowing from one page or site to another.

The Consequences

There are a number of consequences for brands of Google’s decision to no longer support robots.txt files with the ‘noindex’ directive.

The main concern is that you will need to make sure that if you have used the robots.txt file previously, that you have an alternative to fall back on.

If you are using ‘noindex’ in a robots.txt file, you should look for alternatives and set these in ace before the deadline of 1st September. These include:

  • Noindex in meta tags
  • 404 and 410 HTTP status codes
  • Use password protection to hide a page from search engines
  • Disallow in robots.txt
  • Search Console Remove URL tool

Here at Fibre, we know that, the majority (91%) of people who use a search engine use Google and this is why it is important to stay up to date with any changes in their policies. By being able to understand and act on their directives, you can ensure that your website is found and will continue to be found by search engines as other rules change.