A straight-to-the-point approach to SEO that will teach you the basics you need to know.
Search engines like Google can be the starting point of new marketing ideas and profitable customer-business relationships.
As consumers, when we are looking for a business or organization that can provide solutions to our current pain point, we go to Google. On average, 3.5 billion searches are conducted on this popular search engine each day.
Therefore, if you want potential customers to find you online, you have to invest in search engine optimization (SEO).
When it comes to SEO, many business owners struggle to create and implement a strategy that will help their website rank. Others don't know where to start.
To ensure your website is on the right track, we have created a comprehensive beginner's guide to SEO. In it, we highlight and explain the fundamentals you need to drive your website's search engine rankings.
SEO, short for search engine optimization, stands at the forefront of all digital marketing efforts. For all marketers of B2B companies, it is essential to understand what’s at the core of all SEO efforts and the key components you need to follow to build a reliable and effective SEO strategy.
While there is no single definition, put simply, SEO is the practice of increasing the quantity and quality of web traffic to your business website through organic search results from search engines.
Optimizing your site is an ongoing process that ensures Google and other search engines can index and properly read every webpage on it. Doing so allows sites and pages to rank for desktop, mobile, and voice-activated search queries. The cleaner and more optimized your website and your pages look influences the web traffic and the quality of that traffic it receives.
For a successful B2B SEO strategy, a website’s content must provide quality information that answers users’ questions and helps them solve their pain points. It is equally important that the material is easy-to-read, so consumers spend more time on a website. Quality content should always be written with the user in mind and should be structured in such a way that search engines can readily understand what it is about and how it relates to other sections on your site.
For example, if you are a construction business, you do not want Google or other users to think you’re an architecture firm. What you want is to attract visitors who are interested in the products and services you can offer—appealing to your target audience impacts the quality of traffic your site receives and increases the chances that a visitor is interested in becoming a paying customer.
When you identify the audience you want to attract, you’re able to reel them in. Understanding your audience’s profile includes knowing:
In its purest form, SEO is all about providing search engine users with the highest level of value. If you’re aware of what an average customer wants and needs, and you’re able to deliver, your SEO efforts will result in higher levels of success.
In the following chapters, we’ll walk you through all the SEO basics you need to know to get started with search engine optimization.
Delivering value to customers is only a piece of the SEO pie, though it’s the most critical piece of the pie. However, you can create the most powerful and meaningful content for your customers, but if search engines are unable to crawl and index your site, you might as well have never produced the content because no one will be able to find it.
More than ever before, users want quality answers to their questions, fast. The quicker your site can answer a user’s pressing questions, the more likely they are to revisit your website later on.
The key to creating content that ranks among the top search results for search engines is understanding the way Google finds, analyzes, and ranks your content.
Content ranks primarily using two methods:
If the content is not relevant, it has little chance of ranking even if it is authoritative. If your website is not trustworthy, it has a small chance of ranking well on SERPS, no matter how relevant to your topic it is.
To build an efficient SEO-friendly website, it helps to understand the way search engine bots select the webpages featured on SERPS. Google’s search engines use over 200 variables to match keyword terms with the content that users are given. HubSpot provides an entertaining way of breaking down these factors.
Every business that aims for an SEO-friendly website must learn certain technicalities to enable them to work alongside Google bots and achieve top SEO results. Without this technical knowledge, it becomes increasingly difficult to compete with other businesses that are adapting.
An HTTP response code that indicates your requested page has permanently moved to a different URL.
An HTTP response code that indicates your requested page has temporarily moved to a different URL.
A status code used when a user does not have the necessary credentials to access the page.
A status code signaling that a user may not have the required permissions to access a page or may need an account to access it.
A status code used when a requested page couldn’t be found but may be available in the future.
A server status code that tells Google your site is temporarily down for maintenance and will be back online soon.
Alt-text is the text in HTML code that describes the images on webpages. You can learn how to write practical alt text here.
The words that make up a hyperlink and link to another webpage or document. They typically appear as underlined blue text.
Also known as “inbound links,” they are links from other websites that pinpoint back to your site.
SEO practices that violate Google’s quality guidelines.
An email standard that uses brand logos as an indicator of email authenticity and helps consumers avoid fraudulent emails.
An image or line of text that persuades a reader to take the desired action.
A piece of HTML code that lets search engines know that one URL is the original version of another. It is often represented as rel=”canonical.”
The code that makes a website look a certain way; this includes fonts and colors.
A spam tool that helps determine whether a user is human or not.
A software that allows users to publish, edit, and modify webpages within a website.
A metric that estimates the percentage of consumers who take the desired action. The desired action can differ from campaigns and can range from watching a video to visiting a webpage or buying a product.
The process of changing marketing strategies, ads, and websites to increase conversion rates.
A program run by search engines that scours the Internet for web content. Crawlers are also known as search engine bots or spiders.
How a client feels was their overall experience with your business, this includes what they think about your brand.
A tech platform that is used for collecting and managing data from various sources.
The percentage of all the backlinks on your website that point to other pages other than your homepage.
An information system that translates a domain name such as “cnn.com” into IP addresses so browsers can load Internet resources.
A search engine ranking score that predicts how well a website will rank on search engine results pages (SERPS). It is also known as domain authority and was created by Moz, an SEO software company.
A protocol often used for email validation, policy, and reporting. It aims to prevent email spoofing and fraudulent emails that take advantage of a recipient by contacting them through a forged sender address.
An email authentication method that allows recipients to verify if an email was legitimately sent and authorized using a business’s domain. This authorization takes the form of an email with a digital signature.
A markup language that is similar to HTML and allows users to modify their business Facebook profile.
An HTML version of a web page that Google creates and stores after it is indexed.
A web analytics platform developed by Google that allows users to track and report their website's traffic.
A Google search feature that retrieves content-based images for a search query.
A Google service that helps users monitor, maintain, and troubleshoot their website’s appearance in Google search results.
A tool that allows webmasters to manage and arrange any marketing tags (known as snippets of code or tracking pixels) on their website or mobile app without having to modify any code.
A Google platform that evaluates and alerts webmasters of issues with their website and provides solutions to fix them.
A standard markup language used to create webpages.
Tags meant to define headings in HTML and range from H1 (the most important) to H6 (the least important).
An HTML code that's used to create a link to another page.
A communications system used to connect to web servers on the Internet or to local networks (intranet). Its primary function is to establish a connection between a server and send HTML pages to the user’s browser.
An encrypted version of HTTP.
The process of searching for keywords relevant to your website and determining which of these can yield the highest ROI. These keywords should answer: What are people searching? How many people are searching for it? Where do they search for this information?
A form of off-page SEO in which your website earns links from other sites that direct readers to your own.
The process of improving aspects of a landing page to increase conversions.
The practice of earning hyperlinks that link from other websites to your own.
An SEO strategy that's used to increase the visibility of local businesses on search engines.
Snippets of text visible on a webpage’s source code that tell search engines what the page is about.
The practice of highlighting the important elements of content on a webpage to help it stand out in SERPS.
A code language that's designed to provide crawlers with information on a website's contents.
Essential business information that should be available online for the public to see. An up-to-date NAP will help organizations rank better on local organic search results.
Backlinks that happen organically. They are used to refer to a credible website, a piece of content, or a source.
A link building strategy in which a site gets inbound links from other domains without linking back to them.
The practice of improving the aspect of a webpage to achieve a higher ranking on search results and earn quality website traffic.
A backlink that redirects to a webpage on another site.
A score developed by Moz that predicts how well a specific webpage is likely to perform on search engine results pages (SERPS).
A view on a page. Google Analytics attaches a tracking code to page views to help SEO specialists identify the webpages that receive the most traffic.
A metric used by Google to determine the relevance of a webpage.
When using backlinks, the RD is the original domain that people came from before visiting your website.
Pre-defined tags of semantic vocabulary that you can add to your HTML to improve how search engines read your webpage on SERPS.
The strategy of increasing the quality and the quantity of web traffic on your site by improving its rankings on search engines.
The rank of a webpage on an organic search results page (SERP).
The position a webpage ranks for on a SERP.
Webpages presented to users after they have searched for something online using a search engine.
An email authentication method that's used to prevent spammers from sending messages to others using your domain name.
Protocols for establishing authenticated and encrypted links between networked computers.
A measure of how topically relevant the backlinks to your website are.
The name of a web address.
An analytics tool used by marketers to track the impact of their online efforts, understand their audience’s behavior and measure performance.
It includes every aspect of a user’s experience with a company and its services. In SEO, UX refers to a user’s experience while navigating a website.
The use of optimization strategies and techniques that focus on the human experience and follow search engine policies and regulations.
Any webpage that’s created to improve search engine rankings by violating Google’s Webmaster Guidelines but doesn’t provide any value to readers.
When you conduct a search query, it is useful to understand the process that gives you the SERPs you receive. Understanding how search engines retrieve the results they display for users will help you to know how you can design content on your page to improve your site’s rankings. With this, your webpages can hopefully show up when someone searches for a service you provide.
A business’s online success largely depends on the construction of an SEO-friendly website. Ranking in the top SERP is important for clicks; 75% of search engine users do not scroll past the first page, ever. So it is pretty evident that if you’re not featuring on page one, most users will not know your webpage exists.
The key to SEO success is to facilitate a search engine crawler’s job of scanning your content and categorizing it accordingly. This way, when a prospective customer is looking for information you provide, Google has every reason to rank your webpages in the top results.
Businesses should integrate a variety of different elements to achieve optimum SEO results. We’ve categorized these elements into the four major types of SEO every company should be implementing. These are on-page SEO, off-page SEO, Technical SEO, and Local SEO, which we’ll be covering in the following chapters.
(Source: Moz)
While countless factors influence web page ranking on Google, some carry more weight than others. To become an SEO powerhouse, consider the following when devising your SEO strategy.
On-page SEO refers to everything a user can see on a webpage. It includes strategies businesses can use to optimize an individual page on their site.
Every element that contributes to on-page SEO should work together so that search engines understand the content of a page, identify if the content is relevant to a search query, consider the site as a valuable source worthy of displaying on SERPS, and create a good user experience. Includes every aspect of a user’s experience with a company and its services. In SEO, UX refers to a user’s experience while navigating a website.
Putting in the time and effort to developing your on-page SEO brings you closer to higher site traffic and an increase in your overall web presence.
Before publishing a webpage worthy of high rankings, you need to invest some time in keyword research. During this process, a digital marketer looks for sets of keywords that mirror what their site’s target audience is likely to be using.
Keyword research has become an integral step in the content creation process. It often dictates the various topics that new content will cover.
In other words, it is not enough to provide users with the right answers. As websites get better at giving useful information, how a site presents its content becomes a significant tiebreaker for ranking under more competitive keywords.
Remember, the goal of SEO is to get your website’s content indexed by Google and found by your target audience. A whopping 75% of people do not scroll past the first page of a SERP. This fact makes it ever so important that you target a majority of the keywords your target audience is likely to use.
Tools like Moz Keyword Research Tool, Ahrefs Keywords Explorer, and Answer the Public help businesses determine the keywords they want to target for more significant SERP results. These tools do their best to break down the Google code and understand which search results are shown to users.
They work well in helping you develop an understanding of what words people are searching for and how much competition there is for those specific words. These tools will also provide you with other similar words and phrases that people may search for that you can target. If you’re feeling a little lost, tools like Moz Keyword Research Tool are excellent aids for uncovering a searcher's intent and determining the problems they are trying to solve.
When choosing the right keywords from which to build your new content, investing time to discover the ones your direct competitors use can help you build a better SEO-enhanced content strategy. Doing so can work to your advantage if you create content that can take some of their traffic as well as expose potential gaps in their content.
To help in this endeavor, you can opt to use a data management platformA tech platform that is used for collecting and managing data from various sources.. DMPs like HubSpot, Google BigQuery, and Amazon Redshift help businesses understand their customer’s demographic and psychographic information better so that they can draw in their target audiences and persuade them to purchase their products or services.
When it comes to keyword research, DMPs can help businesses combine the information they have about their buyer personas to brainstorm better keywords that would best align with their different personas.
If you’d like more information on performing keyword research, you can consult Moz’s in-depth guide.
If you want to rank higher and see an increase in your web traffic, it is crucial to understand what readers want and why they are searching for it.
Google Analytics and Google Search Console give businesses insight into the user intent behind search queries. Google Search Console helps companies identify the typical search terms that people use to get to their site, as well as those terms that do not drive much web traffic.
Once a person lands on your website, Google Analytics helps businesses identify the content that users engage with the most and how long they engage with it. Keep in mind, t that the information this platform provides is a generalized overview of all their site visitors and cannot be broken down to track specific users.
As a common rule, search queries typically fall within one of four user intents:
Building webpages that target a specific search intent and use relevant keyword phrases increases its chances of showing up on a search engine results page (SERP).
The Internet is jam-packed with new information every day, an average of 380 new websites are created every minute and 500 hours of video are uploaded to YouTube every minute. For sites working hard to make their content visible, the quality of the information in the form of text, blog posts, graphics, infographics, and videos they are producing needs to be top-notch.
When deciding how to rank a webpage against others that address similar topics, Google bots will rank webpages based on the answer to these questions:
How many times does this page use keywords from the search query?
Is this page high quality or low quality?
For a webpage to successfully rank on page one, its content must use the targeted keywords and keyword phrases in a way that respects the natural flow of ideas. Practices like keyword stuffing are prohibited and considered a black hatSEO practices that violate Google’s quality guidelines. activity.
To help readers identify the main points within a piece of text, you can choose to highlight, bold, or use italics to emphasize parts of your writing. Use header tags, title tags, and alt text to help Google index your content better.
Considering Google’s top priority of providing users with great content that: 1) serves a purpose and 2) can be of service to someone, everything on your site must fulfill these criteria.
Websites that provide quality content receive a boost in their page rankA metric used by Google to determine the relevance of a webpage. and those that create low-quality content (per Google standards) are not rewarded and receive less online visibility.
What criteria does Google use to determine what is an excellent, a good, and a bad piece of content? The answer lies in the Search Quality Evaluator Guideline in which the search engine giant outlines the three olden principles to how it evaluates web pages determines the quality of its information.
These three principles are:
The beneficial purpose emphasizes that every webpage must have a user-intended purpose, which can range from providing information, to entertain others, or to sell them something. However, a website cannot be created with the sole intention of making money without giving any value to others.
Under this principle, all web content needs to possess the following qualities.
Expertise: The person behind every piece of content must have a certain level of knowledge on the topic at hand. Whether they have the credentials to support their expertise or real-life experience, the information they publish on the Internet must be legitimate and correct.
Authoritativeness: Every webpage and the business behind it should have a level of authority in their field. For this reason, establishing domain authorityand good high-quality backlinksAlso known as “inbound links”, they are links from other websites that pinpoint back to your site. is essential. Every company should aim to create content that is interesting and authoritative enough to attract links from other websites.
Trustworthiness: Users should be able to trust that the content, the site, and the business behind it provide correct and accurate information about the topics under discussion.
This principle addresses the type of content that, if misrepresented, can directly affect a user’s health, safety, or financial stability. For this reason, every website that publishes content under any of the following industries has a clear understanding of Google’s quality expectations.
While Google’s quality standards may slightly change over time to adjust to the ever-changing digital landscape, its emphasis on providing the highest of qualities for the content it ranks first will always stay the same.
There is often a fine line between the right and wrong way to write content that is sufficiently long, relevant, engaging, and backed by reliable sources. When brainstorming a content piece, you should consider the information you want to include to follow the standard 2,100-2,400 word count that is the norm for good content today.
We know the importance of targeting keywords on a webpage. When a user searches for a string of keywords, a search engine will retrieve results that are high quality, authoritative, and relevant to the search query.
However, as queries have become more and more complex, merely targeting specific keywords on a blog post or article is not enough. People today are comfortable asking complicated questions, and they want accurate search results.
To stay on top of this growing trend, search engines have been become smarter at identifying the connections between different keyword searches. Search engines understand the topical contextA measure of how topically relevant the backlinks to your website are. behind a user’s query so they can relate one person’s keyword search with those of others it has had to solve in the past. For example, if you search for “1980’s song whose music video included Michelle Pfeiffer”, Google will retrieve many relevant results to this query using the past search results from other users.
For this reason, today’s websites need to be focusing on building content across topics and NOT for specific keywords. You may be asking, so what’s the difference between a keyword and a topic to search engines? A keyword is a word or phrase used to perform a search.
To start creating content with topical relevance, you need to identify which topics you want your site to be known associated with. For example, if you provide SaaS to other businesses, then you will want to create content that answers any possible question that your customer base is likely to have about that subject matter.
Doing so will generate more relevant web traffic to your site, increase your chances of ranking on the first page for searches related to your target topics, and grow your site’s authority in that area.
To help with this endeavor, you can use data management platforms to merge any data your customers and prospects have provided with data from third parties. Information such as demographics, geographical location, lifestyle choices, and purchase behavior can be used to have a greater understanding of the interests, wants, and needs of your target audience. You can use this knowledge to craft more relevant and persuasive content that can help convert your audience.
The best advice anyone can give you on succeeding at SEO is to focus on the user (your target audience), and not on search engine algorithms.
Algorithms can be reprogrammed, but your target audience will always want one thing: quality information that answers their questions. By focusing efforts to optimize your business website for SEO results now, adjusting your SEO strategy to mirror Google algorithm updates may not be so bad.
Keeping a tab on these changes is very important not only for your SEO game but also for your site’s reputation. Operating a website that does not follow general SEO guidelines may be a cause for penalties such as lower rankings or manual penalties.
In some cases, a site can receive a penalty for over-optimization because Google sees this as a way of trying to cheat the system through the use of black hat tactics. Over optimizing a site can include adding webspam, creating unnatural links, and keyword stuffing.
Google has taken a focused approach toward addressing these practices by introducing updates like the Penguin and Panda algorithm changes that aim to penalize these sites.
We know Google wants quality websites and quality content, but what does quality mean?
Quality refers to in-depth coverage of a topic. The key to quality is providing real value to users in an informative and entertaining way.
A common way a website measures the quality of its published material is the amount of time people spend on their pages and the number of shares their content receives on social media. You can learn more about building quality content here.
You want high-quality information on your website to give value to your readers, rank higher on search engines, increase the time users spend on your page(s), and lower the bounce rate.
When optimizing your content, your aim should be to answer any questions a user may have about the topic.
For example, if someone searches for “how to start meditating,” you ideally want your webpages to rank in the top search results. You want this person to click on your webpage, read it, take what they need from it, and do something else.
What you DO NOT want them to do is to click on your site, leave it within a few seconds of landing on it, and continue looking through other search results for information. What the user in our example is doing is a practice called “pogo-sticking," which is bad news for sites because it lets Google know that the first page (and the others that follow) did not answer their questions, which in Google’s eyes is a big thumbs down.
In November of 2011, Google released its freshness algorithm update, which at the time was estimated to impact around 35% of search queries. This meant that if the content your site was publishing fit into any one of the specified categories, and you wanted to maintain your SERP rankings, you needed to make some serious updates.
Updating past articles with new content and images can increase a site’s organic traffic by 106%. From an SEO standpoint, you want that for your website!
Within this category, we can find the latest trending topics that need the most up to date information. When you search for a recent event, you will likely see search results that were published within a few minutes to a few hours ago. Examples of this category are the Coronavirus pandemic and the state of the world economy.
Regularly occurring events can be anything from yearly sports tournaments to presidential elections –what’s important is that users expect the latest information on these topics.
Most website content is likely to fall within this category, which can include informational blogs ranging from the latest money-saving tips to web design trends.
Updating your content ensures anyone reading it will get the latest information, and lets others know you care about delivering the best and most current information available. Publishing current and relevant content will help bring in more web traffic and is more likely to encourage social media engagement than outdated or inaccurate content.
However, not every piece of content online will require an update. Articles that cover historical events will never need it. The closer a written account is to the historical development, the higher its likely to rank when compared to a more recent article written about the same event.
Evergreen content doesn’t require frequent updating because it never stops being relevant. Content of this type is sustainable because it continues to be relevant, regardless of when it was initially published, so web traffic is likely to increase over time.
How-to guides, how-to videos, beginner’s guides, articles that answer common questions, case studies, lists, checklists, statistics, and product reviews are all types of evergreen content.
Examples of B2B evergreen content include articles like: “How to use Google Analytics to track campaigns,” “What is marketing automation, and why do you need it,” “Top 15 SEO words and phrases you need to know”, and “SEO mobile marketing case studies.”
While this content stays relevant over time, it will require updates as information changes and new features are added. Doing so can prevent Google from ranking another business’s evergreen content above your own on SERPS.
When designing an SEO friendly webpage, you need great content and a proper page structure.
As more websites improve their content marketing and provide their readers with in-depth answers to their pain points, the page structure on a site becomes a significant tiebreaker when ranking for more competitive keywords.
When deciding how to rank different pages for similar keywords, Google bases their search results on the answer to Do the keywords in this search query appear in the title, the headline, and the URL of a page?
A good content structure is one that adopts a logical flow of ideas; you achieve this naturally by using headers and subheaders and other HTML tagsTags meant to define headings in HTML and range from H1 (the most important) to H6 (the least important). to help readers understand the breakdown of your thoughts.
Using the H1 HTML tag in your article headline and including one or more of your targeted keywords will encourage users who just clicked on your page to stay on it.
Headings H2-H6 can and should be used to organize the rest of your content, making it easier for readers to scan the title of each heading and instantly know what each section is about without having to read the text.
Accurately citing your sources and including bullet points, lists, and images all work together to create more visually appealing and engaging material for others to read.
Other things you can do to improve your content structure include:
A heading is used to gravitate a reader’s attention. By adding keywords to them, you reiterate how relevant your article is to the reader’s search intent.
You can do this by emphasizing them through different formatting, fonts, or font color. Keep in mind that any emphasis you make should follow your brand’s design to look appealing.
The bottom line for the content structure is this: the more engaging your content, the longer a site visitor will likely stay on your website. This is an excellent thing – the average time per session on your site is a key ranking factor for Google.
A landing page is a single webpage meant for a specific marketing or advertising campaign. It is the page you are taken to after clicking on a link in an email, an ad, or a SERP (search engine results page).
As Seth Godin puts it, a landing page can have one of five intentions:
Landing pages are standard in Lead Capture, PPC, and Email Marketing efforts. They are meant to provide a “Welcome” message to visitors who’ve just landed on your site. At their baseline, they aim to create a positive user experience.
In general, Google algorithms and SEO strategy, prioritize creating excellent user experiences. It is for this reason that using landing pages for SEO makes sense.
You may be asking yourself, “but why would I want to optimize a landing page for SEO?”
The short answer is:
In reality, most of the web’s landing pages do not rank in the first search results on a SERP. The reason behind this is that competing webpages have A LOT higher word counts than most landing pages, this is how it is supposed to be.
Landing pages serve a specific purpose, to help website visitors convert. They are meant to get them to take a particular action, and they do this by using minimal text with a hint of visual appeal. If they were filled with endless text, they wouldn’t do a great job of increasing conversions.
Landing pages created for maximum SEO results need to focus on ranking instead of converting. While we ideally want our SEO-enhanced landing pages to accomplish both, the reality is that it is difficult but not impossible.
Let’s look at how you can tweak an existing landing page on your site for more significant landing page optimizationThe process of improving aspects of a landing page to increase conversions..
Optimize your URL, page title, and headers to reflect the keywords you’re targeting.
Including useful content is a crucial distinction between a conversion and an SEO-focused landing page. To get your landing page to rank, you will need lots of helpful content that can also appeal to other sites for link building.
For example, you recently published a landing page meant to promote a holiday discount your company is offering on business management software. To optimize this landing page for SEO, you will need to include a case study on how others have improved their business’s overall customer experienceHow a client feels was their overall experience with your business, this includes what they think about your brand. using this software. You will also want to earn links to other pages that recommend your software or from other sites that write content related to customer acquisition.
A landing page wouldn’t be one without a good CTA that encourages users to take a specific action. You want Google to rank your page for a search that’s relevant to the product or service your landing page is promoting. Help your target audience by telling them what you would like them to do.
Conversion rate optimization (CRO) refers to the process of increasing the percentage of site visitors who convert. This process involves determining how individuals navigate through your site, which pages peak their interest, and what is stopping them from completing your conversion goals.
Before we get into the finer details, let’s recall that a conversion is the desired action that a business wants prospects to take.
Every business and every marketing campaign will have a different idea of what a conversion is to them. For example, a conversion for a B2B software company could be that visitors to their site sign up for a free demo. Once a certain percentage of these individuals sign up and try the software demo, another conversion goal could be to persuade them to purchase an annual subscription for this software.
These two very different conversion goals follow a specific order – one cannot be achieved without doing the other first. You cannot get people to purchase your software if they know nothing about you or how your software.
There are two types of conversions.
Macro-conversions that include:
Micro-conversions such as:
Before a macro conversion can take place, a micro-conversion needs to happen, which is the case in our example and many real-life situations.
Conversions are essential because a website receives only a finite amount of web traffic. Smart businesses invest their marketing budgets in making the most of their web traffic by increasing their conversion rates; that way, they can spend less on advertising and receive the same benefits.
A conversion rate is the number of times one site visitor completes a conversion goal (i.e., creating an account) divided by your site’s traffic.
conversion rate = total number of conversions / total number of sessions * 100
Let’s use the example of the B2B software business. Let’s imagine its website receives 150,000 different visitors in one month. Within this month, 3,000 different people decided to sign up for a free software demo.
conversion rate = 3,000/ 150,000 * 100
conversion rate = 2%
In this example, the conversion rate comes out to 2%. Considering that the average conversion rate for new B2B websites is 3%, the marketing campaign for the business in our example will likely need some optimization.
Keeping track of your site’s conversion rate will help you determine how well your marketing campaigns, your webpages (including landing pages), and apps (if applicable) are performing.
When your marketing campaigns consistently achieve healthy conversion rates and increase over time, a smart move is to start considering how you can optimize your conversion rates.
To start optimizing your conversion rates, consider which marketing campaigns would benefit the most from optimization. Improving a campaign with a 0.5% conversion rate might not give the same return as improving one with a 1% conversion rate. Experts suggest ranking your campaigns based on its potential, its importance, and the ease with which you could optimize them.
To help you choose which campaigns are worth optimizing, your marketing team can answer questions like:
Here are some things you can do to improve your conversion campaigns:
Include landing pages after blogs
Landing pages serve as a platform for websites to ask site visitors for more information.
Writing compelling copy that engages readers and including a landing page pop up ad is a winning combination. For best results, run split tests to get the most from this combination. You can test website copy, images, form questions, the number of form fields, page design, calls-to-action (on your blog post and landing page), and your content offer.
Use video for greater engagement
A HubSpot study from 2017 determined that an average of 54% of consumers would prefer to see brands use more videos in their content. Video content that’s entertaining, informative, and funny tends to perform well with audiences.
Videos also provide brands with the opportunity to make their value proposition evident without coming off as pushy or salesy – two attributes that often put consumers off. Videos do not have to be lengthy. The average attention span for audiences is 9 seconds. When a video’s script and the visuals it uses, be it animation, client testimonials, or authentic footage, are well put together, 9 seconds is enough.
Using video on a landing page can help prospects learn about your brand, persuade them to buy, and entertain them. Not only that, but including videos on landing pages has the potential to boost your conversion rates by an astounding 80%.
Every website needs some flair to contrast against blocks of text.
Images and videos make the content on a page more exciting and easier on the eyes. They help to retain a visitor’s attention, reinforce the concepts your written content addresses and positively influence your SEO efforts by increasing engagement levels.
Optimizing the images and video you include on a webpage can do much more than making it look good. When someone searches for a keyword or a topic relevant to an image or video on your page, the possibility of it featuring in the “images” or “videos” part on Google is open.
While most people may stick to browsing through images, others could have an interest in looking at the web page's origin. As for videos, someone may find the information in it very helpful or insightful and want to take a look at the website behind it.
The amount of exposure a single image or a video embedded on your webpages receives can vary. However, to build a solid SEO strategy for maximum organic search traffic, there are a couple of things you can do to help make this happen.
This first piece of advice is exclusive to images.
Adding an image or images that are relevant to your content is a no brainer. It needs to be high quality, meaning that the photo’s resolution is clear, and the pixels are not visible. You can find an extensive explanation about the image resolution for your pictures to look their best here.
If you have the resources, most of your images can be captured using your professional equipment. If that’s not within your budget, there are plenty of sources online.
While most pictures you will find on Google images are subject to copyright, there are sites where you can purchase custom made images. You can check out these great options: 500 px, Shutterstock, and Adobe Stock.
If that is not an option for you either, some websites provide free, high-quality stock photos. These are great stock photo resources you can check out: Pik Wizard, Pixabay, and Unsplash.
Captions create a connection between an image and its accompanying copy text.
Captions are commonly placed under an image because they are meant to guide a reader; they help them discern the image’s connection with text. Not only that, but they also help search engine crawlers understand more about the picture.
Most visitors scan a page before deciding if it is worth reading. Image captions are one of the criteria they use—quick fact: captions are read 300% more times than the actual copy.
An image’s file name should be a description of the image itself.
Alt-text is an HTML tag that describes the appearance and function of an image on a page. It helps Image SEO because its concise description of the picture makes it easier for crawlers to determine the content of the image.
Make sure both the words in the file name and alt text are separated by hyphens and not spaces.
Schema markup is the language of search engines that gives them the rich snippets you see when you search for your favorite chocolate chip cookie recipe. The advantage of implementing schema markup to your images and video is that these rank higher on search results than the number one ranked webpage for a given search!
This handy feature works best for webpages featuring video, product images, recipes, movies, or information about a real or fictional person, name, address, phone number, and rating of a business.
If your website produces videos and you need a bigger audience, you should reconsider where you’re uploading them. Consider uploading your videos on video platforms with a broader audience such as YouTube or Vimeo in addition to featuring them on your site. If big companies like HubSpot and Moz upload theirs on other platforms, why shouldn’t you?
Videos with captions repeatedly have higher engagement rates. They also tend to get about 40% more views, and audiences get an additional stimulus that can help bring home the video’s message.
While captions are an excellent addition to your video, you can also opt for sharing a transcript of it. Some people may not be too keen on listening to a video and reading captions at the same time. But, they may want to refer to its transcript at a later time.
Not only that, but video transcripts help to boost your SEO efforts. The reason: for crawlers, it is easier to understand text and index it accordingly. Google crawlers are software, after all!
We all know that a good ol’ catchy title can be all we need to click on a video and watch it. This is true for videos on your personal YouTube account, and it holds for businesses and individuals seeking further education in your subject of expertise.
Make sure you put together a concise meta description that includes some of the keywords you’re targeting and does an excellent job of summarizing in a sentence the main idea of your video.
When it comes to videos (and YouTube), video thumbnails are everything. We live in a visual world, and if a thumbnail looks fresh, exciting, or can capture the essence of a video’s message, it will receive many views. However, this is mostly true for videos with a more casual tone.
A page’s URL is its web address, that is, where it is found on the Internet. It is where crawlers and users alike can go to see what they need about your business. The URL also tells Internet servers how to retrieve your webpage and show it on a user’s computer or mobile screen.
For this reason, every website that wants higher search engine real estate needs to align their URLs in the right conditions to ensure Google can access it and serve it on SERPS.
The construction of a proper URL can do a lot for your overall SEO strategy.
From the onset, users and search engines get some insight about the content available under the URL.
While URLs themselves are not a ranking factor, the keywords used in a URL are. This only makes sense when a keyword used in a URL is part of the article’s title.
A well-written URL can serve as its anchor text, which comes in handy for users in two situations:
(Source: Moz)
URLs that follow this structure are more successful in search engines because they provide the pathway Google crawlers need. Every search engine result contains a webpage URL.
(Source: Moz)
The protocol refers to the HyperText Transfer Protocol process.
The subdomain is the third level of the domain hierarchy and is found before the root domain and separated from the domain name with a period. For most websites, the subdomain is the “www” part of a web address.
The domain refers to the unique name of a website. For example, www.theiamarketing.com
The top-level domain (TLD) refers to the suffix that appears after the domain name. Examples of this include:
The top-level domain is the highest level of the hierarchy system that is the domain name systemAn information system that translates a domain name such as “cnn.com” into IP addresses so browsers can load Internet resources. (DNS).
The folders and page portions of a URL referring to the location of a webpage within a website.
Let’s take a look at two different examples.
A bad example of URL structure:
https://www.theiamarketing.com/guide/blog/232349
A good example of URL structure:
https://www.theiamarketing.com/guide/blog/creating-the-perfect-blogpost-with-hubspot-crm
The first URL does not provide users with the information they need to predict whether the URL they click on can provide the answers they want. However, the second example shows the pathway that a link follows within the site. The article “Creating the perfect blogpost with HubSpot” is found within the blog section, located in a section called “guide.” We can decipher the name of the blogpost as each keyword is separated by strings of text called slugs at the end of the URL, making it easier for Google to understand what is the central topic of the post.
Every webpage on your site needs a purpose. Whether it is to give information, entertain, or sell a service, you want its target audience to see it. For this reason, crawlers must discover it. You can help by including terms that describe the page’s subject. More often than not, this incorporates your targeted keywords.
A correctly structured URL will allow both crawlers and users to see a “breakdown” of where the webpage is located within your site. Let’s use this example:
https://www.theiamarketing.com/guide/blog/creating-the-perfect-blogpost-with-hubspot-crm
Using this URL, we know the website we will be accessing is called Theia Marketing. Its blog is located within the “guide” section, and the blog post, found under “blog” is likely about creating a perfect blog post using HubSpot’s CRM. Separating words using a hyphen (-) makes each keyword visible to site visitors.
A well-structured web address allows the user to see a logical flow from domain to category to sub-category to product. The second URL goes from domain to product.
It also gives the user insight into what they’ll find on the page. Coordinating this with your article headline/title will leave no room for doubt about the content’s subject matter.
Keeping URLs relevant and straightforward is the secret to getting both humans and search engines to understand them. Although some URLs do include ID numbers and codes, SEO best practices dictate it is more useful when it is to use words that people can comprehend.
A word of advice: avoid using uppercase letters in your URL because it can cause issues related to duplicate pages. For example, www.theiamarketing.com/guide/blog and www.theiamarketing.com/guide/Blog.
Duplicate pages are something you want to avoid. When faced with two or more pages with similar content, Google bots have to choose which one to rank over the other. A situation like this can work against your best SEO efforts and may lead to lower ranking for all of your webpage(s).
Just as essential as building webpages that look good for users and search engines alike, is improving the “behind the scenes” aspects of SEO. These include all of the elements that play an active role in making your page worthwhile of high SERP rankings and sustainable website traffic.
User experience refers to how the design of a website helps determine a user’s experience while on your site. It includes the person’s subjective feelings about your site, including how easy to use and engaging it is. Good user experience is vital for your site’s readers and directly affects your site’s overall traffic and engagement rates.
Consider your website as the backbone of your marketing strategy, 86.6% of small and medium-sized businesses in the U.S. said their website is their most important digital marketing asset.
So, creating a good user experience on your website is essential to all forms of marketing, but it is especially important when using contextual marketing.
Considering that today’s digital marketing is all about the customer, everything about your website needs to tailor to your target audience.
Your only goal shouldn’t be to rank higher on SERPS and have users visit your website. Your final objective should be to have them stay on your site for as long as possible, get familiar with your brand, and become paying customers.
Questions every SEO professional should ask when deciding how to improve their site’s UX include:
Without a concrete answer to these questions, creating a UX optimization strategy that fits your current needs can be a challenge.
Responsive web design is a web design strategy that allows websites and web pages to display on all devices and screen sizes automatically.
It uses a combination of CSS settings to adjust the style properties according to the screen size, the orientation of the device (such as tilting your smartphone or tablet), and image resolution.
Why do you need a responsive web design? The obvious answer is that it makes it easier for users to navigate on your website from any device. It also means your site is mobile-friendly, so visitors will want to spend more time on it, and search engines are inclined to consider it a quality website.
Responsive design aims to avoid the annoying task of resizing or zooming in on a site because of it from awkwardly displays on a screen. Websites that require these adjustments tend to have high bounce rates and are virtually inexistent on SERPS.
So how can you tell if your site or any other site is responsive? There are two different ways you can go about this: you can visit the same webpage on all your devices and check to see if the images, resolution, and size of all the website’s elements adjust to your screen size.
The other method is to do the following:
To build a website for optimal UX power, you have to know what you’re aiming to accomplish. This includes the following:
We have included sections throughout this page that exclusively focus on each of these aspects. All of these elements, when adequately enhanced, work together to provide the best UX for your site visitors.
Meta tags are pieces of text used to describe a page’s content. The “meta” stands for “metadata,” the kind of data these tags provide – data about the data on your page. The information these tags provide is called microdata.
They are essential for SEO purposes because these short content descriptions help search engine crawlers determine the topic of a webpage. This data is used to display snippets of a webpage on search results and for ranking purposes.
Examples of common meta tags include:
<title> and <description elements>
Part of using meta tag optimization is to give your content a “leg up” using your page’s HTML code. Because Google bots sift through thousands of webpages at a time, you need to specify which information you want them to crawl and index. They’ll consider this data when ranking all available content that’s relevant to your targeted keywords.
While their purpose is an important one, they are not visible on the webpage itself, only in a page’s source code. To access a page’s source code, right-click on any part of the page and click on “View on page source.”
The difference between tags you can see (like on a blog post) and those you cannot see is where they are placed. Meta tags, which provide the information called “metadata,” are found in a webpage’s HTML and are only visible to search engines and web crawlers.
There are approximately 20+ tags you can use to enhance your website; however, to have a necessary foundation on metatags, you only really need to know about a few of them. If you’re interested in learning about all the different kinds, you can read here.
The meta title tag, <title>, is located in the header and is the most important meta tag of all. Every webpage on your site should have a different title tag that describes the page. It indicates to Google crawlers and users the title of a blog post.
The end goal of every meta description tag is to convince the user to click on the link. While microdata refers to information about a webpage’s content, the title tag acts as the “clickable link” that gives a search result in its title.
The meta title is useful when shuffling through multiple open tabs on your web browser; it helps you to identify which tab corresponds to the content on a web page.
The meta description tag, <description element>, summarizes the content on a webpage so that bots know how to index the page, and humans can make an educated decision before clicking on it.
While you should put some time into writing a very brief one- to two-sentence description of the content, keep in mind that Google will sometimes pull up a snippet of it from your page if it considers it more relevant to the search query.
The meta robots tag, <meta name=”robots”>, tells search engines if and how they should crawl your webpages.
The default configuration of a webpage is to be crawled and indexed by a bot. In this case, a meta robots tag is not necessary. However, if you do not want your page crawled and indexed, use this tag.
Let’s look at a breakdown of this:
Key reminder: The use of the above meta robots tags should be restricted only when you want to limit the way Google crawls a page.
Alt text, short for alternative text, is a type of tag used in HTML that describes the appearance and function of an image on a webpage.
Images play a pivotal role in the quality level and engagement that a webpage’s content receives. There are various reasons why you need to use it when building your SEO-friendly webpages.
This is especially true when a site is slow in uploading. The alt text will help a user by providing some context on what an image is before it fully loads.
Google crawlers are created to scan through text-based content but are not programmed to scan an image and determine its subject matter. Therefore, an explicit and concise description of the pictures and graphics on your page helps these bots do their job.
Have you noticed that Google Image SearchA Google search feature that retrieves content-based images for a search query. displays images from hundreds of different sites that are relevant to a search you have just performed? Google Images uses the alt text descriptions from all the websites bots have indexed to retrieve the most relevant images for a search.
The alt-text is automatically written into the HTML code of a webpage. This is what the alt tag will look like in the source code of a webpage:
(Source: HubSpot)
Alt text should be specific and descriptive of an image and should also consider its context.
Let’s use the following image to compare a bad and good alt text description.
Bad example: “Illustration of a woman with headset”.
This alt text description is decent because it conveys what a user sees. However, the description could be more specific – you want Google bots to associate the image with a section or sections of an article.
Good example: “Illustration of a customer support agent with headset and laptop”.
The alt text description above explains the context of the images in more detail.
Many of today’s top businesses have grown and extended their customer base by using social media. That said, if you want a greater outreach, make it easy for anyone who reads your content online to share it with others on social networks.
Here are two things you can do to make this happen.
While the itch to include social sharing buttons to every major social network is big, do not do it. You do not want to give your audience too many options and risk your content not gaining momentum on any specific network.
Most marketers opt to include social sharing buttons to the networks they know their customer base frequents.
For example, if you write for an engineering firm, your target audience is likely to be present on social platforms like LinkedIn, Facebook, or Twitter. Strategically placing social sharing buttons on your webpage that allow readers to share your material will work better than if you share on Instagram.
2. Create short and descriptive URLsThe goal is to create a URL that makes it easy for anyone to know what the webpage is addressing.
Domain authority is a metric developed by the SEO software company, Moz. Businesses should be paying attention to their site’s domain authority because it plays a vital role in the overall SEO success of any website.
Domain authority is not a Google ranking factor and will not impact a site’s ranking positionA metric used by Google to determine the relevance of a webpage.. However, it predicts the likelihood a website will rank on a SERP based on how well it has been optimized for SEO and the number of backlinks it has.
It is measured on a scale from 0 to 100, 100 is most likely to rank in the first SERPs.
Generally speaking, a website can raise its authority on the web by:
Multiple factors influence which sites are considered authoritative in their industry. Ranking among the first ten spots of a Google search is one of them.
How do search engines determine the websites with authority? While the term “domain authority” does not exist for Google, part of creating quality content is having this content be authoritative. It also includes sites where:
The number and quality of backlinks your webpages earn is Google’s most reliable indication of authority, and they directly impact a page’s ranking.
Indeed, Google wants to give users search results that are authoritative and relevant to what they are looking for. Therefore, your content should match specific search queries particularly well.
While a Moz domain authority score will not impact your SERP ranking, it will give you an insight into the probabilities of ranking high in that particular moment. If your score is lacking and you are not ranking as well as you’d like on Google, you can do the following:
Establishing your website’s authority may take time, but it is the key to succeeding at SEO and reaping its many benefits.
Let’s look at an example of how an organization can start building content authority.
A business named 3D Designs is known for its printed prosthetics that it delivers to children who need them. Printing these prosthetics is only one of many things that they 3D print. They want to establish their authority on 3D printing and increase their search engine rankings. They can achieve these goals by creating as much content as they can on 3D printing, comparing their content with any of their direct competitors’, and target a set of keywords in every piece of their written content. They can also ask their clients to write about them and link to their site.
When search engine bots proceed to crawl and index their website, there will be no room for doubt that 3D Designs specializes in 3D printing, in prosthetic 3D printing, and is considered an authority in those topics by others.
Much like domain authorityA search engine ranking score that predicts how well a website will rank on search engine results pages (SERPS). It is also known as domain authority and was created by Moz, an SEO software company., page authority is a score created by the SEO software giant, Moz.
Unlike domain authority, page authority seeks to predict how well a specific webpage could rank on the first search engine results pages. Its score ranges from 0 to 100, 100 as most likely to rank, and 0 the least likely.
The lower your initial page authority score is (say, 20-30), the easier it can be to raise your score. However, pages with higher scores (70-80) may find it more challenging to improve due to the higher quality of competitor websites.
Domain and page authorities are based on information from Moz’s web index and are measured using machine learning that compares a website’s or web page's performance against thousands of SERPs. Moz does not consider factors like keyword use or content optimization when calculating page authority, but it does evaluate the number of links that point to a specific site or webpage.
Much like with domain authority, the best thing an SEO professional can do is to increase the number of quality links that redirect to their webpages.
Local SEO, a subcategory of search engine optimization (SEO), is a strategy that helps local businesses show up on local search results. Considering that 46% of all Google searches are on the lookout for local business information, every company with a physical location does well in investing in local SEO.
It also consists of optimizing a business’s online presence to attract more traffic from local search results.
When performing a Google search for “bakery near me,” you will get two types of results: organic search results and “snack pack” results.
The organic results are those we aim to rank high by implementing SEO strategies.
The “snack pack” search results, also called the Google 3-pack, consists of the top three business listings around a user’s current location that match their keyword search. This is what you want to optimize if you want your local business to land on the coveted three-pack.
Since an estimated 92% of users will choose to purchase from a company located on the first page (first three spots) of local search results, you too should want your business featured there.
Mobile use is inherently connected with local searches because users looking for local information will almost always use a mobile device. Not surprisingly, 46% of all Google searches are looking for local information. In addition to this,
As Google continues to make mobile search important for local search rankings, it becomes just as important to optimize your business's online presence for more significant Local SEO results.
There are multiples things a local business can do to increase their chances of ranking among the top three search results for local competitor businesses.
If you run an engineering firm, this will include brainstorming all of the search queries that someone could use on a search engine if they are looking for an engineering firm. For example:
If you have a hard time brainstorming the different combinations of keyword searches your customer base could use, Moz Keyword Explorer and Ahref’s Keyword Explorer are good options. They are also great allies for analyzing the keywords your direct competitors are using to rank.
Another idea is using Google’s autocomplete feature on its page can also provide you with more insightful keyword options.
By claiming and completing your Google My Business account, you make sure that Google has the most up to date information on your business. It is also pivotal to more significant local traffic because visitors will now have a website and social media profiles they can use to contact you.
In addition to updating your business information on your Google My Business account, keeping it active will ensure local search results include your business. Keeping your account active includes:
Claiming your business citations on every major directory listing for your business is smart. Making sure each profile features the latest NAPEssential business information that should be available online for the public to see. An up-to-date NAP will help organizations rank better on local organic search results. is even smarter.
Making sure your citations on these listings are correct is a top-ranking factor for Local SEO because consistent NAP helps Google crawlers verify a business exists and feeds the information it may have of it from across the Internet.
On top of this, consistent listing information helps potential customers know of your business and can help drive your sales and revenue.
Remember that link building is one of the most significant factors influencing a website’s rank among search engine SERPS. This rule also applies to Local SEO.
For a local business, building links across the web that redirect back to your site can help up your Local SEO game. Three different situations can apply for a local business:
No links- For small and hyper-local business, keeping your Google My Business profile up to date, business citations, and optimizing their page may be enough.
Location-specific links- Credible links from sources in your area, they come from the local news, non-profits, or a fundraiser you have organized.
Industry-specific links- Links from sites relevant to your industry (including any guest blogging) are good options.
Staying active on social media lets local businesses stay in contact with their potential customers. Social media platforms come with the bonus that their customer base can learn about special offers, events, or news related to the organization.
While on-page SEO emphasizes all of the elements that go into constructing a good SEO website that users can enjoy navigating, off-page SEO focuses on all of the efforts a business makes to build backlinks and establish their authority from other sources.
It works to strengthen and influence the relationships your website builds with other sites across the web. It includes strategies a business can apply to help search engines view your website as a reliable, trusted source, and worthy of ranking within the first SERP.
Throughout this section, you will notice that most of this content relates to building a volume of high-quality links.
Per Moz, link building is the process of acquiring hyperlinks from other websites to your own.
As we know, a hyperlink is a link that can be identified by its URLThe name of a web address.. Links are how a user gets from one webpage to the next.
The solution to a successful SEO strategy is creating a delicate balance between three things:
When a Google crawler scans through a webpage and sees another link, it will “crawl” onto that link and scan it as well. This means they will crawl the links on your website, and they’ll crawl the links on other websites that you refer to. The more times they crawl a site or individual webpage because other sites direct to it, the more these crawlers will believe that page is authoritative and relevant for its subject matter.
However, creating great content on your website is not enough to rank high on search engines. The truth is that Google cares how interesting and helpful other people think your content is. In SEO, this means a higher number of links that point to your website.
Link building was not always as it is today, so how did it get to be this way?
When Google was first introduced, popular search engines at the time, like Yahoo and Alta Vista, ranked their search results based on the content of a page. Webpages had to contain links from other sites, but it did not matter if a website was “high-quality” or “low-quality,” they all carried the same weight.
Later on, Google became a major player because it changed the way it evaluated links. It scanned the content on webpages and looked at how many people linked back to it. This was the beginning of the PageRank algorithm, which viewed link building as a “ vote of confidence” that webpages gave to other pages they linked to.
Once websites caught on, many of them started linking to other sites and receiving links. Many sites considered link building as a way to repay a favor, and others sold links to domains willing to pay for them.
With its Penguin update, Google enforced link building and required sites to focus on the quality and the number of its links.
Link building is all about links. We previously looked at a breakdown of the elements that comprise a URL.
In this section, we will classify the different parts of a link. When a link is placed in the HTML code of a webpage, it will look like this:
(Source: Moz)
Let’s look at each element individually.
The <a of a link tag is also called an anchor text. Its purpose is to begin the link tag and informs search engines a link will follow.
The </a> part of the tag ends the link tag, and let’s search engines know the link ends there.
Href is short for hypertext referenceAn HTML code used to create a link to another page.. The link within the quotation marks (“) is the URL the reader will be redirected to once they click on the hyperlink. While URL usually redirects to a webpage, it can also be an image, a pdf, or a file to download.
The anchor text refers to the short amount of text that you will see on a page for a clickable link. This text is usually underlined and is blue.
Using what we already know about the interworkings of crawlers scanning one page to the next, we can conclude that search engines do two things with the links they read on a webpage:
With its newfound knowledge from these webpages, crawlers then index the information. Every search query performed will serve a user the most relevant results from Google’s index, not the Internet.
When it comes to link building, Google will ask the following questions to determine how to rank a webpage:
The words linking building and backlinks are often used when referring to the efforts a website makes to have other websites refer back to it.
To avoid confusion, it is essential to establish the difference between the two terms.
Link building is a process; it works to acquire hyperlinks from other websites to your own. A backlink can be one of two things:
A hyperlink (commonly known as a link) provides a way for users to navigate between webpages within the same site or different sites. Search engines use links to crawl the Internet; they will crawl the hyperlinks between each page on your website, and they will crawl the links between entire websites.
We know that link building is a process in which Website 1 works to get Websites 2, 3, and 4 to link back to it.
There are many things that sites can do to have other websites link to them.
In this post-Penguin age, every effort taken to get more links to your site should aim at creating natural linksBacklinks that happen organically. They are used to refer to a credible website, a piece of content, or a source., that is, when other sites link to your content because they think it is useful and adds value to their readers.
It is here where we can refer to link building as link earning. When done right, link building and link earning are the same things.
Before the Penguin algorithm update, Google faced issues with websites that traded and sold links. Many of these links led to low-quality pages and did not provide any value to readers, but were still rewarded with high rankings because these practices were “technically allowed.”
After Penguin, Google began prioritizing high-quality websites that receive earn links from quality sites.
The premise of link building is creating content that other sites in your industry find it necessary to endorse your material to, 1) provide value to their readers, and 2) provide a valid resource that backs up their claims.
How does Google know which sites fulfill these requirements? It uses a site’s link profile to determine this.
A link profile is an overview of all the inbound links your website has earned. It measures:
This points to the diversity in the links that endorse your website. It lets Google know if a variety of sites keep linking back to the same URL or if it is just a handful of domains that keep linking back hundreds of times. The latter case could raise suspicions of spammy linking.
You’re going to want a diversity of websites to link back to you. Why? Because building links from the same repertoire of sites will have a lesser impact on the quality of your link building over time. This is why it is crucial to building good relationships with other websites in your industry.
Google rewards webpages that link to trusted sources and earn links from other authoritative pages.
When a website writes its content and mentions a business or organization, it will likely link to a specific page on the organization’s website that addresses the same topic. However, there are times when a writer will instead link to the organization’s homepage.
While this minute detail seems harmless, it should be avoided because doing so makes it difficult for individual webpages to attain high rankings. The reason behind this is that it is difficult for a single page to generate link equity and depends on external and internal links to boost its SERP rank.
To help with this issue, make sure you’re highlighting any articles or blog posts that are not getting enough web traffic through internal linking.
Backlinks are not created equal – those that come from high-quality websites tend to influence a webpage’s rank position more than those from low-quality sites.
Deep-link ratio is the percentage of all of your backlinks that point to other pages other than your homepage. It is another factor that distinguishes the difference in backlink value. Google prefers when sites link to inner pages rather than a homepage. The logic behind that suggests that links are a kind of endorsement that a particular page is essential. Links to a homepage do not endorse a page; they endorse a website.
Curious to learn which sites use inbound links to refer back to yours? Use Moz’s Link Explorer to find out and start building a healthy link profile.
Now that we know the importance of linking on a webpage and the positive effect of high-quality links on a page’s SEO ranking, the question to ask is: How many links do you need to rank on page one?
The reality is that there is no set number.
However, there is something you can do to find out how many links you need for a page to be competitive. You do need to have around the same links as those pages featured on the first SERP. Having as many links as they do will provide you with a substantial baseline to target for how many links you will need to also rank on top.
So how can you go about this?
These topics should be broad enough to have many longtail variations that users can search—for example, playground manufacturing.
You will want to open each page up and run it through a link checking tool. Moz Link Explorer is a great tool that will tell you how many linking domains point to that URL. Once you know the average number of domains that link to those sites, you will have an idea of how many links you will need to compete with the results on the first page of a SERP.
If you’re trying to compete with webpages featured on the first page of a search results page, then you will need to write content that includes an average of X number of links.
2. Determine the average number of linking domains that point to these webpages.
To rank within the top 10 search results for the keywords “top leadership skills in the workplace,” you would have to earn an average of 106 links to your webpage. The average number of links you need to rank in the first SERP will depend on how competitive the search is. For this reason, it's important to choose less competitive keywords to rank for.
Now that you know how many links you need on your website to compete with other first page contenders, you may be wondering, How do I know which content should contain all these links?
Again, you will have to look at the first ten results on the SERP for this answer.
Remember that if you want your webpages to rank at the top, you need to emulate these pages. This is not a free pass to plagiarize or copy anything from a website, but you can use it as inspiration.
As you’re looking through these pages, ask yourself these questions:
Before trying to start from scratch, determine how your website has adopted the answers to your questions. If you see similarities between your content and the top 10 search results, this is the piece of content you want to optimize with more backlinks.
If you haven’t published any content that resembles the top search results, you can use the material you already have to build on.
One half of earning links for your webpages is optimizing content so that others naturally want to link to it.
The other half depends on the relationships you build with other sites that can and will want to link to your website.
To get a popular blog in your niche to link to your site, you will need to have a relationship with the people running it. The extent of your relationships with those in your industry can vary, what matters is that they know about your business, they know about the content you produce, and both parties can relate their industry interests where it is appropriate.
Some link building techniques may require more time and resources, primarily if your website aims to earn links from large or well-known corporations within your industry. Small sites may particularly struggle with this, but once they are successful, the publicity that may come from this exposure will help cement their authority online.
(Source: HubSpot)
Strategies like guest blogging or featuring on another site’s resource page are not a complete loss. However, partnering with websites that, at a minimum, have an average domain authority will make the efforts worthwhile. Moz’s Site Explorer tool will help you determine if a partnership is a first step in the right direction.
The quickest way to establish a useful and mutually beneficial relationship with other websites that will lead to more backlinks is by delivering some value to them first.
For example, if you’re interested in building a relationship with a particular blogger, you could reach out to them, saying you’re a long-time admirer of their blog. Tell them you would like to collaborate with their brand and are wondering if they are looking for a guest post your marketing team can write on a specific topic. Uncover their personal and business goals and create a compelling case that demonstrates the value of a partnership with your brand.
To get started on building relationships with other players in your field, ask yourself these questions:
The more meaningful relationships you can build with folks who can link to you from high-quality websites, the more effectively you will scale your link building efforts over time.
As digital marketers, we know that Google collects information for SERPS through crawlers that scan webpages and visit the links within these pages. The links that point to another part of the same website are called internal links; they are used to tell crawlers what topics relate to each other. Links from other websites that pinpoint your site are called backlinks.
A backlink is a link (also called a hyperlink) that’s created when Website 1 creates an external link that takes users to Website 2. When this happens, we say that Website 1 is the referring domain that people came from when visiting Website 2. On a webpage, backlinks are placed within a text, on an image, or as a button.
The more pages that connect back to the same site, the higher authority Google will consider it has. This usually occurs when the authority site is seen as a thought leader. The content it publishes tends to be insightful and valuable to its readers. While it takes time for a website to earn the reputation of a thought leader, building quality content and quality links are vital.
Backlinks are essential for successful SEO because they are solid proof that a website is trustworthy. If many sites consider your content reliable, then search engines have no option but to rank you in the top search results. After all, they want to deliver great content to their users.
After the Penguin update, earning links became the new standard. It had always been, but practices like spamming comment sections and forums with links and submitting them to directory listings were a common practice. Now Google heavily penalizes these techniques with their absence from the top SERPS.
We know link building and backlinks in content is a must. Let’s take a look at the different types of links a single webpage can have.
Inbound links are also known as editorial links. An inbound link is one that comes from another site to your own. When we discuss how necessary it is to earn “links,” we’re talking about these guys.
As we know, inbound links are essential things; they tell search engine crawlers that a webpage contains lots of valuable information.
Therefore, the more inbound links your content receives from sites with health domain authority scores, the higher your chances of improving your overall SEO.
For example, Wilson & Company, a civil engineering company based in Colorado, has 256,700 inbound links from other websites, including The University of New Mexico and The American Planning Association.
This link building strategy is common among small businesses, newly created websites, and freelance writers. It consists of contacting bloggers, influencers, and website owners and asking them to link to their site. You will want to approach them by offering value before asking for link referrals.
An internal link is a link to another page on the same website.
Internal links are essential to your SEO strategy because, like inbound links, they help build up the authority of your pages. The difference between inbound and internal links is that internals are 100% within a site’s control.
To optimize how you use internal links, make sure each link contains anchor text that makes the topic of the redirected page evident. This best practice can lead to an excellent boost in rankings.
An outbound link is a backlink that directs your visitors to another site. Since they act as endorsements of one site to another, they can help build goodwill among sites, which may help you get an inbound link of your own later down the line.
A one-way backlink is a link building strategy in which Website 1 receives inbound links from other sites (Websites 2, 3, and 4) without having to link back to them.
Page Rank is a system Google uses to count the number of “link votes,” a website receives. The result of page rank determines which pages are considered “authoritative” and which are not.
A no-follow link will not count toward a page’s PageRank score and will not help nor hinder the page’s SERP rankings.
You can create a no-follow link on your page’s HTML code by inserting this no-follow tag:
<a href=http://www.anexample.com/ rel=”nofollow”>Link Text</a>
Why would a site not want bots to consider a link that could increase their PageRank?
A common reason: the website has noticed that the comment section for a webpage is being spammed with links to other sites. Instead of erasing every spammy comment, its webmaster can include a no-follow link in the page’s code and avoid possible penalizations for link spamming.
In contrast, a follow link is the opposite of no-follow. Follow links do not have a tag because the default action is for a bot to crawl a webpage; it does not need instructions to do this.
Quality backlinks should have a unique set of traits that set them apart from low-quality links.
There is a difference between an editorial and an acquired link. An editorial link is earned because other sites genuinely want to share the information on a page or because it can help them prove a point on theirs. Acquired links are purchased through link advertisements, directory links, or comments on a blog or forum.
Websites shouldn’t be linking to low-quality sites. Aim to earn links from sites with good reputations.
As time progresses, your website’s quality and the content you publish on it should improve. Simultaneously, so should the number of links that direct to your site.
Undertake efforts to ensure your website earns links from trustworthy, higher domain authority, and not low-quality sites. Not doing so could undermine your SEO efforts.
Ideally, you want the links you earn to relate to the topics your site addresses. If your website is known for event planning, you won't want to receive links from domains that focus on extreme sports.
A word on anchor text: from the referring domain, Google bots get a sense of what the topic(s) of the endorsed page is through the anchor text. When creating a hyperlink, make sure its anchor text is relevant to your subject.
You should aim to earn links from relevant sites that can bring qualified traffic to your site.
Your link profile says a lot on what other sites in your industry think of the content you produce. Make sure it includes follow and no-follow links.
Off-page SEO is all about extending your SEO efforts to other places where your website’s sphere of influence can grow. Most importantly, it is about other sites recognizing how helpful and insightful the information you publish is.
It would be fantastic if getting any site to endorse yours through link building would be easy. The reality is that many websites do not want to risk their hard-earned web traffic and online reputations by associating with sites that may not have much to offer their readers.
To make your link building experience less cumbersome, we will address three of the biggest challenges all websites face when trying to improve their Link Profile.
Most sites struggle with these issues because they are out of their control.
As the owner, the webmaster, or a writer of a website that is optimizing its content for higher search rankings, we do not have oversight on how others respond to our outreach efforts. Simply put, when we’re reaching out for link building, we have no say on whether a site links to ours.
Let’s look at how we can approach these issues.
In SEO, the sites with more links pointing back to them win. Keep in mind that the quality and the quantity of these links is only one crucial ranking factor. You can earn more links by:
Spending more time on keyword research, building topic clusters, comparing your performance to top-ranking websites, interviewing professionals in your field, and investing in educating your writers is the first step to creating quality content.
Once you have created more great content, do not be afraid to promote it anywhere it is relevant. Actively share it with your friends, family, and industry contacts on social media. Join online conversations that are related to your topics. Create a webinar or an opt-in offer others can download when they become email subscribers.
If your customers can purchase a product or a service from your website, reach out to well-known influencers in your industry and encourage them to test these. Ask them to write a blog post or record a video with their sincere impressions and share with their audience.
We have already discussed the importance of creating relationships with other industry sites. When reaching out, volunteer a couple of your links that they can easily relate to their content. Make suggestions as to how your brands can collaborate and build a mutually beneficial partnership.
Linkbuilding is not something that you can accomplish in a day, so you have to cultivate persistence, determination, and patience. Keep in mind that taking shortcuts like buying or trading links is a violation of Google guidelines and can result in severe penalties for your site.
The answer is to strike a delicate balance between guiding and pushing someone to do as you want. You do not want to appear “pushy” because these websites are already doing you a favor by linking to your site. Try one of these methods:
Do not be afraid to be explicit and friendly about the things you would like. Gently let them know where you would want your link placed on their site. Give them a sound justification for what you’re saying – you do not want to come off as ungrateful for your editorial link.
Asking an industry favorite to feature in your content is a HUGE compliment to them, it lets them know you value their opinion, and you want your audience to learn how they are making an impact. If they agree to your offer, they will want their audience to know about this and will link to your site.
Nothing is more embarrassing for a business than other people pointing out inconsistencies in their content after asking them to share it. This demeans the hard work that’s gone into writing content. To avoid this, double-check your sources.
Anchor text is the short amount of text that you will see on a page for a clickable link. This text is usually underlined and blue.
The emphasis on the words used in an anchor text goes back to the Penguin algorithm update that highlighted Google’s concern for keyword use. While an anchor text is small, you want to make sure its text accurately describes the topics discussed in the source page.
Finding who links to your website is possible with tools like Google Search Console, Moz, and Ahrefs Site Explorer. Once you know the exact links, you can check out the anchor texts that lead to your webpages. Encouraging others to use better anchor text goes back to educating them on SEO best practices. However, do not worry too much about poorly written anchor text unless there is a site entirely irrelevant to the niche that’s linking to you.
When determining how to rank webpages, Google bots consider the following: is this page spammy?
In the past, websites used spammy comments and links on forums and blog posts to add outbound links that led to their site. As a response to this SEO malpractice and black hat technique, Google crawlers became smarter in identifying “natural links” as opposed to spammy ones.
To help with this task, SEO powerhouse Moz created the spam score meant to access spammy links.
A spam score measures the percentage of websites that have been penalized or banned by Google for spammy malpractice. To measure the spam score, Moz uses a machine learning model that has pinpointed 27 popular features among the millions of sites that have met an ill fate in Google’s hands.
Knowing your site’s or any site’s spam score is vital for SEO, why? There are two reasons:
If there are sites that consistently link back to you, but they do not appear to be linking to any relevant material on your website, this can be a warning of spammy activity.
These two pieces of information are critical for when you’re choosing backlinks. They give you a better idea if you could face penalization for spam for the backlinks or sites you’re linking to.
Spam scores are measured in a percentage from 1-100% and are color-coded. A spam score of
In much the same way, a green color is considered safe, a yellow color is deemed to be ok, and red is considered to be at risk.
If your website scores within the 61-100% range, it does not necessarily mean your site is spammy. It could be a forewarning that you should investigate the quality and relevance of your backlinks.
Spammy websites that link to sites are a real threat for many businesses. Having an association with a low-quality site or one with a bad reputation on the Internet will hurt a site’s spam score, domain authority, page ranking, and link portfolio.
You can check the spam score for your site’s homepage or any other webpage using Moz Link Explorer.
The Moz Pro tool also attaches a visual representation of a webpage's backlink profile Spam Score, which can help you determine the number of linking domains within each Spam Score range.
In simple terms, you get a breakdown of the spam score for the webpage's that link to your page.
Once you know your site’s spam score, you shouldn’t worry if it falls within the low or medium range scores as long as Google has never flagged you for spam or any other black hat technique. However, you should look into the quality of the inbound links to your site and determine if you need to investigate or even remove them.
The same rules apply when testing for other sites’ spam score. A website with a low to medium level spam score does not make it spammy; it merely indicates that the risk of spam exists. However, if this website routinely links back to your pages, you should investigate the site’s content and its relevance in linking back to your site.
Moz’s spam score comes with a breakdown of the criteria used to calculate the result. Any highlighted criteria you see will be problem areas you can address to avoid any future penalties from Google.
Technical SEO refers to all of the behind-the-scenes aspects that help dictate how well a website and its webpage’s fare on search engines.
Technical SEO focuses on addressing the non-content elements of a website.
It includes strategies to improve the backend structure and foundation of a website, as well as to enhance a site’s readability, making it easy for search engines to crawl and understand a site. It also focuses on providing a good user experience (UX), which search engines value because it means that a domain anticipates users’ needs.
On-page SEO focuses on the visible elements on a webpage. These include content quality, the images and multimedia used to complement this content, and a users’ experience when navigating through a site.
In contrast, off-page SEO centers around a website’s efforts to build quality links that link from external sites to their own. Link building is a requirement for a successful SEO strategy because it notifies search engines that other sites consider the information trustworthy and reliable.
Page speed is the measurement of how fast a page’s content loads. Page load speed can impact search engine optimization, engagement levels, and conversion rates.
The speed at which a webpage uploads on a web browser is critical to the overall user experience. If a visitor has to wait longer for your page to load on their browser than they usually wait for other pages, they could lose interest and choose to leave.
After running extensive A/B testing on their site, Walmart determined that every extra second their pages took to load translated into a 7% decrease in their conversion rates. For the eCommerce giant, this meant a 7% decrease in their sales. On top of this, a 500-millisecond delay in page upload was equivalent to a 20% decrease in site traffic. It is pretty apparent what was happening, users got impatient and exited the site – this is just further proof of how critical it is to measure, test, and optimize a page’s uploading speed.
Websites that load in five seconds or less see users’ sessions on their website increase by 70% when compared to those sites which load after 19 seconds. This is a HUGE contrast that can be the difference between someone leaving a website or becoming a prospect and later converting into a paying customer. Every SEO specialist wants to avoid this common pitfall. It is for this reason that so much time and resources are employed to keep websites in optimum shape.
A laggy webpage will frustrate the user, hinder the quality of their experience, and discourage them from making a purchase.
Page speed has also played an important role in Google’s ranking algorithm. Page speed has been a key signal of search ranking success for desktops since 2010. In July 2018, Google published its Speed Update, which made it mandatory for all mobile devices to monitor the loading time for their pages.
Since 2015, search queries performed on mobile devices outnumbered those done on desktops. A study conducted by Backlinko analyzing 5 million desktop and mobile web pages found that the average webpage on a mobile device takes 87.84% longer to load than its desktop counterparts.
We can describe page load time as the time it takes for a page’s content to fully upload and time to first byte as the time it takes a web browser to receive the first byte of a page’s information.
Mobile page speed best practices consider less than 1.3 seconds ideal. However, in this study, the average speed for mobile devices was 2.6 seconds.
A second study from 2018 evaluated the page speed of 11 million mobile ads’ landing pages from 213 countries. The verdict: the longer a landing page took to load, the higher the bounce rate.
(Source: Think with Google)
Similarly, the number of elements found on a page, including text, headings, titles, images, and multimedia, contributed to the total size of a webpage (average page weight bytes) and had a direct correlation with conversion rates.
Webpages with a high number of elements, somewhere between 400 to 6,000, often see a 95% drop in their effectivity to get prospects to convert. While the evidence to back this claim exists, as consumers, we can agree on one thing: most of us choose to purchase something because of the high quality or the utility of what we’re buying, not because of a flashy ad. Do your best to provide high-quality content and multimedia to engage readers, while also keeping page speed in mind.
Improving page speed on desktops has been a priority for Google since 2010.
We know page speed is vital for users to stay on our website longer, for lower bounce rates and higher conversion rates, but what can we do to improve it?
Luckily, there are a lot of tools out there that promise to measure the average page speed for websites. However, not all of the options you will see online may deliver accurate results.
Among those we recommend is Google’s PageSpeed Insights tool, which will give you an understanding of a page’s upload speed by assigning it a speed score. You just have to plug in a URL, and the tool will assign a rating from 0-100. It will also give you tips on how you can fix any issues currently slowing your page speed.
While it is a great tool, the biggest drawback for PageSpeed Insights is its capacity to assess page speed for a single URL at a time.
If you’re looking for an overall assessment of your whole site’s speed, Google Search Console offers a Speed report that shows you which pages have slow upload speeds and why.
Now that you know how to check your site’s page speed, let’s dive into some strategies you can use to increase it:
A critical factor in page speed is the total weight of a webpage’s files. Too many elements on any one page make it hard for browsers to upload everything within the ideal 1.3 second period. Large text, image, and video files are no different.
Images larger than 70kb and whose image results scale higher than 72dpi are inevitably going to slow your website down. To solve this issue, sites like Tinypng and Squoosh.app offer free file compression.
If you’re looking for a more structured option and currently use HubSpot’s CRM, its CompressHub integration may be a good option for you. CompressHub compresses new and existing images on your HubSpot website, improves your page speed, and you don’t have to worry about losing image quality.
An easy and efficient way to optimize your website code is by eliminating any extra spaces, commas, and other characters that will not affect how information is displayed.
Your CSS and HTML files should be grouped to avoid these from becoming bulky and bringing your website performance down. You can do this by combining external CSS files into a single HTML page.
You should also group your CSS selectors that have similar instructions together, which automatically compresses your CSS stylesheet for higher upload speed.
Web browsers have quite a reputation for frequently caching pagesAn HTML version of a webpage that Google creates and stores after it is indexed. (including images, information filled out on firms, JavaScript files, and stylesheets) from previously visited pages. This can significantly slow down page speed. Tools like GTmetrix let you set an expiration date for your site’s cached information.
The time it takes for a server to respond to a page request depends on many factors. These include the amount of traffic a site receives, the resources each page uses, the software the server uses, and the web hosting provider used.
You can improve your server’s response time by examining your website’s query performance and fixing any memory errors it may have.
A content delivery network (CDN) refers to the distribution of multiple servers across many geographical locations. These work together so a website can load its content on a user’s web browsers faster.
CDNs work like this:
By using a Content Delivery Network, your site’s page speed can improve dramatically. Even if thousands of users are requesting access to your site at the same time, a CDN can deliver the content faster through its distributed network of servers.
(Source: Cloudflare)
HubSpot CMS’s CDN provides site visitors a better user experience by hosting web servers around the world that ensure all sites load quickly and efficiently. With HubSpot’s CDN working for you, your website’s page speed is guaranteed to be below the Internet’s average upload speed.
While these solutions can help alleviate the majority of page speed issues. Some websites may have other underlying problems; for these cases, your best bet is to have a developer take a look.
In January of 2018, Google made it clear on its Webmaster Central Blog its prioritization of mobile device use in its crawling and indexing process.
“Starting today, pages, where content is not easily accessible to a user on the transition from the mobile search results, may not rank as high.”
In July 2019 it also published the following:
“Mobile-first indexing is enabled by default for all new websites (new to the web or previously unknown to Google Search.”
What this means for all websites is that Google’s crawling and indexing bots are prioritizing mobile websites above desktop sites. If you do not have a mobile-friendly site, then you will likely see a hit in your SERP rankings.
The key takeaway from this: it is critical, now more than ever, to create a website that’s accessible from any major device.
As of 2019, 63% of Google’s organic search traffic in the US comes from mobile devices. Creating a mobile-friendly website is not a matter of “staying ahead in the game” anymore. A business that wants to stay in the game should integrate mobile-savvy elements to their site. In comparison, a few years ago, the use of mobile devices for Google search results was as low as 34%.
For this reason, Google updates its algorithm to serve users fine-tuned results that accurately reflect what they want, need, and how they search for information. If your website does not work well on a mobile device, people will leave it.
Fortunately, most content management systemsA software that allows users to publish, edit, and modify webpages within a website. like HubSpot and WordPress offer website builders that allow businesses to visualize what their mobile sites look like before publishing, using a What You See is What You Get (WYSIWYG) builder. In addition to this, Google Webmaster Tools provides valuable tools to make sure your site can still compete with the mobile-centric changes.
For higher organic search results for your mobile website, optimizing your site for mobile search is vital.
Mobile optimization is all about enhancing your site’s design, site structure, and page speed to make navigating on your mobile site a pleasant experience.
For Local SEO mobile optimization, combine the best practices to feature on Google My Business three-pack results and the optimization strategies outlined below.
Mobile devices make it easier for users to scroll through a page endlessly. To make it easier for users to choose where to stop and read avoid these mobile site pitfalls:
The flash plugin is not available on mobile devices (and it will not be available on desktops either come December 2020). Use HTML5 to create special effects that anyone can see through any device.
Popups, in general, are frustrating to close when you just want to read an article on a site. Popups on a mobile device can be enough to exit the website altogether. They are a dangerous combination that can contribute to a higher bounce rate.
Mobile devices require users to use their fingertips to navigate from one item to the next. When designing your mobile site, avoid adding buttons that may be too big or too small.
Landing pages and forms on desktops are much bigger and provide more space to work with. When using a mobile device, users have less space to fill in forms and see images. To adjust to smaller screens, change image size, adjust the amount of content you include, and change the coding on landing pages to ensure that form fields can magnify to allow users to input their information.
The SEO elements of a webpage include title tags, H1-H6 headings, image alt text, your URL, and meta descriptions – all of which may need to be readjusted to fit and look well on the smaller screen space of mobile devices.
Schema markup is used to make your site’s search results more engaging with additional images and bits of information that users may want to know about their search query topic before even clicking on a search result. Because mobile devices have a smaller screen, using the right type(s) of schema markups for your site is a surefire way to get more organic traffic.
When optimizing your website for greater mobile-friendliness, consider integrating voice search capabilities. About 20% of all searches on a mobile device are voice-activated. Using more concise
language in your content and schema markups can bring in more significant results. With the widespread use of assistants like Siri, Cortana, Alexa, and Google Home, increasing your site’s chances of ranking is still relevant.
The speed at which a site loads across any device is crucial to getting users to stay. To improve your mobile page speed, you will want to reduce redirects to other websites, improve your browser caching, and make changes to your site’s coding.
As of December 2019, 92% of the world’s internet search queries are done through Google. This is why we will focus on the search query process from Google’s standpoint.
Before a user conducts a Google search, crawlers (software used to discover publicly available web pages) have already searched through the web for pages and followed their links. They gather data from every website they visit, bring it back to Google’s servers, and index this information.
As a business, you want crawlers to see your website, analyze it, and retrieve every relevant piece of information about it. Sometimes, however, we ignore vital elements that can negatively affect this process. These include:
Followed:
1 |
<a href="https://ahrefs.com">blue text</a> |
Nofollowed:
1 |
<a href="https://ahrefs.com" rel="nofollow">blue text</a> |
Including a “no follow tag” is a big no-no unless you do not want Google crawlers to consider a web page when they collect information. Nofollow links can be useful if you do not want to pass your domain authority to another website or if you do not endorse the other site.
One reason for not allowing a crawler to scan a link is in the case of paid links. Paid links are strictly prohibited and are considered black hat SEO. Let’s look at two ways you can use a “no follow tag.”
Overlooking any of these elements cuts off Google’s natural crawl process. Once crawlers conclude collecting data, it is indexed.
A quick word on robots.txt
There are three situations in which a Google bot may interact with a robots.txt tag:
Once a website is crawled, Google servers use signals like keywords, website freshness, and website authority to index the information into the new Caffeine Index. These are added to a virtual library from which Google extracts URLs to present as search results.
Google processes and stores the information found on the Internet in an index, which is a massive database with all of the content they’ve discovered and consider worthy of showing to search users.
It contains billions of websites Google search has to sort through within milliseconds after someone performs a search query. Each indexed webpage is stored within a frequently cached pageAn HTML version of a webpage that Google creates and stores after it is indexed. that Google accesses in milliseconds when searching for results pages that fit a search query.
So, when a user searches using the keywords “SEO marketing agency,” Google’s ranking system (which consists of many algorithms) will consider various factors to deliver the best and most relevant results. These factors include:
These factors are essential to the ranking order we see in SERPS because they are meant to give the user personalized and quality results. For this reason, if you have a clear idea of the information you are after, it is essential to use specific words in your search to reflect that. This makes it easier for a search engine to retrieve the results you want.
When someone performs a search, Google thoroughly searches its index for highly relevant content and then orders that content according to search engine rankingThe position a webpage ranks for on a SERP. with the prospect of providing a solution to a search query. In general, anyone who uses Google search can assume that the higher a website ranks on a SERP, the more relevant that site is to the search query.
It is possible to block search engine crawlers from scanning a part or all of your site through the use of the robots.txt tag or by instructing search engines not to index individual pages. For the most part, however, websites want their content found, so it needs to be accessible to crawlers and should be indexable.
A website’s architecture focuses on how the pages on a site are arranged, how its structure can help users find the information they need, and on assisting businesses to drive conversions. It is all about building a website that’s intuitive and works for you and not against you in retaining an audience’s attention.
It also refers to structuring your site so that Google can find and index your webpages easily. It also implies that your pages are considered high-quality, so any other sites linking back to you benefit from this.
Let’s take a look at the top priorities for site architecture:
Your site architecture influences a user’s experience as most of them will use a navigation menu to find what they are looking for. You want to make it very easy for them to see things quickly and easily, so they stay on your website for longer and come back another time.
When you create your navigation, keep the coding simple. HTML and CSS are your safest approach. Coding in JavaScript, Flash, and AJAX will limit a crawler’s ability to scan your site’s well-thought-out navigation.
Make it easy for people to know where to go. Make sure your navigation menu is simple, easy to locate, and is clear about the menu options available. If someone is looking for the subscription prices of software, they should find an option labeled “pricing.” Likewise, provide an easy navigation path back to other menu options.
Even if each of your menu options leads to 100 pages, each webpage should be no more than four clicks away from the homepage – remember, you want to make it easy for anyone to navigate on your site.
The structure of a URL should make it easy for both humans and search engines to determine the topic of an individual webpage. It is just as important to arrange your website into subdirectories so that each page is at most four webpages away from your homepage, making it easier for users and crawlers to locate a page quickly. This is particularly helpful for websites that have hundreds of different pages.
Using this example, we can see that the blog post titled “Creating the perfect blogpost with HubSpot CRM” is found within four clicks of the Theia Marketing homepage.
https://www.theiamarketing.com/guide/blog/marketing/creating-the-perfect-blogpost-with-hubspot-crm
For websites that struggle with lot’s of different menu options, consider using a footer site map. We’re all for making your site’s navigation intuitive, and users and search engines alike are familiar with footer site maps.
While site architecture will not make or break your current Google rankings, it is one of the first things you should consider. It influences the average time a visitor stays on your website, and if they return at all, and this does affect your site’s traffic and its overall SEO success.
(Source: Jetpack)
Schema markup, also known as structured data, is the language search engines use to understand your content and how you organize it. You can learn more about the technical side of schema markup at Schema.org.
Schema markup uses semantic vocabulary that helps search engines break down a webpage’s code (this code is the page’s language). At its core, schema markup is a type of code that provides information to search engines so that they can crawl, organize, and show your content and identify any other crucial information on a site.
In return, the search engine can provide the end-user with rich snippets of information. They may help them make better sense of their search results.
We are all familiar with these rich snippets we often see in our search results include reviews, star ratings, images of products with their price, and a business’s location information. The great thing is that the content featured on these snippets always appears above the #1 search result even if its corresponding link does not rank #1. It pays to know how to rank your content not only for search but for the newer organic search features like these rich snippets.
There are many types of schema markup, mostly because users perform hundreds of different kinds of search queries.
The purpose of this type of schema is to define the most valuable information about a business or organization so that users can gain a quick glimpse of it. This includes the organization’s name, location information, phone number, logo, star ratings, and social media profiles.
This schema markup type provides information about a known individual such as a celebrity, politician, or entrepreneur.
Information provided by this schema type includes their full name, birthday, education level, family relationships, nationality, and physical characteristics such as height.
For local businesses investing their time in enhancing their Local SEO, reputation management, and directory optimization efforts, this schema markup is essential.
It is beneficial for local companies or a company’s regional branches. The information it provides includes location, opening hours, and contact information.
This schema type is used to sell specific items or services. Both types provide information about a product, such as its name and price. When in use, it gives potential customers insight into necessary information about a product and gives businesses a leg up over its direct competitors that may not have product schema.
This markup schema shows users the path links that lead to the source domain of the webpage they are viewing. It helps users see a breakdown of the website and can reduce bounce rates.
Most commonly used for news articles and blog posts, it helps search engines understand some necessary information, including the article’s title, the time published, and a featured image.
This schema markup helps Google crawlers index videos on your website, and it helps your videos appear under the “video search results.”
It provides additional information for scheduled events, such as concerts, webinars, and workshops, which may include their location, date, and price. Its purpose is to encourage interested people to attend these events.
It shows a rich snippet of a recipe, including an image(s), star rating, and a meta description of the page.
This schema markup type gives users information regarding the star rating of a product or service and provides a direct link to its reviews.
Implementing schema markup into your SEO strategy is a sure-fire way to engage search engine users with your website’s information before they decide to visit it.
While its use does not directly improve your site’s search result rankings, it is considered a best practice because it makes it easier for search engines to find and show your content on SERPS.
With structured data, you can “tell” a search engine which tidbits of information you would like included within the schema markup type of your choice. For example, if your business is hosting an in-person event, you can choose the information you want an event schema markup to display. Anyone who searches for your event or a keyword-relevant to it will likely retrieve the specific information about you have pinpointed about it in the schema code.
Getting started with schema markup can be a hassle for those of us that are not tech-proficient because it needs to be added manually, which is especially true if your website is well built out and has many webpages. Google search’s guide to understanding how structured data works provides an excellent guide to implementing schema markup on your webpages in a way that helps shine the information that you most want search engines to see.
Everyone has seen an http:// or an https:// in the address bar of our browsers at least once. Do you know what they are and they are there?
HTTP stands for HyperText Transfer ProtocolA communications system used to connect to web servers on the Internet or to local networks (intranet). Its main function is to establish a connection between a server and send HTML pages to the user’s browser., a virtual process in charge of transferring information from a website to a user’s web browser.
HTTPS, characterized by an added “s,” is the secure and encrypted version of this protocol, which means that any data you enter on a website is shared and stored safely. HTTPS also ensures Google that the sensitive information found on the site is safe for indexing. However, you will not always see the https:// on your browser’s address bar. Often, it is replaced by a padlock.
Having this transfer protocol on your website is a significant factor for Google search algorithms. Given that Google’s priority is to provide users with secure and relevant content, websites need to provide their visitors the security that any input of information on their site will be safe from unwanted interception.
The little “s” in https means the connection between a web server and a web browser is secure. This “s” is short for SSL, which stands for Secure Sockets Layer.
SSL is a standard security technology that ensures the information passed between a web server and a web browser is private. Up until 2018, this was the latest security measure for web browsers. However, there is a new update known as TLS or Transport Layer Security.
In reality, the differences between SSL and TLS are minor and are not apparent for the everyday user. The way web security works in the exchange of sensitive information will appear to be the same. The website building provider of your choice will offer an SSL certification with TLS updates.
However, a tech-oriented professional may be interested in learning the minute differences between the two. For more on this, you can consult here.
Think of the last online purchase you made. Before submitting your payment information, you would have had to create an account on the website, chosen your items to purchase, and entered your personal information. All of this data is important to you and the online retailer. But it also means something for a hacker waiting to intercept an unsecured website and steal it.
How does a hacker do this?
It can happen in several ways, but a standard course of action goes like this: A hacker places a small, undetected listening program on a web server, and that program waits in the background until a visitor types information on the site. Once this happens, the program activates and starts capturing information and then sends it back to the hacker.
However, when you interact with a website encrypted with SSL/TLS, your web browser will form a connection with the webserver, inspect the SSL/TLS certificate, and bind your web browser and server together. This binding ensures the connection is secure so that no one aside from you and the website can see or access any information you input.
This connection is instantaneous. You only have to visit a website with an SSL/TLS certification, and your connected and data will automatically be secured.
Web browsers have made it very easy for users to tell a secure SSL/TLS certified website from an insecure one. Let’s look at a couple of ways you can know:
Clicking on the icon will give you more information about the website and the company that provided the certificate
A website with a visible https;// and a padlock icon could still have an expired certification. Double-checking for a valid certificate is only a 30-second effort that can ensure your personal and business information is safe.
Luckily, the vast majority of websites built using a website builder or an integrated hosting platform come preconfigured with a security certificate, so your site already has HTTPS implemented.
If you’re unsure if your site or any other website is secure with the latest protection, you can use HubSpot’s SSL tool to be confident it is.
How to get an SSL/TLS certification for your website
Because the SSL/TLS certificate is the latest in internet browser security, every website should have it. For most industries, an SSL/TLS certification is enough. However, if you host content on different domains or subdomains, you may need additional security.
Most web hosting websites offer the SSL certification as part of their package.
The finance and insurance industries, among others, may require additional security certifications due to the sensitivity of the information they manage. If your website falls within one of these categories or you know your site needs an additional certification, contact an IT professional for more details about specific requirements within your industry.
Protecting your domain is just as important as building a reputable website with a good domain score. A straightforward way to protect your domain is to avoid any black hat activities on your website—for example, building backlinks from sites with a high spam score.
Implementing certain text records such as SPF, DKIM, DMARC, and BIMI are simple yet effective ways to protect your email address from spoofing. A spoofing attack takes place when a human or a malware program successfully impersonates an individual or a business to gain access to sensitive information.
These authentication protocols review an email as soon as it is sent to prove that a legitimate business is sending them.
Any business that exchanges commercial or transactional emails with clients and providers would benefit from using email authentication standards like SPF, SKIM, DMARC, and BIMI. These standards help organizations to verify that the emails others receive in their name are legitimate.
For these email security protocols to work, a business’s email domain administrator must enable them in the DNSAn information system that translates a domain name such as “cnn.com” into IP addresses so browsers can load Internet resources. through the use of DNS records. Once they have been enabled, anyone who receives an email from this business can verify whether an email was legally sent from the business’s domain.
Let’s break down these email security standards:
The Sender Policy Framework (SPF) functions as an email authentication method used to prevent spammers from sending messages to others using your domain.
It provides a baseline for email servers to verify that any incoming mail from your site was sent from your domain’s email server. To enable SPF for your email domain, you need to create a DNS text record and include the email servers you want your recipients to verify it is from you and your website’s IP address.
Domain Key Identified Mail (DKIM) is another email authentication method that allows the recipient of an email to verify that it was sent and authorized by the domain’s owner. This authorization is completed by giving the email a digital signature. Digital signatures are found in the “header” of a message.
Both SPF and DKIM allow the recipients of an email to make sure the sending and claimed domains of an address are legitimate. This refers to the @sample.com of an email address, the other portion of an email address @name could still be spoofed.
Domain-based Message Authentication, Reporting, and Conformance (DMARC) is a protocol used for email validation, policy, and reporting that specifies to inbox providers what they can do with an email that is not authenticated by SPF or DKIM.
In other words, DMARC makes it easier for an internet service provider to prevent spoofing and phishing of an individual’s information from happening. It achieves this by letting a business know what they can do to the emails that weren’t verified by SPF and DKIM; often, this means sending these questionable emails to the junk folder or blocking the domain that sent them.
Brand Indicators for Message Identification, (BIMI) for short, is an email standard that uses brand logos as an indicator of email authenticity that helps consumers avoid fraudulent emails.
As a business, BIMI will help you gain your recipient’s trust because they will be protected against any potential spoofing or receipt of fraudulent information. The logo BIMI attaches to the correspondence you send to your clients, and email subscribers are helpful because it provides an extra layer of protection against any type of spoofing or phishing.
When used together, all four of these email authentication protocols work in unison to ensure that an individual’s emails are not being spoofed by posing as a legitimate business and stealing a user’s sensitive information.
If you’re curious about your brand’s email reputation or that of a business or entity that you regularly receive emails from, you can check their sender score.
A sender score is a number ranging from 0 to 100 (100 is a high score) that measures a sender’s reputation and shows how a mailbox provider (e.g., Gmail, Outlook, Hotmail) views your IP address or that of the entity you’re checking on.
Sender scores are classified into three categories:
You can check out your business’s Sender Score.
CAPTCHA, short for Completely Automated Public Turing Test to Tell Computers and Humans Apart, is a widely used spam tool that helps a site determine whether a user is human or not.
We are all familiar with CAPTCHA tests. They involve having to click on pictures with a traffic light, a bridge, or having to type a string of distorted letters and numbers. While annoying, these short texts were designed to protect a site from human hackers that use bots that enter false information, leak passwords, or post spam on a website.
The idea is to create a barrier (i.e., a test) that a human can easily bypass, but a machine cannot. The theory is that once the test is completed, there is enough evidence that a human is trying to access the site and not a bot.
The updated version of CAPTCHA is reCAPTCHA v3, which uses risk analysis algorithms to develop a score that indicates the likelihood of it being a human or a bot. reCAPTCHA does not ask a user to complete a test; it allows webmasters to choose when they run a risk analysis. This could range from requiring email verification or enabling a 2-step verification process before logging in.
To implement this, webmasters must add a reCAPTCHA script tag and a code snippet.
Building and executing a successful SEO strategy is tough. There are many ins and outs, metrics, KPIs, and best practices to follow for a fighting chance at landing in the coveted top 10 search results for a given search query.
Naturally, there will be SEO professionals who want to give their site(s) a leg up on SERPS and will do what they can to ensure SEO success. We can categorize these individuals into two distinct opposites: white hats and black hats.
White hat refers to the practice of improving a site’s search engine positionThe rank of a webpage on an organic search results page (SERP). on a (SERP) while respecting Google’s (or any other search engine’s) SEO guidelines. Given that they aim to align with the search engine’s SEO standards, they are considered the opposite of black hat tactics.
We know white hat practices are used to enhance a site’s SEO rankings without violating any search engine guidelines, which includes building a website and SEO strategy that aims to satisfy the target audience’s needs. Many of these strategies are identical to the topics we have covered in this SEO guide. Examples of these practices include:
At its core, a white hat practice is any strategy that aims to improve your site’s SERP rankings without violating SEO guidelines or gaining an unfair advantage over other websites.
Aside from the moral responsibility of following pre-established rules that ensure healthy competition between webpages that target similar keywords, not following white hat practices like the ones mentioned above can get your website banned from Google or any other search engine.
With over 3.5 billion search queries performed in a day, losing access to ranking on Google could because for many sites to lose much of their online visibility, web traffic, and even business.
For any business in today’s market, this would be a setback challenging to recover from. For all websites on the Internet, the risks are too high to jeopardize the work and the effort invested in building and optimizing an SEO-friendly site. Google’s Webmaster Guidelines are an excellent resource to consult for any concerns you may have about your site’s alignment with expected criteria.
In contrast to white hat SEO practices, there is black hat SEO. Black hat refers to a set of practices used to wrongly increase a website’s or a webpage’s search engine positioning on search engines through methods that directly or indirectly violate Google’s SEO guidelines. Unlike white hat SEO practices that aim to satisfy the end-user, black hat practices do not.
The term “black hat” originates from the movie cliché that the good guys wear white hats and the bad guys wear black hats.
For this reason, that black hat SEO is often believed to refer to the “bad guys” of the Internet – hackers, virus creators, individuals who install spyware or malware on websites or software, and any other unethical actions performed on computers.
If you want to build and maintain a long-lasting, reputable, and successful online business and website, you should avoid at all costs falling into any one of these popular unethical practices:
Keyword stuffing is the practice of packing a webpage with keywords or numbers to try to manipulate a search engine’s ranking. Remember, when Google’s crawlers scan a webpage, they consider the keywords and the number of times these keywords are used and correlate these with a user’s search intent to determine the webpages that rank first for a given search query.
Keyword stuffing is SEO malpractice because it implies using keywords or numbers in a way that is out of context or illogical and only serves to provide a negative user experience.
Examples of keyword stuffing include text that consistently includes words the page is trying to rank for, repeating the same words over and over again, or making targeted keywords invisible on a page by placing them within the page’s code.
Redirecting is the act of sending a user or a search engine crawler from one URL to a different one. Redirects usually take place when a site has moved to another web address. The white hat practice of redirecting is done through the use of status codes 301 and 302 or by using JavaScript.
Sneaky redirects imply misleading search engines and users. This type of redirect means guiding crawlers to one site but displaying another site to users. When this is complete, a search engine crawler scans and indexes the first redirect, and users see another. This is a black hat practice because it aims to deceive crawlers – it shows crawlers one type of information and users another. Most cases of sneaky redirects redirect users to a website that may be infected with malicious software.
An example of this practice includes redirecting desktop users to a site relevant to the page’s subject matter while redirecting mobile users to an irrelevant and often spammy source.
We know how important it is for websites to create and publish high-quality content for significant SEO success, a white hat practice. The black hat practice implies the complete opposite – creating bad content.
Content plagued by plagiarism, claims, and data without any sources to back it up, and articles filled with stuffed keywords are characteristics or poor-quality material.
Link building is one of SEO’s pillar strategies. Without it, Google crawlers could have a difficult time determining the website’s that have earned the “vote of confidence” from other sites.
In black hat SEO, building backlinks is not a priority. The go-to approach is to buy or sell them and includes giving or receiving money, posts, services, or products in exchange for these links. It also includes establishing agreements between websites to mutually link to each other’s sites and using automated programs to create links to a site.
Google’s Webmaster Guidelines for link schemes is very descriptive about the actions that constitute this unethical behavior.
Anyone who reads articles or watches videos on the Internet can attest to the practice of blog comment spam. This black hat tactic consists of leaving a comment in an article or video and leaving a link to a specific website in said comment with the hope that others will click on it.
While users across the web are now savvy enough to know this practice is spammy and Google’s Penguin algorithm update made it harder for webpages with comment spamming to rank on SERPS, it still exists, and there are still users who do click on these links.
For more information on other black hat SEO practices, check out this HubSpot article.
While successfully executing any black hat practice may result in better search engine rankings and web traffic for a site, Google is increasingly becoming more adept at recognizing these techniques aimed at fooling its system.
The highest risk of getting caught is getting booted off a search engine. A consequence like this could lead to significant losses for a business.
It is crucial to a business’s website survival and success that its team be aware and able to identify black hat practices in action.
There are two reasons for this: to have more knowledge on the current SEO ecosystem that you are investing time and money to succeed in and to know when to file a black hat report.
Situations in which a business will want to file a black hat report are: if their website has been hacked, if it is the target of spammy links, or if someone on their team notices a search result with spammy material for a keyword they are targeting.
Use Google Search Console’s page to start a webspam report. Make sure your suspicions are on point before initiating your webspam report because a wrongful report is a black hat practice.
If you suspect your site is the object of a malware attack or has been hacked, request a Google website review.
In case your site has been the target of a negative SEO campaign of spammy links, head on over to the disavow backlinks option for further guidance.
Status codes are three-digit numbers that represent an action regarding an HTTP server’s response to a web browser’s request.
When you visit a site, your browser sends a request to the site’s server. The server then responds to the browser’s request with an HTTP status code.
Essentially, status codes are conversations between your web browser and server. They communicate whether the sending and receiving of a website’s information are okay or whether something is wrong.
Understanding status codes and how to use them can make it easier to diagnose errors on your site quickly to minimize downtime on your website. Some of these status codes are important because they let search engines know where on the web, they can locate a site when it has a different URL. They also come in handy when they inform people about the status of a URL.
Status codes are divided into different sections, ranging from 1XX-5XX. You can find information about all of the HTTP status codes here.
For this article’s purpose of teaching the basics of HTTP status codes, we will focus on the fundamental ones you need to know.
This group of status codes is essential in signaling to Google whether a page’s URL is permanent or temporary so that it can crawl these pages and assign them to their correct URLs.
The 301 status code appears on your browser when a site has migrated, when pages have been relocated or moved, or when a URL has been renamed.
Typically, a permanent redirect signals search engine to bring overall ranking information for the old URL to the new one.
The 302 status code informs Google that your page was temporarily moved to a new location.
Traditionally, technical SEO best practices dictate that temporary redirects do not transfer the ranking information of the old location to the new one in the same way that a 301 Permanent Redirect would.
While this is technically correct, Google tends to treat a temporary redirect as a permanent one in time.
These codes indicate a problem with your source request. Quite often, this means a request to access a webpage. These errors simply say that the resource is not available and unable to load.
This can be a result of a temporary server outage; however, what’s important is that the result is a bad request – the server is not serving your pages.
This code appears when a user attempts to access a website without having the necessary credentials. These protected resources require a username and password.
The 401 status code is often used on sites where a person needs to log in to their user account to access personal information. However, if users do not need to log onto your website, all or parts of your content are hidden behind a login, which is likely to cause frustration.
This status code often arises because directory browsing is forbidden for the site. Considering that some websites want users to navigate using the URLs in their pages for that site, they do not allow people to browse the file directory portion of the site.
This appears when you can communicate with the server but cannot find the requested resource.
This set of status codes informs users that there is something wrong at the server level that prevents them from uploading your webpage request.
The result will almost always be that the page does not load and is not available to the user attempting to view it.
It is similar to a “404 Not Found” for a complete computer instead of just a page.
Usually, the computer or the server that receives the request is under maintenance or may have an issue with faulty memory or CPU power.
A temporary 503 code is not a problem. Search engines prefer that a website use this status code when their server is temporarily unavailable. However, a lasting 503 code may result in the removal of a URL from a search engine’s index.
A healthy, SEO-friendly website implies that its Technical SEO has found a stable middle ground. In essence, its backend structure allows search engine crawlers to “read” its webpages and categorize them accordingly.
For many sites, however, finding this middle ground does not come quite so easily. Websites need consistent updates, and this can create inconsistencies for search engines.
Let’s look at a couple of strategies websites can implement to optimize their technical SEO efforts.
Duplicate content on a website or many websites is a disaster waiting to happen. Even though crawlers are experts at “reading” and indexing webpages across the web, they are not programmed to determine which one to rank the highest. As a result, when they encounter duplicate content on various webpages, they rank each page lower than perhaps they should be ranked.
This course of action is a logical search engine response when we consider that the main objective for search engines like Google is to provide unique, high-quality content and reward sites that deliver this with higher search engine rankings.
Despite many efforts your team may employ to ensure your content is not duplicated throughout your site, different URLs may contain similar information that crawlers identify as identical. To avoid this common pitfall, include the canonical link element.
Inserting the rel=”canonical” (canonical) tag within a page’s code lets crawlers know which page is the master copy or which page you would like for it to rank. It will tell search engines which piece of content you prefer bots to index and prevents your website from duplicate content penalizations.
Here is an example of the canonical tag embedded into a webpage’s code:
(Source: Moz)
Another option to avoid duplicate information on your webpages is to prevent your CMS softwareA software that allows users to publish, edit, and modify webpages within a website. from publishing multiple versions of a page. How you go about this will depend on the CMS software you use. For websites using HubSpot’s CMS, the canonical tag is not necessary as the software will take care of this automatically.
A crawl error occurs when search engine bots try to “read” a webpage on your site but is unsuccessful. Identifying crawl errors may be difficult, but tools like Moz’s Site Explorer and Google Search Console provide detailed reports on any crawl errors that may occur.
As a webmaster, your main objective (among many others) should be to ensure that every page on your site can be crawled. Since every page on a website should have a purpose as we learned about building quality content, technically, every page should be up for grabs by crawlers. After all, every webpage is a worthy candidate for ranking on SERPS, provided they are high quality and are relevant to a keyword search query.
There are two types of crawl errors, let’s take a look:
Site errors are those that you want to avoid at all costs because they mean that a crawler cannot access any of the pages on your site. They can consist of:
A DNS error means a search is unable to communicate with the webserver. Most of this time, this means that the server may be unavailable due to maintenance issues. When this happens, Google crawlers will attempt to access your site later. However, if the problem persists, your Google Search Console account will reflect this error.
Server errors can occur for several reasons. It can signal that a crawler’s request to access your website has timed out because the server took to long to load the site. Or It can also occur when there is an error inside a page’s code that prevents it from loading for crawlers and human eyes. Another reason could be that the server is saturated by an excess of visitors on the site. These include 5xx status codes.
Remember that the robots.txt file on a website’s code instructs Google crawlers of any pages you would prefer it not to index. If these bots can access your robots.txt files, it will not know which pages you do not want it to index and will postpone the crawling process until this file is available.
In contrast, a URL error signals that a search engine crawler is unable to access a specific webpage on a site. In these cases, you might see a 404 Not FoundA status code used for when a requested page couldn’t be found but may be available in the future. or a 301 RedirectAn HTTP response code that indicates your requested page has permanently moved to a different URL. source code. Commonly, URL errors show up when an internal link is eliminated, but its inbound links remain on the webpage’s code. When a bot tries to follow the inbound link, it will reach a dead end and thus bring up an error.
Common URL errors include:
As the name says, these errors are specific to webpages optimized for mobile devices. This could result from a piece of content that has Flash enabled but cannot be reproduced on a smartphone or an error that originates from a separate mobile subdomain.
This type of error points to malware present on a specific webpage. The best you can do is search for the malicious software on the page and eliminate it immediately.
While there are many different types of errors that can prevent a crawler from accessing your site and properly indexing it, these are some of the most common. For additional guidance, Moz provides a helpful resource for finding and fixing other common crawl errors.
An XML sitemap is a listing of all the pages on a website. It functions as a map for search engines to know of any important content on your site. They are great options for larger or more complex sites with hundreds of different webpages that crawlers may have trouble indexing all the important pages due to the volume of the site.
XML sitemaps can categorize a site’s content using posts, pages, or tags and will include the number of images on each page, and the date each was last updated.
You can create an XML sitemap for your site using the XML-sitemaps website. Once you have completed this step, be sure to submit it to Google’s Search Console. Completing this action ensures that Google crawlers will index all the webpages you have specified in your XML sitemap. Through your Google Search Console account, you can review the pages Google bots have indexed.
An SEO audit is a process of reviewing the technical aspect of your website’s SEO to make sure it is optimized and up to par. It is mostly a health checkup for your website to make sure everything is working a-okay toward better organic search rankings.
Depending on your experience with technical SEO audits, your overall SEO knowledge, and how deep of an audit you do, an audit can take anywhere between one to two hours.
So how do you go about conducting an audit?
Before employing an SEO platform to check for possible crawl issues, check to see if there are some things you can quickly fix right now that maybe causes crawl issues. These include: removing any duplicate content across your website, setting your robots.txt file to make sure you only restrict the indexation of the webpages you do not want to be indexed and fixing any redirect issues.
Both the Google Search Console and Moz platforms analyze any potential crawl issues your website or URL may have.
You should be checking that your site’s backlinks direct users to the page you intend them to.
As for your site’s internal links, you want to check the following:
You do not have any orphan pages on your site that are not linked to any page, which makes it difficult for search engines to find them.
Driving qualified traffic and scaling your business can be achieved in two ways: through organic search traffic and paid traffic.
By investing time and money in making your website a winner in the SEO game, you’re committing to a continuous process of optimizing your site and the content within it so that search engines reward it with a feature on a SERP’s first page.
Let’s take a look at some SEO best practices you can implement to boost your webpage rankings.
On-page SEO is all about making sure your webpages, and the content within them looks good. It also includes convincing users to click on your page once they see it.
Therefore, you need an interesting title tag and meta description – you want users to believe they will find what they are looking for by visiting your page.
To help content rank higher for your targeted keywords, include them in a page’s title tag, URL, and H1 heading. Doing so helps search engines determine the topic of a webpage.
Titles tell users what into the content within a webpage and its relevance to their search.
The best way to help your page stand out from others is by using the main topic of the webpage’s content and the keywords you’re targeting together.
This is done through “front-loading” your primary keyword. You do this by starting your title tag with your target keyword. Remember that search engines scan the title of a page and its content. It catches the primary keyword you’re aiming for, which is found in the title, and it also pays attention to the words and phrases in your title tag.
So, if you can, include your targeted keyword are early on in your title if it feels natural. For example, if I want to rank for plastic tableware, try to use that keyword first.
You should also write an engaging title; it needs to pique a user’s interest if you want them to click on your site’s link before they decide to click on any of the others. The more people that click on your link, the more organic traffic your site gets, which is the number 1 ranking criteria for search engines.
For example, this title: “A complete list of productivity apps for everyone working from home.” This title is specifically catered to those who are working from home, and it makes a promise: the list is complete, so it comes across that you will not need to look anywhere else, you will find what you need here.
Every page with published content should have a heading. Including the targeted keyword where it appears natural is vital for users to know that the title tag they clicked on, and the article topic is the same.
Remember that the structure and the keywords that make up your URL are essential factors that determine if a particular page ranks well for the keywords and topic it aims to.
Internal links connect one page to another on the same site. In general, the more internal and external sources that point to a single webpage, the higher its PageRank. While Google no longer makes PageRank public, it is still common knowledge that links are essential for SEO.
Whenever you publish new content, be sure to add internal links of relevant pages from your site. This helps users discover other content your company publishes and will improve the pages’ web traffic.
To keep a tab on the internal links your marketing team is creating, search on Google for site:yourdomain.com [page topic].
This trick will show you the most relevant pages your site has on a specific topic. It can also help your team find other relevant pages and add internal links where it is appropriate.
One of the worst things about creating content on your website that can happen is that you base your content on something that people are not searching for.
For this reason, keyword research plays a pivotal role in optimizing for search engines.
Let’s use the following keyword search as an example: “how to create a kite.” By using Moz’s Keyword Explorer tool, we see that this search result garners a zero monthly search volume.
For a well-rounded idea of the real potential for a topic’s search traffic, you should consult the “search volume” and “estimated traffic to pages” metrics in Keyword Explorer. Doing so will ensure that you invest your time and energy in building content that addresses a topic that can be beneficial to your website and earn more organic traffic.
HTTPS encrypts the information shared between the website visitor and the site’s server. As of 2017, it was the tenth most important ranking factor search engines evaluate when determining page rankings.
Aside from HTTPS’s top 10 ranking factors position, for excellent user experience, site visitors need the assurance that the information they provide on a website is secure from potential hackers.
Page speed has been a ranking factor for desktops since 2010, and mobile devices since 2018. There is a reason for this – more than ever; users want the content to be delivered fast.
Google’s PageSpeed Insights tool will give you an understanding of the upload speed of a page by assigning it a speed score. Just plug in a URL, and the tool will deliver a score from 0-100 and some advice to make improvements.
These are some tips to keep your page speed in check:
When an image file is too big, it will take longer to load on a website, which often leads to a user’s experience being less than promising. By compressing an image, you decrease its file size, making it easier to upload.
Messy CSS and HTML files are known to be bulky and slow down a site’s performance. By combining external CSS files into a single HTML page, you can simplify the internal work of your pages.
Increase your page speed by deleting any extra spaces, commas, and other characters that will not affect how information is delivered.
Websites that live in a single server are typically slower to upload because their data has to travel long distances before it gets to a web browser. With CDNs, a network of servers caches a site’s data so that a website is always ready to be uploaded by a local server.
There is no way around – backlinks are a significant ranking factor, and they are here to stay. To be successful at SEO, Google needs to know that other sites in your industry link to your pages.
Here are general tips on how you can increase the backlinks leading to your site:
This includes long-form content, usually between 3,000-10,000 words. Blog posts that answer the “why” and “what” of a topic like “What we can do to fight climate change” and “Why is climate change a controversial issue” tend to attract more backlinks because they discuss topics from which others can derive differing points of views.
Backlinks are like endorsements that websites give to others they trust. If you want sites to link to your content, make sure you’re writing about the latest trends. If you can, conduct a study, interview customers, or present information in an entertaining way that others will want to share with their audience.
One way to initiate contact with other industry websites is to reach out. When you do this, make sure you’re upfront about any collaboration ideas you may have. Once your relationship with them is underway, and they are linking to your webpages, you can offer links that could complement the content they are publishing.
SEO success often translates into business success. The reason for this is the strong correlation between the two – the more pages a website has that rank in the top search results for a search query, the greater exposure its content, products, services, and marketing campaigns receive. With search engines as an ally, it is easier for target audiences to find it online, increasing the chances of more sales down the line.
While every organization will require a different SEO strategy that can address their various pain points, successful SEO will always have common characteristics. These include higher search engine rankings, better brand building, more marketing qualified and sales qualified leads, an increase in conversion rates, more backlinks, and higher engagement levels on their website and social media.
Regardless of a business’s SEO goals, creating engaging and high-quality content will always be at the top of the list. Unfortunately, many organizations fail to rank on search engines or generate substantial sales due to low-quality content that does not interest or resonate with their target audience.
Such was the case for Northmill, a Fintech company featured on Ahrefs. By tuning their SEO strategy, they were able to bring in more website traffic and higher conversion rates.
We will look at how Northmill achieved this, but first, let’s learn a little more about them.
Northmill is based in Sweden and provides financial services, including online loans, to more than 150,000 customers across three countries. Their website content primarily focuses on topics from like lending money online and the private economy.
Before changing their SEO strategy, the content they were publishing was not gaining much web traffic. They also wanted to market their brand primarily through social media, which was not going as they expected. After digging a little deeper, they realized their customer base, much like anybody else, was not sharing the fact they had been approved for an online loan. After all, this is very private information that most of us do not talk about.
Many businesses are built on the idea of providing a service and addressing a common pain point. If they do not do any significant market research, they likely try to grow based on a false idea of how they are going to connect with their customer base. By doing some research, a growing business can make sure they engage with their target audience in a way that they are more likely to respond positively. For the Northmill marketing team, discovering their audience was not talking about them on social media was a wake-up call to change their marketing in another direction.
To solve the problem of low engagement levels on social media, Northmill decided to focus on improving the quality of the content they were publishing. They knew they wanted to write on topics that their target audience would want to read. In this case, the theme was online loans.
Before writing great content, Northmill invested in identifying where their direct competitors were getting their web traffic.
With Ahrefs Site Explorer, the Northmill marketing team examined their direct competitors’ domains for a summary of the keyword(s) that were generating a large portion of their organic traffic.
(Source: Ahrefs)
The results indicated that most of their competitor’s site traffic was coming from the content they had on tax refunds. Not coincidentally, Northmill had previously written a couple of articles on the topic.
Instead of starting from scratch, the team decided to write a brand new article on tax refunds and combine their previous articles on the subject. This way, the two previous articles could share the authority from the new one.
Northmill used Ahrefs Keyword Explorer Tool and Google Keyword Planner to search for the top keywords their competitors were used to rank on Google. These keywords became the ones they would choose to target.
(Source: Ahrefs)
They used the Skyscraper Technique to try to rank higher than the sites already ranking on the first SERP. By reading every search result available, they would determine how they could improve on the existing content to create new material that was unique and more compelling to readers.
Once they knew how to proceed, they knew they would target the keywords with the most search traffic, skatteåterbäring (tax refund), which had 8,100 monthly user searches. They did not have to worry too much about backlinks, because this keyword was not competitive.
(Source: Ahrefs)
When they repeated this process for the second most searched keyword, skatteverket declaration (Swedish Tax Authority), which had 6,600 monthly searches, things were different. They saw that the same website, Skatteverket, took the majority of the search results. It would likely be harder to compete for a top spot because pages from the same site help build up each other’s authority through internal links.
When you’re in the process of researching the keywords you want to target for a specific piece of content, you have to be strategic. When the top ten search results for a given keyword come from a variety of websites, it will likely be easier to create better content than these and still use the external links on these search results. However, when the same site dominates a keyword, it gets harder. In these cases, you just have to find a better keyword to target.
Unfortunately, there is not a keyword research software that can give you 100% accurate results. To confirm their findings, the Northmill team used Über Suggest.
(Source: Ahrefs)
They also used Google Keyword Planner to double-check the results.
(Source: Ahrefs)
As it turns out, the keywords that Northmill would be targeting in their new article would be skatteåterbäring 2016 and skatteåterbäring 2015, meaning tax refund followed by the year.
As we know, Northmill had previously written articles on tax refunds, but they weren’t ranking at all. Instead of throwing these away, the marketing team wrote a new one and redirected the old ones to it through internal linking. This way, anyone who read any one of the three articles would have a clear path to finding the other two. They did have to make sure not to target the same keywords in these three articles. Sounds like a solid plan!
They published their third article on tax refund in early November. They waited a month before tracking its performance. During this time, Google bots were able to crawl the webpage and the links in it and index them accordingly.
When the time was right, they used Google Analytics to understand how users were engaging with the article.
(Source: Ahrefs)
As you can see in the screenshot, the first bounce rate and exit frequency were above average, meaning they would need to adjust the webpage for better results. They decided to add a top bar to make it more user-friendly.
The first version on the webpage looked like this.
(Source: Ahrefs)
The second version of the webpage included the top bar.
The top bar was added for better UX, and it helped created a direct path for readers to ask any questions they may have on the topic.
Up until this point, the Northill team did not have a concrete idea of the general user intent for visiting that page. They could assume the majority of users read the article for information on tax returns, and they were leaving as quickly as they came. By adding the top bar, it was easier for readers to ask questions and receive quick answers. From this dynamic, the marketing team discovered that the majority of users were asking questions that indicated they needed money and were looking for alternative solutions.
(Source: Ahrefs)
Once the top bar was added, Northmill saw a slight change in specific metrics, including average time on the page, the bounce rate, and the number of exits, which slightly decreased.
Now that they knew readers’ big motivators for landing on their page, they were able to take another step toward improving the user’s experience. Their solution was to add an overlay that would pop up when a reader wanted to leave the page.
(Source: Ahrefs)
It reads: Waiting for your refund? We offer a 10% discount on all your loans. Apply for free.
Once the overlay was added, their web ranking for the targeted keyword, Skatteåterbäring 2015 (2015 tax refund), increased up to number three spot on the top 10 list of search results.
Not only were they able to rank within the top 3 search results for their chosen keyword, but they also earned:
The key takeaway from this case study is that by conducting extensive keyword research, a business can pinpoint the words or phrases for which its website has a higher chance of ranking.
Once you know the best keywords to target, you can start creating content that provides higher value than what is currently available to users.
While this study did not address the importance of backlinks from external websites, keep in mind that they are uniquely valuable. Search engines want to know that other sites consider your content trustworthy.
Key Performance Indicators (KPIs) are essential in digital marketing and every industry because they serve to measure what is working and what’s not working about any given strategy.
Measuring the efficiency and success of any SEO strategy using specific KPIs is no exception. When you know how your team can measure its SEO success, it is easier to determine how your website is performing and the areas where you can improve.
Assessing the effectivity of your current SEO strategy is less complicated when considering the following factors:
Organic traffic refers to the number of visitors who land on your site because of seeing it on a search results page. It is the opposite of paid traffic because websites do not pay to be featured, they “earn” their spot there.
According to an SEMRush Ranking Factors study, direct website visits, that is, organic web traffic is the number one factor affecting a domain’s SEO success. It is so important because it gives numerical evidence that a page is successful in attracting a greater audience, so it must be doing something right, and it is what drives conversion rates and sales.
The most effective way to increase your site’s organic traffic is to regularly create and publish good content that stays on top of the latest trends in your industry.
Other factors that can influence your site traffic include social shares on social media platforms and, if you’re a local business, your online presence in business directories.
To track where your site’s traffic is coming from, webmasters can include a UTM codeAn analytics tool used by marketers to track the impact of their online efforts, understand their audience’s behavior and measure performance., or tracking code, in a webpage’s URL. This additional text within the URL helps marketers determine the number of people who land on their site from organic search results, social media, paid search ads, or an email.
Here’s an example of a URL and its tracking code:
http://blog.hubspot.com/9-reasons-you-cant-resist-list?utm_campaign=blogpost &utm_medium=social&utm_source=facebook
In this example, any web traffic that goes to the HubSpot website will be attributed to Facebook because it is the source from where a user is coming from when they visit HubSpot. Everything to the right of the “?” in the URL will tell your analytics software that someone got to the URL from a specific source.
There are various types of UTM tracking codes that track specific things. For example, they can track multiple pieces of content from one campaign, a specific source, all web traffic coming from a particular medium. They can also help identify the keywords you’ve paid for in a PPC ad so that your marketing team knows which pieces of content to attribute its marketing success to.
Impressions are the number of individuals who conduct a search query and retrieve one or more individual pages from your website as a search result.
Clicks are the number of search engine users who click on your website. Every click is an increase in your site’s organic traffic.
The click-through-rate (CTR) is the percentage of people that click on a search result, which in this case, is your website. Higher CTRs imply that a search result’s headline and meta description appeals to users even before they access a website.
When assessed together, these KPIs serve as a good indication that a site’s SEO strategy is working.
The average session duration is the average time a user spends on your site. A session begins once a person lands on it and ends when they leave your website or after 30 minutes of inactivity. Everything a user does during this time is considered one session.
Measuring the average time a user spends on your website is now easier than ever. Website builders like HubSpot and WordPress help businesses gain insight into how well their site is performing.
If your analytics show your site is not keeping visitors for very long, you will want to review what to do. If people are leaving within a few minutes on landing on your site, consider making your pages more visually appealing, improving your headlines, including videos, or making navigation easier.
If they are leaving within a few seconds of landing on your page, your audience may not be a match to the purpose of your page, or your page could look low quality and may need a revamp.
Visually appealing sites tend to get people to stay on for a few more seconds; engaging and valuable content gets them to stay longer and keep coming back for more.
With Google Analytics, you can see what users spend their time doing while on your site. You can use this information to see what’s gaining traction and what is not. Emulate your most popular pages and apply what works on your least popular ones.
For example, if you see that graphs, infographics, and videos tend to garner more attention, add more of this content on other pages. If you cannot pinpoint what’s wrong and what is not, split testing various features will help you determine how to approach your content.
By testing each element individually, you can see what site visitors like and what they do not. From there, you can make data-driven decisions to improve your site’s UX.
The pages visited per session metric refers to how many webpages someone interacts with every time they go to a website. The more pages that are visited indicates that people are more engaged and interested in learning more.
The key to getting users to visit many pages is to produce relevant, engaging content, include enough internal links to guide them to other related pages, and make navigating your site intuitive.
The most significant factor that determines this KPI and whether they come back is the quality of their user experience.
The bounce rate measures the percentage of people who visit a webpage but do not engage with it. There are many ways this can happen:
While all of these behaviors are normal for every Internet user, if you observe your site’s bounce rates increase over time, it is clear that your site’s visitors are doing any of the above things. It also means that there is something you need to address. Either your website is not doing enough to keep visitors interested, or your site is attracting the wrong audience.
Though it is essential to keep in mind that every website experiences bounce rates, it is normal for people to go to your site, find what they are looking for, and leave.
Whether your site’s bounce rate is healthy or not will depend on the industry and the type of webpage you’re analyzing. For example, the average bounce rate for B2B websites ranges between 25-55%, and the average bounce rate for landing pages is somewhere between 60-90%.
Measuring your bounce rate is not enough to pinpoint the problem areas on your site; it should be used together with the number of clicks, CTR, and conversion rates on each webpage to determine the changes you need to make.
For most businesses, their primary goal is to sell more. Every digital marketing effort aims to convince consumers that their brand is worth buying. A company can measure this and other things through their conversion rate.
Conversion rate from organic traffic measures the number of site visitors from search engines who have completed the desired action you have indicated. This can be signing up for an email list, filling out a form, or making a purchase.
The key to gaining the most insight from this KPI is to look for any fluctuations in your data. Have you recently made changes to your SEO strategy? Have you been optimizing your content? Your website? You should be comparing the number of conversions you’re getting from organic traffic and other channels like social media.
If your website conversion rates are low, consider how to optimize your site for a better user experience. Remember, you should be targeting all stages of the customer journey – awareness, consideration, and decision in your content and your marketing campaigns.
The intent behind using SEO best practices for higher conversion rates is to get users on your site. Most people are oblivious to the companies out there who offer a service or product until they need it. When users need to fulfill, they will search for it, and if you’re an SEO success, they will find you.
While optimizing your website and content for SEO is free, unless you have the expertise in website coding and Google algorithms, hiring an ongoing SEO service is inevitable. Even so, the potential benefits are vast and can largely outweigh the investment required.
SEO is a fundamental tool in building brand awareness and establishing brand credibility. It works by making your content widely available for a greater audience interested in topics related to what you offer.
On average, the first five organic search results for a given keyword (or string of keywords) make up nearly 67.60% of all the clicks. This is HUGE. By ranking in the top search results, your website can be found by millions of new people.
At Theia Marketing, we make sure that the SEO strategy we provide aligns with your business’s marketing goals and needs. Every organization is unique and has distinct pain points. From the moment we receive your inquiry, our focus is on making your business website rank in the top search results for your subject matter of choice.
The more people know about your business and are happy with the content and service(s) they receive, the more likely they are to give referrals to those they know.
Greater site exposure through inbound marketing strategies like SEO, content marketing, social media, and referrals brings the opportunity for businesses to gain new qualified prospects, possibly leading to higher conversion rates.
SEO can also reduce customer care and sales costs. Companies that invest in creating better content can: answer their customer’s FAQ, address and resolve their most common issues, and provide additional support like tutorials that help make better use of their products and services. Taking the time to focus on these aspects can help decrease the costs of addressing customer issues.
Whether your business is currently implementing SEO strategies and you’re getting good results, you’re not getting enough results, or you do not have an SEO strategy in place, you must have one.