Google used to make much of its ad hoc keyword search functionality available as well, but now the Keyword Planner is behind a paywall in AdWords as a premium feature. Difficulty scores are inspired by the way Google calculates its Competition Score metric in AdWords, though most vendors calculate difficulty using PA and DA numbers correlated with search engine positions, without AdWords data blended in at all. Search Volume is a different matter, and is almost always directly lifted from AdWords. Not to mention keyword suggestions and related keywords data, which in many tools come from Google's Suggest and Autocomplete application programming interfaces (APIs).
That's why PA and DA metrics often vary from tool to tool. Each ad hoc keyword tool we tested came up with slightly different numbers based on what they're pulling from Google and other sources, and how they're doing the calculating. The shortcoming of PA and DA is that, even though they give you a sense of how authoritative a page might be in the eyes of Google, they don't tell you how easy or difficult it will be to position it for a particular keyword. This difficulty is why a third, newer metric is beginning to emerge among the self-service SEO players: difficulty scores.
WebSite Auditor scans pages for code errors, duplicate content and other structure-related issues they may have. Other than that, there is this on-page optimization module, which allows determining the ideal keyword placement and researches page elements that can be optimized. In WebSite Auditor you can also analyze competitor’s pages to compare of to improve own on-page strategy. There are actually more features, I just won’t be listing all of them here. But this is the best solution with regard to on-page optimization I found so far.
Over the past year or two, we've also seen Google begin to fundamentally alter how its search algorithm works. Google, as with many of the tech giants, has begun to bill itself as an artificial intelligence (AI) and machine learning (ML) company rather than as a search company. AI tools will provide a way to spot anomalies in search results and collect insights. In essence, Google is changing what it considers its crown jewels. As the company builds ML into its entire product stack, its core search product has begun to behave a lot differently. This is heating up the cat-and-mouse game of SEO and sending the industry chasing after Google once again.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Most keyowrd databases consist of a small sample of the overall search universe. This means keyword databases tend to skew more toward commercial terms and the core/head industry terms, with slighlty less coverage of the midtail terms. Many rarely searched for longtail terms are not covered due to database size limitations & lack of commercial data around those terms. Plus if those terms were covered, there would be large sampling errors. Google generates over 2 trillion searches per year and claims 15% of their searches are unique. This means they generate searches for over 300 billion unique keywords each year. The good news about limited tail coverage is it means most any keyword we return data on is a keyword with some commercial value to it. And with Google's Rankbrain algorithm, if you rank well on core industry terms then your pages will often tend to rank well for other related tail keywords.
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
For traditional SEO, this has meant some loss of key real estate. For SERP results pages that once had 10 positions, it's not uncommon now to see seven organic search results below a Featured Snippet or Quick Answer box. Rather than relying on PageRank algorithm for a specific keyword, Google search queries rely increasingly on ML algorithms and the Google Knowledge Graph to trigger a Quick Answer or pull a description into a snippet atop the SERP.
LinkResearchTools makes backlink tracking its core mission and provides a wide swath of backlink analysis tools. LinkResearchTools and Majestic provide the best backlink crawling of this bunch. Aside from these two backlink powerhouses, many of the other tools we tested, such as Ahrefs, Moz Pro, Searchmetrics, SEMrush, and SpyFu, also include solid backlink tracking capabilities.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.