Last update: February 23, 2021.
"Things not strings" is a key mantra in current-era Google. It refers to Google evaluating search queries for Things, ie entities in Google's Knowledge Graph, rather than the simply the string of words that make up the search query. If you search for [san diego zoo] Google will first look for an entity that is contained in the query. That entity, if it exists, will be the core of the search results. You can watch the Google's Paul Haahr explain the process in this presentation from SMX West 2016 (Watch the whole video while you're at it. It is by far the best Google Search video I've seen).
Many queries contain multiple entities or ambiguous references to one or more entity, or contain one or more entities and additional words that add information about the intent of the search.
The query [troy brown san diego chargers] contains two entities (Troy Brown and San Diego Chargers) and return an interesting result that contains a Knowledge Panel for former San Diego Chargers safety Marlon McCree.
A Knowledge Panel is a way for Google to present information about an entity on a search results page.
This illustrates the importance of not only entities but the relationships between entities because there is indeed a relationship between McCree and former New England Patriots wide receiver Troy Brown. In an AFC 2006 Divisional playoff game between the Chargers and the visiting Patriots McCree intercepted a fourth-quarter pass from Tom Brady. It would have been an interception that dashed the Patriots comeback had not Brown ripped the ball out of McCree's hands. Another Patriots player recovered the fumble and the Patriots went on to win the game.
Key takeaways: The Knowledge Graph strongly influences Google SERP and Troy Brown was an awesome football player.
Links to AMP web stories I made in 2018, when Web Stories were an experimental part of AMP, plus one web story I made recently.
February 22 update: I added two new web stories. One uses a straight-forward multi-page story while the other than just looks like it does.
Google search results include a lot more than ads and the traditional organic search results (often referred to as "the ten blue links"). Here's a non-exhaustive collection of SERP features and how they can be used by publishers to get more exposure on the first page of Google search results.
A perhaps interesting note is that Google has not yet indexed the page, which until this posting had been linked to only by a tweet. (February 2, 2021 update: The page showed up in Google's index today. Links work.)
Google rolled out a Broad Core Algorithm Update on December 3, some six weeks later than I thought would be the drop-dead date for a 2020 rollout.
A Google Broad Core Algorithm Update (BCAU) is more or less a reassessment of quality and relevance. The March 2018 update placed more emphasis on relevance, while the August 2018 was heavy on the quality aspects of web sites. Most of BCAU seem to fall somewhere betwen those two extremes.
A BCAU tends to affect the competitiveness of a website broadly. Small rankings changes can have a major impact on traffic. One site lost the new number one rankings it had a for a set of queries and with that 30% of the site's traffic, even though it had fallen only to second or maybe third for most of the keywords. Fortunately for that particular web site that query set wasn't central to the site's business so the traffic loss had very little impact on revenue.
A web site negatively impacted by a BCAU has two options:
1) Do nothing and hope for better luck with the next update. This can work. I have seen it work.
2) Attempt to improve aspects of the website. This can also work. I think I've seen this work but it's frankly impossible to know whether the changes made the difference or Google just changed its mind about the site's relevance and quality. This is especially true since a site can gain traffic from a BCAU in one language but lose in a different language.
The most challenging thing about losing traffic to a BCAU is that it usually takes another BCAU to recover the lost traffic.
A not particularly important question that occasionally is asked is how many links Google will crawl on one web page? I call it an unimportant question because for the very vast majority of web sites the answer is quite simply "enough." Experimentation makes it possible to hazard an educated guess that is a bit more specific than that.
One thing to keep in mind is that the answer could mean a couple different things. The maximum number of links Google extracts from a page may be more than the maximum number of links that it will use in the linkgraph.
Someone has created a web page that contains links to 1,000,026 different domains. As it happens, Google has indexed the page. But not the entire page. The anchor text for the almost exactly 50,000 first links are returned if you do a site-colon (site:) search for the page along with an anchor text. Anchor texts that are among the first 50,000 links will be returned, after that you get nothing. This suggests the possibility that Google extracts and possibly indexes that many links before simply stopping. It could be that the total number of characters used in the 50,000 anchor texts happen to match an unrelated character limit, but seems less likely.
Assuming that 50,000 is a count that has an error range of no more than 40,000 the answer "enough" remains quite correct.
Google is pushing AMP Web Stories again. In Google search results they are presented as Visual Stories:
A web story is a single-screen multi-pane presentation format that allows text, images, video, and animations. I have been told it's basically Instragram Stories.
Google currently isn't giving Visual Stories the kind of preferential treatment that Top Stories enjoys, so it's not likely to be a traffic driver of nearly the same proportions. However, if Visual Stories get good play in Discover it could have material impact on traffic flow.
Since 2018, Google announces a small number of so-called Broad Core Algorithm Updates (BCAU) each year. These updates can greatly affect the organic search traffic a web site receives from Google. A negatively affected web site can lose 30% or 40% of its traffic, pretty much overnight. Other websites can see gains of a similar magnitude. For most web sites a BCAU do not dramatically or even noticeably affect traffic.
The most recent BCAU rolled out on May 4, 2020. Google usually avoids major SERP or ranking shake-ups during the run up to the holiday shopping season, which at latest starts on Black Friday in the United States. If this historical pattern holds, the window for the next BCAU is basically any day between now and the end of October (Halloween). My guess is that it is going to happen no later than October 15, but I think it is going to roll out in September. A September roll out will give Google an opportunity to adjust results that aren't to the company's liking. It will also give the company time to add smaller but significant changes as well as adjust SERP layouts and features.
Can Google choose not to roll out another a BCAU this year? I think the answer to that question is no and yes. As I understand it, the purpose of a BCAU is to improve the quality of search results. It seems odd that Google would decide that it is acceptable to let the quality slide. On the the other hand, everything Google does in search it can probably do differently. In other words, Google might change the way it rolls out updates so that BCAU become obsolete.