Google rolled out a Broad Core Algorithm Update on December 3, some six weeks later than I thought would be the drop-dead date for a 2020 rollout.
A Google Broad Core Algorithm Update (BCAU) is more or less a reassessment of quality and relevance. The March 2018 update placed more emphasis on relevance, the August 2018 was heavy on the quality aspects of web sites. Most of the BCAU seem to fall somewhere betwen those two extremes.
A BCAU tends to affect the competitiveness of a website broadly. Small rankings changes can have a major impact on traffic. One site lost the new number one rankings it had a for a set of queries and with that 30% of the site's traffic, even though it had fallen only to second or maybe third for most of the keywords. Fortunately for that particular web site that query set wasn't central to the site's business so the traffic loss had very little impact on revenue.
A web site negatively impacted by a BCAU has two options:
1) Do nothing and hope for better luck with the next update. This can work. I have seen it work.
2) Attempt to improve aspects of the website. This can also work. I think I've seen this work but it's frankly impossible to know whether the changes made the difference or Google just changed its mind about the site's relevance and quality. This is especially true since a site can gain traffic from a BCAU in one language but lose in a different language.
The most challenging thing about losing traffic to a BCAU is that it usually takes another BCAU to recover the lost traffic.
A not particularly important question that occasionally is asked is how many links Google will crawl on one web page? I call it an unimportant question because for the very vast majority of web sites the answer is quite simply "enough." Experimentation makes it possible to hazard an educated guess that is a bit more specific than that.
One thing to keep in mind is that the answer could mean a couple different things. The maximum number of links Google extracts from a page may be more than the maximum number of links that it will use in the linkgraph.
Someone has created a web page that contains links to 1,000,026 different domains. As it happens, Google has indexed the page. But not the entire page. The anchor text for the almost exactly 50,000 first links are returned if you do a site-colon (site:) search for the page along with an anchor text. Anchor texts that are among the first 50,000 links will be returned, after that you get nothing. This suggests the possibility that Google extracts and possibly indexes that many links before simply stopping. It could be that the total number of characters used in the 50,000 anchor texts happen to match an unrelated character limit, but seems less likely.
Assuming that 50,000 is a count that has an error range of no more than 40,000 the answer "enough" remains quite correct.
Google is pushing AMP Web Stories again. In Google search results they are presented as Visual Stories:
A web story is a single-screen multi-pane presentation format that allows text, images, video, and animations. I have been told it's basically Instragram Stories.
Google currently isn't giving Visual Stories the kind of preferential treatment that Top Stories enjoys, so it's not likely to be a traffic driver of nearly the same proportions. However, if Visual Stories get good play in Discover it could have material impact on traffic flow.
Since 2018, Google announces a small number of so-called Broad Core Algorithm Updates (BCAU) each year. These updates can greatly affect the organic search traffic a web site receives from Google. A negatively affected web site can lose 30% or 40% of its traffic, pretty much overnight. Other websites can see gains of a similar magnitude. For most web sites a BCAU do not dramatically or even noticeably affect traffic.
The most recent BCAU rolled out on May 4, 2020. Google usually avoids major SERP or ranking shake-ups during the run up to the holiday shopping season, which at latest starts on Black Friday in the United States. If this historical pattern holds, the window for the next BCAU is basically any day between now and the end of October (Halloween). My guess is that it is going to happen no later than October 15, but I think it is going to roll out in September. A September roll out will give Google an opportunity to adjust results that aren't to the company's liking. It will also give the company time to add smaller but significant changes as well as adjust SERP layouts and features.
Can Google choose not to roll out another a BCAU this year? I think the answer to that question is no and yes. As I understand it, the purpose of a BCAU is to improve the quality of search results. It seems odd that Google would decide that it is acceptable to let the quality slide. On the the other hand, everything Google does in search it can probably do differently. In other words, Google might change the way it rolls out updates so that BCAU become obsolete.