Search Engine Optimization

Last update: July 20, 2023.

An examination of Google Search Generative Experience

Search Generative Experience is Google's AI-powered partial response to ChatGPT. An examination of SGE's handling of the query [is Margot Robbie mid].

Big Site Tech SEO FTW?

How tech SEO may have helped big sites grow traffic.

Google Explore SERP Feature

Google has added a new SERP feature called Explore

June and July Broad Core Algorithm Updates, the launch of Page Experience and much more

Google had a busy couple of months in June and July, 2021.

Page Experience shakes up Top Stories content

Google finally launched the Page Experience signal in July (really only a couple of months later than they had originally announced which isn't bad given the difficulties since March 2020). The most obvious effect has been the impact on content in the Top Stories carousels in mobile SERP. With the PE it is no longer necessary to have AMP pages to be eligible for inclusion in those carousels.

Page Experience shakes up Top Stories content

Google also rolled out a Broad Core Algorithm Update in two parts. The first one started on June 3, the second on July 1. As always somes sites gained traffic, other sites lost traffic, and some sites weren't particularly impacted at all. The roll out of the first update came on the heels of Memorial Day and the second just before Independence Day, which made week-over-week comparisons a bit tricky.

The probably single most interesting observation I made is that a site that had disallowed a large number of pages saw a marked increase in traffic to those pages immediately with the July 1 wave. I speculate that the increase was caused by a reset of the pages' importance (they're part of a category-leader website).

Page Experience signal delayed but added to GSC

On Monday April 21, Google announced that the Page Experience signal that was announced in May last year will not be rolled out in May this year as previously stated by Google but instead will be rolled out in mid-June. Google expects the rollout to be complete in August.

At the same time Google added the Page Experience reporting to Google Search Console.

An example of Google's struggle with article dates

Google representatives have discussed many times how difficult it can be to get the right date for an article, so this entry is by no means a gotcha.

On March 8, 2021, NBC Sports Boston reporter Tom E. Curran posted an article in which he opinionated that New England Patriots head coach Bill Belichick needs to publicly address the state and plans of the team, something the selectively tight-lipped coach is unlikely to do. The article was posted on NBC Sports and republished on Yahoo. Google has at the time of this writing identified three different publishing dates for the article that was posted on NBC Sports:

Top Stories tile that identifies the linked article as being published 21 hours ago.

Top Stories tile that identifies the linked article as being published four days ago.

Top Stories tile that identifies the linked article as being published in 2017, almost four years ago.

Google used the right date for the re-published article on Yahoo, but twice got the time wrong for the article on NBC Sports. What might have caused Google to at times pick the wrong date? The 2017 date was probably picked from the following paragraph:
Screengrab of paragraph from article that contains the phrase 'Since trading Jimmy Garoppolo in October, 2017...'

My guess is that Google went with the 15th just because it's in the middle of the month and hence no more than two weeks off.

I hazard no guess for what provoked Google to guess "4 days ago."

The article does contain valid structured data that identifies the publish date as March 8, 2021. There is on page text that states "Mar 8, 2021" as publishing date.

This entry was posted on March 9, 2021.

Checking in on BERT 16 months later

On October 25, 2019, Google announced that it is using BERT to better understand queries. The announcement used five example queries to illustrate how BERT improves Google's query understanding. The queries were:

The purpose of the "2019 brazil traveler to usa need a visa" was to show how BERT understands that the query was about a person needing a visa to travel from Brazil to the U.S., something Google's BERT-less query analyzer didn't understand, instead interpreting is as question about whether someone from the U.S. need a visa to travel to Brazil. This is what it looks like now:
Screengrab of SERP for query '2019 brazil traveler to usa need a visa'. The Featured Snippet states that U.S. citizens do not need a visa to travel to Brazil.
For whatever reason Google's understanding of the query has regressed to pre-BERT days. As you can see, Google also suggests an alternative query. Google understands that query the same way:
Screengrab of SERP for query '2019 brazil travel to usa need a visa'. The Featured Snippet states that U.S. citizens do not need a visa to travel to Brazil.
Changing the query to "2021 brazil traveler to usa need a visa" does not change Gogle's understanding of the query. Contrary to what the 2019 blog post claims, Google sees someone looking for informtion a whether an American needs a visa to travel to Brazil.
Screengrab of SERP for query '2021 brazil traveler to usa need a visa'. The Featured Snippet states that U.S. citizens do not need a visa to travel to Brazil.

Google does better with the second query, [do estheticians stand a lot at work]. Before BERT, Google failed to understand that the query was related to the physical demands of an esthetician's work. However, the first reult for the query is a Featured Snippet for a web page that was created by an SEO professional in response to the BERT annopuncement, and feels more like traditional keyword matching than ML/AI.
Featured Snippet returned for the query 'do estheticians stand a lot at work'

The third query - "can you get medicine for someone pharmacy" - is a clear win for Google. The number one result is the same page as 16 months ago, but now it is a featured snippet that even more clearly states that you may be able pick up prescription medicine for someone else at a phramcy.
Featured Snippet returned for the query 'do estheticians stand a lot at work'

"Parking on a hill with no curb," the fourth example query, caused Google problems because of the word "no" which Google ignored, pre-BERT, in effect inverting the meaning of the query. (RankBrain allegedly made some headway with understanding when to use certain stop words to understand a query but apparently the word "no" wasn't among them). Here it's more of a tie for Google. Once again the number one result ius a featured snippet, but it starts off addressing how to park when there is a curb.
Featured snippet for the query 'parking on a hill with no curb'. The result addresses parking where there is a curb before addressing no-curb situations.

The final query in the BERT announcement is "math practice books for adult." Before BERT Googled didn't see the query as being about math books for adults. Now it pulls up a set of results that mostly are math books for adults. Clear win for Google.
Top results for query 'math practice books for adults'

The important role of Knowledge Graph entities in Google search results.

"Things not strings" is a key mantra in current-era Google. It refers to Google evaluating search queries for Things, ie entities in Google's Knowledge Graph, rather than the simply the string of words that make up the search query. If you search for [san diego zoo] Google will first look for an entity that is contained in the query. That entity, if it exists, will be the core of the search results. You can watch Google's Paul Haahr explain the process in this presentation from SMX West 2016 (Watch the whole video while you're at it. It is by far the best Google Search video I've seen).

Many queries contain multiple entities or ambiguous references to one or more entity, or contain one or more entities and additional words that add information about the intent of the search.

The query [troy brown san diego chargers] contains two entities (Troy Brown and San Diego Chargers) and return an interesting result that contains a Knowledge Panel for former San Diego Chargers safety Marlon McCree.
A Google Knowledge Panel for former San Diego Chargers safety Marlon McCree.
A Knowledge Panel is a way for Google to present information about an entity on a search results page.

This illustrates the importance of not only entities but the relationships between entities because there is indeed a relationship between McCree and former New England Patriots wide receiver Troy Brown. In an AFC 2006 Divisional playoff game between the Chargers and the visiting Patriots McCree intercepted a fourth-quarter pass from Tom Brady. It would have been an interception that dashed the Patriots comeback had not Brown ripped the ball out of McCree's hands. Another Patriots player recovered the fumble and the Patriots went on to win the game.

A reasonable question is how much of the connection between McCree and Brown is based on text (corpus) and how much is based on queries along the line of "who did troy brown strip the ball from against the chargers in 2006 playoffs?"

Key takeaways: The Knowledge Graph strongly influences Google SERP and Troy Brown was an awesome football player.

A few AMP Web Stories

Links to AMP web stories I made in 2018, when Web Stories were an experimental part of AMP, plus one web story I made recently.

February 22 update: I added two new web stories. One uses a straight-forward multi-page story while the other than just looks like it does.

SERP Features Observations

Google search results include a lot more than ads and the traditional organic search results (often referred to as "the ten blue links"). Here's a non-exhaustive collection of SERP features and how they can be used by publishers to get more exposure on the first page of Google search results.

A perhaps interesting note is that Google has not yet indexed the page, which until this posting had been linked to only by a tweet. (February 2, 2021 update: The page showed up in Google's index today. Links work.)

Google's December 2020 Broad Core Algorithm Update

Google rolled out a Broad Core Algorithm Update on December 3, some six weeks later than I thought would be the drop-dead date for a 2020 rollout.

A Google Broad Core Algorithm Update (BCAU) is more or less a reassessment of quality and relevance. The March 2018 update placed more emphasis on relevance, while the August 2018 was heavy on the quality aspects of web sites. Most of BCAU seem to fall somewhere betwen those two extremes.

A BCAU tends to affect the competitiveness of a website broadly. Small rankings changes can have a major impact on traffic. One site lost the new number one rankings it had a for a set of queries and with that 30% of the site's traffic, even though it had fallen only to second or maybe third for most of the keywords. Fortunately for that particular web site that query set wasn't central to the site's business so the traffic loss had very little impact on revenue.

A web site negatively impacted by a BCAU has two options:
1) Do nothing and hope for better luck with the next update. This can work. I have seen it work.
2) Attempt to improve aspects of the website. This can also work. I think I've seen this work but it's frankly impossible to know whether the changes made the difference or Google just changed its mind about the site's relevance and quality. This is especially true since a site can gain traffic from a BCAU in one language but lose in a different language.

The most challenging thing about losing traffic to a BCAU is that it usually takes another BCAU to recover the lost traffic.

How many links will Google crawl on one page?

A not particularly important question that occasionally is asked is how many links Google will crawl on one web page? I call it an unimportant question because for the very vast majority of web sites the answer is quite simply "enough." Experimentation makes it possible to hazard an educated guess that is a bit more specific than that.

One thing to keep in mind is that the answer could mean a couple different things. The maximum number of links Google extracts from a page may be more than the maximum number of links that it will use in the linkgraph.

Someone has created a web page that contains links to 1,000,026 different domains. As it happens, Google has indexed the page. But not the entire page. The anchor text for the almost exactly 50,000 first links are returned if you do a site-colon (site:) search for the page along with an anchor text. Anchor texts that are among the first 50,000 links will be returned, after that you get nothing. This suggests the possibility that Google extracts and possibly indexes that many links before simply stopping. It could be that the total number of characters used in the 50,000 anchor texts happen to match an unrelated character limit, but seems less likely.

Assuming that 50,000 is a count that has an error range of no more than 40,000 the answer "enough" remains quite correct.

Google is pushing AMP Web Stories again

Google is pushing AMP Web Stories again. In Google search results they are presented as Visual Stories:
An exampe of how Google presents AMP Web Stories (Visual Stories) in search results.
A web story is a single-screen multi-pane presentation format that allows text, images, video, and animations. I have been told it's basically Instragram Stories.

I created a few web stories a couple of years ago as I prepared for a brief presentation at Next10x 2018. One was a single-pane version that uses animation to show the brackets for the NHL Stanley Cup playoffs (it probably doesn't look quite right on desktop browsers). It turned out that the version I created back then is no longer AMP valid, as it was calling in a version of the AMP Project's Web Stories component JavaScript that has been deprecated. So, an AMP Web Story that's 2.5 years old was no longer AMP valid.

Google currently isn't giving Visual Stories the kind of preferential treatment that Top Stories enjoys, so it's not likely to be a traffic driver of nearly the same proportions. However, if Visual Stories get good play in Discover it could have material impact on traffic flow.

When is the next Google Broad Core Algorithm Update coming?

Since 2018, Google announces a small number of so-called Broad Core Algorithm Updates (BCAU) each year. These updates can greatly affect the organic search traffic a web site receives from Google. A negatively affected web site can lose 30% or 40% of its traffic, pretty much overnight. Other websites can see gains of a similar magnitude. For most web sites a BCAU do not dramatically or even noticeably affect traffic.

The most recent BCAU rolled out on May 4, 2020. Google usually avoids major SERP or ranking shake-ups during the run up to the holiday shopping season, which at latest starts on Black Friday in the United States. If this historical pattern holds, the window for the next BCAU is basically any day between now and the end of October (Halloween). My guess is that it is going to happen no later than October 15, but I think it is going to roll out in September. A September roll out will give Google an opportunity to adjust results that aren't to the company's liking. It will also give the company time to add smaller but significant changes as well as adjust SERP layouts and features.

Can Google choose not to roll out another a BCAU this year? I think the answer to that question is no and yes. As I understand it, the purpose of a BCAU is to improve the quality of search results. It seems odd that Google would decide that it is acceptable to let the quality slide. On the the other hand, everything Google does in search it can probably do differently. In other words, Google might change the way it rolls out updates so that BCAU become obsolete.