Links = Rank
The old Google (before Panda) was to some extent largely the following: links = rank.
Once you’ve had enough links to a site, you can literally pour content into a site like Water and ask the global domain link authority to help everything on that site get classify quickly.
As much as the PageRank was excited and important, it was important to have a diverse range of linking areas and keyword-focused anchor text.
Brand = Rank
After Vince, then Panda, the reputation of a site (or, rather, the ranking signals that could best simulate it) was integrated into the ability to rank well.
Panda took into account factors beyond the links and during its first deployment, it would cut everything on a particular domain or sub-domain. Some sites like HubPages have moved their content into subdomains by users. And some aggressive spammers ran their entire site on different subdomains each time a Panda update occurred. This allowed these sites to recover immediately from the first two updates from Panda, but Google has finally fixed this flaw.
Any signal you rely on ends up being intentionally or unintentionally abused. And over time this leads to a “similarity” of the result set unless other signals are used:
Google is absolute junk to search for everything related to a product. If I try to learn something invariably, I have to look for another source like Reddit via Google. For example, I discovered the concept of weighted blankets and I was intrigued. So I’m looking for “why use weighted coverage” and “benefits of weighted coverage”. Just under the word “weighted coverage” in the search, I got pages and pages of nothing but advertisements trying to sell them, and no meaningful talk as to why I would use one.
Over time, as Google has refined with widely distributed Panda sites outside of the news vertical, tough times have often been encountered unless they are dedicated to a specific media format or that they have a lot of user engagement metrics like a solid social networking site. This is a big part of the reason the New York Times sold About.com for less than they paid for it and after IAC bought it, they split it up into a variety of sites like : Verywell (health), spruce (interior decoration), Balance (personal finance), Lifewire (technology), Tripsavvy (travel) and ThoughtCo (education and personal development).
Penguin also cut out aggressive anchor text based on poor quality links. When the Penguin update was launched, Google also deployed a spam classifier on the page to further obscure the update. And the Penguin update has been sandwiched by the Panda updates on either side, making it difficult for people to reverse-engineer any signal from the weekly lists of winners and losers from services that bundle huge amounts of keyword ranking tracking data.
Much of the link graph has been decimated as Google reversed its position on the nofollow where, on March 1 of this year, they began to treat it as an index to a directive for ranking purposes. Many mainstream media websites either use too much nofollow or don’t cite sources at all, so this extra layer of obfuscation from Google will allow them to find more signal in this noise.
May 4, 2020 Algo Update
On May 4, Google launched another main update.
Later that day, we release an update to the basic algorithm, as we do several times a year. This is the base update for May 2020. Our advice on these updates remains the ones we covered before. Please see this blog post for more on this:https://t.co/e5ZQUAlt0G– Google SearchLiaison (@searchliaison) May 4, 2020
I have seen some sites whose ranking has been removed for years see a big jump. But many things have changed at the same time.
On some political search queries that were primarily classified as news related, Google tries to limit political feedback by displaying the official sites and the data extracted from the official sites instead of putting the news first.
“Google has made it clear that they will not spread news sites regarding election related queries and you scroll down and you get a giant election widget in your phone and it shows you all the different data on the results primary and then you go down, you find Wikipedia, you find other similar historical references, and before you even go to a single press article, it’s pretty crazy to see how Google has changed the way the SERP is intended. “
This change reflects the permanent change in the information media ecosystem caused by the Web.
The Internet has trivialized the distribution of facts. The “news” media reacted by turning wholesale into opinion and entertainment.— Naval (@naval) May 26, 2016
A Lily Interactive blog post from Path Interactive used Sistrix data to show that many of the sites that saw high volatility were in the health care sector and others your money, your life (YMYL) categories.
One of the most interesting comments about the update came from Rank Ranger, where they looked at particular pages that jumped or fell hard on the update. They have noticed sites that highlight and center advertisements or similar content to advertisements may have seen heavy drops on some of these priced pages that have been heavily monetized:
See it all, but cement the notion (in my mind at least) that Google didn’t want content unrelated to the main purpose of the page to appear above the fold excluding the main content of the page! Now for the second wrinkle of my theory … Many pages exchanged for new ones did not use the format indicated above where a series of “navigation boxes” dominated the page above the fold.
The above change has had a big impact on some sites that are worth serious money. Intuit paid over $ 7 billion to acquire Credit Karma, but their credit card affiliate pages have recently slipped hard.
Credit Karma lost 40% of traffic compared to the May base update. It’s crazy, they make big TV commercials and probably pay millions of SEO fees. Think about it. Your site is not safe. Google radically changes what it wants with each update, without telling us anything! – SEOwner (@tehseowner) May 14, 2020
The type of change above reflects the fact that Google has become more precise with its algorithms. At first, Panda was all or nothing. Then it started to have different levels of impact on different parts of a site.
The mark was somehow a bandage or a rising tide which lifted all (branded) boats. Now we see Google becoming more granular with their algorithms where a strong brand might not be enough if they consider monetization to be excessive. This same focus on layout can have a more negative impact on small niche websites.
One of my old legacy customers had a site that was primarily monetized by the Amazon affiliate program. About a month ago, Amazon cut affiliate commissions in half, and aggressive ad placement cut site search traffic in half when the rankings slipped on this update.
Their site has trended downward in the past two years, largely due to neglect, as it was still a small side project. They recently improved some of the content about a month ago and it ended up helping, but this update has arrived. As long as this ad location does not change, the declines should continue.
They recently deleted this ad unit, but that meant another drop in revenue because until there is another major update, they will likely stay at around half of the search traffic. So now they have half a half a half. Fortunately, the site did not have full-time employees or they would be among the millions of new unemployed. This experience, however, really reflects how websites can be almost like debt-busted businesses almost overnight. Who can see income slide around 88% and then invest more in property using the remaining 12% while they wait for the site to be re-evaluated for a quarter or more?
“If you have been negatively impacted by a main update, you cannot (most of the time) see the recovery from that update until another main update. Furthermore, you will only see recovery that if you improve the site considerably in the long run. If you haven’t done enough to improve the site as a whole, you may have to wait for several updates to see an increase as you continue to improve the site . And since the basic updates are usually separated by 3-4 months, that means you may have to wait a while. “
Almost no one can afford to do this unless the site is just a side project.
Google could choose to run major updates more frequently, allowing sites to recover faster, but they gain an economic advantage by funding SEO investments and adding opportunity costs to aggressive SEO strategies by ensuring that downgrades from major updates last one season or more.
Choose a strategy or let things come to you
They probably should have reduced their ad density when they made these other upgrades. If they had, they would likely have seen worst-case rankings stable or likely increase as other competing websites fell. Instead, they roll with a half-and-a-half-and-a-half on the income front. Glenn Gabe preach the importance of solving all the problems you can find rather than just fixing a thing or two in the hope that it’s enough. If you have a site that is borderline, you have to somehow consider the trade-offs between the different approaches to monetization.
- monetize slightly and hope the site works well for many years
- aggressively monetize it slightly while using the extra revenue to further improve the site elsewhere and make sure you have enough to survive the meager months
- aggressively monetize shortly after a major ranking update if it was previously slightly monetized, then hope to sell it a month or two later before the next major algorithm update
The results will depend in part on timing and luck, but consciously choosing a strategy is likely to yield better returns than doing a little mix-n-match while having your head buried in the sand.
Read Algo Updates
You can spend 50 or 100 hours reading blog posts about the update and learning precisely nothing in the process if you don’t know which authors are bullshit and which authors are writing on the correct signals.
But how do you know who knows what they’re talking about?
This is more than a little tricky because the people who know it most often have no economic advantage in writing details about the update. If you primarily monetize your own websites, ignoring the wider market is a big part of your competitive advantage.
Make it even more difficult, the less you know, the more likely Google is to trust you to send official messages through you. If you syndicate their message without questioning it, you get a treat – more exclusives. If you question their message in a way that undermines their goals, you will quickly become persona non grata – something cNet learned many years ago by publishing Eric Schmidt’s speech.
It would be unlikely that you would see the following type of Tweet from Blue Hat SEO or Fantomaster or whatever.
I asked Gary about E-A-T. He said this is largely based on links and mentions on authoritative sites. that is, if the Washington Post mentions you, that’s fine.
To read the algorithms well, you need to have certain market sectors and keyword groups that you know well. Passive collection of an archive of historical data quickly highlights major changes.
Everyone who depends on SEO for a living should subscribe to an online ranking tracking service or run something like Serposcope locally to track at least a dozen or two dozen keywords. If you are following the rankings locally, it is a good idea to use a set of web proxies and run the queries slowly through each so as not to be blocked.
You must follow at least a diverse range to get a real idea of the algorithmic changes.
- some different industries
- a couple of different geographic markets (or at least some terms of local intent vs. national intent in a country)
- some keywords head, midtail and longtail
- sites of different sizes, ages and brands in a particular market
Some tools make it easy to quickly add or remove graphs of anything that has moved and is in the first 50 or 100 results, which can help you quickly find outliers. And some tools also make it easy to compare their ranking over time. As the updates develop, you will often see several sites making big moves at the same time and if you know a lot about the keyword, the market and the sites, you may have a good idea of what could be changed to cause these changes.
Once you see someone mention outliers that most people miss and that match what you see in a data set, your confidence level increases and you can spend more time trying to understand which signals have changed.
I have read that influential industry writers mention that links have been greatly reduced on this update. I have also read tweets like this that could potentially indicate otherwise.
If I had little or no data, I would not be able to get a signal from this range of opinions. I would be sort of stuck at “who knows.”
By having my own data that I monitor, I can quickly determine which message is more in line with what I saw in my subset of data and form a more solid hypothesis.
No smoking gun
As Glenn Gabe likes to say, booking sites usually have several major problems.
Google rolls out major updates rarely enough to be able to sandwich several different aspects into major updates at the same time to make reverse engineering of updates more difficult. So it’s helpful to read widely with an open mind and imagine what signal changes could cause the types of ranking changes you see.
Sometimes the data at the site level is more than enough to understand what has changed, but as the example above from Credit Karma has shown, sometimes you need to be much more precise and look at the data at the level page to form a solid hypothesis.
As the world changes, the web also changes
About 15 years ago, online dating was seen as a strange niche for recluses who may have been pushing real people in person. There are now all kinds of specialized niche dating sites, including a variety of DTF-like applications. What was once weird and absurd had become over time normal.
The fear of COVID-19 will cause lasting changes in consumer behavior that accelerate the movement of online commerce. A decade of change will occur in a year or two in many markets.
Telemedicine will develop rapidly. Facebook is add commerce directly to their platform through a partnership with Shopify. Spotify spends a lot of money purchase exclusive rights to distribute widely followed podcasters like Joe Rogan. Uber recently offered to acquire GrubHub. Google and Apple will continue to add funding functionality to their mobile devices. Cinemas have lost much of their appeal.
Tons of offline value-added businesses ended up having no value after months of revenue disappearing as large outstanding debts accumulated interest. There is a belief that some of these brands will have high latent brand value that spills over online, but if they were weak even when offline stores acting as interactive billboards subsidized consumer awareness of their brands , as these stores close consumer awareness and loyalty in-person interactions will also dry up. It is unlikely that a shell of a company rebuilt around the Toys R ‘Us brand will win out over Amazon’s parallel offer or over a company that still manages offline stores.
Big box retailers like Target and Walmart increase online sales to hundreds of percent Year after year.
There will be waves of bankruptcies, dramatic changes in commercial property prices (already reflected in the fall in REIT prices), and more people working remotely (redirecting residential real estate demand from the urban center to the suburbs).
People who work remotely are easier to hire and fire. Those who continue to hone their skills will eventually be rewarded, while those who do not will alternate every year or every two years. Lack of stability will increase demand for education, although much of this additional demand will be for new technologies and specific sectors – certificates or informal training programs rather than degrees.
More and more activities will become normal online activities.
The University of California has about half a million students and in the fall semester they go try to have most of these courses online. How much usage data is Google getting as thousands of institutions are increasingly putting their infrastructure and services online?
Colleges must convince students for next year that distance education is worth as much as in-person education, and then go back before students start to believe it.
It’s like being able to sell your competitor’s product for only one year. – Naval (@naval) May 6, 2020
Lots of B & C level schools will go under as the comparison between similar parts becomes easier. When I was running a membership site here, a college paid us to give students access to our site membership area. As online education normalizes, many unofficial trade-related sites will appear more economically attractive on a relative basis.
If the main state institutions provide most of their services online, other companies should follow. When big cities publish lists of crimes to which they do not respond during an economic downturn, they effectively subsidize more crimes. In turn, it makes sense to move to a slightly more rural and cheaper location, especially when you no longer need to live near your employer.
The most important implication of this permanent movement of the WFH is the state income tax.
Warm, sunny states with affordable housing and zero taxes will see an influx of educated and wealthy workers. States will have to cut taxes to keep up.
Originally posted 2020-06-06 01:17:35.