The short version:
The long version: Inside the invisible government: war, propaganda, Clinton & Trump
Or, if you prefer video:
The short version:
The long version: Inside the invisible government: war, propaganda, Clinton & Trump
Or, if you prefer video:
Remember the whole shtick about good, legitimate, high-quality content being created for readers without concern for search engines – even as though search engines do not exist?
Whatever happened to that?
We quickly shifted from the above “ideology” to this:
The red triangle/exclamation point icon was arrived at after the Chrome team commissioned research around the world to figure out which symbols alarmed users the most.
Google is explicitly spreading the message that they are doing testing on how to create maximum fear to try to manipulate & coerce the ecosystem to suit their needs & wants.
At the same time, the Google AMP project is being used as the foundation of effective phishing campaigns.
Scare users off of using HTTP sites AND host phishing campaigns.
Killer job Google.
Someone deserves a raise & some stock options. Unfortunately that person is in the PR team, not the product team.
I’d like to tell you that I was preparing the launch of http://ift.tt/2evTSLc but awareness of past ecosystem shifts makes me unwilling to make that move.
I see it as arbitrary hoop jumping not worth the pain.
If you are an undifferentiated publisher without much in the way of original thought, then jumping through the hoops make sense. But if you deeply care about a topic and put a lot of effort into knowing it well, there’s no reason to do the arbitrary hoop jumping.
Remember how mobilegeddon was going to be the biggest thing ever? Well I never updated our site layout here & we still outrank a company which raised & spent 10s of millions of dollars for core industry terms like [seo tools].
Though it is also worth noting that after factoring in increased ad load with small screen sizes & the scrape graph featured answer stuff, a #1 ranking no longer gets it done, as we are well below the fold on mobile.
In the above example I am not complaining about ranking #5 and wishing I ranked #2, but rather stating that ranking #1 organically has little to no actual value when it is a couple screens down the page.
Google indicated their interstitial penalty might apply to pop ups that appear on scroll, yet Google welcomes itself to installing a toxic enhanced version of the Diggbar at the top of AMP pages, which persistently eats 15% of the screen & can’t be dismissed. An attempt to dismiss the bar leads the person back to Google to click on another listing other than your site.
As bad as I may have made mobile search results appear earlier, I was perhaps being a little to kind. Google doesn’t even have mass adoption of AMP yet & they already have 4 AdWords ads in their mobile search results AND when you scroll down the page they are testing an ugly “back to top” button which outright blocks a user’s view of the organic search results.
What happens when Google suggests what people should read next as an overlay on your content & sells that as an ad unit where if you’re lucky you get a tiny taste of the revenues?
Is it worth doing anything that makes your desktop website worse in an attempt to try to rank a little higher on mobile devices?
Given the small screen size of phones & the heavy ad load, the answer is no.
I realize that optimizing a site design for mobile or desktop is not mutually exclusive. But it is an issue we will revisit later on in this post.
Many people new to SEO likely don’t remember the importance of using Google Checkout integration to lower AdWords ad pricing.
You either supported Google Checkout & got about a 10% CTR lift (& thus 10% reduction in click cost) or you failed to adopt it and got priced out of the market on the margin difference.
And if you chose to adopt it, the bad news was you were then spending yet again to undo it when the service was no longer worth running for Google.
How about when Google first started hyping HTTPS & publishers using AdSense saw their ad revenue crash because the ads were no longer anywhere near as relevant.
Not like Google cared much, as it is their goal to shift as much of the ad spend as they can onto Google.com & YouTube.
It is not an accident that Google funds an ad blocker which allows ads to stream through on Google.com while leaving ads blocked across the rest of the web.
Android Pay might be worth integrating. But then it also might go away.
It could be like Google’s authorship. Hugely important & yet utterly trivial.
Faces help people trust the content.
Then they are distracting visual clutter that need expunged.
Then they once again re-appear but ONLY on the Google Home Service ad units.
They were once again good for users!!!
Neat how that works.
Or it could be like Google Reader. A free service which defunded all competing products & then was shut down because it didn’t have a legitimate business model due to it being built explicitly to prevent competition. With the death of Google reader many blogs also slid into irrelevancy.
Their FeedBurner acquisition was icing on the cake.
Techdirt is known for generally being pro-Google & they recently summed up FeedBurner nicely:
Thanks, Google, For Fucking Over A Bunch Of Media Websites – Mike Masnick
Ultimately Google is a horrible business partner.
And they are an even worse one if there is no formal contract.
When Google routinely acts so anti-competitive & abusive it is no surprise that some of the “standards” they propose go nowhere.
You can only get screwed so many times before you adopt a spirit of ambivalence to the avarice.
Google is the type of “partner” that conducts security opposition research on their leading distribution partner, while conveniently ignoring nearly a billion OTHER Android phones with existing security issues that Google can’t be bothered with patching.
Deliberately screwing direct business partners is far worse than coding algorithms which belligerently penalize some competing services all the while ignoring that the payday loan shop funded by Google leverages doorway pages.
BackChannel recently published an article foaming at the mouth promoting the excitement of Google’s AI:
This 2016-to-2017 Transition is going to move us from systems that are explicitly taught to ones that implicitly learn.” … the engineers might make up a rule to test against—for instance, that “usual” might mean a place within a 10-minute drive that you visited three times in the last six months. “It almost doesn’t matter what it is — just make up some rule,” says Huffman. “The machine learning starts after that.
The part of the article I found most interesting was the following bit:
After three years, Google had a sufficient supply of phonemes that it could begin doing things like voice dictation. So it discontinued the [phone information] service.
Google launches “free” services with an ulterior data motive & then when it suits their needs, they’ll shut it off and leave users in the cold.
As Google keeps advancing their AI, what do you think happens to your AMP content they are hosting? How much do they squeeze down on your payout percentage on those pages? How long until the AI is used to recap / rewrite content? What ad revenue do you get when Google offers voice answers pulled from your content but sends you no visitor?
A recent Wall Street Journal article highlighting the fast ad revenue growth at Google & Facebook also mentioned how the broader online advertising ecosystem was doing:
Facebook and Google together garnered 68% of spending on U.S. online advertising in the second quarter—accounting for all the growth, Mr. Wieser said. When excluding those two companies, revenue generated by other players in the U.S. digital ad market shrank 5%
The issue is NOT that online advertising has stalled, but rather that Google & Facebook have choked off their partners from tasting any of the revenue growth. This problem will only get worse as mobile grows to a larger share of total online advertising:
By 2018, nearly three-quarters of Google’s net ad revenues worldwide will come from mobile internet ad placements. – eMarketer
Media companies keep trusting these platforms with greater influence over their business & these platforms keep screwing those same businesses repeatedly.
You pay to get likes, but that is no longer enough as edgerank declines. Thanks for adopting Instant Articles, but users would rather see live videos & read posts from their friends. You are welcome to pay once again to advertise to the following you already built. The bigger your audience, the more we will charge you! Oh, and your direct competitors can use people liking your business as an ad targeting group.
Worse yet, Facebook & Google are even partnering on core Internet infrastructure.
Any hope of AMP turning the corner on the revenue front is a “no go”:
“We want to drive the ecosystem forward, but obviously these things don’t happen overnight,” Mr. Gingras said. “The objective of AMP is to have it drive more revenue for publishers than non-AMP pages. We’re not there yet”.
Publishers who are critical of AMP were reluctant to speak publicly about their frustrations, or to remove their AMP content. One executive said he would not comment on the record for fear that Google might “turn some knob that hurts the company.”
Look at that.
Leadership through fear once again.
At least they are consistent.
As more publishers adopt AMP, each publisher in the program will get a smaller share of the overall pie.
Just look at Google’s quarterly results for their current partners. They keep showing Google growing their ad clicks at 20% to 40% while partners oscillate between -15% and +5% quarter after quarter, year after year.
In the past quarter Google grew their ad clicks 42% YoY by pushing a bunch of YouTube auto play video ads, faster search growth in third world markets with cheaper ad prices, driving a bunch of lower quality mobile search ad clicks (with 3 then 4 ads on mobile) & increasing the percent of ad clicks on “own brand” terms (while sending the FTC after anyone who agrees to not cross bid on competitor’s brands).
The lower quality video ads & mobile ads in turn drove their average CPC on their sites down 13% YoY.
The partner network is relatively squeezed out on mobile, which makes it shocking to see the partner CPC off more than core Google, with a 14% YoY decline.
What ends up happening is eventually the media outlets get sufficiently defunded to where they are sold for a song to a tech company or an executive at a tech company. Alibaba buying SCMP is akin to Jeff Bezos buying The Washington Post.
The Wall Street Journal recently laid off reporters. The New York Times announced they were cutting back local cultural & crime coverage.
If news organizations of that caliber can’t get the numbers to work then the system has failed.
The Tribune Company, already through bankruptcy & perhaps the dumbest of the lot, plans to publish thousands of AI assisted auto-play videos in their articles every day. That will guarantee their user experience on their owned & operated sites is worse than just about anywhere else their content gets distributed to, which in turn means they are not only competing against themselves but they are making their own site absolutely redundant & a chore to use.
That the Denver Guardian (an utterly fake paper running fearmongering false stories) goes viral is just icing on the cake.
These tech companies are literally reshaping society & are sucking the life out of the economy, destroying adjacent markets & bulldozing regulatory concerns, all while offloading costs onto everyone else around them.
An FTC report recommended suing Google for their anti-competitive practices, but no suit was brought. The US Copyright Office Register was relieved of her job after she went against Google’s views on set top boxes.
And in spite of the growing importance of tech media coverage of the industry is a trainwreck:
This is what it’s like to be a technology reporter in 2016. Freebies are everywhere, but real access is scant. Powerful companies like Facebook and Google are major distributors of journalistic work, meaning newsrooms increasingly rely on tech giants to reach readers, a relationship that’s awkward at best and potentially disastrous at worst.
Being a conduit breeds exclusives. Challenging the grand narrative gets one blackballed.
Google announced they are releasing a mobile first search index:
Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices.
There are some forms of content that simply don’t work well on a 350 pixel wide screen, unless they use a pinch to zoom format. But using that format is seen as not being mobile friendly.
Imagine you have an auto part database which lists alternate part numbers, price, stock status, nearest store with part in stock, time to delivery, etc. … it is exceptionally hard to get that information to look good on a mobile device. And good luck if you want to add sorting features on such a table.
The theory that using the desktop version of a page to rank mobile results is flawed because users might find something which is only available on the desktop version of a site is a valid point. BUT, at the same time, a publisher may need to simplify the mobile site & hide data to improve usability on small screens & then only allow certain data to become visible through user interactions. Not showing those automotive part databases to desktop users would ultimately make desktop search results worse for users by leaving huge gaps in the search results. And a search engine choosing to not index the desktop version of a site because there is a mobile version is equally short sighted. Desktop users would no longer be able to find & compare information from those automotive parts databases.
Once again money drives search “relevancy” signals.
Since Google will soon make 3/4 of their ad revenues on mobile that should be the primary view of the web for everyone else & alternate versions of sites which are not mobile friendly should be disappeared from the search index if a crappier lite mobile-friendly version of the page is available.
Amazon converts well on mobile in part because people already trust Amazon & already have an account registered with them. Most other merchants won’t be able to convert at anywhere near as well of a rate on mobile as they do on desktop, so if you have to choose between having a mobile friendly version that leaves differentiated aspects hidden or a destkop friendly version that is differentiated & establishes a relationship with the consumer, the deeper & more engaging desktop version is the way to go.
The heavy ad load on mobile search results only further combine with the low conversion rates on mobile to make building a relationship on desktop that much more important.
Even TripAdvisor is struggling to monetize mobile traffic, monetizing it at only about 30% to 33% the rate they monetize desktop & tablet traffic. Google already owns most the profits from that market.
Webmasters are better off NOT going mobile friendly than going mobile friendly in a way that compromises the ability of their desktop site.
Mobile-first: with ONLY a desktop site you’ll still be in the results & be findable. Recall how mobilegeddon didn’t send anyone to oblivion?— Gary Illyes (@methode) November 6, 2016
I am not the only one suggesting an over-simplified mobile design that carries over to a desktop site is a losing proposition. Consider Nielsen Norman Group’s take:
in the current world of responsive design, we’ve seen a trend towards insufficient information density and simplifying sites so that they work well on small screens but suboptimally on big screens.
Publishers are getting squeezed to subsidize the primary web ad networks. But the narrative is that as cross-device tracking improves some of those benefits will eventually spill back out into the partner network.
I am rather skeptical of that theory.
Facebook already makes 84% of their ad revenue from mobile devices where they have great user data.
They are paying to bring new types of content onto their platform, but they are only just now beginning to get around to test pricing their Audience Network traffic based on quality.
Priorities are based on business goals and objectives.
When these ad networks are strong & growing quickly they may be able to take a stand, but when growth slows the stock prices crumble, data security becomes less important during downsizing when morale is shattered & talent flees. Further, creating alternative revenue streams becomes vital “to save the company” even if it means selling user data to dangerous dictators.
The other big risk of such tracking is how data can be used by other parties.
Data is being used in all sorts of crazy ways the central ad networks are utterly unaware of. These crazy policies are not limited to other countries. Buying dog food with your credit card can lead to pet licensing fees. Even cheerful “wellness” programs may come with surprises.
On Friday Google’s Gary Illyes announced Penguin 4.0 was now live.
Key points highlighted in their post are:
Things not mentioned in the post
Since the update was announced, the search results have become more stable.
No signs of major SERP movement yesterday – the two days since Penguin started rolling out have been quieter than most of September.— Dr. Pete Meyers (@dr_pete) September 24, 2016
They still may be testing out fine tuning the filters a bit…
Fyi they’re still split testing at least 3 different sets of results. I assume they’re trying to determine how tight to set the filters.— SEOwner (@tehseowner) September 24, 2016
…but what exists now is likely to be what sticks for an extended period of time.
Now that Penguin is baked into Google’s core ranking algorithms, no more Penguin updates will be announced. Panda updates stopped being announced last year. Instead we now get unnamed “quality” updates.
Earlier in the month many SEOs saw significant volatility in the search results, beginning ahead of Labor Day weekend with a local search update. The algorithm update observations were dismissed as normal fluctuations in spite of the search results being more volatile than they have been in over 4 years.
There are many reasons for search engineers to want to roll out algorithm updates (or at least test new algorithms) before a long holiday weekend:
Just about any of the algorithm volatility tools showed far more significant shift earlier in this month than over the past few days.
One issue with looking at any of the indexes is the rank shifts tend to be far more dramatic as you move away from the top 3 or 4 search results, so the algorithm volatility scores are much higher than the actual shifts in search traffic (the least volatile rankings are also the ones with the most usage data & ranking signals associated with them, so the top results for those terms tend to be quite stable outside of verticals like news).
You can use AWR’s flux tracker to see how volatility is higher across the top 20 or top 50 results than it is across the top 10 results.
I shut down our membership site in April & spend most of my time reading books & news to figure out what’s next after search, but a couple legacy clients I am winding down working with still have me tracking a few keywords & one of the terms saw a lot of smaller sites (in terms of brand awareness) repeatedly slide and recover over the past month.
Notice how a number of sites would spike down on the same day & then back up. And then the pattern would repeat.
As a comparison, here is that chart over the past 3 months.
Notice the big ranking moves which became common over the past month were not common the 2 months prior.
There is a weird sect of alleged SEOs which believes Google is omniscient, algorithmic false positives are largely a myth, AND negative SEO was never a real thing.
As it turns out, negative SEO was real, which likely played a part in Google taking years to rolll out this Penguin update AND changing how they process Penguin from a sitewide negative factor to something more granular.
Part of the reason many people think there was no Penguin update or responded to the update with “that’s it?” is because few sites which were hit in the past recovered relative to the number of sites which ranked well until recently just got clipped by this algorithm update.
When Google updates algorithms or refreshes data it does not mean sites which were previously penalized will immediately rank again.
Some penalties (absent direct Google investment or nasty public relations blowback for Google) require a set amount of time to pass before recovery is even possible.
Google has no incentive to allow a broad-based set of penalty recoveries on the same day they announce a new “better than ever” spam fighting algorithm.
They’ll let some time base before the penalized sites can recover.
Further, many of the sites which were hit years ago & remain penalized have been so defunded for so long that they’ve accumulated other penalties due to things like tightening anchor text filters, poor user experience metrics, ad heavy layouts, link rot & neglect.
So here are some of the obvious algorithmic holes left by the new Penguin approach…
The trite advice is to make quality content, focus on the user, and build a strong brand.
But you can do all of those well enough that you change the political landscape yet still lose money.
“Mother Jones published groundbreaking story on prisons that contributed to change in govt policy. Cost $350k & generated $5k in ad revenue”— SEA☔☔LE SEO (@searchsleuth998) August 22, 2016
Google & Facebook are in a cold war, competing to see who can kill the open web faster, using each other as justification for their own predation.
And that is without getting hit by a penalty.
It is getting harder to win in search period.
And it is getting almost impossible to win in search by focusing on search as an isolated channel.
I never understood mentality behind Penguin “recovery” people. The spam links ranked you, why do you expect to recover once they’re removed?— SEOwner (@tehseowner) September 25, 2016
Efforts and investments in chasing the algorithms in isolation are getting less viable by the day.
Obviously removing them may get you out of algorithm, but then you’ll only have enough power to rank where you started before spam links.— SEOwner (@tehseowner) September 25, 2016
Anyone operating at scale chasing SEO with automation is likely to step into a trap.
When it happens, that player better have some serious savings or some non-Google revenues, because even with “instant” algorithm updates you can go months or years on reduced revenues waiting for an update.
And if the bulk of your marketing spend while penalized is spent on undoing past marketing spend (rather than building awareness in other channels outside of search) you can almost guarantee that business is dead.
“If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits.” – Matt Cutts
Google recently made it much harder to receive accurate keyword data from the AdWords keyword tool.
They have not only grouped similar terms, but then they broadened out the data ranges to absurdly wide ranges like 10,000 to 100,000 searches a month. Only active AdWords advertisers receive (somewhat?) decent keyword data. And even with that, there are limitations. Try to view too many terms and you get:
“You’ve reached the maximum number of page views for this day. This page now shows ranges for search volumes. For a more detailed view, check back in 24 hours.”
Jennifer Slegg shared a quote from an AdWords advertiser who spoke with a representative:
“I have just spoken to a customer service manger from the Australia support help desk. They have advised me that there must be continuous activity in your google ad-words campaign (clicks and campaigns running) for a minimum of 3-4 months continuous in order to gain focused keyword results. If you are seeing a range 10-100 or 100-1k or 1k -10k its likely your adwords account does not have an active campaign or has not had continuous campaigns or clicks.”
So you not only need to be an advertiser, but you need to stay active for a quarter-year to a third of a year to get decent data.
Part of the sales pitch of AdWords/PPC was that you can see performance data right away, whereas SEO investments can take months or years to back out.
But with Google outright hiding keyword data even from active advertisers, it is probably easier and more productive for those advertisers to start elsewhere.
There are many other keyword data providers (Wordtracker, SEMrush, Wordze, Spyfu, KeywordSpy, Keyword Discovery, Moz, Compete.com, SimilarWeb, Xedant, Ubersuggest, KeywordTool.io, etc.) And there are newer entrants like the Keyword Keg Firefox extension & the brilliantly named KeywordShitter).
In light of Google’s push to help make the web more closed-off & further tilt the web away from the interests of searchers toward the interest of big advertisers*, we decided to do the opposite & recently upgraded our keyword tool to add the following features…
Our keyword tool lists estimated search volumes, bid prices, cross links to SERPs, etc. Using it does require free account registration to use, but it is a one-time registration and the tool is free. And we don’t collect phone numbers, hard sell over the phone, etc. We even shut down our paid members area, so you are not likely to receive any marketing messages from us anytime soon.
Export is lightning quick AND, more importantly, we have a panda in our logo!
Here is what the web interface looks like
And here is an screenshot of data in Excel with the various keyword match types
If the tool looks like it is getting decent usage, we may upgrade it further to refresh the data more frequently, consider adding more languages, add a few more reference links to related niche sites in the footer cross-reference section, and maybe add a few other features.
“Every market has some rules and boundaries that restrict freedom of choice. A market looks free only because we so unconditionally accept its underlying restrictions that we fail to see them.” — Ha-Joon Chang
What a fun past couple years it has been in the digital marketing landscape; we’ve seen hummingbirds, ads displacing organic listings, phantoms, ads displacing organic listings, rank brain, and of course ads displacing organic listings. It has been such a long time since my last post that back when I was last writing for SEObook we were still believing in the timelines provided by Google employees on when Penguin was going to run next. Remember that? Oh, the memories.
The reason I’m back is to share a tip. Normally I don’t share SEO tips because by sharing information on a tactic, I end up burning the tactic and killing whatever potential usable market value remained on its shelf life. Why share then? Because this isn’t something you can kill; it involves people. And killing people is bad. To explain how it works though, I need to explain the two concepts I’m smashing together like chocolate and peanut butter.
The chocolate, aka Influencer Marketing – my definition of influencer marketing is having someone tell your story for you. Some people view influencer marketing as paying someone like Kim Kardashian $50,000 to post a picture of herself on Instagram holding a sample of your new line of kosher pickles. While that does fit under my definition as well, I consider that aspirational influencer marketing since her audience is primarily comprised of being aspiring to be Kim. Also equally valid is having Sally your foodie neighbor posting that picture in exchange for getting a free jar of those delicious pickles; in this particular case though the influence would be considered peer level influence since Sally’s audience is going to be comprised largely of people that view Sally as their equal, and possibly recognize that Sally as a foodie knows her food. Personally, I am biased, but I prefer lots of peer influence campaigns than a single big budget aspirational influence campaign, but I digress. If you want to learn a lot more about differences in the campaign types, I spoke with Bronco on the ins and outs of influence.
The peanut butter, aka Online Reputation Management, aka ORM – while I would hope reputation management doesn’t need to be specifically defined, I’ll define it anyhow as changing the online landscape for the benefit of a client’s (or your own) reputation. Peanut butter is a really good analogy for ORM because a lot of work gets spread around in a lot of directions, from creating hundreds of thousands of properties designed to flood the SERPs and social channels as a tail that wags the dog, to straight up negative SEO. Yeah, I said it. If negative SEO wasn’t made so much more available due to Panda, Penguin, and the philosophical neative a priori shift, in ORM would not be the industry that it is today.
So what’s the tip? You can combine these two concepts for your clients, and you can do it in a variety of different ways. Let’s walk through a few…
My business partner at my influencer marketing network Intellifluence, Terry Godier, and I also refer to some of the above topics under the umbrella of dark influence. I’m sure this list isn’t even close to exhaustive, mainly because I don’t want to go too deep on how scary one can get. If you need to address such things, I still take on select ORM clients at Digital Heretix and can help you out or refer you to a quality professional that will. Combining concepts and tactics is often a lot more fun than trying to approach a tactic singularly; when possible, work in multiple dimensions.
Think of a way that I missed or some cool concepts that could be paired to be more powerful? Let me know on Twitter.
So far this year publishers have lost 52% their Facebook distribution due to:
Instant Articles may have worked for an instant, but many publishers are likely where they were before they made the Faustian bargain, except they now have less control over their content distribution and advertising while having the higher cost structure of supporting another content format.
When Facebook announced their news feed update to fight off clickbait headlines, it sure sounded a lot like the equivalent of Google’s Panda update. Glenn Gabe is one of the sharpest guys in the SEO field who regularly publishes insightful content & doesn’t blindly shill for the various platform monopolies dominating the online publishing industry & he had the same view I did.
Further cementing the “this is Panda” view was an AdAge article quoting some Facebook-reliant publishers. Glad we have already shifted our ways. Nice to see them moving in the same direction we are. etc. … It felt like reading a Richard Rosenblatt quote in 2011 about Demand Media’s strong working relationship with Google or how right after Panda their aggregate traffic level was flat.
Peter Kafka: Do you think that Google post was directed at you in any way?
Richard Rosenblatt: It’s not directed at us in any way.
P K: they wrote this post, which talks about content farms, and even though you say they weren’t talking about you, it left a lot of people scratching their heads.
R R: Let’s just say that we know what they’re trying to do. … He’s talking about duplicate, non-original content. Every single piece of ours is original. … our relationship is synergistic, and it’s a great partnership.
Kara Swisher: What were you trying to communicate in the call, especially since investors seemed very focused on Panda?
R R: What I also wanted to show was that third-party data sources should not be relied on. We did get affected, for sure. But I was not just being optimistic, we wanted to use that to really understand what we can do better.
K S: Given Google’s shift in its algorithm, are you shifting your distribution, such as toward social and mobile?
R R: If you look at where trends are going, that’s where we are going to be.
K S: How are you changing the continued perception that Demand is a content farm?
R R: I don’t think anyone has defined what a content farm is and I am not sure what it means either. We obviously don’t think we are a content farm and I am not sure we can counter every impact if some people think we are.
A couple years later Richard Rosenblatt left the company.
Since the Google Panda update eHow has removed millions of articles from their site. As a company they remain unprofitable a half-decade later & keep seeing YoY media ad revenue declines in the 30% to 40% range.
Over-reliance on any platform allows that platform to kill you. And, in most cases, you are unlikely to be able to restore your former status until & unless you build influence via other traffic channels:
I think in general, media companies have lost sight of building relationships with their end users that will bring them in directly, as opposed to just posting links on social networks and hoping people will click. I think publishers that do that are shooting themselves in the foot. Media companies in general are way too focused on being where our readers are, as opposed to being so necessary to our readers that they will seek us out. – Jessica Lessin, founder of TheInformation
Recovering former status requires extra investment far above and beyond what led to the penalty. And if the core business model still has the same core problems there is no solution.
“I feel pretty confident about the algorithm on Suite 101.” – Matt Cutts
Some big news publishers are trying to leverage video equivalents of a Narrative Science or Automated Insights (from Wochit and Wibbitz) to embed thousands of autogenerated autoplay videos in their articles daily.
But is that a real long-term solution to turn the corner? Even if they see a short term pop in ad revenues by using some dumbed-down AI-enhanced low cost content, all that really does is teach people that they are a source of noise while increasing the number of web users who install ad blockers.
And the whole time penalized publishers try to recover the old position of glory, the platform monopolies are boosting their AI skills in the background while they eat the playing field.
The companies which run the primary ad networks can easily get around the ad blockers, but third party publishers can’t. As the monopoly platforms broadly defund ad-based publishing, they can put users “in control” while speaking about taking the principle-based approach:
“This isn’t motivated by inventory; it’s not an opportunity for Facebook from that perspective,” Mr. Bosworth said. “We’re doing it more for the principle of the thing. We want to help lead the discussion on this.” … Mr. Bosworth said Facebook hasn’t paid any ad-blocking software company to have its ads pass through their filters and that it doesn’t intend to.
Google recently worked out a deal with Wikimedia to actually cite the source of the content shown in the search results:
it hasn’t always been the easiest to see that the material came from Wikipedia while on mobile devices. At the Wikimedia Foundation, we’ve been working to change that.
While the various platforms ride the edge on what is considered reasonable disclosure, regulatory bodies crack down on individuals participating on those platforms unless they are far more transparent than the platforms are:
Users need to be clear when they’re getting paid to promote something, and hashtags like #ad, #sp, #sponsored –common forms of identification– are not always enough.
The whole “eating the playing field” is a trend which is vastly under-reported, largely because almost everyone engaged in the ecosystem needs to sell they have some growth strategy.
The reality is as the platform gets eaten it only gets harder to build a sustainable business. The mobile search interface is literally nothing but ads in most key categories. More ads. Larger ads. Nothing but ads.
And a bit of scrape after the ads to ensure the second or third screen still shows zero organic results.
And more scraping, across more categories.
What’s more, even large scaled companies in big money fields are struggling to monetize mobile users. On the most recent quarterly conference call TripAdvisor executives stated they monetize mobile users at about 30% the rate they monetize desktop or tablet users.
What happens when the big brand advertisers stop believing in the narrative of the value of precise user tracking?
P&G two years ago tried targeting ads for its Febreze air freshener at pet owners and households with large families. The brand found that sales stagnated during the effort, but rose when the campaign on Facebook and elsewhere was expanded last March to include anyone over 18.
P&G’s push to find broader reach with its advertising is also evident in the company’s recent increases in television spending. Toward the end of last year P&G began moving more money back into television, according to people familiar with the matter.
For mobile to work well you need to be a destination & a habit. But there is tiny screen space and navigational searches are also re-routed through Google hosted content (which will, of course, get monetized).
In fact, what would happen to an advertiser if they partnered with other advertisers to prevent brand bidding? Why that advertiser would get sued by the FTC for limiting user choice:
The bidding agreements harm consumers, according to the complaint, by restraining competition for, and distorting the prices of, advertising in relevant online auctions, by reducing the number of relevant, useful, truthful and non-misleading advertisements, by restraining competition among online sellers of contact lenses, and in some cases, by resulting in consumers paying higher retail prices for contact lenses.
If the above restraint of competition & market distortion is worth suing over, how exactly can Google make the mobile interface AMP exclusive without earning a similar lawsuit?
AMP content presented in the both sections will be “de-duplicated” in order to avoid redundancies, Google says. The move is significant in that AMP results will now take up an entire phone screen, based on the example Google shows in its pitch deck.
Are many publishers in a rush to support Google AMP after the bait-n-switch on Facebook Instant Articles?
When markets are new they are unproven, thus they often have limited investment targeting them.
That in turn means it can be easy to win in new markets just by virtue of existing.
It wouldn’t be hard to rank well creating a blog today about the evolution of the 3D printing industry, or a how to site focused on Arduino or Raspberry Pi devices.
Couple a bit of passion with significant effort & limited competition and winning is quite easy.
Likewise in a small niche geographic market one can easily win with a generic, because the location acts as a market filter which limits competition.
But as markets age and become more proven, capital rushes in, which pushes out most of the generic unbranded players.
Back in 2011 I wrote about how Google had effectively killed the concept of category killer domains through the combination of ad displacement, vertical search & the algorithmic ranking shift moving away from relevancy toward awareness. 2 months before I wrote that post Walgreen Co. acquired Drugstore.com for about $429 million. At the time Drugstore.com was one of the top 10 biggest ecommerce pure plays.
Thursday Walgreens Boots announced it would shut down Drugstore.com & Beauty.com:
The company is still trying to fine tune its e-commerce strategy but clearly wants to focus more of its resources on one main site. “They want to make sure they can invest more of the equity in Walgreens.com,” said Brian Owens, a director at the consultancy Kantar Retail. “Drugstore.com and Beauty.com are distractions.”
Big brands can sometimes get coverage of “meh” content by virtue of being associated with a big brand, but when they buy out pure-play secondary e-commerce sites those often fail to gain traction and get shuttered:
Other retailers have picked up pure-play e-commerce sites, only to shut them down shortly thereafter. Target Corp. last year shuttered ChefsCatalog.com and Cooking.com, less than three years after buying them.
The lack of publishing savvy among most large retailers mean there will be a water cycle of opportunity which keeps re-appearing, however as the web gets more saturated many of these opportunities are going to become increasingly niche options riding new market trends.
If you invest in zero-sum markets there needs to be some point of differentiation to drive switching. There might be opportunity for a cooking.com or a drugstore.com targeting emerging and frontier markets where brands are under-represented online (much like launching Drugstore.com in the US back in 1999), but it is unlikely pure-play ecommerce sites will be able to win in established markets if they use generically descriptive domains which make building brand awareness and perceived differentiation next to impossible.
Target not only shut down cooking.com, but they didn’t even bother redirecting the domain name to an associated part of their website.
It is now listed for sale.
Many short & generic domain names are guaranteed to remain in a purgatory status.
If you are new to SEO it is hard to appreciate how easy SEO was say 6 to 8 years ago.
Almost everything worked quickly, cheaply, and predictably.
Go back a few years earlier and you could rank a site without even looking at it. 😀
Links, links, links.
Back then sharing SEO information acted like a meritocracy. If you had something fantastic to share & it worked great you were rewarded. Sure you gave away some of your competitive advantage by sharing it publicly, but you would get links and mentions and recommendations.
These days most of the best minds in SEO don’t blog often. And some of the authors who frequently publish literally everywhere are a series of ghostwriters.
Further, most of the sharing has shifted to channels like Twitter, where the half-life of the share is maybe a couple hours.
Yet if you share something which causes search engineers to change their relevancy algorithms in response the half-life of that algorithm shift can last years or maybe even decades.
These days breaking in can be much harder. I see some sites with over 1,000 high quality links that are 3 or 4 months old which have clearly invested deep into 6 figures which appear to be getting about 80 organic search visitors a month.
From a short enough timeframe it appears nothing works, even if you are using a system which has worked, should work, and is currently working on other existing & trusted projects.
Time delays have an amazing impact on our perceptions and how our reward circuitry is wired.
Most the types of people who have the confidence and knowledge to invest deep into 6 figures on a brand new project aren’t creating “how to” SEO information and giving it away free. Doing so would only harm their earnings and lower their competitive advantage.
Most of the info created about SEO today is derivative (people who write about SEO but don’t practice it) or people overstating the risks and claiming x and y and z don’t work, can’t work, and will never work.
And then from there you get the derivative amplifications of don’t, can’t, won’t.
And then there are people who read and old blog post about how things were x years ago and write as though everything is still the same.
If you are using lagging knowledge from derivative “experts” to drive strategy you are most likely going to lose money.
With all the misinformation, how do you find out what works?
You can pay for good advice. But most people don’t want to do that, they’d rather lose. 😉
The other option is to do your own testing. Then when you find out somewhere where conventional wisdom is wrong, invest aggressively.
“To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. Most large organizations embrace the idea of invention, but are not willing to suffer the string of failed experiments necessary to get there. Outsized returns often come from betting against conventional wisdom, and conventional wisdom is usually right.” – Jeff Bezos
That doesn’t mean you should try to go against consensus view everywhere, but wherever you are investing the most it makes sense to invest in something that is either hard for others to do or something others wouldn’t consider doing. That is how you stand out & differentiate.
But to do your own testing you need to have a number of sites. If you have one site that means everything to you and you get wildly experimental then the first time one of those tests goes astray you’re hosed.
And, even if you do nothing wrong, if you don’t build up a stash of savings you can still get screwed by a false positive. Even having a connection in Google may not be enough to overcome a false positive.
Cutts said, “Oh yeah, I think you’re ensnared in this update. I see a couple weird things. But sit tight, and in a month or two we’ll re-index you and everything will be fine.” Then like an idiot, I made some changes but just waited and waited. I didn’t want to bother him because he’s kind of a famous person to me and I didn’t want to waste his time. At the time Google paid someone to answer his email. Crazy, right? He just got thousands and thousands of messages a day.
I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”
“How did you go bankrupt?”
Two ways. Gradually, then suddenly.”
― Ernest Hemingway, The Sun Also Rises
A lot of SEMrush charts look like the following
What happened there?
Well, obviously that site stopped ranking.
You can’t be certain why without doing some investigation. And even then you can never be 100% certain, because you are dealing with a black box.
That said, there are constant shifts in the algorithms across regions and across time.
Paraphrasing quite a bit here, but in this video Search Quality Senior Strategist at Google Andrey Lipattsev suggested…
He also explained the hole Google has in their Arabic index, with spam being much more effective there due to there being little useful content to index and rank & Google modeling their ranking algorithms largely based on publishing strategies in the western world. Fixing many of these holes is also less of a priority because they view evolving with mobile friendly, AMP, etc. as being a higher priority. They algorithmically ignore many localized issues & try to clean up some aspects of that manually. But even whoever is winning by the spam stuff at the moment might not only lose due to an algorithm update or manual clean up, but once Google has something great to rank there it will eventually win, displacing some of the older spam on a near permanent basis. The new entrant raises the barrier to entry for the lower-quality stuff that was winning via sketchy means.
Over time the relevancy algorithms shift. As new ingredients get added to the algorithms & old ingredients get used in new ways it doesn’t mean that a site which once ranked
In fact, sites which don’t get a constant stream of effort & investment are more likely to slide than have their rankings sustained.
The above SEMchart is for a site which uses the following as their header graphic
When there is literally no competition and the algorithms are weak, something like that can rank.
But if Google looks at how well people respond to what is in the result set, a site as ugly as that is going nowhere fast.
Further, a site like that would struggle to get any quality inbound links or shares.
If nobody reads it then nobody will share it.
The content on the page could be Pulitzer prize level writing and few would take it seriously.
With that design, death is certain in many markets.
The above ugly header design with no taste and a really dumb condescending image is one way to lose. But there are also many other ways.
Excessive keyword repetition like the footer with the phrase repeated 100 times.
Excessive focus on monetization to where most visitors quickly bounce back to the search results to click on a different listing.
Ignoring the growing impact of mobile.
Blowing out the content footprint with pagination and tons of lower quality backfill content.
Stale content full of outdated information and broken links.
A lack of investment in new content creation AND promotion.
Aggressive link anchor text combined with low quality links.
The harder & more expensive Google makes it to enter the search channel the greater incentive there is to spend elsewhere.
Why is Facebook doing so well? In part because Google did the search equivalent to what Yahoo! did with their web portal. The rich diversity in the tail was sacrificed to send users down well worn paths. If Google doesn’t want to rank smaller sites, their associated algorithmic biases mean Facebook and Amazon.com rank better, thus perhaps it makes more sense to play on those platforms & get Google traffic as a free throw-in.
Of course aggregate stats are useless and what really matters is what works for your business. Some may find Snapchat, Instagram, Pinterest or even long forgotten StumbleUpon as solid traffic drivers. Other sites might do well with an email newsletter and exposure on Twitter.
Each bit of exposure (anywhere) leads to further awareness. Which can in turn bleed into aggregate search performance.
People can’t explicitly look for you in a differentiated way unless they are already aware you exist.
Some amount of remarketing can make sense because it helps elevate the perceived status of the site, so long as it is not overdone. However if you are selling a product the customer already bought or you are marketing to marketers there is a good chance such investments will be money wasted while you alienate pas
Years ago people complained about an SEO site being far too aggressive with ad retargeting. And while surfing today I saw that same site running retargeting ads to where you can’t scroll down the page enough to have their ad disappear before seeing their ad once again.
If you don’t have awareness in channels other than search it is easy to get hit by an algorithm update if you rank in competitive markets, particularly if you managed to do so via some means which is the equivalent of, erm, stuffing the ballot box.
And if you get hit and then immediately run off to do disavows and link removals, and then only market your business in ways that are passively driven & tied to SEO you’ll likely stay penalized in a long, long time.
While waiting for an update, you may find you are Waiting for Godot.
Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”
What sort of strategy is helping to drive that industry transformation?
How about doorway pages.
These sorts of doorway pages are still live to this day. Simply look at the footer area of http://ift.tt/1rXbGWd
This in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.
March 16, 2015 Google To Launch New Doorway Page Penalty Algorithm
Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.
Today we get
journalists conduits for Google’s public relations efforts writing headlines like: Google: Payday Loans Are Too Harmful to Advertise.
Today those sorts of stories are literally everywhere.
Tomorrow the story will be over.
And when it is.
Precisely zero journalists will have covered the above contrasting behaviors.
As they weren’t in the press release.
Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp’s lead in re-branding their offers as being something else in name.
Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.
Don’t expect to see a link to this blog post on TechCrunch.
There you’ll read some hard-hitting cutting edge tech news like:
Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.
As far as being an investable business goes, news is horrible.
And it is getting worse by the day.
Look at these top performers.
The above chart looks ugly, but in reality it puts an optimistic spin on things…
Almost all the solutions to the problems faced by the mainstream media are incomplete and ultimately will fail.
That doesn’t stop the market from selling magic push button solutions. The worse the fundamentals get, the more incentive (need) there is to sell the dream.
Video will save us.
No it won’t.
Video is expensive to do well and almost nobody at any sort of scale on YouTube has an enviable profit margin. Even the successful individuals who are held up as the examples of success are being squeezed out and Google is trying to push to make the site more like TV. As they get buy in from big players they’ll further squeeze out the indy players – just like general web search.
The New York times is cutting back on their operations in Paris.
What impact does it have on Marketwatch’s brand if you go there for stocks information and they advise you on weight loss tips?
And, once again, when everyone starts doing that it is no longer a competitive advantage.
There have also been cases where newspapers like The New York Times acquired About.com only to later sell it for a loss. And now even About.com is unbundling itself.
The more companies who do them & the more places they are seen, the lower the rates go, the less novel they will seem, and the greater the likelihood a high-spending advertiser decides to publish it on their own site & then drive the audience directly to their site.
When it is rare or unique it stands out and is special, justifying the extra incremental cost. But when it is a scaled process it is no longer unique enough to justify the vastly higher cost.
Further, as it gets more pervasive it will lead to questions of editorial integrity.
It won’t scale across all the big publishers. It only works well at scale in select verticals and as more entities test it they’ll fill up the search results and end up competing for a smaller slice of attention. Further, each new affiliate means every other affiliate’s cookie lasts for a shorter duration.
It is unlikely news companies will be able to create commercially oriented review content at scale while having the depth of Wirecutter.
“We move as much product as a place 10 times bigger than us in terms of audience,” Lam said in an interview. “That’s because people trust us. We earn that trust by having such deeply-researched articles.”
Further, as it gets more pervasive it will lead to questions of editorial integrity.
It won’t work, as it undermines the social proof of value the site would otherwise have from having many comments on it.
Absurd. And a sign of extreme desperation.
Here is Doug Edwards on Larry Page:
He wondered how Google could become like a better version of the RIAA – not just a mediator of digital music licensing – but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it.
If we just give Google or Facebook greater control, they will save us.
No they won’t.
You are probably better off selling meal kits.
As time passes, Google and Facebook keep getting a larger share of the pie, growing their rake faster than the pie is growing.
Here is the RIAA’s Cary Sherman on Google & Facebook:
Just look at Silicon Valley. They’ve done an extraordinary job, and their market cap is worth gazillions of dollars. Look at the creative industries — not just the music industry, but all of them. All of them have suffered.
Over time media sites are becoming more reliant on platforms for distribution, with visitors having fleeting interest: “bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today.”
Accelerated Mobile Pages and Instant Articles?
These are not solutions. They are only a further acceleration of the problem.
How will giving greater control to monopolies that are displacing you (while investing in AI) lead to a more sustainable future for copyright holders? If they host your content and you are no longer even a destination, what is your point of differentiation?
If someone else hosts your content & you are depended on them for distribution you are competing against yourself with an entity that can arbitrarily shift the terms on you whenever they feel like it.
“The cracks are beginning to show, the dependence on platforms has meant they are losing their core identity,” said Rafat Ali “If you are just a brand in the feed, as opposed to a brand that users come to, that will catch up to you sometime.”
Do you think you gain leverage over time as they become more dominant in your vertical? Not likely. Look at how Google’s redesigned image search shunted traffic away from the photographers. Google’s remote rater guidelines even mentioned giving lower ratings to images with watermaks on them. So if you protect your works you are punished & if you don’t, good luck negotiating with a monopoly. You’ll probably need the EU to see any remedy there.
When something is an embarrassment to Google & can harm their PR fixing it becomes a priority, otherwise most the costs of rights management fall on the creative industry & Google will go out of their way to add cost to that process. Facebook is, of course, playing the same game with video freebooting.
As the platforms aim to expand into new verticals they create new opportunities, but those opportunities are temporal.
Whatever happened to Zynga?
Even Buzzfeed, the current example of success on Facebook, missed their revenue target badly, even as they become more dependent on the Facebook feed.
“One more implication of aggregation-based monopolies is that once competitors die the aggregators become monopsonies — i.e. the only buyer for modularized suppliers. And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate.” – Ben Thompson
Long after benefit stops passing to the creative person the platform still gets to re-use the work. The Supreme Court only recentlyrefused to hear the ebook scanning case & Google is already running stories about using romance novels to train their AI. How long until Google places their own AI driven news rewrites in front of users?
Who then will fund journalism?
Remember how Panda was going to fix crap content for the web? eHow has removed literally millions of articles from their site & still has not recovered in Google. Demand Media’s bolt-on articles published on newspaper sites still rank great in Google, but that will at some point get saturated and stop being a growth opportunity, shifting from growth to zero sum to a negative sum market, particularly as Google keeps growing their knowledge scraper graph.
Now maybe if you dumb it down with celebrity garbage you get quick clicks from other channels and longterm SEO traffic doesn’t matter as much.
But if everyone is pumping the same crap into the feed it is hard to stand out. When everyone starts doing it the strategy is no longer a competitive advantage. Further, if you build a business that is algorithmically optimized for short-term clicks is also optimizing for its own longterm irrelevancy.
Yahoo’s journalists used to joke amongst themselves about the extensive variety of Kind bars provided, but now the snacks aren’t being replenished. Instead, employees frequently remind each other that there is little reason to bother creating quality work within Yahoo’s vast eco-system of middle-brow content. “You are competing against Kim Kardashian’s ass,” goes a common refrain.
Yahoo’s billion-person-a-month home page is run by an algorithm, with a spare editorial staff, that pulls in the best-performing content from across the site. Yahoo engineers generally believed that these big names should have been able to support themselves, garner their own large audiences, and shouldn’t have relied on placement on the home page to achieve large audiences. As a result, they were expected to sink or swim on their own.
“Yahoo is reverting to its natural form,” a former staffer told me, “a crap home page for the Midwest.”
That is why Yahoo! ultimately had to shut down almost all their verticals. They were optimized algorithmically for short term wins rather than building things with longterm resonance.
Death by bean counter.
The above also has an incredibly damaging knock on effect on society.
People miss the key news. “what articles got the most views, and thus “clicks.” Put bluntly, it was never the articles on my catching Bernanke pulling system liquidity into the maw of the collapse in 2008, while he maintained to Congress he had done the opposite.” – Karl Denninger
The other issue is PR is outright displacing journalism. As bad as that is at creating general disinformation, it gets worse when people presume diversity of coverage means a diversity of thought process, a diversity of work, and a diversity of sources. Even people inside the current presidential administration state how horrible this trend is on society:
“All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.” … “We created an echo chamber,” he told the magazine. “They [the seemingly independent experts] were saying things that validated what we had given them to say.”
That is basically the government complaining to the press about it being “too easy” to manipulate the press.
Much of what “seems” like an algorithm on the tech platforms is actually a bunch of lowly paid humans pretending to be an algorithm.
This goes back to the problem of the limited diversity in original sources and rise of thin “take” pieces. Stories with an inconvenient truth can get suppressed, but “newsworthy” stories with multiple sources covering them may all use the same biased source.
After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm. … A topic was often blacklisted if it didn’t have at least three traditional news sources covering it
As algorithms take over more aspects of our lives and eat more of the media ecosystem, the sources they feed upon will consistently lose quality until some sort of major reset happens.
The strategy to keep sacrificing the long term to hit the short term numbers can seem popular. And then, suddenly, death.
You can say the soul is gone
And the feeling is just not there
Not like it was so long ago.
– Neil Young, Stringman
It is getting cheap enough that just about anyone can run a paid membership site, but it is quite hard to create something worth paying for on a recurring basis.
There are a few big issues with paywalls:
“It’s only after we’ve lost everything that we’re free to do anything.” ― Chuck Palahniuk, Fight Club