Google & Facebook Squeezing Out Partners

Sections

Just Make Great Content…

Remember the whole shtick about good, legitimate, high-quality content being created for readers without concern for search engines – even as though search engines do not exist?

Whatever happened to that?

We quickly shifted from the above “ideology” to this:

The red triangle/exclamation point icon was arrived at after the Chrome team commissioned research around the world to figure out which symbols alarmed users the most.

Search Engine Engineering Fear

Google is explicitly spreading the message that they are doing testing on how to create maximum fear to try to manipulate & coerce the ecosystem to suit their needs & wants.

At the same time, the Google AMP project is being used as the foundation of effective phishing campaigns.

Scare users off of using HTTP sites AND host phishing campaigns.

Killer job Google.

Someone deserves a raise & some stock options. Unfortunately that person is in the PR team, not the product team.

Ignore The Eye Candy, It’s Poisoned

I’d like to tell you that I was preparing the launch of http://ift.tt/2evTSLc but awareness of past ecosystem shifts makes me unwilling to make that move.

I see it as arbitrary hoop jumping not worth the pain.

If you are an undifferentiated publisher without much in the way of original thought, then jumping through the hoops make sense. But if you deeply care about a topic and put a lot of effort into knowing it well, there’s no reason to do the arbitrary hoop jumping.

Remember how mobilegeddon was going to be the biggest thing ever? Well I never updated our site layout here & we still outrank a company which raised & spent 10s of millions of dollars for core industry terms like [seo tools].

Though it is also worth noting that after factoring in increased ad load with small screen sizes & the scrape graph featured answer stuff, a #1 ranking no longer gets it done, as we are well below the fold on mobile.

   

Below the Fold = Out of Mind

In the above example I am not complaining about ranking #5 and wishing I ranked #2, but rather stating that ranking #1 organically has little to no actual value when it is a couple screens down the page.

Google indicated their interstitial penalty might apply to pop ups that appear on scroll, yet Google welcomes itself to installing a toxic enhanced version of the Diggbar at the top of AMP pages, which persistently eats 15% of the screen & can’t be dismissed. An attempt to dismiss the bar leads the person back to Google to click on another listing other than your site.

As bad as I may have made mobile search results appear earlier, I was perhaps being a little to kind. Google doesn’t even have mass adoption of AMP yet & they already have 4 AdWords ads in their mobile search results AND when you scroll down the page they are testing an ugly “back to top” button which outright blocks a user’s view of the organic search results.

What happens when Google suggests what people should read next as an overlay on your content & sells that as an ad unit where if you’re lucky you get a tiny taste of the revenues?

Is it worth doing anything that makes your desktop website worse in an attempt to try to rank a little higher on mobile devices?

Given the small screen size of phones & the heavy ad load, the answer is no.

I realize that optimizing a site design for mobile or desktop is not mutually exclusive. But it is an issue we will revisit later on in this post.

Coercion Which Failed

Many people new to SEO likely don’t remember the importance of using Google Checkout integration to lower AdWords ad pricing.

You either supported Google Checkout & got about a 10% CTR lift (& thus 10% reduction in click cost) or you failed to adopt it and got priced out of the market on the margin difference.

And if you chose to adopt it, the bad news was you were then spending yet again to undo it when the service was no longer worth running for Google.

How about when Google first started hyping HTTPS & publishers using AdSense saw their ad revenue crash because the ads were no longer anywhere near as relevant.

Oops.

Not like Google cared much, as it is their goal to shift as much of the ad spend as they can onto Google.com & YouTube.

It is not an accident that Google funds an ad blocker which allows ads to stream through on Google.com while leaving ads blocked across the rest of the web.

Android Pay might be worth integrating. But then it also might go away.

It could be like Google’s authorship. Hugely important & yet utterly trivial.
Faces help people trust the content.
Then they are distracting visual clutter that need expunged.
Then they once again re-appear but ONLY on the Google Home Service ad units.
They were once again good for users!!!

Neat how that works.

Embrace, Extend, Extinguish

Or it could be like Google Reader. A free service which defunded all competing products & then was shut down because it didn’t have a legitimate business model due to it being built explicitly to prevent competition. With the death of Google reader many blogs also slid into irrelevancy.

Their FeedBurner acquisition was icing on the cake.

Techdirt is known for generally being pro-Google & they recently summed up FeedBurner nicely:

Thanks, Google, For Fucking Over A Bunch Of Media Websites – Mike Masnick

Ultimately Google is a horrible business partner.

And they are an even worse one if there is no formal contract.

Dumb Pipes, Dumb Partnerships

They tried their best to force broadband providers to be dumb pipes. At the same time they promoted regulation which will prevent broadband providers from tracking their own users the way that Google does, all the while broadening out Google’s privacy policy to allow personally identifiable web tracking across their network. Once Google knew they would retain an indefinite tracking advantage over broadband providers they were free to rescind their (heavily marketed) free tier of Google Fiber & they halted the Google Fiber build out.

When Google routinely acts so anti-competitive & abusive it is no surprise that some of the “standards” they propose go nowhere.

You can only get screwed so many times before you adopt a spirit of ambivalence to the avarice.

Google is the type of “partner” that conducts security opposition research on their leading distribution partner, while conveniently ignoring nearly a billion OTHER Android phones with existing security issues that Google can’t be bothered with patching.

Deliberately screwing direct business partners is far worse than coding algorithms which belligerently penalize some competing services all the while ignoring that the payday loan shop funded by Google leverages doorway pages.

“User” Friendly

BackChannel recently published an article foaming at the mouth promoting the excitement of Google’s AI:

This 2016-to-2017 Transition is going to move us from systems that are explicitly taught to ones that implicitly learn.” … the engineers might make up a rule to test against—for instance, that “usual” might mean a place within a 10-minute drive that you visited three times in the last six months. “It almost doesn’t matter what it is — just make up some rule,” says Huffman. “The machine learning starts after that.

The part of the article I found most interesting was the following bit:

After three years, Google had a sufficient supply of phonemes that it could begin doing things like voice dictation. So it discontinued the [phone information] service.

Google launches “free” services with an ulterior data motive & then when it suits their needs, they’ll shut it off and leave users in the cold.

As Google keeps advancing their AI, what do you think happens to your AMP content they are hosting? How much do they squeeze down on your payout percentage on those pages? How long until the AI is used to recap / rewrite content? What ad revenue do you get when Google offers voice answers pulled from your content but sends you no visitor?

The Numbers Can’t Work

A recent Wall Street Journal article highlighting the fast ad revenue growth at Google & Facebook also mentioned how the broader online advertising ecosystem was doing:

Facebook and Google together garnered 68% of spending on U.S. online advertising in the second quarter—accounting for all the growth, Mr. Wieser said. When excluding those two companies, revenue generated by other players in the U.S. digital ad market shrank 5%

The issue is NOT that online advertising has stalled, but rather that Google & Facebook have choked off their partners from tasting any of the revenue growth. This problem will only get worse as mobile grows to a larger share of total online advertising:

By 2018, nearly three-quarters of Google’s net ad revenues worldwide will come from mobile internet ad placements. – eMarketer

Media companies keep trusting these platforms with greater influence over their business & these platforms keep screwing those same businesses repeatedly.

You pay to get likes, but that is no longer enough as edgerank declines. Thanks for adopting Instant Articles, but users would rather see live videos & read posts from their friends. You are welcome to pay once again to advertise to the following you already built. The bigger your audience, the more we will charge you! Oh, and your direct competitors can use people liking your business as an ad targeting group.

Worse yet, Facebook & Google are even partnering on core Internet infrastructure.

Any hope of AMP turning the corner on the revenue front is a “no go”:

“We want to drive the ecosystem forward, but obviously these things don’t happen overnight,” Mr. Gingras said. “The objective of AMP is to have it drive more revenue for publishers than non-AMP pages. We’re not there yet”.

Publishers who are critical of AMP were reluctant to speak publicly about their frustrations, or to remove their AMP content. One executive said he would not comment on the record for fear that Google might “turn some knob that hurts the company.”

Look at that.

Leadership through fear once again.

At least they are consistent.

As more publishers adopt AMP, each publisher in the program will get a smaller share of the overall pie.

Just look at Google’s quarterly results for their current partners. They keep showing Google growing their ad clicks at 20% to 40% while partners oscillate between -15% and +5% quarter after quarter, year after year.

In the past quarter Google grew their ad clicks 42% YoY by pushing a bunch of YouTube auto play video ads, faster search growth in third world markets with cheaper ad prices, driving a bunch of lower quality mobile search ad clicks (with 3 then 4 ads on mobile) & increasing the percent of ad clicks on “own brand” terms (while sending the FTC after anyone who agrees to not cross bid on competitor’s brands).

The lower quality video ads & mobile ads in turn drove their average CPC on their sites down 13% YoY.

The partner network is relatively squeezed out on mobile, which makes it shocking to see the partner CPC off more than core Google, with a 14% YoY decline.

What ends up happening is eventually the media outlets get sufficiently defunded to where they are sold for a song to a tech company or an executive at a tech company. Alibaba buying SCMP is akin to Jeff Bezos buying The Washington Post.

The Wall Street Journal recently laid off reporters. The New York Times announced they were cutting back local cultural & crime coverage.

If news organizations of that caliber can’t get the numbers to work then the system has failed.

The Guardian is literally incinerating over 5 million pounds per month. ABC is staging fake crime scenes (that’s one way to get an exclusive).

The Tribune Company, already through bankruptcy & perhaps the dumbest of the lot, plans to publish thousands of AI assisted auto-play videos in their articles every day. That will guarantee their user experience on their owned & operated sites is worse than just about anywhere else their content gets distributed to, which in turn means they are not only competing against themselves but they are making their own site absolutely redundant & a chore to use.

That the Denver Guardian (an utterly fake paper running fearmongering false stories) goes viral is just icing on the cake.

These tech companies are literally reshaping society & are sucking the life out of the economy, destroying adjacent markets & bulldozing regulatory concerns, all while offloading costs onto everyone else around them.

An FTC report recommended suing Google for their anti-competitive practices, but no suit was brought. The US Copyright Office Register was relieved of her job after she went against Google’s views on set top boxes.

And in spite of the growing importance of tech media coverage of the industry is a trainwreck:

This is what it’s like to be a technology reporter in 2016. Freebies are everywhere, but real access is scant. Powerful companies like Facebook and Google are major distributors of journalistic work, meaning newsrooms increasingly rely on tech giants to reach readers, a relationship that’s awkward at best and potentially disastrous at worst.

Being a conduit breeds exclusives. Challenging the grand narrative gets one blackballed.

Mobile Search Index

Google announced they are releasing a mobile first search index:

Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices.

There are some forms of content that simply don’t work well on a 350 pixel wide screen, unless they use a pinch to zoom format. But using that format is seen as not being mobile friendly.

Imagine you have an auto part database which lists alternate part numbers, price, stock status, nearest store with part in stock, time to delivery, etc. … it is exceptionally hard to get that information to look good on a mobile device. And good luck if you want to add sorting features on such a table.

The theory that using the desktop version of a page to rank mobile results is flawed because users might find something which is only available on the desktop version of a site is a valid point. BUT, at the same time, a publisher may need to simplify the mobile site & hide data to improve usability on small screens & then only allow certain data to become visible through user interactions. Not showing those automotive part databases to desktop users would ultimately make desktop search results worse for users by leaving huge gaps in the search results. And a search engine choosing to not index the desktop version of a site because there is a mobile version is equally short sighted. Desktop users would no longer be able to find & compare information from those automotive parts databases.

Once again money drives search “relevancy” signals.

Since Google will soon make 3/4 of their ad revenues on mobile that should be the primary view of the web for everyone else & alternate versions of sites which are not mobile friendly should be disappeared from the search index if a crappier lite mobile-friendly version of the page is available.

Amazon converts well on mobile in part because people already trust Amazon & already have an account registered with them. Most other merchants won’t be able to convert at anywhere near as well of a rate on mobile as they do on desktop, so if you have to choose between having a mobile friendly version that leaves differentiated aspects hidden or a destkop friendly version that is differentiated & establishes a relationship with the consumer, the deeper & more engaging desktop version is the way to go.

The heavy ad load on mobile search results only further combine with the low conversion rates on mobile to make building a relationship on desktop that much more important.

Even TripAdvisor is struggling to monetize mobile traffic, monetizing it at only about 30% to 33% the rate they monetize desktop & tablet traffic. Google already owns most the profits from that market.

Webmasters are better off NOT going mobile friendly than going mobile friendly in a way that compromises the ability of their desktop site.

I am not the only one suggesting an over-simplified mobile design that carries over to a desktop site is a losing proposition. Consider Nielsen Norman Group’s take:

in the current world of responsive design, we’ve seen a trend towards insufficient information density and simplifying sites so that they work well on small screens but suboptimally on big screens.

Tracking Users

Publishers are getting squeezed to subsidize the primary web ad networks. But the narrative is that as cross-device tracking improves some of those benefits will eventually spill back out into the partner network.

I am rather skeptical of that theory.

Facebook already makes 84% of their ad revenue from mobile devices where they have great user data.

They are paying to bring new types of content onto their platform, but they are only just now beginning to get around to test pricing their Audience Network traffic based on quality.

Priorities are based on business goals and objectives.

Both Google & Facebook paid fines & faced public backlash for how they track users. Those tracking programs were considered high priority.

When these ad networks are strong & growing quickly they may be able to take a stand, but when growth slows the stock prices crumble, data security becomes less important during downsizing when morale is shattered & talent flees. Further, creating alternative revenue streams becomes vital “to save the company” even if it means selling user data to dangerous dictators.

The other big risk of such tracking is how data can be used by other parties.

Spooks preferred to use the Google cookie to spy on users. And now Google allows personally identifiable web tracking.

Data is being used in all sorts of crazy ways the central ad networks are utterly unaware of. These crazy policies are not limited to other countries. Buying dog food with your credit card can lead to pet licensing fees. Even cheerful “wellness” programs may come with surprises.

Categories: 

from SEO Book http://ift.tt/2eOgnvK
via google

Penguin 4.0 Update

On Friday Google’s Gary Illyes announced Penguin 4.0 was now live.

Key points highlighted in their post are:

  • Penguin is a part of their core ranking algorithm
  • Penguin is now real-time, rather than something which periodically refreshes
  • Penguin has shifted from being a sitewide negative ranking factor to a more granular factor

Things not mentioned in the post

  • if it has been tested extensively over the past month
  • if the algorithm is just now rolling out or if it is already done rolling out
  • if the launch of a new version of Penguin rolled into the core ranking algorithm means old sites hit by the older versions of Penguin have recovered or will recover anytime soon

Since the update was announced, the search results have become more stable.

They still may be testing out fine tuning the filters a bit…

…but what exists now is likely to be what sticks for an extended period of time.

Penguin Algorithm Update History

  • Penguin 1: April 24, 2012
  • Penguin 2: May 26, 2012
  • Penguin 3: October 5, 2012
  • Penguin 4: May 22, 2013 (AKA: Penguin 2.0)
  • Penguin 5: October 4, 2013 (AKA Penguin 2.1)
  • Penguin 6: rolling update which began on October 17, 2014 (AKA Penguin 3.0)
  • Penguin 7: September 23, 2016 (AKA Penguin 4.0)

Now that Penguin is baked into Google’s core ranking algorithms, no more Penguin updates will be announced. Panda updates stopped being announced last year. Instead we now get unnamed “quality” updates.

Volatility Over the Long Holiday Weekend

Earlier in the month many SEOs saw significant volatility in the search results, beginning ahead of Labor Day weekend with a local search update. The algorithm update observations were dismissed as normal fluctuations in spite of the search results being more volatile than they have been in over 4 years.

There are many reasons for search engineers to want to roll out algorithm updates (or at least test new algorithms) before a long holiday weekend:

  • no media coverage: few journalists on the job & a lack of expectation that the PR team will answer any questions. no official word beyond rumors from self-promotional marketers = no story
  • many SEOs outside of work: few are watching as the algorithms tip their cards.
  • declining search volumes: long holiday weekends generally have less search volume associated with them. Thus anyone who is aggressively investing in SEO may wonder if their site was hit, even if it wasn’t.
    The communications conflicts this causes between in-house SEOs and their bosses, as well as between SEO companies and their clients both makes the job of the SEO more miserable and makes the client more likely to pull back on investment, while ensuring the SEO has family issues back home as work ruins their vacation.
  • fresh users: as people travel their search usage changes, thus they have fresh sets of eyes & are doing somewhat different types of searches. This in turn makes their search usage data more dynamic and useful as a feedback mechanism on any changes made to the underlying search relevancy algorithm or search result interface.

Algo Flux Testing Tools

Just about any of the algorithm volatility tools showed far more significant shift earlier in this month than over the past few days.

Take your pick: Mozcast, RankRanger, SERPmetrics, Algaroo, Ayima Pulse, AWR, Accuranker, SERP Watch & the results came out something like this graph from Rank Ranger:

One issue with looking at any of the indexes is the rank shifts tend to be far more dramatic as you move away from the top 3 or 4 search results, so the algorithm volatility scores are much higher than the actual shifts in search traffic (the least volatile rankings are also the ones with the most usage data & ranking signals associated with them, so the top results for those terms tend to be quite stable outside of verticals like news).

You can use AWR’s flux tracker to see how volatility is higher across the top 20 or top 50 results than it is across the top 10 results.

Example Ranking Shifts

I shut down our membership site in April & spend most of my time reading books & news to figure out what’s next after search, but a couple legacy clients I am winding down working with still have me tracking a few keywords & one of the terms saw a lot of smaller sites (in terms of brand awareness) repeatedly slide and recover over the past month.

Notice how a number of sites would spike down on the same day & then back up. And then the pattern would repeat.

As a comparison, here is that chart over the past 3 months.

Notice the big ranking moves which became common over the past month were not common the 2 months prior.

Negative SEO Was Real

There is a weird sect of alleged SEOs which believes Google is omniscient, algorithmic false positives are largely a myth, AND negative SEO was never a real thing.

As it turns out, negative SEO was real, which likely played a part in Google taking years to rolll out this Penguin update AND changing how they process Penguin from a sitewide negative factor to something more granular.

Update != Penalty Recovery

Part of the reason many people think there was no Penguin update or responded to the update with “that’s it?” is because few sites which were hit in the past recovered relative to the number of sites which ranked well until recently just got clipped by this algorithm update.

When Google updates algorithms or refreshes data it does not mean sites which were previously penalized will immediately rank again.

Some penalties (absent direct Google investment or nasty public relations blowback for Google) require a set amount of time to pass before recovery is even possible.

Google has no incentive to allow a broad-based set of penalty recoveries on the same day they announce a new “better than ever” spam fighting algorithm.

They’ll let some time base before the penalized sites can recover.

Further, many of the sites which were hit years ago & remain penalized have been so defunded for so long that they’ve accumulated other penalties due to things like tightening anchor text filters, poor user experience metrics, ad heavy layouts, link rot & neglect.

What to do?

So here are some of the obvious algorithmic holes left by the new Penguin approach…

  • only kidding
  • not sure that would even be a valid mindset in the current market
  • hell, the whole ecosystem is built on quicksand

The trite advice is to make quality content, focus on the user, and build a strong brand.

But you can do all of those well enough that you change the political landscape yet still lose money.

Google & Facebook are in a cold war, competing to see who can kill the open web faster, using each other as justification for their own predation.

Even some of the top brands in big money verticals which were known as the canonical examples of SEO success stories are seeing revenue hits and getting squeezed out of the search ecosystem.

And that is without getting hit by a penalty.

It is getting harder to win in search period.

And it is getting almost impossible to win in search by focusing on search as an isolated channel.

Efforts and investments in chasing the algorithms in isolation are getting less viable by the day.

Anyone operating at scale chasing SEO with automation is likely to step into a trap.

When it happens, that player better have some serious savings or some non-Google revenues, because even with “instant” algorithm updates you can go months or years on reduced revenues waiting for an update.

And if the bulk of your marketing spend while penalized is spent on undoing past marketing spend (rather than building awareness in other channels outside of search) you can almost guarantee that business is dead.

“If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits.” – Matt Cutts

Categories: 

from SEO Book http://ift.tt/2daK50j
via google

Free Google AdWords Keyword Suggestion Tool Alternative

Google recently made it much harder to receive accurate keyword data from the AdWords keyword tool.

They have not only grouped similar terms, but then they broadened out the data ranges to absurdly wide ranges like 10,000 to 100,000 searches a month. Only active AdWords advertisers receive (somewhat?) decent keyword data. And even with that, there are limitations. Try to view too many terms and you get:

“You’ve reached the maximum number of page views for this day. This page now shows ranges for search volumes. For a more detailed view, check back in 24 hours.”

Jennifer Slegg shared a quote from an AdWords advertiser who spoke with a representative:

“I have just spoken to a customer service manger from the Australia support help desk. They have advised me that there must be continuous activity in your google ad-words campaign (clicks and campaigns running) for a minimum of 3-4 months continuous in order to gain focused keyword results. If you are seeing a range 10-100 or 100-1k or 1k -10k its likely your adwords account does not have an active campaign or has not had continuous campaigns or clicks.”

So you not only need to be an advertiser, but you need to stay active for a quarter-year to a third of a year to get decent data.

Part of the sales pitch of AdWords/PPC was that you can see performance data right away, whereas SEO investments can take months or years to back out.

But with Google outright hiding keyword data even from active advertisers, it is probably easier and more productive for those advertisers to start elsewhere.

There are many other keyword data providers (Wordtracker, SEMrush, Wordze, Spyfu, KeywordSpy, Keyword Discovery, Moz, Compete.com, SimilarWeb, Xedant, Ubersuggest, KeywordTool.io, etc.) And there are newer entrants like the Keyword Keg Firefox extension & the brilliantly named KeywordShitter).

In light of Google’s push to help make the web more closed-off & further tilt the web away from the interests of searchers toward the interest of big advertisers*, we decided to do the opposite & recently upgraded our keyword tool to add the following features…

  • expanded the results per search to 500
  • we added negative match and modified broad match to the keyword export spreadsheet (along with already having phrase, broad & exact match)

Our keyword tool lists estimated search volumes, bid prices, cross links to SERPs, etc. Using it does require free account registration to use, but it is a one-time registration and the tool is free. And we don’t collect phone numbers, hard sell over the phone, etc. We even shut down our paid members area, so you are not likely to receive any marketing messages from us anytime soon.

Export is lightning quick AND, more importantly, we have a panda in our logo!

Here is what the web interface looks like

And here is an screenshot of data in Excel with the various keyword match types

If the tool looks like it is getting decent usage, we may upgrade it further to refresh the data more frequently, consider adding more languages, add a few more reference links to related niche sites in the footer cross-reference section, and maybe add a few other features.

“Every market has some rules and boundaries that restrict freedom of choice. A market looks free only because we so unconditionally accept its underlying restrictions that we fail to see them.”Ha-Joon Chang

Categories: 

from SEO Book http://ift.tt/2cu8rkh
via google

How I Learned to Start Loving Social Media’s Darkside

I’m baaaaaaack.

Organic Listings

What a fun past couple years it has been in the digital marketing landscape; we’ve seen hummingbirds, ads displacing organic listings, phantoms, ads displacing organic listings, rank brain, and of course ads displacing organic listings. It has been such a long time since my last post that back when I was last writing for SEObook we were still believing in the timelines provided by Google employees on when Penguin was going to run next. Remember that? Oh, the memories.

Idiot Proof SEO Concepts You Better Not Screw Up For Me

The reason I’m back is to share a tip. Normally I don’t share SEO tips because by sharing information on a tactic, I end up burning the tactic and killing whatever potential usable market value remained on its shelf life. Why share then? Because this isn’t something you can kill; it involves people. And killing people is bad. To explain how it works though, I need to explain the two concepts I’m smashing together like chocolate and peanut butter.

Keepin' it REAL.

Chocolate

The chocolate, aka Influencer Marketing – my definition of influencer marketing is having someone tell your story for you. Some people view influencer marketing as paying someone like Kim Kardashian $50,000 to post a picture of herself on Instagram holding a sample of your new line of kosher pickles. While that does fit under my definition as well, I consider that aspirational influencer marketing since her audience is primarily comprised of being aspiring to be Kim. Also equally valid is having Sally your foodie neighbor posting that picture in exchange for getting a free jar of those delicious pickles; in this particular case though the influence would be considered peer level influence since Sally’s audience is going to be comprised largely of people that view Sally as their equal, and possibly recognize that Sally as a foodie knows her food. Personally, I am biased, but I prefer lots of peer influence campaigns than a single big budget aspirational influence campaign, but I digress. If you want to learn a lot more about differences in the campaign types, I spoke with Bronco on the ins and outs of influence.

Peanut Butter

The peanut butter, aka Online Reputation Management, aka ORM – while I would hope reputation management doesn’t need to be specifically defined, I’ll define it anyhow as changing the online landscape for the benefit of a client’s (or your own) reputation. Peanut butter is a really good analogy for ORM because a lot of work gets spread around in a lot of directions, from creating hundreds of thousands of properties designed to flood the SERPs and social channels as a tail that wags the dog, to straight up negative SEO. Yeah, I said it. If negative SEO wasn’t made so much more available due to Panda, Penguin, and the philosophical neative a priori shift, in ORM would not be the industry that it is today.

So what’s the tip? You can combine these two concepts for your clients, and you can do it in a variety of different ways. Let’s walk through a few…

POSITIVE/BENIGN Focus

  1. Use aspirational influence to find a blogger/writer to talk about your client or product.
  2. Use peer influence indirectly to let a more difficult to approach blogger/writer “discover” your client and write about him or her.
  3. Use aspirational influence as a means to gain links to some properties. Seriously, this works really well. Some audiences will write a series of articles on whatever certain individuals writes about.
  4. Use peer influence to change tone/meaning of a negative article to something more benign.
  5. Use peer influence to find bloggers/writers to discuss concepts that can only be disucssed by referencing you or your client.

NEGATIVE Focus

  1. Use peer pressure influence to get material removed.
  2. Use aspirational influence to change the mind of blogger/writer (think politics – this works).
  3. Use peer influence to change links from one target to another in source material (this occurs quite a bit on Wikipedia too).
  4. THE TRUMP® CARD©: Use aspirational influence and peer influence in combination, which I call compulsion marketing, to inspire frightening movements and witchunts (coordinated DOS attacks, protests, crap link blasts, et al).

My business partner at my influencer marketing network Intellifluence, Terry Godier, and I also refer to some of the above topics under the umbrella of dark influence. I’m sure this list isn’t even close to exhaustive, mainly because I don’t want to go too deep on how scary one can get. If you need to address such things, I still take on select ORM clients at Digital Heretix and can help you out or refer you to a quality professional that will. Combining concepts and tactics is often a lot more fun than trying to approach a tactic singularly; when possible, work in multiple dimensions.

Think of a way that I missed or some cool concepts that could be paired to be more powerful? Let me know on Twitter.

Cheers,
Joe Sinkwitz

Categories: 

from SEO Book http://ift.tt/2aQxk9X
via google

Facebook’s Panda Update

So far this year publishers have lost 52% their Facebook distribution due to:

Instant Articles may have worked for an instant, but many publishers are likely where they were before they made the Faustian bargain, except they now have less control over their content distribution and advertising while having the higher cost structure of supporting another content format.

When Facebook announced their news feed update to fight off clickbait headlines, it sure sounded a lot like the equivalent of Google’s Panda update. Glenn Gabe is one of the sharpest guys in the SEO field who regularly publishes insightful content & doesn’t blindly shill for the various platform monopolies dominating the online publishing industry & he had the same view I did.

Further cementing the “this is Panda” view was an AdAge article quoting some Facebook-reliant publishers. Glad we have already shifted our ways. Nice to see them moving in the same direction we are. etc. … It felt like reading a Richard Rosenblatt quote in 2011 about Demand Media’s strong working relationship with Google or how right after Panda their aggregate traffic level was flat.

January 27, 2011

Peter Kafka: Do you think that Google post was directed at you in any way?

Richard Rosenblatt: It’s not directed at us in any way.

P K: they wrote this post, which talks about content farms, and even though you say they weren’t talking about you, it left a lot of people scratching their heads.

R R: Let’s just say that we know what they’re trying to do. … He’s talking about duplicate, non-original content. Every single piece of ours is original. … our relationship is synergistic, and it’s a great partnership.

May 9, 2011

Kara Swisher: What were you trying to communicate in the call, especially since investors seemed very focused on Panda?

R R: What I also wanted to show was that third-party data sources should not be relied on. We did get affected, for sure. But I was not just being optimistic, we wanted to use that to really understand what we can do better.

K S: Given Google’s shift in its algorithm, are you shifting your distribution, such as toward social and mobile?

R R: If you look at where trends are going, that’s where we are going to be.

K S: How are you changing the continued perception that Demand is a content farm?

R R: I don’t think anyone has defined what a content farm is and I am not sure what it means either. We obviously don’t think we are a content farm and I am not sure we can counter every impact if some people think we are.

A couple years later Richard Rosenblatt left the company.

Since the Google Panda update eHow has removed millions of articles from their site. As a company they remain unprofitable a half-decade later & keep seeing YoY media ad revenue declines in the 30% to 40% range.

Over-reliance on any platform allows that platform to kill you. And, in most cases, you are unlikely to be able to restore your former status until & unless you build influence via other traffic channels:

I think in general, media companies have lost sight of building relationships with their end users that will bring them in directly, as opposed to just posting links on social networks and hoping people will click. I think publishers that do that are shooting themselves in the foot. Media companies in general are way too focused on being where our readers are, as opposed to being so necessary to our readers that they will seek us out. – Jessica Lessin, founder of TheInformation

Recovering former status requires extra investment far above and beyond what led to the penalty. And if the core business model still has the same core problems there is no solution.

“I feel pretty confident about the algorithm on Suite 101.” – Matt Cutts

Some big news publishers are trying to leverage video equivalents of a Narrative Science or Automated Insights (from Wochit and Wibbitz) to embed thousands of autogenerated autoplay videos in their articles daily.

But is that a real long-term solution to turn the corner? Even if they see a short term pop in ad revenues by using some dumbed-down AI-enhanced low cost content, all that really does is teach people that they are a source of noise while increasing the number of web users who install ad blockers.

And the whole time penalized publishers try to recover the old position of glory, the platform monopolies are boosting their AI skills in the background while they eat the playing field.

The companies which run the primary ad networks can easily get around the ad blockers, but third party publishers can’t. As the monopoly platforms broadly defund ad-based publishing, they can put users “in control” while speaking about taking the principle-based approach:

“This isn’t motivated by inventory; it’s not an opportunity for Facebook from that perspective,” Mr. Bosworth said. “We’re doing it more for the principle of the thing. We want to help lead the discussion on this.” … Mr. Bosworth said Facebook hasn’t paid any ad-blocking software company to have its ads pass through their filters and that it doesn’t intend to.

Google recently worked out a deal with Wikimedia to actually cite the source of the content shown in the search results:

it hasn’t always been the easiest to see that the material came from Wikipedia while on mobile devices. At the Wikimedia Foundation, we’ve been working to change that.

While the various platforms ride the edge on what is considered reasonable disclosure, regulatory bodies crack down on individuals participating on those platforms unless they are far more transparent than the platforms are:

Users need to be clear when they’re getting paid to promote something, and hashtags like #ad, #sp, #sponsored –common forms of identification– are not always enough.

The whole “eating the playing field” is a trend which is vastly under-reported, largely because almost everyone engaged in the ecosystem needs to sell they have some growth strategy.

The reality is as the platform gets eaten it only gets harder to build a sustainable business. The mobile search interface is literally nothing but ads in most key categories. More ads. Larger ads. Nothing but ads.

And a bit of scrape after the ads to ensure the second or third screen still shows zero organic results.

And more scraping, across more categories.

What’s more, even large scaled companies in big money fields are struggling to monetize mobile users. On the most recent quarterly conference call TripAdvisor executives stated they monetize mobile users at about 30% the rate they monetize desktop or tablet users.

What happens when the big brand advertisers stop believing in the narrative of the value of precise user tracking?

We may soon find out:

P&G two years ago tried targeting ads for its Febreze air freshener at pet owners and households with large families. The brand found that sales stagnated during the effort, but rose when the campaign on Facebook and elsewhere was expanded last March to include anyone over 18.

P&G’s push to find broader reach with its advertising is also evident in the company’s recent increases in television spending. Toward the end of last year P&G began moving more money back into television, according to people familiar with the matter.

For mobile to work well you need to be a destination & a habit. But there is tiny screen space and navigational searches are also re-routed through Google hosted content (which will, of course, get monetized).

In fact, what would happen to an advertiser if they partnered with other advertisers to prevent brand bidding? Why that advertiser would get sued by the FTC for limiting user choice:

The bidding agreements harm consumers, according to the complaint, by restraining competition for, and distorting the prices of, advertising in relevant online auctions, by reducing the number of relevant, useful, truthful and non-misleading advertisements, by restraining competition among online sellers of contact lenses, and in some cases, by resulting in consumers paying higher retail prices for contact lenses.

If the above restraint of competition & market distortion is worth suing over, how exactly can Google make the mobile interface AMP exclusive without earning a similar lawsuit?

AMP content presented in the both sections will be “de-duplicated” in order to avoid redundancies, Google says. The move is significant in that AMP results will now take up an entire phone screen, based on the example Google shows in its pitch deck.

Are many publishers in a rush to support Google AMP after the bait-n-switch on Facebook Instant Articles?

Categories: 

from SEO Book http://ift.tt/2aXBj3I
via google

Brands Beat Generics

When markets are new they are unproven, thus they often have limited investment targeting them.

That in turn means it can be easy to win in new markets just by virtue of existing.

It wouldn’t be hard to rank well creating a blog today about the evolution of the 3D printing industry, or a how to site focused on Arduino or Raspberry Pi devices.

Couple a bit of passion with significant effort & limited competition and winning is quite easy.

Likewise in a small niche geographic market one can easily win with a generic, because the location acts as a market filter which limits competition.

But as markets age and become more proven, capital rushes in, which pushes out most of the generic unbranded players.

Back in 2011 I wrote about how Google had effectively killed the concept of category killer domains through the combination of ad displacement, vertical search & the algorithmic ranking shift moving away from relevancy toward awareness. 2 months before I wrote that post Walgreen Co. acquired Drugstore.com for about $429 million. At the time Drugstore.com was one of the top 10 biggest ecommerce pure plays.

Thursday Walgreens Boots announced it would shut down Drugstore.com & Beauty.com:

The company is still trying to fine tune its e-commerce strategy but clearly wants to focus more of its resources on one main site. “They want to make sure they can invest more of the equity in Walgreens.com,” said Brian Owens, a director at the consultancy Kantar Retail. “Drugstore.com and Beauty.com are distractions.”

Big brands can sometimes get coverage of “meh” content by virtue of being associated with a big brand, but when they buy out pure-play secondary e-commerce sites those often fail to gain traction and get shuttered:

Other retailers have picked up pure-play e-commerce sites, only to shut them down shortly thereafter. Target Corp. last year shuttered ChefsCatalog.com and Cooking.com, less than three years after buying them.

The lack of publishing savvy among most large retailers mean there will be a water cycle of opportunity which keeps re-appearing, however as the web gets more saturated many of these opportunities are going to become increasingly niche options riding new market trends.

If you invest in zero-sum markets there needs to be some point of differentiation to drive switching. There might be opportunity for a cooking.com or a drugstore.com targeting emerging and frontier markets where brands are under-represented online (much like launching Drugstore.com in the US back in 1999), but it is unlikely pure-play ecommerce sites will be able to win in established markets if they use generically descriptive domains which make building brand awareness and perceived differentiation next to impossible.

Target not only shut down cooking.com, but they didn’t even bother redirecting the domain name to an associated part of their website.

It is now listed for sale.

Many short & generic domain names are guaranteed to remain in a purgatory status.

  • The price point is typically far too high for a passionate hobbyist to buy them & attempt to turn them into something differentiated.
  • The names are too generic for a bigger company to do much with them as a secondary option
    • the search relevancy & social discovery algorithms are moving away from generic toward brand
    • retailers have to save their best ideas for their main branded site
    • the rise of cross-device tracking + ad retargeting further incentivize them to focus exclusively on a single bigger site)
Categories: 

from SEO Book http://ift.tt/2aH5IGb
via google