Search Engine Optimization: SEO News

SocialPro 2016 recap: What Your Social Data Should Be Teaching You

Sphinn: Hot Topics - Fri, 07/08/2016 - 7:39am
Are you getting the most out of your social data? Columnist Benjamin Spiegel summarizes a SocialPro session that included some surprising data-driven findings.

Please visit Marketing Land for the full article.

MarTech Today: Yahoo bots for Facebook Messenger, leveraging marketing data & more

Sphinn: Hot Topics - Fri, 07/08/2016 - 6:30am
Here’s our daily recap of what happened in marketing technology, as reported on Marketing Land and other places across the web. From Marketing Land: Yahoo delivers stocks, news, weather & adoptable monkey bots to Facebook Messenger Jul 7, 2016 by Greg Finn Thanks to a slew of new Yahoo...

Please visit Marketing Land for the full article.

Is an internal linking strategy paying off for Mail Online?

Search Engine Watch - Fri, 07/08/2016 - 5:55am

Combining hub pages for key topics with well-planned internal linking can be a very effective strategy to secure consistent search rankings for target keywords. 

It’s become an essential tactic for publishers and others, especially when you are regularlycreating content around a particular topic.

The risk of producing a lot of content around the same topic is that you can end up with multiple pages which have similar keywords which compete against each other in Google for the same search terms.

For example, USA Today has ten different articles ranking for the term ‘Kylie Jenner’ during a six month period last year. As each new one comes along, it battles with the existing article, with the end result being a very inconsistent search performance.

The answer to this problem is to decide on a page that you want your site to rank for a given keyword or phrase, and concentrate on that. This hub, category or landing page (however you want to describe it) can then be the page that ranks for the term.

Sites can then consistently link to that page from new articles on the topic, eventually creating a useful resource, and one that stands a better chance of gaining high rankings than lots of individual pages.

One such example is the BBC’s Euro 2016 category page. Here it is:

It’s a repository for all of the site’s content around the tournament, and it ranks consistently.

It should also be noted that the groundwork for this was carried out well in advance of the start of Euro 2016 in early June so that, when the spike in interest around the term happened, the BBC was in position to attract plenty of traffic.

This is the BBC’s search rankings for the term ‘Euro 2016’ for the five months up to the start of the tournament. Nice and consistent.

This well planned use of hub pages along with consistent internal linking can really pay off. In the BBC’s case, it has ensured that its Euro 2016 page is in a great position to capitalise in increased interest from searchers around the tournament.

Of course, other factors have to be in place too. The BBC is an authority site with some excellent content and a formidable number of backlinks. Effective linking and theming will help any site, but other factors have to be in place to achieve high rankings for competitive search terms.

That said, it should not be beyond major publishers to profit from this strategy, and the example I’m going to use here is Mail Online. It is, by some accounts, the most visited English-language newspaper site on the web. Make of that what you will.

Mail Online and internal linking

Mail Online, until late last year, hadn’t been implementing a hub page / internal linking strategy at all.

We know this thanks to Dan Barker (@danbarker on Twitter) who pointed this out. He estimates that Mail started this strategy around October 25 last year.

Mail Online creates and publishes huge quantities of articles about celebrities and news. While each new article performs relatively well in search, they do so for a limited time only. So the article becomes old and search positions drop until the original article is usurped by a new article, and so on. This is where the proper strategy can help.

As we can see from the example below for the term ‘chelsea news’, ranking was inconsistent until early November 2015.

The chart shows search results for this term across the entire Daily Mail domain.

The consistent results post-November are for this hub page, which collates all the articles around that term.

Essentially, Mail Online has sent clear signals to Google, through (relatively) consistent internal linking, that this is the page it wants to rank for the term in question.

The hub page had existed before, but without the right linking strategy to promote it. Here we can see the difference in performance before and after the Mail improved its linking strategy. 

The charts above (all charts are from PI Datametrics btw) shows performance up to January 2016, but we can also see how it performed in the last six months.

The chart below shows the Daily Mail’s Chelsea landing page performance for the term ‘Chelsea news’.

Since January, there have only been 26 URL changes, and a lot steadier performance. The visibility for this page has improved as a result by 33.28% and this URL is visible for 98.1% of the time.

The chart below shows the hub / landing page’s performance. It’s mainly consistent, but shows that for the odd day or two, the page wasn’t visible.

This landing page hasn’t beaten its previous ranking of number five on Google.

The reason? Inconsistent linking. For maximum effectiveness, all mentions of the term on new articles should be linked back to the hub page. If this is not implemented, then newer pages can end up competing with the hub page for rankings. This is why it was visible for 98.1%, not 100% of the period shown.

Here’s another example, for the search term ‘David Cameron’. As the British PM (though not for much longer) he obviously attracts a lot of searches and mentions in the news.

This is the Daily Mail domain view for ‘David Cameron’. As with ‘chelsea news’, performance is inconsistent until November 2015.

After November, the Mail is linking to a landing /hub page more consistently (maybe the result of a staff training day on SEO?) and it has led to steadier rankings.

Here’s one example. It’s easy enough to implement.

However, as was the case with the previous example term, inconsistent linking means that Mail Online isn’t getting the full benefit.

Here’s a recent article mentioning David Cameron. No internal links.

Here’s the view of the David Cameron landing page for the past (almost) 12 months.

There’s been an increased number of URL changes, as newer pages compete with the hub page, but the overall visibility of this URL has improved and the ranking has increased by two positions.

Thanks to the EU referendum, there has obviously been a lot more content produced about David Cameron recently. Had the Mail  linked consistently back to the landing page, this content would have been a lot more visible.

In summary: could do better

These examples show how effective the use of linking and hub pages can be, and demonstrate its value, especially for sites that produce a lot of content around the same themes.

They also demonstrate how quickly sites can achieve results with this strategy. However, consistent implementation is key for maximum effect.

That said, we can see how effective this strategy can be. When applied consistently across a range of popular terms, the result is higher and steadier rankings, putting the site in a position to attract more search traffic.

Five most interesting search marketing news stories of the week

Search Engine Watch - Fri, 07/08/2016 - 3:55am

Rebecca here, filling in for Christopher this week – and yes, these are some nice, roomy shoes I’m standing in, thanks for asking.

I don’t have any witty comments to make about the current state of UK politics, so let’s dive straight into the search-related news you might have missed while you were outside cloud-gazing, collecting butterflies, or (if you’re British) making the most of the fact that we can still travel freely between other EU countries…

As usual, Google is where it’s at this week, with news that HTTPS websites account for a third of search results on page one, an increase in the number of search results that receive a Quick Answer Box, and does Google’s newest acquisition mean it’s finding new ways to watch us all creepily?

Google is increasing the number of queries that receive a Quick Answer box

Jim Yu reported for Search Engine Watch this week on the fact that the portion of Google search results which received a Quick Answer box has increased from just over 20% in December 2014 to more than 30% in May 2016.

A Quick Answer, also known as a featured snippet, is when Google pulls content from a trusted, high-ranking website that will directly answer a user’s query and places it at the top of the SERP so that they can find the information they need without having to click through to another site. It can be an awesome way to dominate the SERP without having to fight for the top position.

Jim looked at the impact that Google Quick Answers have on brands, and broke down the three-step framework for getting your content into a quick answer box. So now you can win the game without even playing it, too!

Why should you use featured snippets? "It lets you win the game without playing" @STATRob #BrightonSEO

— Search Engine Watch (@sewatch) April 22, 2016

30% of Google search results are HTTPS websites

A new study from Moz has revealed that more than 30% of websites on page one of Google search use the HTTPS protocol. We know that HTTPS has been a “lightweight ranking signal” for Google since 2014, and the data that Moz has been tracking bears that information out.

Christopher Ratcliff looked at how the share of Google search results on page one that use HTTPS has climbed from an initially tiny fraction in August 2014 to a significant share of the results.

As Christopher put it,

“The results are definitely enough to give SEOs pause for thought when it comes to considering whether to switch their sites to a secure protocol.”

Mobile searches on Google have now exceeded desktop – how has the landscape of search changed?

Jason Tabeling looked at how the landscape of search has changed now that, for the first time ever, mobile searches on Google have exceeded desktop. “To account for this massive shift, Google has made some of the most drastic changes to search results in years,” including removing right-hand side ads and adding a fourth paid listing above organic search results, causing mobile results to be filled with ads.

Jason broke down the data on the number of times paid ads, shopping results or local listings appear in search results and evaluated how the information should affect your search strategy.

Google acquires image recognition startup Moodstocks

Google announced yesterday that it has acquired Moodstocks, a French startup specialising in machine-learning-based image recognition technology for smartphones.

As the International Business Times reported,

“Following the acquisition, which is expected to be completed in the next few weeks, the Moodstocks team will join Google’s R&D team in Paris where they will continue to “build great image recognition tools within Google”.”

Between the Twitter acquisition of Magic Pony two weeks ago and Amazon’s acquisition of AI startup Orbeus in April, it seems that visual processing and machine learning is where it’s at for major tech companies.

The Sun gave a particularly hysterical take on this development by announcing that Google had revealed plans to put “eyes in machines” and that “campaigners” had urged Britons to “cover up cameras on smartphones and computers”. Er… that sounds a little impractical.

Photo by Patrick Barry, available via CC BY-SA 2.0

But in all seriousness, this latest addition to Google’s R&D department could be the first step towards giving Google the capability to identify and run a search for objects in the physical world, à la CamFind.

And if Google really is watching me, well, maybe it’ll finally be able to tell me where I left my keys.

Google is experimenting with another use for Google Posts

The SEM Post reported this week that Google has been spotted expanding its use of Google Posts, a new(ish) feature combining elements of social publishing and rich cards, into more ‘ordinary’ search results.

Moz marketing scientist Dr. Pete Meyers originally noticed the posts appearing in search results for a charter school in New York, KIPP NYC. Google debuted the feature, which I believe still lacks an official name (but has been dubbed “Google Posts” by the search commentariat), back in March as a platform for US presidential candidates to put across their policies.

It was later seen expanding the feature to include a select handful of local businesses, and then using it to cover the I/O developer conference in May. None of these past uses of Posts show up in search results any more – and at the time of writing, KIPP’s appears to have disappeared too – making them a bit like a pop-up soapbox for select entities (and keeping us all guessing about what Google’s eventual plan is for Posts).

What’s interesting is that although the KIPP NYC posts were only just spotted in search, a scroll down their Google Posts page shows that the school has been using Google’s new feature since April. In other words, there could be any number of other lucky users or groups quietly using the platform and waiting for the hallowed spotlight of Google to finally, finally shine on them. And we wouldn’t know.

If you want a shot at joining their ranks, the waiting list is still open.

SearchCap: Bing ads Keyword Planner, SEO dirt & more

Search Engine Land - Thu, 07/07/2016 - 2:00pm
Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Bing ads Keyword Planner, SEO dirt & more appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Marketing Day: Yahoo Facebook Messenger bot, Android Wear & Twitter Wimbledon

Sphinn: Hot Topics - Thu, 07/07/2016 - 2:00pm
Here’s our recap of what happened in online marketing today, as reported on Marketing Land and other places across the web.

Please visit Marketing Land for the full article.

Bing Ads bolsters Keyword Planner targeting and rolls out access to the UK, Canada & Australia

Search Engine Land - Thu, 07/07/2016 - 1:47pm
Keyword Planner begins expansion outside of the US and gives users some new ways to discover keywords in the process. The post Bing Ads bolsters Keyword Planner targeting and rolls out access to the UK, Canada & Australia appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

SMX Advanced recap: Lies, Damn Lies, and Search Marketing Statistics

Search Engine Land - Thu, 07/07/2016 - 11:00am
Are your marketing tests giving you valid results? Columnist Mark Traphagen summarizes insights from a presentation by Vistaprint's Adria Kyne at SMX Advanced. The post SMX Advanced recap: Lies, Damn Lies, and Search Marketing Statistics appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Eight key improvements from an ecommerce site relaunch

Search Engine Watch - Thu, 07/07/2016 - 7:40am

Ecommerce site Simply Hammocks relaunched a few months ago, and has so far achieved some impressive results. 

The changes to the site have led to higher conversion rates and a 60% year on year increase in sales in the two months since relaunch.

The site, along with other brands from The Simply Group, was acquired last year by Scott Woodhead, who has founded other successful commerce sites, such as My Glasses Club and Loving Outdoors.

It seems that Simply Hammocks needed to work to bring it up to date and to speed (literally), hence the relaunch.

Here, I’ll look at some of the changes made to the site as part of this process, and some of the early gains that have resulted.

Making the site mobile-friendly

Simply Hammocks has managed to increase mobile conversion rates from 0.06% to 2.3% by optimising for mobile.

Previously, the site wasn’t mobile-friendly at all, so mobile users had to work hard to make it through checkout.

Here’s the old version on mobile:

And the new version:

Optimising for speed

Site speed matters. It’s about providing a better user experience, which in turn helps to improve conversion rates and reduce site abandonment.

(image taken from Tom Bennet’s Brighton SEO presentation)

It will also be increasingly important for SEO, as Google is looking to make page speed a factor in its next mobile algorithm update.

According toThe Simply Group co-founder Scott Woodhead:

We ran all images through minify which saved us 1.4GB in file size. We moved over to Shopify which utilises Content Delivery Network and load balancing servers so the website was faster under traffic.

We also removed unnecessary apps. One in particular was from an installed module that wasn’t being used but was loading javascript for each page from company that didn’t exist. This added 3 seconds load to every page!

In addition, the homepage banner, which used to be a 16MB images,  was optimised for speed.

Social proof

Social proof can be very effective for ecommerce sites when used well. Here, Simply Hammocks has introduced pop-ups which appear to show which products other customers are buying right now.

They are relatively subtle pop-ups which don’t interrupt the user, but instead gives people a gentle nudge.

They appear roughly every five to ten seconds (at least around 10am on a Monday morning) and give the impression that:

  1. People are buying from the site, which reinforces trust. If other customers are buying, then it must be OK etc…
  2. People are buying at regular intervals, which creates a sense of urgency.

According to Scott, these pop-ups have increased conversions by 40% since implementation.

Improved product imagery

The importance of product images for online retailers shouldn’t be underestimated, as they form a key part of the customer’s decision on whether or not to make a purchase.

Here, the product pages have multiple images which allow potential buyers to see the product in context, as well as more practical shots which show a close up of the material, as well as the storage bag.

The site has also added some quirky imagery – you’ll see the odd tortoise or dropped ice cream on some of the product shots.

Previously, images were mainly garden shots, meaning that the category pages were a sea of green. This made it harder to individual products to stand out.

A focus on delivery

Delivery is a key part of the customer experience, and shoppers increasingly expect greater convenience in terms of shipping options and speed.

Indeed, a recent survey found that 42% of customers have higher expectations around delivery now than two years ago, with factors such as greater control a key issue.

The site offers free express delivery, with all items shipped the next day. It displays the rate of next day arrivals (97%) at the top of the site as a sales driver.

Product page detail

Most of these hammocks are in the £70-£250 price bracket, and naturally customers want some detailed information before committing to buy.

To this end, there’s plenty of detail about the products, including buyer’s guides, detailed information on size and materials, as well as product videos.

The site also offers downloadable instruction manuals, useful for customers who may have misplaced theirs, but also a great SEO tactic, as people searching for manuals are likely to end up on this site.

As mentioned above, the product images and variety of views also helps to make the pages more effective than before.

An improved content strategy

There are some who would cast doubt on the value of blogging for online retailers, but it can be very effective when done well, and I’ve seen examples where relatively niche retailers can gain an advantage.

Content which directly addresses customer queries and concerns around products (such as how-to guides) can be very effective.

Here’s an example, which answers a common customer question:

It’s useful content for customers, but also helps pick up search traffic. The key thing here is that this content is targeting a key audience of potential hammock buyers.

Thanks to this content, the site ranks well for this query, and will continue to attract traffic for some time to come.

Smooth checkout

There’s no forced registration, the checkout has been enclosed with distractions removed, and forms work well. All key factors in a successful checkout process.

In summary

The site is a work in progress, but some of the key issues which affect conversion have been addressed already. The site is faster, works on mobile, product pages are effective, and the checkout works as it should.

There are more improvements planned, such as hammock buying tools, greater choice over delivery slots, and other social proof features such as showing items which are frequently purchased together.

The improvements here offer a lesson on the areas to prioritise when redesigning ecommerce sites, as well as the potential gains which can be made.

For more information on this topic, see our Ecommerce Checkout Best Practice Guide.

For more reports, including guides on mobile commerce, customer experience, and social customer service, head to ClickZ Intelligence

Imitating search algorithms for a successful link building strategy

Search Engine Watch - Thu, 07/07/2016 - 4:34am

When doing a link building strategy – or any other type of SEO-related strategy, for that matter – you might find the sheer amount of data that we have available to us at a relatively low cost, quite staggering.

However, that can sometimes create a problem in itself; selecting the best data to use, and then using it in such a way that makes it useful and actionable, can be a lot more difficult.

Creating a link building strategy and getting it right is essential – links are within the top two ranking signals for Google.

This may seem obvious, but to ensure you are building a link strategy that is going to benefit from this major ranking signal and be used on an ongoing basis, you need to do more than just provide metrics on your competitors. You should be providing opportunities for link building and PR that will guide you on the sites you need to acquire to provide the most benefit.

To us at Zazzle Media, the best way to do this is by performing a large-scale competitor link intersect.

Rather than just showing you how to implement a large scale link intersect, which has been done many times before, I’m going to explain a smarter way of doing one by mimicking well-known search algorithms.

Doing your link intersect this way will always return better results. Below are the two algorithms we will be trying to replicate, along with a short description of both.

Topic-sensitive PageRank

Topic-sensitive PageRank is an evolved version of the PageRank algorithm that passes an additional signal on top of the traditional authority and trustworthy scores. This additional signal is topical relevance.

The basis of this algorithm is that seed pages are grouped by the topic to which they belong. For example, the sports section of the BBC website would be categorised as being about sport, the politics section about politics, and so on.

All external links from those parts of the site would then pass on sport or politics-related topical PageRank to the linked site. This topical score would then be passed around the web via external links just like the traditional PageRank algorithm would do authority.

You can read more about topic-sensitive PageRank in this paper by Taher Haveliwala. Not long after writing it he went onto become a Google software engineer. Matt Cutts also mentioned Topical PageRank in this video here.

Hub and Authority Pages

The idea of the internet being full of hub (or expert) and authority pages has been around for quite a while, and Google will be using some form of this algorithm.

You can see this topic being written about in this paper on the Hilltop algorithm by Krishna Bharat or Authoritative Sources in a Hyperlinked Environment by Jon M. Kleinberg.

In the first paper by Krishna Bharat, an expert page is defined as a ‘page that is about a certain topic and has links to many non-affiliated pages on that topic’.

A page is defined as an authority if ‘some of the best experts on the query topic point to it’. Here is a diagram from the Kleinberg paper that shows a diagram of hubs and authorities and then also unrelated pages linking to a site:

We will be replicating the above diagram with backlink data later on!

From this paper we can gather that to become an authority and rank well for a particular term or topic, we should be looking for links from these expert/hub pages.

We need to do this as these sites are used to decide who is an authority and should be ranking well for a given term. Rather than replicating the above algorithm at the page level, we will instead be doing it at the domain level. Simply because hub domains are more likely to produce hub pages.

Relevancy as a Link Signal

You probably noticed both of the above algorithms are aiming to do very similar things with passing authority depending on the relevance of the source page.

You will find similar aims in other link based search algorithms including phrase-based indexing. This algorithm is slightly beyond the scope of this blog post, but if we get links from the hub sites we should also be ticking the box to benefit from phrase-based indexing.

If anything, reading about these algorithms should be influencing you to build relationships with topically relevant authority sites to improve rankings. Continue below to find out exactly how to find these sites.

1 – Picking your target topic/keywords

Before we find the hub/expert pages we are aiming to get links from, we first need to determine which pages are our authorities.

This is easy to do as Google tells us which site it deems an authority within a topic in its search results. We just need to scrape search results for the top sites ranking for related keywords that you want to improve rankings for. In this example we have chosen the following keywords:

  • bridesmaid dresses
  • wedding dresses
  • wedding gowns
  • bridal gowns
  • bridal dresses

For scraping search results we use our own in-house tool, but the Simple SERP Scraper by the creators of URL Profiler will also work. I recommend scraping the top 20 results for each term.

2 – Finding your authorities

You should now have a list of URLs ranking for our target terms in Excel. Delete some columns so that you have just the URL that is ranking. Now, in column B, add a header called ‘Root Domain’. In cell B2 add the following formula:

=IF(ISERROR(FIND(“//www.”,A2)), MID(A2,FIND(“:”,A2,4)+3,FIND(“/”,A2,9)-FIND(“:”,A2,4)-3), MID(A2,FIND(“:”,A2,4)+7,FIND(“/”,A2,9)-FIND(“:”,A2,4)-7))

Expand the results downwards so that you have the root domain for every URL. Your spreadsheet should now look like this:

Next add another heading in column C called ‘Count’ and in C2 add and drag down the following formula:

=COUNTIF(B:B,B2)

This will just count how many times that domain is showing up for the keywords we scraped. Next we need to copy column C and paste as a value to remove the formula. Then just sort the table using column C from largest to smallest. Your spreadsheet should now look like this:

Now we just need to remove the duplicate domains. We do this by going into the ‘Data’ tab in the ribbon at the top of Excel and selecting remove duplicates. Then just remove duplicates on column B. We can also delete column A so we just have our root domains and the number of times the domain is found within the search results.

We now have the domains that Google believes to be an authority on the wedding dresses topic.

3 – Export referring domains for authority sites

We usually use Majestic to get the referring domains for the authority sites – mainly because they have an extensive database of links. Plus, you get their metrics such as Citation Flow, Trust Flow as well as Topical Trust Flow (more on Topical Trust Flow later).

If you want to use Ahrefs or another service, you could use a metric that they provide that is similar to Trust Flow. You will, however, miss out on Topical Trust Flow. We’ll make use of this later on.

Pick the top domains with a high count from the spreadsheet we have just created. Then enter them into Majestic and export the referring domains. Once we have exported the first domain, we need to insert an empty column in column A. Give this column a header called ‘Competitor’ and then input the root domain in A2 and drag down.

Repeat this process and move onto the next competitor, except this time copy and paste the new export into the first sheet we exported (excluding the headers) so we have all the backlink data in one sheet.

I recommend repeating this until you have at least ten competitors in the sheet.

4 – Tidy up the spreadsheet

Now that we have all the referring domains we need, we can clean up our spreadsheet and remove unnecessary columns.

I usually delete all columns except the competitor, domain, TrustFlow, CitationFlow, Topical Trust Flow Topic 0 and Topical Trust Flow Value 0 columns.

I also rename the domain header to be ‘URL’, and tidy up the Topical Trust Flow headers.

5 – Count repeat domains and mark already acquired links

Now that we have the data, we need to use the same formula as earlier to highlight the hub/expert domains that are linking to multiple topically relevant domains.

Add a header called ‘Times Linked to Competitors’ in column G and add this formula in G2:

=COUNTIF(B:B,B2)

This will now tell you how many competitors the site in column B is linking to. You will also want to mark domains that are already linking to your site so we are not building links on the same domain multiple times.

To do this, firstly add a heading in column H called ‘Already Linking to Site?’. Next, create a new sheet in your spreadsheet called ‘My Site Links’ and export all your referring domains from Majestic for your site’s domain. Then paste the export into the newly created sheet.

Now, in cell H2 in our first sheet, add the following formula:

=IFERROR(IF(MATCH(B2,’My Site Links’!B:B,0),”Yes”,),”No”)

This checks if the URL in cell B2 is in column B of the ‘My Site Links’ links sheet and returns yes or no depending on the result. Now copy columns G and H and paste them as values, just to remove the formulas again.

In this example, I have added ellisbridals.co.uk as our site.

6 – Organic traffic estimate (optional)

This step is entirely optional but at this point usually I will also pull in some metrics from SEMrush.

I like to use SEMrush’s Organic Traffic metric to give a further indication of how well a website is ranking for its target keywords. If it is not ranking very well or has a low organic traffic score, this a pretty good indication the site has either been penalised, de-indexed or is just low quality.

Move onto step 8 if you do not want to do this.

To get this information from SEMrush, you can use URL Profiler. Just save your spreadsheet as a CSV, right click in the URL list area in URL Profiler then import CSV and merge data.

Next, you need to tick ‘SEMrush Rank’ in the domain level data, input your API key, then run the profiler.

If you are running this on a large set of data and want to speed up collecting the SEMrush metrics, I will sometimes remove domains with a Trust Flow score of 0 – 5 in Excel before importing. This is just to eliminate the bulk of the low-quality untrustworthy domains that you do not want to be building links from. It also saves some SEMrush API credits!

7 – Clean-up URL profiler output (still optional)

Now we have the new spreadsheet that includes SEMrush metrics, you just need to clear up the output in the combined results sheet.

I will usually remove all columns that have been added by URL Profiler and just leave the new SEMrush Organic Traffic.

8 – Visualise the data using a network graph

I usually do steps 8, 9, 10, so the data is more presentable rather than having the end product as just a spreadsheet. It also makes filtering the data to find your ideal metrics easier.

These are very quick and easy to create using Google Fusion Tables, so I recommend doing them.

To create them save your spreadsheet as a CSV file and then go to create a new file in Google Drive and select the ‘Connect More Apps’ option. Search for ‘Fusion Tables’ and then connect the app to your Drive account.

Once that is done, create a new file in Google Drive and select Google Fusion Tables. We then just need to upload our CSV file from our computer and select next in the bottom right.

After the CSV has been loaded, you will have to import the table by clicking next again. Give your table a name and select finish.

9 – Create a hub/expert network graph

Now the rows are imported we need to create our network graph by clicking on the red + icon and then ‘add chart’.

Next, choose the network graph at the bottom and configure the graph with the following settings:

Your graph should now look something like below. You may need to increase the number of nodes it shows so more sites begin to show up. Be mindful that the more nodes there are, the more demanding it is on your computer. You will also just need to select ‘done’ in the top right corner of the chart so we are no longer configuring it.

If you have not already figured it out, the yellow circles on the chart are our competitors; the blue circles are the sites linking to our competitors. The bigger the competitors circle, the more referring domains they have. The linking site circles get bigger depending on how much of a hub domain it is. This is because, when setting up the chart, we weighted it by the number of times linking to our competitors.

While the above graph looks pretty great, we have quite a lot of sites in it that fit the ‘unrelated page of large in-degree’ categorisation mentioned in the Kleinberg paper earlier as they only link to one authority site.

We want to be turning the diagram on the right into the diagram on the left:

This is really simple to do by adding the the below filters.

Filtering so only sites who link to more than one competitor will show only hub/expert domains; filtering by TrustFlow and SEMrush Organic Traffic removes lower quality untrustworthy domains.

You will need to play around with the Trust Flow and SEMrush Organic Traffic metrics depending on the sites you are trying to target. Our chart has now gone down to 422 domains from 13,212.

If you want to, at this point you can also add another filter to only show sites that aren’t already linking to your site. Here is what our chart now looks like:

The above chart is now a lot more manageable. You can see our top hub/expert domains we want to be building relationships with floating around the middle of the chart. Here is a close-up of some of those domains:

Your PR/Outreach team should now have plenty to be getting on with! You can see the results are pretty good, with heaps of wedding related sites that you should start building relationships with.

10 – Filter to show pages that will pass high topical PageRank

Before we get into creating this graph, I am first going to explain why we can substitute Topical Trust Flow by Majestic for Topic-sensitive PageRank.

Topical Trust Flow works in a very similar way to Topic-sensitive PageRank in that it is calculated via a manual review of a set of seed sites. Then this topical data is propagated throughout the entire web to give a Topical Trust Flow Score for every page and domain on the internet.

This gives you a good idea of what topic an individual site is an authority on. In this case, if a site has a high Topical Trust Flow score for a wedding related subject, we want some of that wedding related authority to be passed onto us via a link.

Now, onto creating the graph. Since we know how these graphs work, doing this should be a lot quicker.

Create another network graph just as before, except this time weight it by Topical Trust Flow value. Create a filter for Topical Trust Flow Topic and pick any topics related to your site.

For this site, I have chosen Shopping/Weddings and Shopping/Clothing. I usually also use a similar filter to the previous chart for Trust Flow and organic traffic to prevent showing any low-quality results.

Fewer results are returned for this chart, but if you want more authority passed within a topic, these are the sites you want to be building relationships with.

You can play around with the different topics depending on the sites you want to try and find. For example, you may want to find sites within the News/Magazines and E-Zines topic for PR.

11 – Replicate Filters in Google Sheets

This step is very simple and does not need much explanation.

I import the spreadsheet we created earlier into Google Sheets and then just duplicate the sheet and add the same filters as the ones I created in the network graphs. I usually also add an ‘Outreached?’ header so the team knows if we have an existing relationship with the site.

I recommend doing this, as while these charts look great and visualise your data in a fancy way, it helps with tracking which sites you have already spoken to.

Summary

You should now know what you need to be doing for your site or client to not just drive more link equity into their site, but also drive topical relevant link equity that will benefit them the most.

While this seems to be a long process, they do not take that long to create – especially when you compare the benefit the site will receive from them.

There are much more uses for Network Graphs. I occasionally use them for visualising internal links on a site to find gaps in their internal linking strategy.

I would love to hear any other ideas you may have to make use of these graphs, as well as any other things you like to do when creating a link building strategy.

Sam Underwood is a Search and Data Executive at Zazzle and a contributor to Search Engine Watch. 

HTTPS websites account for 30% of all Google search results

Search Engine Watch - Thu, 07/07/2016 - 2:29am

As of late June, 32.5% of page one Google results now use the HTTPS protocol, according to a new study from Moz.

The esteemed Dr Pete published a blog post this week on the data they’ve been tracking in the two year period since Google announced HTTPS was to be a light ranking signal in August 2014.

The results are definitely enough to give SEOs pause for thought when it comes to considering whether to switch their sites to a secure protocol.

What is HTTPS?

In case you need a refresher, here is Jim Yu’s explanation of the difference between http and HTTPS:

HTTP is the standard form used when accessing websites. HTTPS adds an additional layer of security by encrypting in SSL and sharing a key with the destination server that is difficult to hack.

And here is Google’s 2014 announcement:

“We’re starting to use HTTPS as a ranking signal. For now, it’s only a very lightweight signal, affecting fewer than 1% of global queries, and carrying less weight than other signals, such as high-quality content.”

But over time, the promise that Google would strengthen the signal “to keep everyone safe on the Web” seems to be coming true…

HTTPS as a ranking signal in 2014

Searchmetrics found little difference between HTTP and HTTPS rankings in the months after the initial Google announcement. Hardly surprising as it did only affect 1% of results.

Moz also saw very little initial difference. Prior to August 2014, 7% of page one Google results used HTTPS protocol. A week after the update announcement, that number increased to 8%.

So we all went about our business, some of us implemented, some of us didn’t. No big whoop. It’s not like it’s AMP or anything! Amirite?

SMASH CUT TO:

HTTPS as a ranking signal in 2016

Moz has found that one-third of page one Google results now use HTTPS.

As Dr Pete points out, due to the gradual progression of the graph, this probably isn’t due to specific algorithm changes as you would normally see sharp jumps and plateaus. Instead it may mean that Google’s pro-HTTPS campaign has been working.

“They’ve successfully led search marketers and site owners to believe that HTTPS will be rewarded, and this has drastically sped up the shift.”

Projecting forward it’s likely that in 16–17 months time, HTTPS results may hit 50% and Dr Pete predicts an algorithm change to further bolster HTTPS in about a year.

How the future of advertising is in servicing the ‘moment’

Search Engine Watch - Wed, 07/06/2016 - 8:55am

Great advertising starts when a brand delivers a service to the consumer – rather than an ad, says Forbes 30 Under 30 entrepreneur, Brian Wong.

Wong is the 25-year-old co-founder and chief executive of Kiip – an advertising tool that allows advertisers to send ‘rewards’ to mobile users during moments of online achievements. It gives the 700 brands using the platform, access to more than 200 million monthly active users across 5000 apps.

*Image: Kiip

Ad blocking has forced the advertising industry into some self-reflection, while simultaneously pushing the trend for native and platform-driven content.

Viewability has become a key metric as a result, forcing brands to be more honest about where their ad dollars are going.

“It’s always important to have these moments of responsibility and accountability in an industry. After the whistle is blown, people are more careful, more conscious, and ultimately more responsible,” says Wong.

Here are Wong’s key tips for advertising which goes beyond reach and placements to engaging connected consumers during the moments when they are most receptive to receiving ads.

1. Native advertising

Native advertising has become a key trend this year, largely as a way to outmanoeuvre the ad blockers. It’s also what has helped drive the rise of platforms like Snapchat and Hulu.

“There are platforms where the more native you are, the harder it is to block you,” says Wong.

He says users don’t want to have to skip or block an ad, they want an advertising model that’s entertaining. And a platform like Snapchat allows brands to do just that.

*Image: Snapchat

This dynamic is forcing advertisers (in a really good way, Wong adds) to be more conscious of their content.

“In the past, a brand would use its own creative and spew it out across thousands of publishers. That no longer works, you now have to be a lot more curated and a lot more intelligent about what you’re trying to push out there,” says Wong.

2. The connected consumer

Forget millennials, the real consumer a brand should be targeting is the connected one.

“Ultimately it’s not about age groups or generations – it’s about the level of connectivity that the consumer is experiencing,” says Wong.

This level of connectivity comes from the number of devices a consumer owns, and how often they are interacting with them.

Mobile is enhancing this level of constant connectivity in both developed and emerging economies, and for brands it means developing strategies for these constantly connected consumers.

Wong uses his mum as an example. She has an Apple Watch, an iPad and an iPhone.

“Just because I’m a millennial, doesn’t mean I’m in a special generation that requires such treatment. My mum requires special treatment.”

Brands need to recognize that consumers adopting these technologies expect special types of services that come from being constantly connected, and that’s not just millennials.

3. Servicing the moment

A core component of the Kiip tool is access to user ‘moments’. Wong and his team are using these moments to build a new metric around the modern connected consumer.

A ‘moment’ is when a user is connected to a device while experiencing a period of time where they’ve just done something ‘meaningful’. For example, they have achieved a new high score in a mobile gaming app, or hit a target during a work out.

This is where the value of this moment can be detectable in real time. It puts a brand in a very advantageous position if it can communicate with the consumer at these moments where they are most receptive to an advertising messaging, says Wong.

*Image: Kiip

“We are trying to create a model where the brand is conscious of the time component, where someone is active on mobile and being respectful of that experience in bringing the advertising in to that moment,” he says.

The best way to do this is to engage with them via a reward on mobile. This leaves the consumer with something valuable, but something they can takeaway and engage with at a later time.

“That’s important on mobile – at the moment I am doing something else so I’m not going to immerse myself into your brand. However, if you give me something to takeaway, I will be able to spend time with it later on,” says Wong.

*Image: Kiip

Where does data and privacy come into all of this? Wong believes privacy isn’t so much of an issue when the ad stops being an ad, and starts being a service to the consumer.

For example, when a hotel knows everything about the consumer, the better they can serve them and it becomes about the experience.

4. Respectful advertising

It all comes down to being conscious of the consumer and the consumer experience. In short, the consumer has to come first.

That means if a click through rate is just 2% to 5%, advertisers need to face up to the reality that 95% of people don’t want to consume that ad.

“Be respectful and cognisant that you need to create something the consumer wants,” says Wong.

In addition, it’s about timing. Wong says 90% of the battle is knowing what time to be a part of a consumer’s digital habits.

“You might have the best ad ever, but if you are there to interrupt them, you will get flack,” he says.

5. Awareness and product

A common mistake advertisers make is thinking that if a product is good it will sell itself.

“Unfortunately with all the noise out there, good advertising needs to accommodate that product,” says Wong.

The product still comes first, but these products need to be integrated into the fabric of a consumer’s daily life through advertising. The product itself begins to create new ways for the brand to become relevant, he says.

Good advertising and good products have to be combined – it can’t be mutually exclusive.

6. Data

Wong believes a common mistake brands make is running an ad campaign without paying attention to the data that is being generated from it. Worse still, the company the brand has just bought the ads from is probably using that data, whether that’s for things like retargeting or profiling.

Brands should therefore take hold of their data, store it in a DMP, knowing they have it at their disposal to use in future campaigns.

7. Mobile first

Mobile will continue to be the dominant focus for advertisers as they target mobile first audiences. Traditional forms of CRM and advertising via television are no longer going to cut it.

The juggernauts like Facebook, Google and Twitter already have a lot of mobile first data, and more and more brands will start to look at the consumer from this point of view. Wong predicts that by 2020 most big brands will have a mobile profile of the consumer.

As an entrepreneur, Wong has this concluding advice: Never learn the rules. “When you go to different industries, it’s kind of cool not to know the rules because you are ultimately going to take people who have been in the industry for a long time by surprise,” he says.

*Brian Wong is a keynote speaker at ClickZ Live Hong Kong on August 3-4. Join him there to learn more about successful marketing to the connected consumer and what it takes to be a Silicon Valley startup entrepreneur at 19.

Nine considerations for movie-based SEO outreach campaigns

Search Engine Watch - Wed, 07/06/2016 - 8:31am

Alien invasions have had their costs calculated by finance companies. Fashion boutiques care about superhero costumes. Travel firms researched where films were made, and retailers know which gadgets spies use…

As a blogger I receive infographics throughout the week which contain great movie related content. Only the best of them will be published and only the clever will earn a link.

Movie based outreach ideas, designed to boost SEO with some earned links, seem like a good idea but there are plenty of issues to get your head around.

Since being forewarned is forearmed; let’s take a look at some.

Ahoy! Thar be lawyers!

Most movies have no objections to free publicity but some SEO campaigns can cross the line.

Usually, if brands or agencies are making anything physical – anything with demonstrable value – of trademarked and protected intellectual property then there is a risk.

It is safer to stay digital although, better still, is to talk around the movie rather than directly about it, and anything you can do to avoid yet another infographic will be worthwhile.

For example, UK finance site Compare the Market teamed up with DC Entertainment to produce an actual superhero meerkat comic book. If any similar brand had attempted to print and publish their own unofficial Batman V Superman spin-off then it is likely legal teams would have been involved.

There is an upside, if the brand does land in hot water on the back of an unofficial movie tie-in and if their skin is thick enough you can turn that drama into a linkbait campaign.

If you are working with a brand large enough to work officially with a movie it is often worth asking in quarterly reviews whether there are any partnerships coming up. This is exactly the sort of opportunity that can be missed because no one thought to tell the agencies.

Find the signal in the noise

Data is important but it is no substitute for skill. You’ll find this distinction plays an important part in movie outreach.

Movies create a lot of buzz and can involve people with significance reach. This means you have to filter through a lot of noise. Is a movie popular with bloggers just because there is media attention?

Movie mentions may come from celebs or vloggers with no interest in the movie other than to interest their audience. You might even find actors who have appeared in the movie but have little chance of engaging in with your brand.

In either scenario, if you decide to approach these influencers you will want to do so individually and with cooperation with PR, sponsorship and branding teams in mind.

A better approach than qualifying with just one or two mentions to consider the true “Vocality” of the potential influencer – know how often are they talking about the movie or subjects related to it. The more often, the better. Also consider “Visibility” as a hybrid of reach and engagement. Both are important.

The danger with the single mention route is that an apparently large list of outreach targets is more likely to waste your time, mismanage expectations or get your content on low quality sites than help you connect with the type of blogger or influencer who will help ignite interest around your content.

On the flipside, working with just one fan community can supercharge your campaign.

Understand image sharing

The overwhelming majority of movie inspired outreach and engagement campaigns – either for social or for SEO – lean heavily on highly visual content. Thankfully, visual doesn’t always mean infographic.

The leading influence analyzer platforms have the ability to study pictures as part of their research. This is worth doing. Finding a blogger with a tendency to share infographics, for example, is a valuable find.

Movies based on an established franchise, either a sequel or a movie of a comic book or game, will be accessible and researchable based on visuals long before they hit theatres. This is a tactical advantage should not be wasted.

Build on past successes

There are few occasions when an outreach and engagement campaign needs to begin from scratch and even fewer in which it should.

In most scenarios the brand or agency should already have relationships with bloggers and a strong idea of what content types are popular. This knowledge should be baked into the campaign. There’s no reason why key bloggers cannot be approached in advance, their input sought or even co-creation concepts explored.

For large, officially backed, movie campaigns relationships should be built in advance. Ideally no brand should be in the situation of having spent a considerable amount of money on a tie-in and content before an agency sees whether or not the influential bloggers on that topic are actually interested in it. Some dominant fan sites may even be offended if they’re not given the chance to be involved.

Pick a movie that suits your brand

Some movies work better for outreach than others. Movies with a large community – geek, for example – are easier targets.

There are generally lots of clever ways to make a brand relevant for the movie’s audience too. Sell suspending ceiling panels? Look around, maybe there’s a haunted house movie coming up and that’ll give you the chance to talk about strange noises from the attic instead.

Sometimes there is a good connection between a brand and a movie but it won’t be one the brand enjoys. What if a smart character in the haunted house movie argues against the lunacy of splendid ceiling panels? You won’t know until after the movie is released and most outreach projects need to begin long before then.

There are times you just have to be careful.

Hipster-macho brand Bluebeard’s Revenge partnered with the Suicide Squad movie. There’s overlapping demographics in the audience. We also have a brand that sells straight razors partnering a movie with ‘Suicide’ in the title (and featuring The Joker, no less). This certainly doesn’t mean it is a bad partnership or there are no outreach potentials but it does mean some careful handling will be needed.

Build Anchors

An anchor is one half of a handshake in well designed outreach and engagement campaign. It is the reason why a blogger or other publisher would make an editorial decision to link back to your site.

Campaigns with no or weak anchors sometimes ask bloggers to link back to ‘credit the source’. This rarely works as publishers you care about know about nofollow.

Surveys inspired by movies a good example here. They’re easy to do, haven’t entirely jumped the shark and can be used to pitch some big media. Do they offer bloggers a reason to link back to the brand that paid for the survey? Not so much.

Your SEO mind has to come up with a reason, create something that wonderfully complements the survey data so that writers make the decision to reference it, therefore link, in their coverage.

Provide media assets

Media assets are the other half of the handshake in an SEO outreach campaign. These assets make it easier for bloggers to cover your campaign. Every movie inspired outreach and engagement campaign should have an image on offer, at least.

For example, take a look at this simple parallax from Grange on 007 cars. It was timely, out at the same time as the movie Spectre. Bloggers know to avoid thin content, writing nothing but sign-posts that direct readers elsewhere, and coming up with an angle to discuss an interactive like this requires time, energy and imagination.

Grange’s agency made sure a ‘full flat’ version of the infographic was available as a media asset to try and mitigate that problem. A sensible thing to do is to create some WordPress safe animated gifs (in terms of size, common theme width) of interactions on your interactive and make them available as assets to accompany your anchor.

Research international distribution plans

Movies are not released on the same date across the world. For example, Disney’s Big Hero 6 was released in the United States on the 7th November in 2014 but after Christmas in the UK on the 30th of January 2015.

It’s far easier to do your outreach for big movies with similar release dates in your target markets and the good news is it’s easy enough to research schedules far in advance.

It’s not unheard of for movies to change names for international releases too. This can make building an outreach and engagement campaign around them. For example, the 2012 The Avengers movie was called Avengers Assemble in the UK as not to clash with a classic TV series. The added catch? Most UK bloggers called the movie The Avengers anyway.

Work with your Affiliates

In most cases your Affiliate marketing channel can be used to help assist your SEO and Social media activities as it will give you access to hundreds of publishers.

Take just a little time to see how many content affiliates you have in your program. This information may be easily discoverable if the affiliate team have done this work already.

It’ll certainly be worth emailing them directly and saving money preventing a second agency doing exactly the same thing. Even cashback and voucher code sites are worth tipping off; they may well want to do some marketing around the movie as well and this could be a great chance to score some extra promotion.

However, if you are experimenting with attribution models then be aware of how providing linkbait and clickbait headlines to your army of publishers will influence that study. You’ll help those sites lay claim to radically more sales touch points.

Andrew Girdwood is Head of Media Technology at Cello Signal and a contributor to Search Engine Watch. You can connect with Andrew on Twitter or Google+.

What do content marketers need to know about SEO?

Search Engine Watch - Wed, 07/06/2016 - 7:37am

The way that search marketing has evolved over the last few years has brought content marketing and SEO ever closer together

Content creation and SEO used to be very separate disciplines in the past, but now it’s hard to see how either can be practiced effectively without at least some knowledge of the other.

Which brings me to the question: what do content marketers need to know about SEO (and vice versa) to achieve the best results?

In this post I’ll make an attempt to answer that question, with the help of Kevin GibbonsMD at Blueglass and Sticky Content’s Content Director Dan Brotzel.

What do content marketers need to know about SEO?

Here are some of the tactics and skills, mainly associated with SEO, that content marketers need to be aware of.

I’ve not mentioned technical SEO here, though it is of course important as the foundations on which effective SEO (and content marketing) is built.

Keyword research

Keyword research enables SEOs to find gaps and opportunities to help target pages rank, by understanding the popularity of keywords.

For example, a site producing content around wedding dresses needs to understand the most popular terms used, and the range of terms they need to optimise their content around.

There are lots of useful SEO tools (many are free) which can help with keyword research. Google’s Keyword Planner and Trends are obvious ones (though Keyword Planner has just become less useful), while just typing terms into Google and seeing the suggested searches is another way.

One tool I’ve found useful recently is Answer the Public, which provides some great insight into the kinds of questions people have around a particular topic.

Here are the results for ‘content marketing’. It provides some great insight into the kinds of questions people are asking around a topic, which should help content marketers to target more effectively, as well as generating some useful ideas.

According to Dan Brotzel:

“Keywords and other SEO insights are a vital tool. They provide a useful (and often surprising) index of users’ preoccupations, and the words that they actually use to phrase their searches. Anyone in the business of generating ideas for content should see keyword research/data as a rich resource for understanding user intent and interest, and make it integral to their brainstorming process.”

Knowledge of user search behaviour

This is related to keyword research, as this provides insight into how people search, the language used etc. It’s more than that though…

Insights such as the seasonality of some searches can inform content planning, as can the way people view and interact with search results.

Effective content marketing looks at the target audience’s questions and concerns and produces content to address their needs. Knowledge of how people search and what the search for provides plenty of insight to help with this goal.

Attracting authority links

Link building is a valuable tactic for SEOs and one which content marketers should be aware of.

They don’t necessarily have to actively build links, but can attract links by creating content that people want to link to and promoting in to relevant publications and channels.

Indeed, a 2016 link building study found that content based link building was by far the most effective method.

Proving the value of content

There are plenty of content marketing metrics to look at, and the SEO value of content should be part of the measurement applied to content efforts.

As Kevin Gibbons says, it can help to secure budget:

“Knowing the role SEO can play helps to prove the value of content. If you can forecast and report that your content will generate a monetary value in terms of organic traffic and revenue, this is when people can start to scale their investment towards building valuable content assets.”

There are several SEO-related metrics to look at – the organic traffic (and any related revenue) delivered by the content you produce, the links it attracts, and the success in securing organic search positions.

What do SEOs need do know about content marketing?

These are the tactics and skills normally associated with content production which are becoming ever more valuable to SEOs.

Many of these tactics are interchangeable. For example a focus on the target audience is essential for SEO and content marketing.

The importance of storytelling

Content creation requires a degree of creativity, which can also be valuable from an SEO perspective.

As Kevin Gibbons explains:

“The biggest thing SEO can learn from content marketing, in my opinion is around the importance of storytelling.

It’s vital that you get your message across, providing the best experience in the format that resonates with your target audience. SEOs can often be guilty of sticking to the tried and trusted campaigns that have worked in the past, great content marketers realise that it’s not about what we think, it’s about your audience.

Do your persona analysis, speak to your customers, find out what they really want to see and have a less is more approach towards driving and engagement through doing the best job possible to tell your story.”

A focus on the audience/customer

An effective content strategy needs to address the needs of your target customer, and should also align with business goals. It’s not just about attracting traffic, rather it should aim to attract the right kind of customer.

If you address the needs of your target customer through content, the viewers of this content are likely to be your target audience.

For example, retailers can use content such as how-to guides to attract potential customers. So, Repair Clinic produces useful guides on appliance repair. This is useful content which also ties in closely with, and therefore helps to promote, its products.

It’s great for SEO too, as it helps them to target searchers with appliance-related problems, its target audience.

There are SEO tools and techniques which can help to answer these questions, but a broader understanding of the target customer can be gained by using a wide range of information.

This includes customer surveys and reviews, information from customer service interactions, and much more.

As Dan Brotzel says, this process requires creativity:

“Content marketers need to apply editorial initiative and imagination to generating ideas for content that can both address users’ needs and business requirements. They need to find ways to answer the question: What kind of content do our users care about? What counts as a good idea for them? What kind of ideas and content can we credibly produce from within our niche? How can we use content to support our goals?”

Importance of quality content

The old SEO techniques of churning out content for the sake of putting target keywords on a page is no longer effective.

Algorithm updates have forced SEOs to think more about the quality of the content they produce, and this is also the focus of effective content marketing.

As Dan Brotzel says, quality is what users want:

“The great thing about the evolution of SEO is that it is pushing content ever closer to the one criterion that users are ever likely to care about: quality. Yes, you want your content to surface high in results (and to be accessible and scannable, come to that), but there’s no point being easy to find and consume if what you’re offering users isn’t on reflection actually worth finding.”

Quality is, of course, a very subjective term, and is ultimately something for the end user to judge, but the aim should be to produce content that is valuable.

This can be measured to a certain extent in the way that users interact with it (on-page behaviour, actions taken after reading etc) but also in the way that search engines rank content.

Quality can also be a factor in search rankings. For example, if the content is answering the question that the searcher typed into Google, this helps it rank higher.

This is the ‘long click’ (as explained here by Bill Slawski). It’s similar to – but not the same as – bounce rates.

It’s essentially a measure of how long a user spends on a page before returning to the search results page. If they take time, or don’t return to search results, it tells Google that the content has satisfied the searcher, and is therefore relevant to the search query.

Over time, quality content which achieves this will rank well. It will also attract links and social shares, all of which help in terms of SEO.

This is why a focus on evergreen content can be a great tactic for content marketers and SEOs alike.

For example, a post on SEO basics (since updated) from 2014 has delivered traffic over a long period of time. There’s an initial spike after publication, but the traffic didn’t drop off totally, it continued to deliver visitors to this site. In fact it attracted more than 25,000 pageviews last month, more than two years after publication.

The reason is that this is useful content for searchers, and this has helped the article top Google for the term ‘SEO basics’ for some time. This ranking then helps to deliver more visits, attract links, and so on. It’s a virtuous circle.

In summary

The reality is that no marketing discipline can exist in a vacuum. They all rely on skills and techniques normally associated with other disciplines.

Email marketers need some content skills to make their subject lines and email copy more effective, ecommerce sites rely on SEO techniques to attract customers, and so on.

For content marketing and SEO, its very important for practitioners to understand the importance of the tactics and skills of each.

In a nutshell, SEO requires good content to be really effective, while content producers need to use SEO to help with content planning, and to ensure that the content they work hard on can be found by their target audience.

Is Google AMP a ranking signal?

Search Engine Watch - Wed, 07/06/2016 - 5:16am

So I’ve been working my backside off trying to implement Google’s Accelerated Pages, with limited amounts of success and bucket-loads of frustration and I’ve come to the point now where I have to ask… is it all really worth it?

Before we get to the wider picture of why AMP is so important, and why ultimately you probably do need to implement the thing. Let’s first answer the simple question stated in the headline…

Is Google AMP a ranking signal?

No.

Gary Illyes, Webmaster Trends Analyst at Google, stated during his SEJ Summit Chicago appearance that, “Currently, AMP is not a mobile ranking factor.”

But of course you can pull that “Currently” apart as much as you like, and read into it a high likelihood that Google will probably use it as a direct ranking factor one day soon.

As Danielle Antosz from SEJ reports, Illyes further pledged that Google will be expanding AMP vertices to include Google News, Google Now, Play Newsstand, Now On Tap, and in the near future it will likely expand to product pages for ecommerce sites like Amazon.

Which brings me to my next question…

But does it actually matter whether AMP is a ranking signal or not?

Well… (again)…  No.

As you will have read countless times in countless search engine related news sites (well 3 or 4 anyway), the world of search is a drastically different place to what it was a couple of years ago and it continues to evolve every day. Every element we describe as a ‘traditional ranking signal’ is oversimplifying that element greatly.

It’s all about context, content and… stop me if you’ve heard this one before… user experience.

And that’s the key thing with AMP, it improves the user experience.

Which brings us neatly to…

Why should I implement Google AMP?

Here’s a super-quick refresher on AMP in case you’re completely lost:

Google’s Accelerated Mobile Pages are an open source initiative which aims to improve the performance of the mobile web. stripped-down versions of web pages.

When you Google search for an article on a mobile, you may have spotted the ‘lightning’ symbol. This is to indicate that by clicking on it, you will experience a cleaner, faster, streamlined version of the article that downloads instantly.

 

According to Kissmetrics, 40% of web users will abandon a page if it takes longer than three seconds to load, and that’s just on a desktop.

With mobile there are masses of other issues that can lead to abandonment – connection faults, poor formatting, non-mobile optimised content, badly placed links – so you can imagine why, if now more than 50% of searches take place on mobile, Google would want to improve the mobile web. It’s difficult to monetise a crappy experience.

According to Gary Illyes, AMP pages have a four times faster than average load time, 90% of publishers are seeing higher CTRs, 80% of publishers are getting more views, and the majority of publishers are seeing higher eCPMs.

So clearly AMP is working. Searchers are clicking through, they’re enjoying the experience, they’re seeking more AMP content to read… Wait. Hang on…

Of course AMP traffic is growing! For almost every content-related search I do on a mobile it’s difficult to get past the attractive, image-heavy carousel of AMP posts at the top of SERPs… of course the CTR is going to be a high! It’s basically a stacked deck.

Unfortunately if you’re (to continue the horrible analogy) in the game, you will need to sit at the table.

Implementing AMP is dead easy though right?

Uh…

Let’s just take a look at my Search Console AMP results for my own site right now. I have 258 pages with errors.

And this is after implementing all of Yoast’s recommendations and various plug-ins. Frustratingly I can see my AMP pages actually working, just by adding the /amp/ suffix to my URLs, yet I’m still seeng these errors.

But perhaps my own ultimate frustration lies in the fact there doesn’t seem to be much point in implementing AMP if my site doesn’t show up in Google News, as its the News carousel that provides all of the AMP content.

We already looked at why SEOs are so slow to implement AMP just a couple of months ago, and revealed that only 23% of SEOs are currently using AMP. It’s reasons such as the above that add to this general malaise.

There’s also the potential loss of revenue to consider…

As Rebecca Sentance reported in May, “Because AMP strips out a lot of the dynamic elements that slow down page loading time, search marketers have to do away with features that they depend on for business, such as comment systems, lead capture forms and other types of pop-up.”

Implementing AMP is obviously good for user experience, but of course Google is primarily a business…

Google launched three ad formats for AMP last month, intended to gain some ad share in mobile display advertising away from Facebook and other competitors.

AMP could be read as Google’s sly way to make you remove all your own ads in order to conform to its faster mobile web, then encouraging you to join up with it’s own ad exchange platform Doubleclick because you need to generate revenue somehow.

But then all of this could become horribly unstuck when every mobile web user on the planet decides they’ve truly had enough and downloads an ad-blocker.

Tl;dr: yeah you should probably implement AMP, Google has you right where it wants you anyway.

Or does it?

Neofeudal Web Publishing Best Practices Guide

SEO Book.com - Mon, 06/20/2016 - 1:44pm

At the abstract level, if many people believe in something then it will grow.

The opposite is also true.

And in a limitless, virtual world, you can not see what is not there.

The Yahoo Directory

Before I got into search, the Yahoo! Directory was so important to the field of search there were entire sessions at SES conferences on how to get listed & people would even recommend using #1AAA-widgets.com styled domains to alphaspam listings to the top of the category.

The alphaspam technique was a carry over from yellow page directories - many of which have went through bankruptcy as attention & advertising shifted to the web.

Go to visit the Yahoo! Directory today and you get either a server error, a security certificate warning, or a redirect to aabacosmallbusiness.com.

Poof.

It's gone.

Before the Yahoo! Directory disappeared their quality standards were vastly diminished. As a webmaster who likes to test things, I tried submitting sites of various size and quality to different places. Some sites which would get rejected by some $10 directories were approved in the Yahoo! Directory.

The Yahoo! Directory also had a somewhat weird setting where if you canceled a directory listing in the middle of the term they would often keep it listed for many years to come - for free. After many SEOs became fearful of links the directory saw vastly reduced rates of submissions & many existing listings canceled their subscriptions, thus leaving it as a service without much of a business model.

DMOZ

At one point Google's webmaster guidelines recommended submitting to DMOZ and the Yahoo! Directory, but that recommendation led to many lesser directories sprouting up & every few years Google would play a whack-a-mole game and strip PageRank or stop indexing many of them.

Many have presumed DMOZ was on its last legs many times over the past decade. But on their 18th birthday they did a spiffy new redesign.

Different sections of the site use different color coding and the design looks rather fresh and inviting.

Take a look.

However improved the design is, it is unlikely to reverse this ranking trend.

Lacking Engagement

Why did those rankings decline though? Was it because the sites suck? Or was it because the criteria to rank changed? If the sites were good for many years it is hard to believe the quality of the sites all declined drastically in parallel.

What happened is as engagement metrics started getting folded in, sites that only point you to other sites become an unneeded step in the conversion funnel, in much the same way that Google scrubbed affiliates from the AdWords ecosystem as unneeded duplication.

What is wrong with the user experience of a general web directory? There isn't any single factor, but a combination of them...

  • the breadth of general directories means their depth must necessarily be limited.
  • general directory category pages ranking in search results is like search results in search results. it isn't great from the user's perspective.
  • if a user already knows a category well they would likely prefer to visit a destination site rather than a category page.
  • if a user doesn't already know a category, then they would prefer to use an information source which prioritizes listing the best results first. the layout for most general web directories is a list of results which are typically in alphabetical order rather than displaying the best result first
  • in order to sound authoritative many directories prefer to use a neutral tone

If a directory mostly links to lower quality sites Google can choose to either not index it or not trust links from it. And even if a directory generally links to trustworthy sites, Google doesn't need to rank it to extract most the value from it.

The trend of lower traffic to the top tier general directory sites has happened across the board.

Many years ago Google's remote rater guidelines cited Joeant as a trustworthy directory.

Their traffic chart looks like this.

And the same sort of trend is true for BOTW, Business.com, GoGuides.org, etc.

There is basically nothing a general web directory can do to rank well in Google on a sustainable basis, at least not in the English language.

Even if you list every school in the city of Winnipeg that page can't rank if it isn't indexed & even if it is indexed it won't rank well if your site has a Panda-related ranking issue. There are a couple other issues with such a comprehensive page:

  • each additional listing is more editorial content cost in terms of building the page AND maintaining the page
  • the bigger the page gets the more a user needs something other than alphabetical order as a sort option
  • the more listings there are in a tight category the more the likelihood there will be excessive keyword repetition on the page which could get the page flagged for algorithmic demotion, even if the publisher has no intent to spam. Simply listing things by their name will mean repeating a word like "school" over 100 times on the above linked Winnipeg schools page. If you don't consciously attempt to lower the count a page like that could have the term repeated over 300 times.

Knock On Effects

In addition to those web directories getting fewer paid submissions, most are likely seeing a rise in link removal requests. Google's "fear first" approach to relevancy has even led them to listing DMOZ as an unnatural link source in warning emails to webmasters.

What's more, many people who use automated link clean up tools take the declining traffic charts & low rankings of the sites as proof that the links lack value or quality.

That means anyone who gets hit by a penalty & ends up in warning messages not only ends up with less traffic while penalized, but they also get extra busy work to do while trying to fix whatever the core problem is.

And in many cases fixing the core problem is simply unfeasible without a business model change.

When general web directories are defunded it not only causes many of them to go away, but it also means other related sites and services disappear.

  • Editors of those web directories who were paid to list quality sites for free.
  • Web directory review sites.
  • SEOs, internet marketers & other businesses which listed in those directories

Now perhaps general web directories no longer really add much value to the web & they are largely unneeded.

But there are other things which are disappearing in parallel which were certainly differentiated & valuable, though perhaps not profitable enough to maintain the "relevancy" footprint to compete in a brand-first search ecosystem.

Depth vs Breadth

Unless you are the default search engine (Google) or the default social network everyone is on (Facebook), you can't be all things to all people.

If you want to be differentiated in a way that turns you into a destination you can't compete on a similar feature set because it is unlikely you will be able to pay as much for traffic-driven partnerships as the biggest players can.

Can niche directories or vertical directories still rank well? Sure, why not.

Sites like Yelp & TripAdvisor have succeeded in part by adding interactive elements which turned them into sought after destinations.

Part of becoming a destination is intentionally going out of their way to *NOT* be neutral platforms. Consider how many times Yelp has been sued by businesses which claimed the sales team did or was going to manipulate the displayed reviews if the business did not buy ads. Users tend to trust those platforms precisely because other users may leave negative reviews & that (usually) offers something better than a neutral and objective editorial tone.

And that user demand for those reviews, of course, is why Google stole reviews from those sorts of sites to try to prop up the Google local places pages.

It was a point of differentiation which was strong enough that people wanted it over Google. So Google tried to neutralize the advantage.

Blogs

The above section is about general directories, but the same concept applies to almost any type of website.

Consider blogs.

A decade ago feed readers were commonplace, bloggers often cross-linked & bloggers largely drove the conversation which bubbled up through mainstream media.

Google Reader killed off RSS feed readers by creating a fast, free & ad-free competitor. Then Google abruptly shut down Google Reader.

Not only do whimsical blogs like Least Helpful or Cute Overload arbitrarily shut down, but people like Chris Pirillo who know tech well suggest blogging is (at least economically) dead.

Many of the people who are quitting are not the dumb, the lazy, and the undifferentiated. Rather many are the wise trend-aware players who are highly differentiated yet find it impossible to make the numbers work:

The conversation started when revenues were down, and I had to carry payroll for a month or two out of my personal account, which I had not had to do since shortly after we started this whole project. We tweaked some things (added an ad or two which we had stripped back for the redesign, reminded people about ad-blockers and their impact on our ability to turn a profit, etc.) and revenue went back up a bit, but for a hot minute, you’ll remember I was like: “Theoretically, if this industry went further into the ground which it most assuredly will, would we want to keep running the site as a vanity project? Probably not! We would just stop doing it.”

In the current market Google can conduct a public relations campaign on a topic like payday loans, have their PR go viral & then if you mention "oh yeah, so Google is funding the creation of doorway pages to promote payday loans" it goes absolutely nowhere, even if you do it DURING THE NEWS CYCLE.

So much of what exists is fake that anything new is evaluated from the perception of suspicion.

While the real (and important) news stories go nowhere & the PR distortions spread virally, the individual blogger ends up feeling a bit soulless if they try to make ends meet:

"The American Mama reached tens of thousands of readers monthly, and under that name I worked with hundreds of big name brands on sponsored campaigns. I am a member of virtually every ‘blog network’ and agency that “connects brands with bloggers”. ... What’s the point of having your own space to write if you’re being paid to sound like you work for a corporation? ... PR Friendly says “For the right price, I will be anyone you want me to be.” ... I’m not saying blogging is dying, but this specific little monster branch of it, sponsored content disguised as horribly written “day in the life” stories about your kids and pets? It can’t possibly last. Do you really want to be stuck on the inside when it crumbles?"

If you can't get your own site to grow enough to matter then maybe it makes sense to contribute to someone else's to get your name out there.

I recently received this unsolicited email:

"Hello! This is Theodore, a writer and chief editor at SomeSiteName.Com I noticed that you are accepting paid reviews online and you will be glad to know that now you can also publish your Sponsored content to SomeSite via me. SomeSite.Com is a leading website which deals in Technology, Social Media, Internet Stuff and Marketing. It was also tagged as Top 10 _____ websites of 2016 by [a popular magazine]. Website Stats- Alexa Rank: [below 700] Google PageRank: 6/10 Monthly Pageviews: 5+ Million Domain Authority: 85+ Price : $500 via PayPal (Once off Payment) Let me know if you are interested and want to feature your website product like nothing! This will not only increase your traffic but increase in overall SEO Score as well. Thanks"

That person was not actually a member of that site's team, but they had found a way to get their content published on it.

In part because that sort of stuff exists, Google tries to minimize the ability for reputation to flow across sites.

The large platforms are so smug, so arrogant, they actually state the following sort of crap in interviews:

"There's a space in the world for art, but that's different from trying to build products at scale. The one thing that does make me a little nervous is a lot of my designer friends are still focused building websites and I'm not sure that's a growth business anymore. If you look at people who are doing interesting work, they tend to be building inside these platforms like Facebook and finding ways to do interesting work in there. For instance, journalists. Instant Articles is a really great way for stories to be told."

Sure you can bust your ass to build up Facebook, but when their business model changes (bye social gaming companies, hello live streaming video) best of luck trying to follow them.

And if you starve during the 7 lean years in between when your business model is once again well aligned with Facebook you can't go back in time to give yourself a meal to un-starve.

Content Farms

Ehow.com has removed *MILLIONS* of pages of content since getting hit by Panda. And yet their ranking chart looks like this

What is crazy is the above chart actually understates the actual declines, because the shift of search to mobile & increasing prevalence of ads in the search results means estimates of organic search traffic may be overstated significantly compared to a few years prior.

A half-decade ago a bootstrapped eHow competitor named ArticlesBase got some buzz in TechCrunch because they were making about $500,000 a month on about 20 million monthly unique visitors. That business was recently listed on Flippa. They are getting about a half-million unique monthly visitors (off 95%) and about $2,000 a month in revenues (off about 99.6%).

The negative karma with that site (in terms of ability to rank) is so bad that the site owner suggested on Flippa to publish any new content from new authors onto different websites: "its not going to get to 0 as most of the traffic is not google today, but we would suggest to push out the fresh daily incoming content to new sites - thats where the growth is."

Now a person could say "eHow deserves to die" and maybe they are right. BUT one could easily counter that point by noting...

  • the public who owns the shares owns the ongoing losses & many top insiders cashed out long ago
  • Google was getting a VIG on eHow on their ride up & is still collecting one on the way down (along with funding other current parallel projects from the very same people with the very same Google ad network)
  • Demand Media's partner program where they syndicate eHow-like content to newspapers like USA Today keeps growing at 15% to 20% a year (similar process, author, content, business model, etc. ... only a different URL hosting the content)
  • look at this and you'll see how many publishing networks are still building the same sort of content but are cross-marketing across networks of sites. What's more some of the same names are at the new plays. For example, Demand Media's founder was the chairman of an SEO firm bought by Hearst publishing & his wife is on the about us page of Evolve Media's ModernMom.com

The wrappers around the content & masthead logos change, but by and large the people and strategies don't change anywhere near as quickly.

Web Portals & News Sites

As the mainstream media gets more desperate, they are more willing to partner with the likes of Demand Media to get any revenue they can.

You see the reality of this desperation in the stock charts for newspaper companies.

Or how about this chart for Yahoo.com.

It doesn't look particularly bad, especially if you consider that Yahoo has shut down many of their vertical sites.

Underlying flat search traffic charts misses declining publisher CPMs and the click traffic mix shift away from organic toward paid search channels as search traffic shifts to mobile devices & Google relentlessly increases the size of the search ads. Yahoo may still rank #3 for keyword x, but if that #3 ranking is below the fold on both mobile and desktop devices they might need to rank #1 to get as much traffic as #3 got a couple years ago.

Yahoo! was once the leading search portal & now they are worth about 1/5th of LinkedIn (after backing out their equity stakes in Alibaba and Yahoo! Japan).

The chart is roughly flat, but the company is up for a fire sale because organic search result displacement & the value of traffic has declined quicker than Yahoo! can fire employees & none of their Hail Mary passes worked.

Ms. Mayer compared the [Polyvore] deal to Google’s acquisition of YouTube in 2006, arguing that “you can never overpay” for a company with the potential to land a huge new base of users.
...
“Her core mistake was this belief that she could reinvent Yahoo,” says a former senior executive who left the company last year. “There was an element of her being a true believer when everyone else had stopped.”

The same line of thinking was used to justify the Tumblr acquisition, which has went nowhere fast - just like their 50+ other acquisitions.

Yahoo! shut down many verticals, fired many workers, sold off some real estate & is exploring selling their patents.

Chewing Up the Value Chain

Smaller devices that are harder to use means the gateways have to try to add more features to maintain relevance.

As they add features, publishers get displaced:

The Web will only expand into more aspects of our lives. It will continue to change every industry, every company, and every life on the planet. The Web we build today will be the foundation for generations to come. It’s crucial we get this right. Do we want the experiences of the next billion Web users to be defined by open values of transparency and choice, or the siloed and opaque convenience of the walled garden giants dominating today?

And if converting on mobile is hard or inconvenient, many people will shift to the defaults they know & trust, thus choosing to buy on Amazon rather than a smaller ecommerce website. One of my friends who was in ecommerce for many years stated this ultimately ended up becoming the problem with his business. People would email him back and forth about the product, related questions, and basically go all the way through the sales process with getting him to answer every concern & recommend each additional related product needed, then at the end they would ask him to price match Amazon & if he couldn't they would then buy from Amazon. If he had more scale he might have been able to get a better price from suppliers and compete with Amazon on price, but his largest competitor who took out warehouse space also filed for bankruptcy because they were unable to make the interest payments on their loans.

We live in a society which over-values ease-of-use & scale while under-valuing expertise.

Look at how much consolidation there has been in the travel market since Google Flights launched & Google went pay-to-play with hotel search.

Expedia owns Travelocity & Orbitz. Priceline owns Kayak. Yahoo! Travel simply disappeared. TripAdvisor is strong, but even they were once a part of Expedia.

How different are the remaining OTAs? One could easily argue they are less differentiated than this article about the history of the travel industry makes Skift against other travel-related news sites.

How many markets are strong enough to support the creation of that sort of featured editorial content?

Not many.

And most companies which can create that sort of in-depth content leverage the higher margins on shallower & cheaper content to pay for that highly differentiated featured content creation.

But if the knowledge graph and new search features are simply displacing the result set the number of people who will be able to afford creating that in-depth featured content is only further diminished.

Over 5 years ago Bing's Stefan Weitz mentioned they wanted to move search from a web of nouns to a web of verbs & to "look at the web as a digital representation of the physical world." Some platforms are more inclusive than Google is & decide to partner rather than displace, but Bing's partnership with Yelp or TripAdvisor doesn't help you if you are a direct competitor of Yelp or TripAdvisor, or if your business was heavily reliant on one of these other channels & you fall out of favor with them.

Chewing Up Real Estate

There are so many enhanced result features in the search results it is hard to even attempt to make an exhaustive list.

As search portals rush to add features they also rush to grab real estate & outright displace the concept of "10 blue links."

There has perhaps been nothing which captured the sentiment better than

.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014

The following is paraphrased, but captures the intent to displace the value chain & the roll of publishers.

"the journeys of users. their desire to be taken and sort of led and encouraged to proceed, especially on mobile devices (but I wouldn't say only on mobile devices).
...
there are a lot of users who are happy to be provided with encouragement and leads to more and more interesting information and related, grouped in groups, leading lets say from food to restaurants, from restaurants to particular types of restaurants, from particular types of restaurants to locations of those types of restaurants, ordering, reservations.

I'm kind of hungry, and in a few minutes you've either ordered food or booked a table. Or I'm kind of bored, and in a few minutes you've found a book to read or a film to watch, or some other discovery you are interested in." - Andrey Lipattsev

What role do publishers have in the above process? Unpaid data sources used to train algorithms at Facebook & Google?

Individually each of these assistive search feature roll outs may sound compelling, but ultimately they defund publishing.

Looks like Symptom Cards will lead to additional, more-focused searches (& not to third party sites.) #seo pic.twitter.com/vhkz5ZflMJ— Glenn Gabe (@glenngabe) June 20, 2016

Not a "Google Only" Problem

People may think I am unnecessarily harsh toward Google in my views, but this sort of shift is not a Google-only thing. It is something all the large online platforms are doing. I simply give Google more coverage because they have a history of setting standards & moving the market, whereas a player like Yahoo! is acting out of desperation to simply try to stay alive. The market capitalization of the companies reflect this.

Google & Facebook control the ecosystem. Everyone else is just following along.

"digital is eating legacy media, mobile is eating digital, and two companies, Facebook and Google, are eating mobile. ... Since 2011, desktop advertising has fallen by about 10 percent, according to Pew. Meanwhile mobile advertising has grown by a factor of 30 ... Facebook and Google, control half of net mobile ad revenue." - Derek Thompson

The same sort of behavior is happening in China, where Google & Facebook are prohibited from competing.

As publishers get displaced and defunded online platforms can literally buy the media: “There’s very little downside. Even if we lose money it won’t be material,” Alibaba's Mr. Tsai said. “But the upside [in buying SCMP] is quite interesting.”

The above quote was on Alibaba buying the newspaper of record in Hong Kong.

As bad as entire industries becoming token purchases may sound, that is the optimistic view. :D

Facebook's Instant Articles and Google's AMP those make a token purchase unnecessary: "I don't think it's any secret that you're going to see a bloodbath in the next 12 months," Vice Media's Shane Smith said, referring to digital media and broadcast TV. "Facebook has bought two-thirds of the media companies out there without spending a dime."

Those services can dictate what gets exposure, how it is monetized, and then adjust the exposure and revenue sharing over time to keep partners desperate & keep them hooked.

“If Thiel and Nick Denton were just a couple of rich guys fighting over a 1st Amendment edge case, it wouldn't be very interesting. But Silicon Valley has unprecedented, monopolistic power over the future of journalism. So much power that its moral philosophy matters.” - Nate Silver

Give them just enough (false) hope to stay partnered.

All the while track user data more granularly & run AI against it to disintermediate & devalue partners.

TV networks are aware of the risks of disintermediation and view Netflix with more suspicion than informed SEOs view Google:

for all the original shows Netflix has underwritten, it remains dependent on the very networks that fear its potential to destroy their longtime business model in the way that internet competitors undermined the newspaper and music industries. Now that so many entertainment companies see it as an existential threat, the question is whether Netflix can continue to thrive in the new TV universe that it has brought into being.
...
“ ‘Breaking Bad’ was 10 times more popular once it started streaming on Netflix.” - Michael Nathanson
...
the networks couldn’t walk away from the company either. Many of them needed licensing fees from Netflix to make up for the revenue they were losing as traditional viewership shrank.

And just like Netflix, Facebook will move into original content production.

The Wiki

Wikipedia is certainly imperfect, but it is also a large part of why other directories have went away. It is basically a directory tied to an encyclopedia which is free and easy to syndicate.

Every large search & discovery platform has an incentive for Wikipedia to be as expansive as possible.

The bigger Wikipedia gets, the more potential answers and features can be sourced from it. More knowledge graph, more instant answers, more organic result displacement, more time on site, more ad clicks.

Even if a knowledge graph listing is wrong, the harm done by it doesn't harm the search service syndicating the content unless people create a big deal of the error. But if that happens then people will give feedback on how to fix the error & that is a PR lead into the narrative of how quickly search is improving and evolving.

"Wikipedia used to instruct its authors to check if content could be dis-intermediated by a simple rewrite, as part of the criteria for whether an article should be added to wikipedia. There are many rascals on the Internets; none deserving of respect." - John Andrews

Sergy Brin donates to fund the expansion of Wikipedia. Wikipedia rewrites more webmaster content. Google has more knowledge graph grist and rich answers to further displace publishers.

I recently saw the new gray desktop search results Google is tested. When those appear the knowledge graph appears inline with the regular search results & even on my huge monitor the organic result set is below the fold.

The problem with that is if your brand name is the same brand name that is in the knowledge graph & you are not the dominant interpretation then you are below the fold on all devices for your core brand UNLESS you pay Google for every single click.

How much should a brand like The Book of Life pay Google for being a roadblock? What sort of tax is appropriate & reasonable? How high will you bid in a casino where the house readjusts the shuffle & deal order in the middle of the hand?

I recently did a search on Bing & inside their organic search results they linked to a Mahalo-like page called Bing Knows. I guess this is a feature in China, but it could certainly spread to other markets.

If they partnered with an eBay or Amazon.com and put a "buy now" button in the search results they'd have just about completely closed the loop there.

Broad Commodification

The reason I started this article with directories is their role is to link to sites. They are categorized collections of links which have been heavily commodified & devalued to the point they are rendered unnecessary and viewed adversely by much of the SEO market (even the ones with decent editorial standards).

Just like links got devalued, so did domain names.

And, as mentioned above in the parts about blogging, content farms, web portals & news sites ... the same trend is happening to almost every type of content.

Online ad revenues are still growing quickly, but they are not flowing through to old media & many former leading bloggers consider blogging dead.

Big platform players like Google and Facebook broaden cross-device user tracking to create new relevancy signals and extract most the value created by publisher. The more information the platform owns the more of a starving artist the partners become.

As partners become more desperate, they overvalue growth (just like Yahoo! with Polyvore):

"It's the golden age right now," [Thrillist CEO Ben Lerer] said. "If you're a digital publisher, you have every big TV company calling you. When I look at media brands, if a media brand disappeared tomorrow, would I notice?" he said. "And there are a bunch of brands that have scale, and maybe a lot of money raised, and maybe this and that, but, actually, I might not know for a year. There's so many brands like that. Like, what does it really stand for? Why does it exist?"

Disruption is not a strategy, but the whole point of accelerating it & pushing it (without an adequate plan for "what's next") is to re-establish feudal lords.

The web is a virtual land where the commodity which matters most is attention. If you go back in time, lords maintained wealth & control through extracting rents.

A few years ago a quote like the following one may have sounded bizarre or out of place

These are the people who guard the company’s status as what ranking team head Amit Singhal often sees characterised as “the biggest kingmaker on this Earth.”

But if you view it through the some historical context it isn't hard to understand

"The nobles still had the power to write the law, and in a series of moves that took place in different countries at different times, they taxed the bazaar, broke up the guilds, outlawed local currencies, and bestowed monopoly charters on their favorite merchants. ... It was never really about efficiency anyway; industrialization was about restoring the power of those at the top by minimizing the value and price of human laborers." - Douglas Rushkoff

Google funding LendUp & ranking their doorway pages while hitting the rest of the industry is Google bestowing "monopoly charters on their favorite merchants."

Headwinds

The issue is not that the value of anything drops to zero, but rather a combine set of factors shrinks down the size of the market which can be profitably served. Each of these factors eat at margins...

  • lower CPMs
  • the rise of ad blockers (funded largely by some big ad networks paying to allow their own ads through while blocking competing ad networks)
  • rise of programmatic ads (which shift advertiser budget away from publisher to various forms of management)
  • larger ad sizes: "Based on early testing, some advertisers have reported increases in clickthrough rates of up to 20% compared to current text ads. "
  • increase of vertical search results in Google & more ads + self-hosted content in Facebook's feed
  • shift of search audience to mobile devices which have no screen real estate for organic search results and lower cost per click (there's a reason Google AdSense is publishing tips on making more from mobile)
  • increased algorithmic barrier to entry and longer delay times to rank

The least sexy consultant pitch in the world: "Sure I can probably rank your website, but it will take a year or two, cost you at least $80,000 per year, and you will still be below the fold even if we get to #1 because the paid search ads fill up the first screen of results."

That isn't going to be an appealing marketing message for a new small business with a limited budget.

The Formula

“The open web is pretty broken. ... Railroad, electricity, cable, telephone—all followed this similar pattern toward closedness and monopoly, and government regulated or not, it tends to happen because of the power of network effects and the economies of scale” - Ev Williams.

The above article profiling Ev Williams also states: "An April report from the web-analytics company Parse.ly found that Google and Facebook, just two companies, send more than 80 percent of all traffic to news sites."

The same general trend is happening to almost every form of content - video, news, social, etc..

  • a big platform over-promotes a vertical to speed up buy-in (perhaps even offering above market rates or other forms of compensation to get the flywheel started)
  • other sources join the market without that compensation & then the compensation stream gets yanked
  • displacement of the source by a watered down copy (eHow or Wikipedia styled rewrite), or some zero-cost licensing arrangement (Facebook Instant Articles, Google AMP, syndicating Wikipedia rewrites)
  • strategic defunding of the content source
  • promise of future gains causing desperate publishers to lean harder into Google or Facebook even as they squeeze more water out of the rock.

Hey, sure your traffic is declining & your revenue is declining faster. You are getting squeezed out, but if you trust the primary players responsible for the shift & rely on Instant Articles or Google's AMP this time will be different.

...or maybe not...

Facts & Opinions

When I saw some Google shills syndicating Google's "you can't copyright facts" pitch without question I cringed, because I knew where that was immediately headed.

A year later the trend was obvious.

The Internet commoditized the distribution of facts. The "news" media responded by pivoting wholesale into opinions and entertainment.— Naval Ravikant (@naval) May 26, 2016

So now we get story pitches where the author tries to collect a few quote sources to match the narrative already in their head. Surely this has gone on for a long time, but it has rarely been so transparently obvious and cringeworthy as it is today.

How modern journalism works pic.twitter.com/i2CRnwAWZy— Nick Cohen (@NickCohen4) June 15, 2016

And if you stray too far from facts into opinions & are successful, don't be surprised if you end up on the receiving end of proxy lawsuits:

Can we talk about how strange it is for a group of Silicon Valley startup mentors to embrace secret proxy litigation as a business tactic? To suddenly get sanctimonious about what is published on the internet and called News? To shame another internet company for not following ‘the norms’ of a legacy industry? The hypocrisy is mind bending.

The desperation is so bad news sites don't even attempt to hide it. And part of what is driving that is bot-driven content further eroding margins on legitimate publishing. Google not only ranks those advertorials, but they also promote some of the auto-generated articles which read like:

As many as 1 analysts, the annual sales target for company name, Inc. (NYSE:ticker) stands at $45.13 and the median is $45.13 for the period closed 3.

The bearish target on sales is $45.13 and the bullish estimate is $45.13, yielding a standard deviation of 1.276%.

Not more than 1 investment entities have updated sales projections on upside over the last week while 1 have downgraded their previously provided sales targets. The estimates highlight a net change of 0% over the last 1 weeks period.

Sales estimated amount is a foremost parameter in judging a firm’s performance. Nearly 1 analysts have revised sales number on the upside in last one month and 1 have lowered their targets. It demonstrates a net cumulative change of 0% in targets against sales forecasts which were given a month ago.

In latest quarterly period, 1 have revised targeted sales on upside and 1 have decreased their projections. It demonstrates change of 4.898%.

I changed a few words in each sentence of that quote to make it harder to find the source as I wasn't trying to out them specifically. But the auto-generated content was ranked by Google & monetized via inline Google AdSense ads promoting the best marijuana stocks to invest in and warning of a pending 80% stock market crash coming soon this year.

Hey at least it isn't a TOTALLY fake story!

Publishers get the message loud and clear. Tronc wants to ramp up on AI driven video content at scale:

"There's all these really new, fun features we're going to be able to do with artificial intelligence and content to make videos faster," Ferro told interviewer Andrew Ross Sorkin. "Right now, we're doing a couple hundred videos a day; we think we should be doing 2,000 videos a day."

All is well, news & information are just externalities to a search engine ad network.

No big deal.

"With newspapers dying, I worry about the future of the republic. We don’t know yet what’s going to replace them, but we do already know it’s going to be bad." - Charlie Munger

Build a Brand

Build a brand, that way you are protected from the rapacious tech platforms.

Or so the thinking goes.

But that leads back to the above image where The Book of Life is below the fold on their own branded search query because there is another interpretation Google feels is more dominant.

The big problem with "brand as solution" is you not only have to pay to build a brand, but then you have to pay to protect it.

And the number of search "innovations" to try to siphon off some late funnel branded traffic and move it back up the funnel to competitors (to force the brand to pay again for their own brand to try to displace the "innovations") will only continue growing.

And at any point in time if Disney makes a movie using your brand name as the name of the movie, you are irrelevant and need of a rebrand overnight, unless you commit to paying Google for your brand forever.

Having an offline location can be a point of strength and a point of differentiation. But it can also be a reason for Google to re-route user traffic through more Google owned & controlled pages.

Further, most large US offline retailers are doing horrible.

Almost all the offline growth is in stores selling dirt cheap unbranded imported stuff like Dollar General or Family Dollar & stores like Ross and TJ Maxx which sell branded item remainders at discount prices. And as Amazon gets more efficient by the day, other competitors with high cost structures & less efficient operations grow relatively less efficient over time.

The Wall Street Journal recently published an article about a rift between Wal-Mart & Procter & Gamble: “They sell crappy private label, so you buy Swiffer with a crappy refill,” said one of the people familiar with the product changes. “And then you don’t buy again.”

In trying to drive sales growth, P&G is resorting to some Yahoo!-like desperate measures, included meetings where "Some workers donned gladiator-like armor for the occasion."

Riding on other platforms or partners carries the same sorts of risks as trusting Google or Facebook too much.

Even owning a strong brand name and offline distribution does not guarantee success. Sears already spun out their real estate & they are looking to sell the Kenmore & Craftsman brands.

The big difference between the web and offline platforms is the marginal cost of information is zero, so they can quickly & cheaply spread to adjacent markets in ways that physically constrained offline players can not & some of the big web platforms have far more data on people than governments do. It is worth noting one of the things that came out of the Snowden leaks is spooks were leveraging Google's DoubleClick cookies for tracking users.

As desperate stores/platforms see slowing growth they squeeze for margins and seek to accelerate growth any way possible. Chasing growth ultimately leads to the promise of what differentiates them disappearing. I recently bought some "hand crafted" soaps on Etsy, which shipped from Shenzen.

I am not sure how that impacts other artisinal soap sellers, but it makes me less likely to buy that sort of product from Etsy again.

And for as much as I like shopping on Amazon, I was uninspired when a seller recently sent me this.

Amazon might usually be great for buyers & great for affiliates, but hearing how they are quickly expanding their private label offerings wouldn't be welcome news for a merchant who is overly-reliant on them for sales in any of those categories.

The above sort of activity is what is going on in the real world even among brands which are not under attack.

The domestic economic landscape is getting quite ugly:

America’s economy today is in some respects more concentrated than it was during the Gilded Age, whose excesses prompted the Progressive Era reforms the FTC exemplifies. In sector after sector, from semiconductors and cable providers to eyeglass manufacturers and hotels, a handful of companies dominate. These giants use their market power to hike prices for consumers and suppress wages for workers, worsening inequality. Consolidation also appears to be driving a dramatic decline in entrepreneurship, closing off opportunity and suppressing growth. Concentration of economic power, in turn, tends to concentrate political power, which incumbents use to sway policies in their favor, further entrenching their dominance.

And the local abusive tech monopolies are now firmly promoting the TPP: "make it more difficult for TPP countries to block Internet sites" = countries should have less influence over the web than individual Facebook or Google engineers do.

In a land of algorithmic false positives that cause personal meltdowns and organizational breakdowns there is nothing wrong at all with that!

I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.

Luckily the world can depend on China to drive growth and it will save us.

Or maybe there is a small problem with that line of thinking...

Beijing’s intellectual property regulator has ordered Apple Inc. to stop sales of the iPhone 6 and iPhone 6 Plus in the city, ruling that the design is too similar to a Chinese phone, in another setback for the company in a key overseas market.

Can any experts chime in on this?

Let's see...

First, there is Wal-Mart selling off their Chinese e-commerce operation to the #2 Chinese ecommerce company & then there's this from the top Chinese ecommerce company:

“The problem is the fake products today are of better quality and better price than the real names. They are exactly the same factories, exactly the same raw materials but they do not use the names.” - Alibaba's Jack Ma

Reinventing SEO

SEO Book.com - Thu, 05/12/2016 - 3:44am
Back in the Day...

If you are new to SEO it is hard to appreciate how easy SEO was say 6 to 8 years ago.

Almost everything worked quickly, cheaply, and predictably.

Go back a few years earlier and you could rank a site without even looking at it. :D

Links, links, links.

Meritocracy to Something Different

Back then sharing SEO information acted like a meritocracy. If you had something fantastic to share & it worked great you were rewarded. Sure you gave away some of your competitive advantage by sharing it publicly, but you would get links and mentions and recommendations.

These days most of the best minds in SEO don't blog often. And some of the authors who frequently publish literally everywhere are a series of ghostwriters.

Further, most of the sharing has shifted to channels like Twitter, where the half-life of the share is maybe a couple hours.

Yet if you share something which causes search engineers to change their relevancy algorithms in response the half-life of that algorithm shift can last years or maybe even decades.

Investing Big

These days breaking in can be much harder. I see some sites with over 1,000 high quality links that are 3 or 4 months old which have clearly invested deep into 6 figures which appear to be getting about 80 organic search visitors a month.

From a short enough timeframe it appears nothing works, even if you are using a system which has worked, should work, and is currently working on other existing & trusted projects.

Time delays have an amazing impact on our perceptions and how our reward circuitry is wired.

Most the types of people who have the confidence and knowledge to invest deep into 6 figures on a brand new project aren't creating "how to" SEO information and giving it away free. Doing so would only harm their earnings and lower their competitive advantage.

Derivatives, Amplifications & Omissions

Most of the info created about SEO today is derivative (people who write about SEO but don't practice it) or people overstating the risks and claiming x and y and z don't work, can't work, and will never work.

And then from there you get the derivative amplifications of don't, can't, won't.

And then there are people who read and old blog post about how things were x years ago and write as though everything is still the same.

Measuring the Risks

If you are using lagging knowledge from derivative "experts" to drive strategy you are most likely going to lose money.

  • First, if you are investing in conventional wisdom then there is little competitive advantage to that investment.
  • Secondly, as techniques become more widespread and widely advocated Google is more likely to step in and punish those who use those strategies.
  • It is when the strategy is most widely used and seems safest that both the risk is at its peak while the rewards are de minimus.

With all the misinformation, how do you find out what works?

Testing

You can pay for good advice. But most people don't want to do that, they'd rather lose. ;)

The other option is to do your own testing. Then when you find out somewhere where conventional wisdom is wrong, invest aggressively.

"To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. Most large organizations embrace the idea of invention, but are not willing to suffer the string of failed experiments necessary to get there. Outsized returns often come from betting against conventional wisdom, and conventional wisdom is usually right." - Jeff Bezos

That doesn't mean you should try to go against consensus view everywhere, but wherever you are investing the most it makes sense to invest in something that is either hard for others to do or something others wouldn't consider doing. That is how you stand out & differentiate.

But to do your own testing you need to have a number of sites. If you have one site that means everything to you and you get wildly experimental then the first time one of those tests goes astray you're hosed.

False Positives

And, even if you do nothing wrong, if you don't build up a stash of savings you can still get screwed by a false positive. Even having a connection in Google may not be enough to overcome a false positive.

Cutts said, “Oh yeah, I think you’re ensnared in this update. I see a couple weird things. But sit tight, and in a month or two we’ll re-index you and everything will be fine.” Then like an idiot, I made some changes but just waited and waited. I didn’t want to bother him because he’s kind of a famous person to me and I didn’t want to waste his time. At the time Google paid someone to answer his email. Crazy, right? He just got thousands and thousands of messages a day.

I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”

“How did you go bankrupt?"
Two ways. Gradually, then suddenly.”
― Ernest Hemingway, The Sun Also Rises

True Positives

A lot of SEMrush charts look like the following

What happened there?

Well, obviously that site stopped ranking.

But why?

You can't be certain why without doing some investigation. And even then you can never be 100% certain, because you are dealing with a black box.

That said, there are constant shifts in the algorithms across regions and across time.

Paraphrasing quite a bit here, but in this video Search Quality Senior Strategist at Google Andrey Lipattsev suggested...

He also explained the hole Google has in their Arabic index, with spam being much more effective there due to there being little useful content to index and rank & Google modeling their ranking algorithms largely based on publishing strategies in the western world. Fixing many of these holes is also less of a priority because they view evolving with mobile friendly, AMP, etc. as being a higher priority. They algorithmically ignore many localized issues & try to clean up some aspects of that manually. But even whoever is winning by the spam stuff at the moment might not only lose due to an algorithm update or manual clean up, but once Google has something great to rank there it will eventually win, displacing some of the older spam on a near permanent basis. The new entrant raises the barrier to entry for the lower-quality stuff that was winning via sketchy means.

Over time the relevancy algorithms shift. As new ingredients get added to the algorithms & old ingredients get used in new ways it doesn't mean that a site which once ranked

  • deserved to rank
  • will keep on ranking

In fact, sites which don't get a constant stream of effort & investment are more likely to slide than have their rankings sustained.

The above SEMrush chart is for a site which uses the following as their header graphic

When there is literally no competition and the algorithms are weak, something like that can rank.

But if Google looks at how well people respond to what is in the result set, a site as ugly as that is going nowhere fast.

Further, a site like that would struggle to get any quality inbound links or shares.

If nobody reads it then nobody will share it.

The content on the page could be Pulitzer prize level writing and few would take it seriously.

With that design, death is certain in many markets.

Many Ways to Become Outmoded

The above ugly header design with no taste and a really dumb condescending image is one way to lose. But there are also many other ways.

Excessive keyword repetition like the footer with the phrase repeated 100 times.

Excessive focus on monetization to where most visitors quickly bounce back to the search results to click on a different listing.

Ignoring the growing impact of mobile.

Blowing out the content footprint with pagination and tons of lower quality backfill content.

Stale content full of outdated information and broken links.

A lack of investment in new content creation AND promotion.

Aggressive link anchor text combined with low quality links.

Investing in Other Channels

The harder & more expensive Google makes it to enter the search channel the greater incentive there is to spend elsewhere.

Why is Facebook doing so well? In part because Google did the search equivalent to what Yahoo! did with their web portal. The rich diversity in the tail was sacrificed to send users down well worn paths. If Google doesn't want to rank smaller sites, their associated algorithmic biases mean Facebook and Amazon.com rank better, thus perhaps it makes more sense to play on those platforms & get Google traffic as a free throw-in.

Of course aggregate stats are useless and what really matters is what works for your business. Some may find Snapchat, Instagram, Pinterest or even long forgotten StumbleUpon as solid traffic drivers. Other sites might do well with an email newsletter and exposure on Twitter.

Each bit of exposure (anywhere) leads to further awareness. Which can in turn bleed into aggregate search performance.

People can't explicitly look for you in a differentiated way unless they are already aware you exist.

Some amount of remarketing can make sense because it helps elevate the perceived status of the site, so long as it is not overdone. However if you are selling a product the customer already bought or you are marketing to marketers there is a good chance such investments will be money wasted while you alienate pas

Years ago people complained about an SEO site being far too aggressive with ad retargeting. And while surfing today I saw that same site running retargeting ads to where you can't scroll down the page enough to have their ad disappear before seeing their ad once again.

If you don't have awareness in channels other than search it is easy to get hit by an algorithm update if you rank in competitive markets, particularly if you managed to do so via some means which is the equivalent of, erm, stuffing the ballot box.

And if you get hit and then immediately run off to do disavows and link removals, and then only market your business in ways that are passively driven & tied to SEO you'll likely stay penalized in a long, long time.

While waiting for an update, you may find you are Waiting for Godot.

Google Rethinking Payday Loans & Doorway Pages?

SEO Book.com - Wed, 05/11/2016 - 1:34pm

Nov 12, 2013 WSJ: Google Ventures Backs LendUp to Rethink Payday Loans

Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”

What sort of strategy is helping to drive that industry transformation?

How about doorway pages.

That in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.

March 16, 2015 Google To Launch New Doorway Page Penalty Algorithm

Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.

These sorts of doorway pages are still live to this day.

Simply look at the footer area of lendup.com/payday-loans

But the pages existing doesn't mean they rank.

For that let's head over to SEMrush and search for LendUp.com


(Click for enlarged image)

Hot damn, they rank for about 10,000 "payday" keywords.

And you know their search traffic is only going to increase now that competitors are getting scrubbed from the marketplace.

Today we get journalists conduits for Google's public relations efforts writing headlines like: Google: Payday Loans Are Too Harmful to Advertise.

Today those sorts of stories are literally everywhere.

Tomorrow the story will be over.

And when it is.

Precisely zero journalists will have covered the above contrasting behaviors.

As they weren't in the press release.

Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp's lead in re-branding their offers as being something else in name.

A user comment on Google's announcement blog post gets right to the point...

Are you disgusted by Google's backing of LendUp, which lends money at rates of ~ 395% for short periods of time? Check it out. GV (formerly known as Google Ventures) has an investment in LendUp. They currently hold that position.

Oh, the former CIO and VP of Engineering of Google is the CEO of Zest Finance and Zest Cash. Zest Cash lends at an APR of 390%.

Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.

Don't expect to see a link to this blog post on TechCrunch.

There you'll read some hard-hitting cutting edge tech news like:

Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.

#MomentOfZeroTruth #ZMOT

Update: Kudos to the Google Public Relations team, as it turns out the CFPB is clamping down on payday lenders, so all the positive PR Google got on this front was simply them front running a known regulatory issue in the near future & turning it into a public relations bonanza. Further, absolutely NOBODY (other than the above post) mentioned the doorway page issue, which remains in place to this day & is driving fantastic rankings for their LendUp investment.

The (Hollow) Soul of Technology

SEO Book.com - Mon, 05/09/2016 - 5:01pm
The Daily Obituary

As far as being an investable business goes, news is horrible.

And it is getting worse by the day.

Look at these top performers.

The above chart looks ugly, but in reality it puts an optimistic spin on things...

  • it has survivorship bias
  • the Tribune Company has already went through bankruptcy
  • the broader stock market is up huge over the past decade after many rounds of quantitative easing and zero (or even negative) interest rate policy
  • the debt carrying costs of the news companies are also artificially low due to the central banking bond market manipulation
  • the Tribune Company recently got a pop on a buy out offer
Selling The Story

Almost all the solutions to the problems faced by the mainstream media are incomplete and ultimately will fail.

That doesn't stop the market from selling magic push button solutions. The worse the fundamentals get, the more incentive (need) there is to sell the dream.

Video

Video will save us.

No it won't.

Video is expensive to do well and almost nobody at any sort of scale on YouTube has an enviable profit margin. Even the successful individuals who are held up as the examples of success are being squeezed out and Google is trying to push to make the site more like TV. As they get buy in from big players they'll further squeeze out the indy players - just like general web search.

Even if TV shifts to the web, along with chunks of the associated ad budget, most of the profits will be kept by Google & ad tech management rather than flowing to publishers.

Some of the recent acquisitions are more about having more scale on an alternative platform or driving offline commerce rather than hoping for online ad revenue growth.

Expand Internationally

The New York times is cutting back on their operations in Paris.

Spread Across Topics

What impact does it have on Marketwatch's brand if you go there for stocks information and they advise you on weight loss tips?

And, once again, when everyone starts doing that it is no longer a competitive advantage.

There have also been cases where newspapers like The New York Times acquired About.com only to later sell it for a loss. And now even About.com is unbundling itself.

@glenngabe They're doing good work over there. Worth noting that they 301'd all the https://t.co/sF2JMe8lU5 health content to verywell.— Jesse Semchuck (@jessesem) April 26, 2016

Native Ads

The more companies who do them & the more places they are seen, the lower the rates go, the less novel they will seem, and the greater the likelihood a high-spending advertiser decides to publish it on their own site & then drive the audience directly to their site.

When it is rare or unique it stands out and is special, justifying the extra incremental cost. But when it is a scaled process it is no longer unique enough to justify the vastly higher cost.

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Get Into Affiliate Marketing

It won't scale across all the big publishers. It only works well at scale in select verticals and as more entities test it they'll fill up the search results and end up competing for a smaller slice of attention. Further, each new affiliate means every other affiliate's cookie lasts for a shorter duration.

It is unlikely news companies will be able to create commercially oriented review content at scale while having the depth of Wirecutter.

“We move as much product as a place 10 times bigger than us in terms of audience,” Lam said in an interview. “That’s because people trust us. We earn that trust by having such deeply-researched articles.”

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Charging People to Comment

It won't work, as it undermines the social proof of value the site would otherwise have from having many comments on it.

Meal Delivery Kits

Absurd. And a sign of extreme desperation.

Trust Tech Monopolies

Here is Doug Edwards on Larry Page:

He wondered how Google could become like a better version of the RIAA - not just a mediator of digital music licensing - but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it.

If we just give Google or Facebook greater control, they will save us.

No they won't.

You are probably better off selling meal kits.

As time passes, Google and Facebook keep getting a larger share of the pie, growing their rake faster than the pie is growing.

Here is the RIAA's Cary Sherman on Google & Facebook:

Just look at Silicon Valley. They’ve done an extraordinary job, and their market cap is worth gazillions of dollars. Look at the creative industries — not just the music industry, but all of them. All of them have suffered.

Over time media sites are becoming more reliant on platforms for distribution, with visitors having fleeting interest: "bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today."

Accelerated Mobile Pages and Instant Articles?

These are not solutions. They are only a further acceleration of the problem.

How will giving greater control to monopolies that are displacing you (while investing in AI) lead to a more sustainable future for copyright holders? If they host your content and you are no longer even a destination, what is your point of differentiation?

If someone else hosts your content & you are depended on them for distribution you are competing against yourself with an entity that can arbitrarily shift the terms on you whenever they feel like it.

“The cracks are beginning to show, the dependence on platforms has meant they are losing their core identity,” said Rafat Ali “If you are just a brand in the feed, as opposed to a brand that users come to, that will catch up to you sometime.”

Do you think you gain leverage over time as they become more dominant in your vertical? Not likely. Look at how Google's redesigned image search shunted traffic away from the photographers. Google's remote rater guidelines even mentioned giving lower ratings to images with watermaks on them. So if you protect your works you are punished & if you don't, good luck negotiating with a monopoly. You'll probably need the EU to see any remedy there.

When something is an embarrassment to Google & can harm their PR fixing it becomes a priority, otherwise most the costs of rights management fall on the creative industry & Google will go out of their way to add cost to that process. Facebook is, of course, playing the same game with video freebooting.

Algorithms are not neutral and platforms change what they promote to suit their own needs.

As the platforms aim to expand into new verticals they create new opportunities, but those opportunities are temporal.

Whatever happened to Zynga?

Even Buzzfeed, the current example of success on Facebook, missed their revenue target badly, even as they become more dependent on the Facebook feed.

"One more implication of aggregation-based monopolies is that once competitors die the aggregators become monopsonies — i.e. the only buyer for modularized suppliers. And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate." - Ben Thompson

Long after benefit stops passing to the creative person the platform still gets to re-use the work. The Supreme Court only recentlyrefused to hear the ebook scanning case & Google is already running stories about using romance novels to train their AI. How long until Google places their own AI driven news rewrites in front of users?

Who then will fund journalism?

Dumb it Down

Remember how Panda was going to fix crap content for the web? eHow has removed literally millions of articles from their site & still has not recovered in Google. Demand Media's bolt-on articles published on newspaper sites still rank great in Google, but that will at some point get saturated and stop being a growth opportunity, shifting from growth to zero sum to a negative sum market, particularly as Google keeps growing their knowledge scraper graph.

.@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f— dan barker (@danbarker) February 27, 2014

Now maybe if you dumb it down with celebrity garbage you get quick clicks from other channels and longterm SEO traffic doesn't matter as much.

But if everyone is pumping the same crap into the feed it is hard to stand out. When everyone starts doing it the strategy is no longer a competitive advantage. Further, if you build a business that is algorithmically optimized for short-term clicks is also optimizing for its own longterm irrelevancy.

Yahoo’s journalists used to joke amongst themselves about the extensive variety of Kind bars provided, but now the snacks aren’t being replenished. Instead, employees frequently remind each other that there is little reason to bother creating quality work within Yahoo’s vast eco-system of middle-brow content. “You are competing against Kim Kardashian’s ass,” goes a common refrain.
...
Yahoo’s billion-person-a-month home page is run by an algorithm, with a spare editorial staff, that pulls in the best-performing content from across the site. Yahoo engineers generally believed that these big names should have been able to support themselves, garner their own large audiences, and shouldn’t have relied on placement on the home page to achieve large audiences. As a result, they were expected to sink or swim on their own.
...
“Yahoo is reverting to its natural form,” a former staffer told me, “a crap home page for the Midwest.”

That is why Yahoo! ultimately had to shut down almost all their verticals. They were optimized algorithmically for short term wins rather than building things with longterm resonance.

Death by bean counter.

The above also has an incredibly damaging knock on effect on society.

People miss the key news. "what articles got the most views, and thus "clicks." Put bluntly, it was never the articles on my catching Bernanke pulling system liquidity into the maw of the collapse in 2008, while he maintained to Congress he had done the opposite." - Karl Denninger

The other issue is PR is outright displacing journalism. As bad as that is at creating general disinformation, it gets worse when people presume diversity of coverage means a diversity of thought process, a diversity of work, and a diversity of sources. Even people inside the current presidential administration state how horrible this trend is on society:

“All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.” ... “We created an echo chamber,” he told the magazine. “They [the seemingly independent experts] were saying things that validated what we had given them to say.”

That is basically the government complaining to the press about it being "too easy" to manipulate the press.

Adding Echo to the Echo

Much of what "seems" like an algorithm on the tech platforms is actually a bunch of lowly paid humans pretending to be an algorithm.

When I worked curating Facebook Paper's tech section (remember that?) they told me to just rip off Techmeme https://t.co/LJ6O3coFMy— (@kifleswing) May 3, 2016

This goes back to the problem of the limited diversity in original sources and rise of thin "take" pieces. Stories with an inconvenient truth can get suppressed, but "newsworthy" stories with multiple sources covering them may all use the same biased source.

After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm. ... A topic was often blacklisted if it didn’t have at least three traditional news sources covering it

As algorithms take over more aspects of our lives and eat more of the media ecosystem, the sources they feed upon will consistently lose quality until some sort of major reset happens.

The strategy to keep sacrificing the long term to hit the short term numbers can seem popular. And then, suddenly, death.

You can say the soul is gone
And the feeling is just not there
Not like it was so long ago.
- Neil Young, Stringman

Micropayments & Paywalls

It is getting cheap enough that just about anyone can run a paid membership site, but it is quite hard to create something worth paying for on a recurring basis.

There are a few big issues with paywalls:

  • If you have something unique and don't market it aggressively then nobody will know about it. And, in fact, in some businesses your paying customers may have no interest in sharing your content because they view it as one of their competitive advantages. This was one of the big reasons I ultimately had to shut down our membership site.
  • If you do market something well enough to create demand then some other free sites will make free derivatives, and it is hard to keep having new things to write worth paying for in many markets. Eventually you exhaust the market or get burned out or stop resonating with it. Even free websites have churn. Paid websites have to bring in new members to offset old members leaving.
  • In most markets worth being in there is going to be plenty of free sites in the vertical which dominate the broader conversation. Thus you likely need to publish a significant amount of information for free which leads into an eventual sale. But knowing where to put the free line & how to move it over time isn't easy. Over the past year or two I blogged far less than I should have if I was going to keep running our site as a paid membership site.
  • And the last big issue is that a paywall is basically counter to all the other sort of above business models the mainstream media is trying. You need deeper content, better content, content that is not off topic, etc. Many of the easy wins for ad funded media become easy losses for paid membership sites. And just like it is hard for newspapers to ween themselves off of print ad revenues, it can be hard to undo many of the quick win ad revenue boosters if one wants to change their business model drastically. Regaining you sou takes time, and often, death.

“It's only after we've lost everything that we're free to do anything.” ― Chuck Palahniuk, Fight Club

Syndicate content