the frequency a kenny chung blog

SEJ Summit NYC 2015

I was invited to the 2015 SEJSummit in NYC hosted by Searchmetrics. It was unique in that they advertised zero vendor pitches, just content. And they definitely delivered on that promise. The day was jam-packed with 8 speakers, talking about a wide variety of topics ranging from content strategy to user tracking to mobile SEO. Each presentation was meant to have 3 Takeaways, presented at the beginning and recapped at the end. Below are my notes and liveTweets of the presentations, separated by presentation topic:

“Thinking Outside of the Text Box: 6 Ways to Increase the Life of Your Content” with Kelsey Jones (Search Engine Journal)

We kicked off the day with some fresh content marketing ideas. Kelsey’s presentation mainly focused on ways to leverage existing content and either spin it or repurpose it into different formats or social networks. Some of the tips were things we should all know (guest blog, syndicate content, etc.), but there were a lot of good ideas for other channels. Slideshare, for example, was mentioned a lot as a new way to present existing content to a different audience.

Podcasts surprisingly make up almost 26% of all audio consumed (that’s across all media devices including cars), which is an astounding number. Sometimes, it’s worth altering the format of your information since people prefer different consumption behaviors (e.g. the Oregon DMV offers an audio version of their driving manual). You can also turn long-form written content into short-form podcasts. One example was Frommer’s (known for their travel guides) creating short bite-sized podcast episodes focusing on very specific locations within a country.

For videos, there were also a few ideas. For example, NewEgg creates videos for breaking news stories, which is often quicker than writing them up. This allows them to be first to publish, and then they turn the content into longer, written posts.

Lastly, one metric Kelsey touched upon was that syndicating content elsewhere can lead to an 180% increase in email subscribers. The logic here is that if a new audience sees your content, they’ll seek out the author or company who puts it out (assuming the content is worthwhile).

“The Social Future” with Peter Shankman (Geek Factory Inc.)

Peter Shankman’s presentation was more free-form, with no deck. It was focused on customer service in the digital world, and if there was one major takeaway, it was that you don’t need to be the best at customer service; you only need to be “several levels above crap.” The point being most companies are terrible at social media and online service, so you only need to be better than they are.

One recurring theme of the summit was about the usefulness of social media metrics, or lack thereof. Peter postulated that the idea of big brands chasing social media Follows, Friends, Fans and Likes is going away because it doesn’t follow natural human behavior, and they aren’t useful to report on. People want to interact with people they know, and not necessarily with brands. He also mentioned that Yelp was dead (or soon to be) because other social networks are more useful for recommendations. I took issue with this one, because like I’ve written about two years ago, Yelp is a social network.

Peter provided his four rules for better customer communications. Rule #1 was to be transparent. It’s inevitable that screwups will happen. Brands should own up to them, rather than trying to sweep them under the rug. Using a real life example, he said we should aspire to be like Eliot Spitzer (he apologized and is now accepted back in the mainstream) and not to be Anthony Weiner (who tried desperately for damage control). [sic] in my Tweet of Weiner’s name, by the way. A great quote from Peter was that “the best lover is a former hater”, meaning if you can turn someone around, they will become your greatest brand advocate.

Rule #2 is to find out how your audience wants to receive information and deliver it to them that way. He told a story about a company that gave thank you gifts in the form of a coffee table book, but through a simple survey, he helped them figure out that the majority of customers preferred online content. So with that simple change, they not only saved money on printing books, but also increased their rate of donations.

Rule #3 is to focus on brevity. Brands have an average of 2.7 seconds to reach their audience. After that, their attention will be gone. Peter advised anyone involved with corporate communications to take an improv class, as it will positively affect the way they speak, and help you get to the point.

Rule #4 is to talk to people even when you don’t need anything. Sometimes it just pays to be top of mind. Peter told us the story of Barry Diller, a former CEO of Paramount during the 70s and 80s. One of his keys to success was to call people just to speak to them and ask if they needed help with anything, rather than just calling when he needed a favor. In turn, under his watch, Paramount became the most successful motion picture house of its time.

“3 Red Hot Social Marketing Hacks To Crush in 2015” with Daniel Morrison (aimClear)

Usually at marketing conferences, there are only one or two presentations that have super clearly defined actionable takeaways. Luckily, SEJSummit had three of them (in a row!). The first was from Daniel Morrison, who was a quant through and through. I took the most notes from his presentation, which focused on leveraging psychographics to enrich your remarketing data/lists.

Daniel outlined three steps for improving your data pool:

  • Inject psychographic data into your audience list
  • Cookie them with retargeting pixels (as first party data)
  • Nurture/convert them with remarketing/RLSA (remarketing lists for search ads).

In order to create the best personas, Daniel recommended layering in active intent filters (the types of people specifically looking to do something, which directly/indirectly ties to a conversion for your brand) as well as taking a dual root approach (creating target segments based on multiple criteria, such as behavior and financial qualifiers).

He also recommended tagging campaign URLs with UTM codes so they can feed into remarketing list rules. He provided three real-world examples of successful campaigns. The first was a tire reseller who increased sales by 22% by bidding higher for users who had previously visited the homepage. The second was a vacation tour company that increased conversation rates by 300% by bidding on broad “gift” terms on people who had previously made a purchase. And the last was a telecom company (with TV, voice and internet services) who decreased CPO by 66% by serving custom ad copy based on the services that the users subscribed to.

There was also talk of setting frequency caps based on the type of service and the typical life cycle, but I’ll touch upon that at the end of the next recap.

“Harnessing the Awesome Power of Identity-Based PPC Marketing” with Larry Kim (WordStream)

Larry Kim is the founder of WordStream, and I’ve had the fortune of listening in on one of his webinars before. His presentation at SEJSummit was full of hyperbole, but his conviction to and expertise in the topic was something to behold. According to Larry, identity-based PPC marketing was the biggest evolution in both paid search and email marketing.

Simply put, identity-based marketing (or people-based marketing) is the ability to target specific users by some sort of unique identifier (email address, phone number, Twitter handle, etc.). By targeting groups of curated individuals, you avoid some of the pitfalls of traditional PPC and email marketing efforts. Firstly, there are no inventory constraints as there are with email blasts, and it is harder to unsubscribe because targeted paid ads are just about everywhere (not sure if this is good or bad for the consumer…). Secondly, it makes it easier to increase your potential target pools while maintaining quality, as you can clone your audience lists using tools like Facebook’s lookalike audiences.

Larry also brought up some real-world examples of his. In one instance, he blogged about a topic on a Friday afternoon and didn’t expect much press coverage. So he ran a Twitter campaign aimed at target influencers, and it was picked up by a large publication shortly thereafter. Another example was for a conference at which he was presenting. He created an audience list based on people in the location of the event within the relevant industry.

During the Q&A session, an audience member asked Larry and Daniel about frequency caps. Larry advised that marketers be aggressive with caps as they are rarely met. And in cases where ad fatigue occurs (which means fewer clicks), the corollary is that the conversion rate for users who actually click through actually increases.

“Big Brands, Mobile SEO and You” with John Shehata (Conde Nast)

I briefly worked with John when he was my ABC News client toward the end of my time at Morpheus Media. He’s an opinionated and clever guy, and his presentation was focused on optimizing websites for mobile. In it, he dispelled some misconceptions about mobile best practices. The major one was that responsive design is not always the most mobile-friendly solution for large sites. Google did not outright say that responsive is the best; they only stated that responsive is better than a strictly desktop experience. The most common alternative is a dedicated mobile site (e.g. m-dot), which leads to all sorts of organizational content issues. The third way is dynamic serving, which displays different HTML/CSS based on user agent. The main benefit is page load speed. John stated that 80-90% of site speed issues are on the front-end, and now that load time is a mobile ranking factor, it makes the most sense to start there. Another stat he quoted was to aim for a 1 second page load for all mobile content above the fold. If optimized rendering can be utilized to allow that content to load first, it is preferable.

John also mentioned that Google will start penalizing app download interstitials (on-page popups on mobile sites that ask the prompt users to download the site’s app instead). The best practice here is to use banners instead. For the future, it’s not outside the realm of possibility (and may actually be very likely) that Google’s stance will extend to all sorts of mobile interstitials and popups, regardless of intent.

One last useful bit of information was about the purpose of googlebot-mobile. It is not the standard mobile crawler. Instead, it is the crawler that Google uses for “feature phones”, which were the precursor to smartphones (think slide or flip phones with limited internet browsers). The standard Googlebot is that one that crawls for mobile.

“SEO Reporting in the Enterprise: Information is Power” with AJ Mihalic (Ayima)

Admittedly, this is the point of the summit where my attention was split between the presentations and some urgent client work, so my notes became a bit sparse. AJ of Ayima spoke about SEO reporting for enterprise clients. He championed graphical interpretations of data, rather than tabular formats. He also recommended utilizing some sort of “EKG” dashboard in order to monitor site health preemptively, instead of waiting for Google Search Console/Webmaster Tools to identify issues because at that point, the problem will have existed for days, if not weeks. In order to do this, AJ mentioned a few methods of early detection. One would be to look at the status codes that your website is returning. If an increase in 404s occurs, it may reflect a redirect problem. Another would be to look at a server log of which pages Googlebot is crawling. Your tech team is your friend for all of these things.

AJ also echoed an approach that I’d heard before at the last SEO meetup I attended, which was to optimize for metrics that will get your clients (or their bosses) promoted. Those are likely the most effective KPIs for your agency work.

“Increasing Your Content IQ” with Jordan Koene (Searchmetrics)

Jordan spoke about how to choose content topics, leverage competitive insights, and about the importance of content recall. On content topics: brands and agencies often have tunnel vision when it comes to keywords. Brands should embrace the image that they’ve cultivated instead of trying to redefine themselves in a way that runs counter to how searchers view them. The websites that brands view as competitors may not be the same ones that are actual competitors. One example was Toyota. Their search competitor for a lot of terms isn’t GM, it’s local Toyota dealerships. Once they added “Official” to all of their site page titles, SEO traffic increased 20%.

Content recall is also important. The topics or themes to which users associate your brand are hard to shake, and may not exactly sync up with your goals. For example, eBay has tried to become a marketplace for luxury goods, but users continue to associate the site with a secondhand garage sale image. One random and interesting stat – there are more used/broken phones of any single iPhone model for sale on eBay than listings for all new phones combined.

“Content Marketing: Success by Design” with Eric Enge (Stone Temple Consulting)

Eric’s presentation was mainly about measuring for SEO successes. Without quantification of goals, any content creation is worthless. Eric’s recipe for content marketing success is to build you reputation, grow visibility, grow your audience, and get links.

By measuring your own successes and comparing to your competitors, it becomes easier to identify which areas need improvement in order to overtake them in organic rankings. For example, a site that recently outranked yours may have recently acquired a high authority link. Your goal should be to receive a link of similar equity, if not more. Rankings are incremental; you should be trying to leapfrog over the listing directly above you, rather than aiming for #1 every time. Eric recommended using Open Site Explorer, Majestic SEO and ahrefs for backlink reporting.

Last month, Google broke the status quo by actually publicly disclosing a part of their top secret search algorithm. What change could possibly have been so significant that Google would announce it from the rooftops? As it turns out, Google now officially considers a page’s load time in its algorithm.

Matt Cutts even recorded a YouTube video confirming the weight that load time has in Google’s decision-making. Both Cutts’ video and the official blog touted the importance of User Experience: the quicker a page loads, the more usable/useful it is for searchers. It was a simple concept that made sense, and Google exhibited a rare complete transparency in disclosing it.

Fast forward one month. It’s the 30 year anniversary of the classic arcade game Pac-Man. As the top search engine is usually wont to do when nerdy dates cross the Google Calendar, the design team created a custom logo. But Friday, May 21st went down in Google history as the first interactive logo. It was a fully functional two-player Pac-Man game, complete with sound effects. It added 225 kb to the load time (a whopping 330% increase) and caused an intro sound to automatically play.

Google Pac-Man Load Time
Load Time and File Sizes according to Firebug plugin

Some loved it; others hated it. I personally played it through four or five levels for some lunch break nostalgia, and Google even created a dedicated static page for the applet. However, despite the potential fun, it was very apparent that the homepage took a lot longer than usual to load.

In short, the search engine that has long positioned itself as the sleekest, quickest method to find anything on the web created a cumbersome default homepage to celebrate a video game anniversary. Was this arrogance on the part of Google? If Google didn’t have 64% market share (as of May, according to the Wall Street Journal), would it have sacrificed speed to create some social media bait? I also don’t believe it was an isolated incident. In an attempt to incorporate feature-rich functions, Google has (in the past year) included real-time social streams in its Universal Search, implemented fading effects on the homepage, and redesigned the entire SERP. Google seems to be continually willing to compromise core values (see also Google Censorship) to play the Web 2.0 game.

So was this a hypocritical move by Google? All signs point to yes, but it wasn’t that big of a jump, all things considered.

December 15th, 2009
according to

Looks like the little search engine that could, did:
Bing Cracks 10% Search Engine Market Share

What will this mean for those of us in SEO? Well for one thing, it might be time to start tweaking our natural search optimization techniques to include the growing presence of bing. Here’s a good guide for optimizing for bing.

July 13th, 2009
according to

Link: The Time Has Come To Regulate Search Engine Marketing And SEO

Here’s a quote from the article that speaks volumes to me (with my emphasis).

“Here’s where the parallel to free trade breaks down. There are no perfect paradigms looking at free trade and import/export laws that exactly define or address this challenge.”

And now my thoughts regarding the rest of the article:

1) I believe that the article is most definitely written by an employee of a competing search engine challenging the Google model.

Google is the gold standard of online search, which means they are and have been doing something right. If users didn’t find results to be consistent and relevant, then Google would not be as dominant as it is. This may come off as common sense statement, but I think a lot of users just take it for granted that Google exists and is as powerful as it is.

2) It’s true that the Internet marketplace is incredibly saturated. And unlike the real world, where people choose a store based on location, personalized customer service, and visual appeal, the Internet doesn’t work that way. Google will tell you which sites are the most relevant based on what your keywords say you’re looking for. The overlap between Internet and real-world shopping is word-of-mouth. Where the Internet trumps real life is that word-of-mouth travels at lightspeed over the Net. Think about how many times customer service horror stories have made their rounds on the Web. The Internet is both the best tool for PR and its worst enemy.

3) Google is not the be-all and end-all of online commerce. Certain specialty “watchdog” sites that compare products, prices and merchants (the latter two of which Google has a market share) via user reviews are really where experienced buyers will look to first. Seasoned Internet shoppers know how to find the best prices for goods, the best sites for individual product reviews, and ratings for online stores. I feel that the author of the article underestimates the ingenuity of the Internet populace.

4) Wikipedia probably has the best model of collaborative effort on the Web. But how would you apply this paradigm to the search industry? First and foremost, you would need community moderation- the staple of Wikipedia. You would need people willing to spend their time in order to improve results, eliminate biases, and ultimately convey the “truth” behind the SEO smoke and SEM mirrors. I am certain that Google believes that those they employ can do a better job than the combined efforts of the Internet community. And who can blame them? Look how far their trust has brought the company.

5) Google pretty much singlehandedly drives the SEO/SEM industry. The ever-changing and evolving secret algorithm keeps these marketing and optimization companies in business while also helping to prevent abuse.

I interned for a successful search marketing agency, and I can tell you that results can be delivered for new companies without using black-hat tactics. It takes hard work and real insight (which is why so many companies outsource).

And here’s one of the most important lessons I learned while working there: You have to believe in your clients. If you don’t, then your chances of improving pagerank will certainly diminish. On the other hand, if your clients believe in their goods and/or service and your SEO/SEM company informs them of changes they should make to improve both content and User Experience, then you’re already many steps ahead of the competition.

6) Google does not police the Internet. Google polices its own service. The article’s main analogy is flawed in that it doesn’t consider other continents. I suppose comparing Google to a country is more apt. The country imposes its own laws on its citizens the same way Google moderates search results. And there are smaller continents with their own sets of rules for those who don’t wish to become citizens of Google. Google is not a monopoly. Of course, it would be ignorant to state that Google is not a huge factor in online business success, but there is definitely room for improvement. Do you think Microsoft would sink upward of $100 million into Bing had they not done their market research? If Google doesn’t give the people the kind of search engine they want, then there is definitely room for another company to develop one. Bing offers a somewhat fresh search model based on their own laws. But only time will tell how big and powerful Bing country becomes.

7) Google is not a malicious dictatorship. I firmly believe that both profit and user experience are equal drivers of development and innovation for their products. It’s true that Google will pull site listings that are, for lack of a better word, ‘fishy.’ SEO/SEM veterans have warned me about how to avoid angering Google moderators, but have also told me about how Google can be merciful. There exists an appeals system and Google will consider reviewing your infraction. You just have to make a case that you’re above using sneaky methods and that you really deserve a spot on Google.

That last part is so important that I’ll repeat it: You have to make a case that you deserve a spot on Google. The search spiders and Ad buying will only take you so far. User behavior will let Google know which sites people like the most and how accurate meta descriptions and keywords really are.

At the end of the day, it’s the users who have the most power, and not a single search engine.

Rebranding campaigns are a tricky beast. On the one hand, it’s easy to go in with the mindset that you can’t do any worse than your past attempts. But if you want it to really be successful, you put everything you have out there and leave it all in the ring.

There’s not too much information about the new Microsoft search engine codenamed Kumo. But what Microsoft has on its plate will make it heavy to lift. Can anybody seriously imagine any search engine that could actually rival Google’s dominance?

For the most part, Google has made all the right decisions but it’s their philosophy that got them to where they are today. The CEO at my internship was regaling us with tales about how some people he recently met thought that Google was the Internet. That’s how successful the company has become. They’ve rolled out quality product after quality product and built up the value of their brand and name.

So really, what can Microsoft do?

Microsoft certainly can (and will) throw copious amounts of money to fix the problem. They’ve enlisted the help of JWT to handle the rebranding campaign of their to-be-named search engine and are going to spend $100M on it. That’s right, $100 Million Dollars.

Personally, I think Google will be Search Engine King for many, many years to come. I don’t think it’s too far-fetched to think that Microsoft can convert some users, but its greatest strength is its ubiquity. Consider this: every Windows-equipped computer after the launch of Kumo would have the search engine set as the browser homepage and perhaps even have some widget embedded in its desktop or plugin for other software. As I mentioned in my last Microsoft post, new users are such a key demographic. You get computer novices to use Kumo as soon as they unbox their computer and then suddenly Google will be foreign and Kumo will feel right.

It’ll be interesting to see what direction this rebranding campaign takes.

Link: Microsoft Looks to JWT to Market New Search Engine

Creative Commons License
licensed under a Creative Commons Attribution-No Derivative Works 3.0 United States License.