Online Marketing Strategies for SMBs

Through a convergence of recent events, local small businesses (SMBs) are beginning to flock to online advertising. Businesses and consumers alike are making substantially less use of the Yellow Pages and other print directories to discover local merchants, in favor of more efficient tools search tools such as Google and Bing. Traditional rating and review directories such as Zagats have been bought and aggregated into Google’s Local platform, and other sites such as Yelp!, FourSquare and mobile apps like UrbanSpoon are making it even easier to find qualified leads for restaurants and entertainment venues.

Online marketing however is a big topic and requires an understanding of the online landscape to position one’s small business appropriately. Perhaps adding to the confusion, is the deluge of phone calls SMBs have been receiving from advertising agencies (reports of 4-5 calls per week on average) offering to help with their own brand of online marketing, some even claiming to be acting on the behalf of Google; an exaggeration at best. While these agencies do represent a conduit for local merchants to quickly get online, how does one know if they are offering the right services for your business or that they will receive a positive return on ad spend?

I took a little time to review the more prominent SMB agencies and I found that by and large they fit into one of three categories:

Local SEO (Presence Focus)

These are companies that focus on getting a company listed in Google and Bing’s maps and local search results. You may have noticed recently that if you type in “Pasadena plumbers”, you now see a map and a list of local merchants next to the map. If you search simply for “plumbers”, you will see a similarly local-biased result set on your phone. This is not to be confused with traditional SEO in which optimizers acquire links to establish relevancy for specific keywords. Rather, they are acquiring citations and reviews on behalf of the local merchant, in the authoritative local directories and registering the business as a local entity directly with Google and Bing.

Search Engine Optimization companies are notorious for selling their services over that of paid advertising by saying their listings are permanent and the listings are free, saving a lot of money in the long run. Sometimes it is true that you can save a lot of money by going the SEO route instead of paid advertising, but it is rarely free. There is a lot of time and effort that can go into getting your business listed properly and staying on top of the search results takes continued effort.

These companies also typically provide website development, design and social presence services, since these efforts are largely related to the effort of better online visibility and reviews.

Paid Advertising (Sales Focus)

Paid advertising companies essentially buy ads and search placements (pay per click) on behalf of the local merchant; a practice sometimes referred to as lead generation. Most often paid advertising companies are purchasing text ads from the search engines, and mostly from Google. In fact, it would not be uncommon for 80% of a local merchants ad spend to go directly to Google. There is also value to be found from Bing and Facebook, and perhaps some specialty niche sites and increasingly from ad banner retargeting, if there is enough volume and budget.

For the local merchant, it is worth understanding the eco-system a little bit, if considering working with a lead generation company. Google AdWords can be rather complicated and some spend years perfecting the art and nuance of managing keywords and maximizing ROI for campaigns. Google however does offer the Local platform for SMBs, a simplified and watered-down version of their standard AdWords interface. Local merchants can in fact run their own campaigns and save the premium that is being paid to agencies to work on their behalf (sometimes as high as 40%!), but its not as rosy as it first appears. First, many of these larger agencies get reseller discounts allowing them to source for a lower cost right off the bat. Second, if you don’t have the skills, you will likely pay more and get less qualified traffic than you’d pay the agency, and third, if can take more of your time that you’d think it should, even using the simplified Google Local platform.

These agencies often do offer a website but it is typically a very simple 1-5 page site, focusing on a sales landing page and a prominently displayed 800 phone number for ad-to-call tracking purposes. Some companies such as ReachLocal may even insist to mirror or overtake your site in order to be effective at running their ads.

Lead Incubation (Relationship Focus)

Finally there is a third type of marketing agency which works with longer-term relationships that may not convert into a sale immediately. For example, Realtors, B2B sales, and other high-cost transactions may first be introduced to a prospect months before they buy. Compare that to the nearly instant transaction of someone looking for a plumber. In these cases, lead incubation experts may use a blend of SEO, paid search, and perhaps even social channels to generate the leads, but take an extra step of staying in contact and following up with the leads periodically, to ensure that business prospect is not lost through the course of the longer sales funnel.

Lead Incubation is typically achieved using a Customer Relationship Management (CRM) system that tracks relationships and stays in touch via email, following up at strategic opportunities to check in with the prospect and potentially provide more value. In the case of real estate, the CRM may be integrated with the IDX data feed which provides a listing of houses for sale and email alerts can be sent to prospects, when a new listing is added that matches the search criteria of that prospect. This is typically referred to as a Drip campaign. Some companies such as TopProducer for Realtors and Eloqua for B2B sales, provide the tools for sales professionals to orchestrate and manage their own “auto responder” follow up emails, while others may automate and manage the effort as a service for their clients.

These companies typically provide a backend CRM and/or DRIP campaign system. There is often a front-end website option or website integration form, to facilitate lead and email capture. Frequently additional information is provided in the form of whitepapers or deeper search query reports to incentivize completion of the form, which enables later follow up by the CRM/Drip system.


As you can see from the above descriptions, these companies and their strategies may very significantly. How does a local business know which is right for them? Fortunately, a lot of this translates to the real world and may be self-evident with a little analysis. For local merchants considering working with an online marketing agency, be sure to understand their focus and if it is a correct fit for your business:

Companies for whom their product or service is generally not differentiated by quality (aka a commodity), will tend to be more sales-oriented. For example, if I am the local seller of bicycles or the local locksmith, people need not qualify me, and so its more of a sales game to get the business. For these companies, a paid averting specialist (lead generation) may be the right fit.

For companies that are a lot more discernible by quality and reputation, paid search may not be effective, since customers are interested to see ratings and reviews first. Examples might include restaurants or doctors. And finally, for real estate agents and B2B sales professionals selling expensive product that typically would not be a cash transaction for a consumer, lead incubation is a must.


At the end of the day what you are really looking for is the best Return on Ad Spend (ROAS). Your ROAS will vary by (a) having the right strategy to match your business, (b) how much you paid for the advertising, and (c) how effectively it was executed. Fortunately most small businesses should already have a sense for the right strategy based on what’s already working for them in the offline world. It may be tempting to reduce cost of ad spend by doing it yourself but remember there is not only opportunity cost in having to learn this stuff yourself, but you likely will not execute as well as experts of the field, if you are new to online marketing. That’s not to say you shouldn’t learn about or dabble in it yourself but those are two counter-weights to consider.

No matter how you choose to proceed, just get started! And plan to set aside 10-20% of your marketing budget for testing and exploring new online opportunities such as social or mobile marketing. There are always new opportunities popping up online and invariably, the largest bounties go to those who are early to the party.


Goodbye Flash, Hello Edge!

It is no secret that the iPhone does not support Flash.  Steve Jobs went as far as to explicitly rule out support of Flash by name, in his famous 2011 speech.  And now, Adobe has responded by announcing they will no longer support or further develop the Flash platform. Instead, Adobe is quietly releasing a new product called Edge, which outputs HTML5, CSS, and JavaScript, as presumed replacement for their popular animation authoring environment.

HTML5 is very powerful and a significant milestone in the evolution of web application interface architecture.  Its not just about animation and native support of audio and video.  In truth, HTML5 in conjunction with JavaScript can do just about everything that Flash could do. The name HTML5 in fact is perhaps a misnomer in this regard as it should be noted that HTML5 embodies the standards for a collection of technologies that facilitate audio, video, real-time rendering and animations, local persistence, and so much more.  But because it is all inherently a part of the HTML5 document model (not compiled binary code), accessibility and SEO are no longer the issues they have been in the past.  Finally, the design and technology prerogatives need not be in contradiction to one another!

So what exactly is Adobe Edge? The basic concept for developers is quite similar to Flash.  The differences  are minor, for example the timeline is based on elapsed time now rather than key frames, similar to Adobe’s After Effects video editing software.  Also notable is the use of non-destructive editing. Rather than over-writing the original HTML and JavaScript files you start with, it will create its own parallel file set to augment what you started with. Available binding events are also a little bit different and better reflect their HTML5 underpinnings. Overall though, you’d be surprised just how similar the tools are.

Adobe Edge

With Adobe making this change to embrace HTML5, it seems that just about everyone is on board now with HTML5 and JavaScript as the way forward for development of rich internet applications; Microsoft even announced recently that they are depreciating Silverlight.  That’s on the desktop anyway. Mobile is a more complex issue with so much momentum still with development of native applications (iOS, Android, etc). The cost of maintaining separate applications for various devices certainly isn’t ideal however and truthfully, 80% of those apps could be replicated using HTML5 and JavaScript without much negative impact on user experience, but at a substantially lower cost.  Its primarily the sophisticated game and media applications that might not port as well.

Adobe Edge seems well positioned to take become the default product for interactive and animation authoring for the web 3.0 applications of the future. The product has done a good job of playing off of the strengths and knowledge of the existing Flash platform and hopefully will not alienate the developer base.  They’ve satisfied UI architecture prerogatives by keeping the output artifacts aligned with HTML5 and the document object model, and they’re going to be the first significant tool to provide an easy authoring solution for what will inevitably be a major new wave of web application innovation.  I’m sure it was a painful decision for Adobe to kill their golden goose, but this move should be a positive for everyone and may actually help them in the long run.

Web User Interface Accessability

Perhaps we’ve all heard that you’re suppose to avoid using tables and Flash when designing your website, but do you know why?  For a good answer, you really need to look deeper than tactical benefits, and consider more fundamental questions of accessibility.

A  lot of times the end-consumers of our content are not consuming the content as we assume they are.  For example, nearly a third of all visitors on some web applications are now using non-desktop devices (aka mobile) to consume the content.  There’s also the issue of search engines trying to parse out what your content is, which you need to facilitate if you want decent rankings.  There are also a number of vision impaired consumers using screen-reader tools (e.g. JAWS, NVDA, Window Eyes) to interpret the content and translate it to audio. The government actually mandates support for these screen reasons via Section 508. A lot of major companies follow it too.

So let’s revisit that initial question again with this context – can your mobile visitors, search engines, and vision impaired individuals access your content if its all wrapped up on binary Flash or Java files?  Perhaps they can access content enclosed in nested HTML tables, but is it going to be a good user experience for someone on a browser?

When you begin to consider these challenges, you’re beginning to think about user interface architecture.  Or put another way, how can you ensure that your interface is structured in a way that it is flexible and supportive of all those who access the content. And the mechanisms by which to facilitate your audience are pretty straight forward.

To build a well architected user interface that maximizes accessibility, consider these few steps:

Document-Native – First and foremost, it is important to move away from interface structure being embedded in binary Flash, Flex, Java, or Silverlight components.  This is unavoidable with certain media such as video or audio, but when it comes to your navigation and content, you absolutely cannot have this embedded in a binary file, or you will have erected the single biggest barrier you can, to allowing machine facilitators such as search engines and screen readers to even access the content, much less make sense of it.  Meanwhile, the iOS devices don’t even support it.  So if you’re looking to build dynamic interactive content, you should absolutely use JavaScript and HTML to accomplish those goals, not Flash, etc.

Semantic HTMLHTML5 introduces semantic tagging for navigation, headers, footers, asides and articles.  Proper use of these tags can considerably improve a machine’s ability to parse and interpret the content. Augmenting the HTML5 with Semantic Web tags such as RDFa or Microformats (aka, further significantly improves machine accessibility, which again, is great for search engines and screen readers. It also opens the door to many other great semantic web applications of the future. So beginning to think semantically is a great habit to begin today.

Unobtrusive JavaScript – The term “unobtrusive” means writing your JavaScript in such a way that if a machine or browser doesn’t understand the script, it doesn’t cause an error. Ideally you’ll provide a graceful fallback scenario for any critical path functionality, in this case.  To do this properly, you’re looking at writing your JavaScript a bit differently.  Rather than using embedded onclick and on submit commands directly on the DOM elements, you’ll write “listener” objects that sit in an external file  and listen for events on the DOM that match the specified DOM selector.  The beauty of this approach is that if the event listeners are not executed because the machine doesn’t understand the script, then nothing happens when the DOM even occurs.  In which case you could execute your fallback scenario such as a screen refresh to the page content, rather than it coming up in the pretty AJAX lightbox you preferred.  This way, the user (and computer) can still access the content.

Unobtrusive CSS – Similar to the JavaScript being held external, you’ll want to do the same with CSS.  That means forgetting the style attribute exists and focusing instead on class and ID attributes.  The reason for doing this is similar to the reason you don’t want to use nested tables. If you contain style information in your HTML document, then you’ve mandated certain styles to be rendered regardless of the device.  That’s not going to present the content well on a mobile device however.  What if instead, you have a single HTML output from your server which has no style, and then dynamically switch which external CSS file is served, based upon the user agent requesting the document.  You could render the same HTML to appear optimally on a desktop browser, an iPad, a Blackberry, etc.  It further reduces code bloat and confusion potential for screen readers and search engines.

AJAX Live Regions – AJAX is the use of JavaScript to dynamically call a web service and get back new content.  Users will typically recognize AJAX because the page content updates or a lightbox appears with further navigation steps, without having to wait for a page to update.  AJAX is a huge step forward for critical path workflow, but presents challenges for machine accessibility, since the subsequent content is not present on the original HTML.  To overcome this, consider using ARIA (WAI-Aria specifically) to tag live regions of content and further instruct screen readers and search engines, so they are aware of, and know how to parse the dynamic AJAX content.

And that’s it!  Hopefully that provides a decent overview of why it is sometimes necessary to think of the user interface beyond simply what we see on our main computer screen, and beyond the latest technical bells an whistles. By being aware of how the content is being consumed and the challenges faced by certain users and machines attempting to access your content, you can quickly see the need to think a little deeper about how the interface is structured. And by following a few new guidelines as provided above, you can significantly improve accessibility for your web application.  This will make your content more accessible for mobile devices, screen-readers, and search engines alike.

Claiming Your Online Identity

Defining and managing your identity online can be time consuming but considering how many of our social and professional relationships begin with a Google query, it probably makes sense for all of us to invest a little time pulling it all together and presenting the image to the World that we want displayed, rather than whatever just happens to show up on the Internet.

First, let me make a distinction. In SEO circles, there is a practice called ‘reputation management’ that generally involves creating numerous pages of external content that will rank for the given brand’s related searches.  All those external pages are intended to rank below the corporate website but before whatever negative reviews or remarks exist, hopefully pushing any negativity down to the 3rd or 4th page of results, where no one will notice.  In other words, it is a focused spamming effort on behalf of said brand, with the goal of manipulating search results.  Sometimes brands have no choice and have to engage in this level of online warfare. Pragmatism aside though, this is NOT what I am describing here.  Rather I am talking about a proactive and cooperative effort to help Google identify all of your identities and content online, in return for preferred placement of your content when people search for you.

Google implemented an important new feature, along with the release of Google+, that allows them to recognize social channels and owned content and properly attribute this to the brand or individual.  The implementation takes advantage of a new semantic tagging attribute (@rel) in the HTML5 spec that specifies origination and authorship of content.  If you create a Google profile and link your content and social identities using these tags, you can essentially let Google know which content is really you.  In a recent presentation, Matt Cutts (Google) called this new ability “author rank”, and if you’ve been following their recent talk about quality signals, you know that authenticity of content is a big deal in recent algorithmic updates. In fact, they’re not only giving preferred placement to authenticated content for brand and name searches, they’re even displaying the photo of the author in many cases, to help set these aside as authenticated content, from reputable authors.

Okay, sounds good so far right?  Now we just need to work on the confusing mess that is our social network and tie it all together in some meaningful way.  After a bit of homework and practice on my online identity, I distilled down what I feel is a best practice approach.  First, I created a simple website for myself under my namesake URL and created links to all of my social accounts from this location. is my literally my homepage now. Next, I linked all my enlogica articles back to my homepage.  And of course enlogica has a couple of its own social accounts to facilitate outreach and easier content socialization. So its important to keep the social accounts owned by the blog separate from those attributed to my own personal identity.  So whereas I link to the blog’s social sites on the sidebar of the blog, I keep all my personal social media accounts separate and only link to them from my own personal homepage.  Once all of that is clear, I create a Google profile page for my own personal identity link all the personal identities together. should probably also do this for its own identity, separate from my own.

Sync Up Your Online Profile

Now let’s get a little more specific about how its actually implemented and the use of authorship tagging. On my homepage, each of my social account links contains the new attribute: @rel=”me”. The homepage should have proper authenticity to make such a claim since you already liked to it from your Google profile account.  Next, on the blog that you are reading now (for example), I placed a small author block at the bottom of each article that provides a link to my homepage. This links back to my personal homepage, with the attribute: rel=”author”.  Just like the @rel=”me” helped Google to identify my social profiles, @rel=”author” will help it to recognize any content that I have created and authenticate it.

Since I know this can be a little confusing, here’s the same information in enumerated form:

1. Create Google profile (
1a Link to your website and all social accounts
1b Link to any blogs or online magazines that you contribute to
2. On your blog, provide a link on each article page back to your home base site.  Include rel=”author”
3. On your homepage, link to all of your social accounts.  Include rel=”me”

Connecting Accounts to Google Profile

The final step is to validate that everything is setup correctly. Google provides a tool for you to use. Unfortunately at the time of this writing, the tool is giving me an error for using the @rel=”me” attributes on my social account links from my homepage. This appears to be a bug however; even Matt Cutts blog is getting the same error for the same reason.

Anyway, despite that hiccup, if everything was done properly, you will soon start to see that your homepage, social accounts, and your content begin to dominate the search results for terms related to your name or brand.  You may even begin to notice for profile picture next to your content.

There you have it.  My homepage, my social accounts, and my blog contributions are all accounted for and properly attributed and validated.  Now hopefully with a little time, Google begins to treat my identities with a positive bias for searches related to my personal ‘brand’.

The Five Pillars of SEO in 2012

Let’s face it, when we say Search Engine Optimization, we really mean Google Optimization .  And while we’re on the topic, are we really just optimizing our page, or are we actually trying to manipulate Google?  Optimization means creating a clean accessible website and all of the other stuff would not be considered for “White Hat SEO“.  But is that really all that is required to rank well?  And is that all that professional SEOs are doing?

For anyone who has been involved in SEO for any amount of time, we know that simply creating a page of great content is not going to get you any traffic from Google.  Unfortunately the relentless Coyote-vs-Roadrunner game that web masters play with Google has escalated to such a point that it requires a pragmatic appreciation for how the system really works, not how it should work, and if you’re not going to play by those rules, you might as well not even try.

So with that prelude, I’m going to take a more holistic perspective on what is required to achieve good search rankings in Google.  It means spending some effort optimizing your site for crawling and indexing, but also means spending some time promoting your site via external channels.  I would describe the effort by saying there are essentially 5 pillars to modern SEO campaigns, as I’ve listed below.  The first two (links and social) are external factors that could be described as “quality signals” to Google, instructing it how seriously to take the site.  The last 3 in the list are more related to true website optimization.  And yes, I put these in order of what I believe to be their significance, which should hopefully illustrate just how significant external quality signals are, relative to pure optimization activities.

i. Links – What really separated Google in the early years from other search engines was its use of inbound links to determine relevancy of a page.  Similar to how academic journals are more important based on the number of journal citations, the same could be said of websites.  There is even a quality metric called Page Rank (named after Larry Page) which expresses the amount of link power a page has on a 1-10 scale.  Not all links are created equal though – a link from is worth way more than  Each site has value based upon how much link power it has, and thus when it links to another site, it passes a fraction of its link power (“link juice”) to another site.  And so an entire cottage link building industry now exists in the SEO community, servicing the effort to build more and higher quality links to establish sites.  The more competitive the niche that a site is competing in, the more links it requires.

ii. Social Mentions – Social is the latest, greatest impact upon search results.  There have been numerous rumblings in the search communities recently that specific social signals are having an effect, not just any and all social activity.  Specifically, social mentions.  If someone ‘likes’ or ‘shares’ your article on Facebook, or if they retweet a post on Twitter, these serve as social validation similar to how inbound links work.  Also similar to links, not all entities in the social sphere are equal.  If you get a retweet from someone with 2 non-influential followers, that mention is worth much less than the guru of your field.  Thus social is actually shaping up to be quite a powerful quality signal.

iii. Content Is (not) King – We’ve probably all heard a million times that “content is king”.  I strongly disagree! A prince perhaps, but certainly not King!  Its one of those little fantasies that sounds great at the convention but then you go home and try it and, unless you have NO COMPETITION, it doesn’t work!  I’d rather use a plant as a metaphor.  Consider the content as your seed and inbound links as the water.  You can plant the most verile seed but nothing is going to happen without enough water.  Anyway, assuming you apply water and you have a fighting chance, great content should be driven by keyword research.  Find what your users are looking for and write articles about it.

iv. Accessibility – How easy is it to access your content?  Keep in mind that until very recently the Google bot could not access any content that was contained within a Flash file or fetched asynchronously with an AJAX call.  Those things are possible now but with heavy negative bias, so you’d never want to use these to create your navigation bar for instance.  But its more than that.  Having clean self-documenting “RESTful” URLs instead of something with an unreadable query string mess are important.  It is also important to consider how many clicks are required to reach a particular page that might rank.  If it takes 4 clicks from the homepage, Google will assume it is not significant content and will discount its value. It also inherits a lot less of the link juice that is going to your homepage.

v. Usability – Recently, Google began to track user interaction with search results and factor that interaction back into the search results, creating a sort of usability feedback look.    Let’s say that you are result #3 for a given search but your listing is getting clicked more often than result #1.  There is a good chance you will begin to move up in the rankings as a result of this.  Also, how long is the user staying on your site after clicking on it, before they return back to the search results to view another listing?  This would be a good indicator as to whether they found what they wanted on your site. To a lesser degree, indirect factors began to play into usability rankings too such as how long it takes for your page to load and if the page is dominated by advertisements above the fold.  The over-arching takeaway here is to focus on creating high quality content that a user would want to return to, and optimizing your meta description to make sure your listing seems interesting and is consistent with the content the user will find when the click to your site.

In summary, don’t get too fixated on content or simply optimizing your site.  It would be great to think that is all that is required to rank well, but in 2011/2012, there are just so many websites out there competing for rankings, that realistically you need to also consider promoting your site by obtaining inbound links and social mentions.  Without proactive effort to build these quality signals, it could take years for you to legitimately build these signals.  And thus, any serious SEO campaign must include an effort to establish strong quality signals as part of the effort.

‘Black Hat’ and ‘White Hat’ SEO

In 2011, Google is way, way smarter than it use to be. It is also way more competitive. As I think back to the first time I visited the Pubcon forum in 2004, the conferences in the light of day were all about the “right way” to do SEO such as building quality content and structuring your pages correctly. But the real money, the people in the know, were going to the all night invitation only poker parties back at the hotels, where the the webmasters would share tips and tricks with each other about what really worked – and that was my first exposure to the world of Black hat.

Black Hat SEO is really more about tricking the search engines than it is earning credit. Its also a lot more about big short term success than long-term brand building.  Rather than building one big brand site over time, you’d cycle through a bunch of throw-away domains, knowing that what you are doing will work very well until Google catches you, at which your domain would be banned.  So, you’d structure your marketing with that in mind.  You would buy domains like, stuff it full of the keywords you want to rank for, buy your way into a link farm to get thousands of overnight inbound links, and write cloaked pages that switch the content depending on which IP address you’re coming from, to show Google one thing (what you want to rank for) versus what you’re really selling. The latter trick in particular was very popular among the “poker pills and porn” crowd and was a great way to rank for ancillary topics such as online games, diet, and sexual health, even though unrelated.

But the good ole Wild Wild West days are pretty much over in that regard.  Well, that’s not entirely true, there are still people playing these games, but the effort required versus the output benefit has diminished considerably.  In that regard, Google has won the Coyote/Road Runner game.  You’ll notice now that Google places much less emphasis on keyword domains, on-page keyword density, and  and even on-page content for these reasons.  They’ve also acknowledged that they lean toward favoring established brands in their search results now.  This is in large motivated by paying advertiser relationships I’m sure, but it can also be traced back to their spam fighting past.

Today, effective SEO is much more similar to brand building than ever before.  It is about creating quality content that users want to read, improving user experience such as load times and navigation, and promoting the hell out of your content through social and PR channels.  What use to work with good ole reputation building for small businesses 20 years ago is once again working now at a conceptual level; only the tactics have changed.

With all of that said, Google is making it nearly impossible to get ranking or traffic for just about anything until you prove yoursef now, which has setup a situation where, unless you want to rank for something very non-competitive, you’re pretty much compelled to game the system a *little bit* to get off the ground.  Buying inbound links, paying a small army of people to vote for you on Google+ etc shouldn’t be necessary in the lovely World of White hatting, but if you want any traffic, it is.   And so like anything, it probably warrants a balanced and pragmatic look, on a case by case basis.

I’ve provided a few resources below to learn more about these two Worlds:

I. White Hat
1. SitePoint
2. Webmasterworld

II. Gray Hat (on that line):

1. WarriorForum

III. Black Hat
1. Blackhatworld
2. Syndic8
3. WickedFire




Tech Innovations That Are Creating New Opportunity

There is a common expression in tech entrepreneurship – do things that weren’t possible 3 years ago.   The reason for this is the accelerated innovation and thus the accelerated commodization of technology.  A product might go from ground-breaking and unique to absolutely commodity in 10 years or less.  If you accept that, then you must assume competition starts spiking by around year 5 and your ability to to build a bran and break through the noise, will be diminished thereafter.  You also will have less than half the window remaining to capitalize on your investment.  So if you get started around year 3, then you’ve at least got momentum and have built a brand before peak competition is reached.

So that said, what are a few innovations that we could possibly leverage that only only 3 years old, as of this writing.

1. Cheap bandwidth – It use to be that hosting accounts metered your account and limited you to 100s of megabits per month.  Now you can get unmetered pipes (as much as 100 mbps) for a mere $100 per month. Imagine how that might pave the way for streaming and online storage.

2. The Cloud – Everyone knows about the cloud and how this is changing business models such as software which is changing to a software as a service model (SaaS), in which you pay for the service per month, instead of buying it up-front.  Remote sharing of data is now easier than ever too.  What impact might this have?  Remote location, remote collaboration, and better use of corporate capital/cashflow come to mind.

3. Smart phones – Perhaps the single biggest impact of the smart phone revolution has been the GEO-intelligent ‘apps’.  Now you can find anything, anywhere, anytime – relative to where you are.  Imagine the future impact of this upon advertising and device intelligence.

4. Social graph – Facebook as proposed an open social graph which is perhaps the first and most ambitious application of the semantic web. Their goal is to tie together all of your interactions, comments, locations etc, into one single usable data stream.  This is both scary and full of possibility; especially when combining it with GeoIP smart phones.

5. SaaS Business Model – I mentioned this one above but ‘cloud computing’ has given rise to an entirely new software licensing model in which the use pays per month rather than an up-front fee.  This is innovative from a financing perspective when you consider how cash-strapped many startups and small businesses are.  But more interestingly, many SaaS companies are beginning to position themselves as hybrid *services* companies that automate a service for their customers; not simply virtual product vendors. That’s a major philosophy change in what software businesses are, and a huge benefit for small business.  This is likely the beginning of the adoption of  automation tools by small businesses en masse.

6. Tech Startup Incubators (Y Combinator) – Mark Schuster made an interesting point recently about how massively much less capital is now required for a startup and how this is driving the exponential rise in small business startups. We went from millions of required capital (hardware, software) in the late 1990s, to $10,000s, or even $5,000 in some cases now.  That is the lowest cost in history to start a small business, which is good seeing how high unemployment currently is!   It will be interesting to see how those two converging trends shape the future.  But layer on top of that all of these startup incubators now such as YCombinator, TechStars, The Foundry, Founders Institute, etc – and it is enabling young entrepreneurs to get access to capital and mentorship like never before!  Actually, for me considering startup of a small business, I find this to be a real concern frankly; an explosion of competition!

7. Information Overload – Google has succeeded in indexing billions of records and making them easily accessible to the masses.  Search engines are not new, but having ready access to all this data any time and from anywhere (smart phones) and taken for granted even by high schoolers now, should surely have some near-term future impact. I personally believe that this commoditization and saturation of data is what will drive the coming Semantic Web movement.  When documents are properly and semantically tagged, we will enable software and appliances to consume the data automatically and this will automate much of our lives.  So this secondary effect of search and information innovation I think will give way to yet an even bigger *applied* impact.

8. AdSense is Saturated – I hear there are over 400,000 websites joining the Google AdSense network every month now.  And that gave way to the explosion of ‘junk’ content and networks that creative junk content on massive scale, that Google has recently been trying to defeat with its Panda update (aka “Farmer”).  I think this business model of creating massive content to generate ad revenue is at a near-term end. But that creates a vacuum – what is it?  All those site owners will be looking for a solution.

9. Interest in Optimization (CRO/LPO) – Starting in 2006 it seemed like webmasters all woke up to the power of SEO at the same time.  A huge new consulting niche was born overnight.  In 2002 an explosion of PPC advertising was born and it took 3-4 years thereafter, for ad cost to catch up with their value.  Now we’re in a mature market for both of these ‘easy money’ search advertising options.  The free organic traffic people are now binging on Social media as a sort of SEO 2.0, and meanwhile PPC is finally discovering landing page and conversion optimization by which they can dramatically increase their ROI by improving how their site intakes visitors, creating a sort of Renaissance for them too.  The consulting world seems ripe for both Social and Conversion Optimization at the moment.

As I look around, I see plenty of changes and opportunities but ironically I am a bit underwhelmed by the opportunity.  I feel like we are in the remnant phase of some pretty massive innovation from 5 or 6 years ago. What I’m seeing now are mostly secondary effects of innovation that first occurred a few years ago. I recall there was a lull in innovation that seemed to occur in 2000-2003 and it feels similar to then, to me.  I wonder if this is a natural cycle and we’re in the lull just before the next major innovation opportunity?  Mark Schuster recently suggested we’re on the precipice of the Internet television revolution for example.  Now THAT could be interesting.

Social Media Syndication

If you’ve ever contemplated a social media marketing campaign then one of the first questions that came up was how to best distribute your message.  There is a myriad of small social media properties out there and you can spend a lot of time trying to manage all of them.  Many Social Media Marketing professionals will thus choose to focus purely on Facebook and Twitter for this reason.  But as you start to consider the quality signals from social and how those can benefit your SEO efforts, it seems almost a shame not to take advantage of it. And meanwhile there are companies like Know’em who offer package services to create profiles for you on literally 100s of social properties.  Is your head spinning yet?  :)

I decided to take a middle-ground approach and focus on 12 properties which seemed to be the more well-established ones and that I was able to sew together a reasonable syndication scheme for.  After reviewing all of these properties I found a class of social media that specifically provides for syndication to other properties. is the one that I particularly liked, but you’ll also find that HootSuite, and a few others do the same, but with varying lists of properties they integrate into. 

Below is a diagram of what I ultimately came up with:

Social Media Syndication

The full list of social media properties I’m able to reach this way are:

  • StumbleUpon
  • Facebook (fan page)
  • Twitter
  • MySpace
  • Tmblr
  • Plurk
  • Jaiku
  • FriendFeed

You can see the UML actor at the top which represents the social media manager (SMM).  Their responsibility is to originate and propagate the social media message.  Rather than logging into all 12 properties, I was able to reduce this to only 4 properties.  First and foremost, they need to manage the frequent blog posts.  I consider this the primary broadcasting point from which they should be originating the message.  When a blog post is published, it will automatically post to 11 properties via, including Ping’s own board.  The two curious exceptions to the syndication scheme are Twitter and the Facebook FanPage. I’ll explain those in a moment.

If you look at items contained within the dotted line area, those are the 4 items that the SMM must interact with directly:

  • Primary Blog – originate the message
  • HootSuite – manage the Tweet Stream
  • Facebook Fan Page – direct supplement of message
  • StumbleUpon – Important but no API for syndication

The blog, as mentioned, is the primary point from which the message originates.  HootSuite is a nice way to augment the primary message and manage the Twitter channel.  Here, you can see what’s gone out, monitor Twitter responses and syndicated retweets by syndicating out from HootSuite to  As for the Facebook FanPage, the primary syndication of those blog posts will make its way through but I think its important to reinforce that syndication with some direct messaging on the Facebook Fan page as well, which is why you’ll notice a minor stub included in that group.   And finally StumbleUpon is included here because for many media types, it has become the most effective social bookmarking utility, but it hasn’t opened up its API for syndication, so it needs to be managed directly.

Interestingly, the reason I post directly to Twitter from HootSuite and suggest reinforcing your message directly on Facebook, not just syndicating, is due to filtering issues.    Recent studies of Facebook’s Edgerank filtering has shown that some delivery methods are far more likely to be filtered out of user’s news feeds.  Here were the bottom line findings:

  • HootSuite – 69% reduction
  • TweetDeck – 73% reduction
  • Sendible – 75% reduction
  • Networked Blogs – 76% reduction
  • RSS Graffiti – 81% reduction
  • Twitter – 83% reduction
  • Publisher – 86% reduction
  • twitterfeed – 90% reduction
  • – 91% reduction
  • Social RSS – 94% reductions

As you can see, syndication schemes do seem to get less visibility than direct posting on Facebook, likely due to spamming issues, etc and thus why you should reinforce your Facebook fan page message directly.  This ensures your message gets through but will also help improve your overall edge ranking for the fan page.  Indecently, the above statistics are partly why I chose to use instead of Dlvir has a great reputation as a very usable syndication tool, but its filtering level is the second worst of the group studied!

SEO Link Building Resources

As anyone should know if they’ve spent any time building a search engine optimization campaign, on-page optimization is only going to get you so far; without links, your site isn’t going anywhere if you have any competition.  Let’s say your goal is to rank for “Los Angeles” instead of “Modesto”.  With the term “Modesto” you’ll probably not need much if any inbound linking, but to rank for “Los Angeles”, link building will be the dominant part of your SEO effort.

According to Google, link building shouldn’t even be something you do, because it is inherently a manipulation of the search results.  Instead, you should focus on creating great content and promoting your site and links happen on their own.  Okay, if are creating shocking and scandalous content then yes that can sometimes be enough, but for most it is not!  And so then what?  Well, you may decide to just write your content and post it to see what happens, but eventually reality will set in when a few months have past and you’ve received NO TRAFFIC – thanks Google! So then you must face reality and get serious about link building.

In SEO we talk about “black hat” (bad) and “white hat” (good) methods for SEO.  Actually, I think of it more as a spectrum of gray, from light to dark.  And yes, most off-site methods of SEO such as link building fall into that gray abyss somewhere, some on the darker end of the spectrum than others, but those are often the methods that work the best, at least in the short term (such as link networks).  Just be aware of the risk trade offs and factor those into your goals, long term versus short term.    With that said, here is a comprehensive list of link building tactics and the resources I recommend for those tactics.  I’d generally recommend a holistic strategy and incorporates all of these at some level (link networks being an exception for high value sites).  Your goal is to create a natural pyramid of links, with a lot of low value discussion type links, with increase value links layered on top of those at a decreasing number as you go up the stack.

Note – I’ve given a score (1-10) to each of these to help indicate how “white hat” it is; 1 is white hat and 10 is black hat.

1. OpenSiteExplorer (1) – OSE is a tool provided by the folks at SEOMoz that pulls together a number of internal and external criteria to try to understand the real “link juice” power of certain sites that you may target for inbound links. The MozRank in particular is a proprietary algorithm used for this purpose, with this tool. You have only limited access to try it out unless you signup for an account at SEOMoz.  In my opinion, this single tool is so valuable that it alone justifies the account expense for SEOMoz.  This can be used for seeking out high value sites that you may ask for inclusion into their reviews or perhaps even to instigate conversation with a reporter or writer to review your product, etc.

2. Angela’s Packets (7) – Under the radar somewhere, is a small army of outsourced link builders in the Philippines quietly building inbound links with forum and blog comment spam. We’ve all seen them – those comments on our blogs that say something generic like “good work, I like the post” and meanwhile, they’ve placed a profile link back to some unrelated site. Most of the big sites of course try to circumvent this issue with nofollow attributes on their link tags, but many of the more common WordPress and Moveable Type blogs, as well as many other older installs don’t have this limitation.  And so “researchers” such as Angela and Paul have scoured the Internet finding these resources with good PageRank and providing monthly reports called “link packets” which are the battle blah for these small armies to find and conquer what they seek.

3. Scrapebox (7) – Okay, above when I said those small armies were fueled by “researchers”, what I really meant were resourceful individuals most likely using tools like ScrapeBox.  This tool will literally scour the Internet specifically seeking opportunities for dofollow links with good page rank and provide a report back to you.  It comes with a pretty sophisticated tool with link management tools, rotating proxies, auto commenting, etc.

4. Link Networks (8) – Okay, we’re definitely getting into the darker-gray hats with link networks but it wouldn’t be am honest review of link building resources if it were not covered.  Google of course hates these because they’re highly potent resources for gaming their search rankings, and they tell people you’ll get in trouble if you use them in order to cast doubt on the practice and scare most web masters from going this direction.  But the unfortunate reality is, they work very well!  At least for now.  Now granted, there is risk here and so you wouldn’t use these for a client site or even an owned site with any substantial investment tied to it.  But this can be very effective for those sites for which you’re confident you’ll achieve positive ROI quickly, or even just as a minor part of your overall mix.  Just don’t let that be a dominant part of your linking mix for your brand or “money” site.

  • HighPRNetwork – The best quality links of the networks.
  • LinkVana – A well-established and well-regarded.  100s in network.
  • SEONitro – Smaller network.  High prices for exclusivity.

5. Article Directories  (5) – These are websites full of community contributed content.  Often you’ll see slogans such as “you make the news”.  These sites have become very popular among SEOs because you can submit content under the Creative Commons license which may contain a link or two back to your site. You publish these here and then other web masters may also borrow  and republish your content, provided they don’t remove the links back to your site.  To make this cost-effective, webmaster would often source content from overseas and then put it through a content spinner to create multiple “unique” versions of the same article, and then submit to these depots.  This was probably the single most effective link building strategy under the Google “Farmer” Panda update.  The primary goal of that update was to reduce the effectiveness of these depots and discourage the low quality content. That’s not to say it completely doesn’t work now though – it does!  It can still be an effective part of a holistic link building campaign. Here are my favorite resources:


6. LinkWheels (3) – Another popular linking strategy is to setup a series of your own micro blogs, populate them with content and them link them back to your primary site in the content of those articles.  In a traditional link wheel, you’d link each of these satellite sites to each other in a circular reinforcing way to increase link value of each page, though in more recent days I’ve heard its more effective to NOT interlink these and just form more of a hub and spoke pattern back to the website.  Google has apparently become a little too smart for this tactic now.

  • Squidoo
  • Hubpages
  • Weebly
  • Quizilla
  • LiveJournal

7. Directories (2) – Once the mainstay of initial link building (circa 2002), website directories were the original way to discover who’s who on the Internet, prior to search engines really becoming so ubiquitous.  They stuck around for a while and were generally considered a good signal or quality and authority, particularly if it cost money to sign up, or in the case of DMOZ was hand-selected to ensure premium quality. Over time, these have become overly manipulated like so many other things and so their value is not what it use to be. Nonetheless, this can be an effective component to a holistic link building strategy.  The goal here is to keep costs under control, and not spend money on every directory you find.  There are far more directories out there than you’ll get value out of.  The main directories that are known to provide maximum value are DMOZ (though its impossible to get listed anymore), Yahoo Directory (though it costs $300 per year),and (though their prices have gone up considerably in recognition of their value). Those tier 1s are probably not necessary for most SEO projects, but if you’re in a hyper competitive space and have the budget, you may be able to justify the spend.  Here’s the short list of the one’s I’d recommend, in order:

  • GoGuides
  • BlugUniverse
  • InCrawler

8. Social (2) – There’s been a lot of buzz in the past year or two that social signals such as Twitter retweets, Facebook mentions, G+ likes, may be factoring into search rankings now.  In fact, this seems like a very natural evolution in search rankings, since social conversation monitoring can do a very good job of filtering what everyone is truly excited about, compared to what’s merely being promoted online.  It can also be a much more interesting and fulfilling way to build search rankings by promoting your product, rather than gaming the search engines.  Below are a few of the better social resources to consider when building your campaigns. Ultimately, yes, there are ways to game this too (there’s already a small cottage industry with companies such as SocialKik who will sell you retweets etc), but these are the sites that really matter and that any campaign you design should ultimately point back to:

  • Facebook Fanpage
  • Twitter
  • Google+
  • LinkedIn
  • Yahoo Answers
  • Wiki Answers
  • StumbleUpon
  • Delicious


SEO On-Page Optimization Tools

Anyone who has spent any time doing search engine optimization knows the effort basically breaks down into on-page and external factors (such as link building).  There is probably a multi-page discussion to be had there about the true significance of on-page factors compared to off-site factors but for the purposes of this post, I just want to capture my favorite tools for dealing with  “bread and butter” on-page optimization:


1. SEOMoz – I really like SEOMoz, their tools and the community. SEOMoz is basically one of the two contenders for a primary SEO tools (SEOmoz vs RavenTools). I tried both of them out and I just found myself a lot more inclined to the tools at SEOMoz so I stuck with it.  I think my first biasing factor was unlimited access to OpenSiteExplorer (OSE), which is the best link-building research tool I’ve found so far (off site factor).  But it is also just a really nice set of tools for standard SEO efforts such as periodic ranking reports, keyword competition analysis, and the like. The other stand-out feature that I really like is the support.  They have really great community, Q&A, and blog/training materials.  The Whiteboard Fridays video blog series with Rank Fishkin in invaluable!  In my mind, this is the single best tool to equip an in-house SEO team with for this combination of reasons.

2. ScreamingFrog – A free and downloadable utility for SEO audits.  I really like this tool for getting deep into the details of what’s going on when things aren’t quite clicking for a site’s rankings.  Discover HTTP header issues such as 404s and 301/302s, duplicate headers, missing alt tags, etc.  I would generally use this tool at the end of an on-site optimization effort to check my work, or later when a site just isn’t getting traction and I’m trying to figure out why.

3. IBP – Internet Business Professional (IBP) is a downloadable software platform that I started using back in 2004, as I felt it was the best thing out there at the time.  Honestly, SEOMoz and RavenTools are probably better for a lot of its core features now, but it still not bad and doesn’t incur that monthly fee!  It also has a couple really nice features I occasionally use:  First, I like the Top 10 reviews – you type in keyword for it to search in Google and it will then find the top 10 ranking results and analyze them looking for common elements that you may be missing in your site.  This can be a nice resource for finding and reminding you about on-page details you may have overlooked on a competitive basis.  And Second, it is still a pretty good resource for running those periodic ranking reports on your sites.  You can configure it to run in the background on a schedule and output PDF reports as needed.

4. YSlow – We’ve all heard by now how page load speeds are a factor now for Google rankings.  Actually, I believe this is a much less significant factor that it has been given credit for, and only comes into play on the most competitive spaces where an extra tie-breaking criteria is needed, so I wouldn’t get too hung up in it. Nonetheless, it can also benefit overall user experience and expose bandwidth inefficiencies etc, so its not without merit.  For this, I like Yahoo’s YSlow tool.  It will show you which specific resources are taking the longest to load such as CSS and Javascript files.  You’ll see opportunities to minify files, push some files to a distributed content network, GZip images, etc.  Naturally you cannot account for server-side resource issues here; its purely a download-centric UI tool.

5. WebConfs – A set of free mini tools available for free at  While this isn’t the shiniest tool in the toolbox, it is free and can be rather useful for sending links to others to show them your point. I particularly like their HTTP header check tool. It works beautifully!

Content Marketing Cost-Effectively

Have you ever wondered how content marketing works and how it is being used to influence Google’s search rankings? Until very recently (Panda update) this was single-handedly the most effective popular way of improving page rank for a website, via inbound links.  An author would create a lot of original articles related to their site, spin them to create unique versions of these articles, and then submit them to all the various article directories.  Each article might have a link back to the actual site and would be published under the Creative Commons license, enabling other sites to copy and use these articles too, thereby providing more inbound links.  But she you think about it, that is a LOT of work if you do it manually, so its critical to look at this from an automation perspective.  Here’s basically how it works:

1. Commission Content – Step one is to get the original content written for a low price. You could use CraigsList for higher quality writing but its probably going to be too slow and a little more expensive than other resources such as or from someone on oDesk.  The key with success with both of these resources is to stick with the top-rated resources, even if they cost a bit more money.  It will save you time, money, and a huge headache later.  If you’re going to do this en masse, finding a team on oDesk is probably a better solution than iWriter over the long run. If you’re going that route, be sure to *avoid* content from places like India or Pakistan; I find the Philippines a lot better for content.

2. Article Spinners – in the case of aggressive content marketing via the articles directories, you’ll need unique version of your articles to post to all of these directories, in order to overcome the Google duplicate content filter. The way many have figured out to do this, is to have cheap articles written overseas via oDesk, iWriter, etc, and then spin these to dynamically create “unique” articles in the number required for all the different directories you’ll submit to.  So if you are going to submit to 20 places, have one article written and spin 19 new versions. There are many popular tools but the most popular seem to be Article Marketing Robot, The Best Spinner, and Spin Chimp. I’m personally a fan of SpinChimp as it is just as good as the others, but a lot less expensive.  It also will run on your local computer, not requiring a constant online connection.

3. CopyScape – As with any content commissioning project, you’ll want to check the validity of the content to make sure it wasn’t just ripped off.  Or even if you’re just spinning content, this can be a really good tool to validate your spinning quality.  Get a premium account and run the articles to make sure there are either no, or minimal matches that occur.  Note – regarding the article spinners, if using SpinChimp, you can even integrate directly to streamline the workflow.

4. Submitting – Finally you need to submit your content.  Article Marketing Robot is very popular because of its submission engine but there are other tools you can use for this purpose too. Many of the articles directories have Captchas, so you’ll need to find one that has built-in de-captcher.  There are more than a few of these out there.

Note – it would be a myth to suggest this doesn’t work at all any more, but Google has certainly cracked down on the practice by punishing the articles directories that publish this often thin and low-quality content.  As a result, places like EZineArticles and HubSpot have increased their qualities requirements, etc.  Its still worth exploring in a more limited capacity and as a part of a more holistic SEO campaign, but don’t expect the wild success this would have brought you in 2009. :)