You never know until you test it. Even with blog posts

By some definition, my techvibes guest post “Why you should choose Canada over the valley” went viral 2 days ago.  Less than 24 hours after it was live, it was their 4th most viewed article of the year, and currently has 10X more shares than any other article within the last 2 weeks.  It was trending on hackernews front page all day.  Usually at around #5. All in all, a very successful post which drove engagement and dialog.

I’ll share some specific data on what those type of numbers “meant” in terms of conversions, as well as some other side effects of the article, but I really want to take the opportunity to emphasize that this is a good example of never knowing if something *will work* until you actually try it.

I’d talked to a few people about the topic of “false positives in the valley” before, and mentioned writing a blog post about it. I’d usually get fairly mixed responses about it’s potential.  The night I submitted it I almost backed out, thinking it wasn’t engaging.  I wanted to incorporate a couple startup genome studies from my good friends at startup compass.

Turns out that blogs are a lot like product.  In a lot of ways, I suppose they ARE product.  Talking is a great, collect as much data possible, run tests to validate assumptions.  But regardless of how much you’ve talked to others, and tested assumptions, you don’t truly know if you’re onto something until it’s seen in the context of real consumption by real users.