Archive for May, 2009

Your Brain is Not Normal

May 27, 2009

No, you don’t have anything to be afraid of, so no need to go running to your local brain surgeon. Your brain is just fine. But it is not normal, contrary to the belief that we all hold. What I mean by that, is that we have come to normalize what we see from our vantage point: “I feel this way, I think think this way, my experience is XYZ; therefore it’s normal and everyone feels and thinks this way.” That couldn’t be further from the truth. Based on our experiences and education (formal and informal), we develop a certain prism that we apply to all subsequent events to help us understand them and place them within context of what we know. It is how we process and organize information. To add a level of complexity, our prism is constantly evolving, as we add more and more experience and education to our arsenal. Because my prism is different from yours, the same exact event can make us feel quite differently about it.

Ability to understand this nugget is the key to being an effective human being, whether it’s in personal or professional interpersonal communications, or in marketing to consumers or businesses. It’s really at the foundation of all communication. Before having a discussion with your significant other, writing that intraoffice e-mail, putting together that powerpoint deck for a presentation, writing that blogpost or sending that customer e-mail update, do a quick reality check. Try to step outside of yourself and say “Ok, I consider this normal, but does the person receiving my communication feel the same way? What prism will he / she apply to what I am communicating to him / her?” I know this sounds beyond elementary, but this is one of the biggest failures in communication. We all do it. I, for example, get so wrapped up in the social media world, that I assume that everyone blogs, tweets, podcasts, creates video content, mobile / web widgets and apps. Not so! Most people have no idea what any of those things are. However, I have come to normalize it, because those things are my reality. We are creatures of our respective environments, so please take a minute to try to extrapolate how other people’s environments and experiences have shaped their views, habits and responses to stimuli.

Reblog this post [with Zemanta]

Attending the 140 Characters Conference

May 26, 2009

I have attended several of Jeff Pulver‘s events in the past (Social Media 1 and Social Media Jungle 2), and have always been impressed with the content, as well as the amazing connections I had made. I am even more excited for the 140 Characters Conference (#140conf as it’s known on Twitter). This is a not-to-miss event for anyone who is passionate about Twitter as the hottest emerging communication platform.

Those who read my blog know how completely and utterly enthralled I am with Twitter, above and beyond any other social network. Most of my blogposts have at least something to do with this extremely disruptive, efficient, insightful, ubiquitous and open communication platform. Hence, my interest in a Twitter conference is tremendous. Monitoring the Twittersphere and the event site, some of the top voices in social media and Twitter luminaries will be attending the event. Networking is also a top reason of why I am extremely interested in this event.

Unfortunately, due to financial constraints I am unable to pay the fee to attend. So I am applying for and hoping to win the #140conf Scholarship. If I was selected to be a scholarship finalist, I would diligently cover the event via Twitter and blogging, adding my own insights. I will also help promote the event via all social media tools available to me.

Reblog this post [with Zemanta]

Tracking Conversations

May 26, 2009

computerA huge benefit of Twitter, especially for brands, is the ability to track and monitor what the Twittersphere is saying about you, your competitors and just about any related topic. Because of how important search is, Twitter actually bought Summize and incorporated it as search.twitter.com. Desktop and mobile applications have search functions that vary in sophistication and ease of use. I think we will see quite a bit of innovation coming from Twitter insights and tracking conversation: after we have created all this content, we need to know how to extract valuable nuggets from it. Innovation will vary from simple search tools to more complex and intelligent semantic search, to enterprise-wide solutions. I am excited to see what develops.

One tool that caught my eye last week was ConvoTrack. It’s a fantastic little bookmarklet that lets you track and package conversations around a URL. It’s based off the Backtype API which allows to get the full context of URLs, regardless of whether it’s shortened or full, or what type of shortener was used (bit.ly, tinyurl, is.gd, etc). Moreover, the URL is tracked all over social sites, including Twitter, FriendFeed, Digg, Reddit, or any blog mentioning that URL. To illustrate, here are the comments around the gay marriage ban in California today – http://convotrack.com/19R. While bit.ly analytics can be useful to track the reach of each URL that you shorten, tools like ConvoTrack take it a step above, by allowing to track any URL, regardless of who originated it. Twitt(url)y is also a great tool of discovering the top trending URLs and the conversations about them; however it’s limited to Twitter only and isn’t as useful if you want to track a less popular URL. All in all, a ton of tools come out each day, it seems like. They are designed to make our lives better, but the process of discovery and trying out different tools makes my head spin sometimes. Which is not a bad problem to have. For the most complete tool list, I recommend reading the following post by Brian Solis.

Reblog this post [with Zemanta]

And… There Goes the Neighborhood…

May 25, 2009

neighborhoodI, like many others, have been concerned about the rise of spam on Twitter. First, spammers would book a user name, follow a bunch of users, In hopes of auto-follows, then auto-DM those users with spam links (this is why I never auto-follow). Then, there has been talk of spammers latching on to a trending hashtag with an irrelevant spam message containing a link to an unrelated site. I have no issue with self-promotion, as long as it’s done tactfully, and is designed to add value to the conversation. By latching on to a trending hashtag, a spammer will appear in the search timeline for anyone tracking that topic, and thus gain great visibility.

spamNow, last night, as I was following my twitter stream before going to bed, I came across this link to a “guru” site promising to amass tons of twitter followers fast. It made me vomit a little. Real estate scams, now Twitter scams? This is fueled by the rise of a blind race for users, fueled by users like Ashton Kutcher and others who amass followers like it’s some kind of a competitive sport. Amassing followers may be fun, if that’s your type of thing: a popularity contest of sorts. But if you are looking to build value for yourself and your followers via Twitter, you will be wrong to follow this path. The Twitter community is all about building long-term relationships, listening and engaging before you speak, being authentic and being human. The 30 second spot is fading in efficacy, and brands looking to really engage their hard-to-reach customers must not use Twitter as a 1-way broadcast system. Which is why I am disappointed by these developments, but also think that spammers will soon realize that Twitter is not the right medium for amassing tons of followers non-organically and blasting them with a 1-way message. As Brian Solis said in response to this development (via Twitter, of course), “Those driven by the # of followers will find themselves alone as social Darwinism ensures the survival of the loyal+helpful.” And remember that there are no shortcuts to success, only hard work and producing quality content. There is no such thing as an “automated Twitter traffic machine”.

These developments, while not surprising, disappoint me. I am not surprised, because Twitter has definitley jumped the shark, and all popular digital communication methods get invaded with spammers after they become popular. But it does make me a little sad to see this behavior going on in a medium that we have come to love for its community feel. I guess this is what happens when web products start to cross the chasm.

Reblog this post [with Zemanta]

How portable is your data?

May 19, 2009

Quite an interesting development yesterday that Facebook is now implementing OpenID, allowing its user to sign-up and sign-in with non-Facebook credentials. I definitely did not expect this from the “walled garden” known as Facebook. Initiatives like OpenID are a fantastic move in the direction of being consumer-centric in the face of extreme web fragmentation. As the Web 2.0 bubble grew, more and more websites were created, forgetting that the user can only visit a finite number of websites. Expecting users to go to your niche social network is not a strategy any longer, especially since just about anything can be done inside of Facebook and a handful of others. It’s hard enough to get users to go to your site, it’s foolish to expect them to create a new set of credentials. Even if they do create the new login, coming back and remembering how to login is a whole other bridge that needs to be crossed. Which is why OpenID is super important now.

Another area that’s just as important and heavily debated is: what happens to your data when you do engage with a site / social network? What about the information that you have diligently provided to Facebook about yourself? What about all of your pithy wisdom that you have shared with your followers on Twitter?  What about all the photographs that you posted to Flickr and Facebook? What about all the diligent tagging, note writing, photo album creation, wall posts, comments on your friends status updates on Facebook? And oh my, what about the e-mails? Who owns that? We would like to think that we do, as it’s our content. But reading many sites’ TOS’s, that couldn’t be further from the truth – the site owns all of the content. Putting aside the possibility of a social network misusing our content (that’s a whole other discussion), what happens when the “new Facebook” (whoever that is) dethrones Facebook, and you want to take all of your content with you that you spent so many months, even years, creating? Do you have to start from scratch? It is my theory that this is why Facebook is so successful: we have so much skin in the game, we aren’t going anywhere, and they know it. And what about the not-so-remote possibility of a site like Facebook failing? Does all of your content die with it?

I first started thinking about it when I saw exactly how formidable the amount of user-generated content is when I witnessed the below exchange, generated by my Facebook status update. My friends wrote many, and quite lengthy, comments (which could’ve been blogposts in their own right). They were so engaged and free to share, and we all got so enthralled by the discussion that we forgot that we may never see this content again after sharing it.

fb1b

fb pic2b

One way to ensure that all of your comments at least get funneled into one depository that you can point to, and make part of your digital footprint, is to use commenting systems like Disqus. But that still doesn’t solve the problem of a site going out of business and taking you down with it.

How do you preserve and backup your content? I have tried tweetake.com to back up my tweets. It does a great job of throwing your tweets / direct messages / favorites / all of the above into a spreadsheet. However, it only goes back a couple of months; at least it did for me.

Reblog this post [with Zemanta]

Importance of transparency

May 19, 2009

transparentCompanies can no longer afford to not communicate the reasons why they are doing / not doing something to their users, or to not communicate fast enough. Not to beat a dead horse, but this was very apparent in the Twitter @ replies episode from last week. Biz Stone of Twitter took the time to explain why the changes were put in place, but unfortunately it was too late. People had started talking about it, and before too long, tweittersphere had heated up to epic temperatures. Users were angry, and since they didn’t have information to help them understand what was happening and why, they took it upon themselves to fill in the blanks. Turns out that technical scalability issues, as well as reduction in noise were the primary reasons (you can read Biz’s view here).

Where Twitter went wrong, in my opinion, was the lack of communication to its end users on the reasons why this was happening. Twitter has successfully created this amazing communication platform, but in its first iteration it’s very much like drinking from a fire hose. Fine-tuning to reduce the noise and increase relevance is the natural next step, and I welcome it with open arms. Twitter is fine-tuning now by giving us options in which we can produce @ replies (to be seen by some or by all of our followers), as well as reducing the noise from the people we follow (by fine-tuning how much of each person’s @ reply stream we see – this feature I wanted since day 1!). But even though all of this is done for our (users’) benefit and with the long-term vision in mind, things can go very askew if you don’t take the time to educate and communicate upfront. Because of how virally sentiments spread on Twitter (especially when they are about Twitter), preemption and anticipation, in a very transparent way, are key to managing sentiment and expectations.

Reblog this post [with Zemanta]

How much should you listen to your customers

May 14, 2009

The recent upheaval in the Twittersphere regarding the new Twitter update dealing with @ replies has got me thinking over the past couple of days (in case you are not familiar with what the new Twitter update does, please read @whitneyhess’s blogpost which does a great job explaining it). Even though I am not going to rehash the details of the new update here, I will briefly mention that as a result, Twitter no longer shows you @ replies directed to people you don’t follow, even though you follow the writer of the tweet (this only happens when the handle of the person you don’t follow is the first word of the tweet).  I am not sure why Twitter did this; perhaps they were helping us reduce the noise that is produced by following everyone’s @ replies. However, this makes little sense, as you can adjust your settings from inside the Twitter.com site. What Twitter should’ve realized that a lot of users find @ replies beneficial to discovery of new users to follow. If someone I respect and engage with replies to someone else, I will take notice and at least click through to that person, and if I like him / her, make a decision to follow. Yes, there are ways to still explore these tweets via setting up a search in TweetDeck and other tools. However, by having everything come to my main feed, it was just that much easier. By taking the option away, Twitter has taken the ability to decide from its users.  If you give me an option to adjust what @ replies I see via settings tab, why take the decision away from me?

The above example shows that Twitter wasn’t particularly listening to its users and how they use the site.  Some companies pursue their strategy without paying much attention to what its users want. And sometimes it’s for good reason. This @ replies episode reminded me of a talk by 37Signals Jason Fried at last fall’s Web2.0 conference in NYC. Jason talked about how each product person / company needs to be a curator, carefully reviewing user suggestions for improvements, while implementing only those that make sense given the company’s strategy. This makes complete sense, especially as a company grows and acquires more users and thus more user feedback. And secondly, users don’t even know what they want most of the time. When they say they want something, sometimes it’s just a symptom of a larger problem they have to solve. As a business, you must figure out this larger problem and solve it, instead of solving by implementing piecemeal modules.

So the question still remains… When do you listen to your customers and when does not listening translate to anger (as in the Twitter example) and possible attrition?

Reblog this post [with Zemanta]