Product & Startup Builder

The 5 (insignificant) ways Micro-blogging is different from Blogging: How to create an open Twitter alternative

Added on by Chris Saad.

Given the recent developments in the Twitter Developer ecosystem I think it's a good time to revisit the idea of an open web alternative to Twitter. The headline of this post hints at, in my belief, the key to making this dream a reality.

The fact is, there are only 5 insignificant differences between Micro blogging and normal Blogging. I will try to detail them below. My point in doing so is to illustrate that the best way to bootstrap an open alternative to Twitter is not by inventing a bunch of new technologies or products, but rather to realize that most of the pieces already exist in the current blogging ecosystem. With a few modifications a distributed micro-blogging ecosystem can easily emerge.

When I say the the differences are insignificant, I mean that while collectivelly they change the usage model significantly, examining them each one-by-one shows how small the gap is between what we have today and what we need.

1. Length

Micro-blogs are, well, micro. They are shorter. This is not some marvelous invention, it is a simple, imposed limitation on the input field. Any publishing software today, from Wordpress to Drupal can be modified to force users to stick to 140 characters. Call it 'Micro-blogging mode'. I don't think this particular difference (or how to bridge it) warrants much more explanation

2. Real-time

While blogs used to update rather slowly in a publish/subscribe model, Micro-blogging has had a reputation for being 'faster' or 'real-time'. The old school refresh rate of 15 minutes or more (time between RSS refreshes) seems like an eternity these days.

Of course the reality is that the Twitter API is still incapable of sending updates to individual clients in real-time and the whole thing is far from real-time. Updates in seconds, however, is a key trait of micro-bogging.

The fact is, however, that Blogs now have a method of pushing updates that's faster and more effective than even the Twitter API. It's an open standard called PubSubHub and it's supported by both Blogger and Wordpress, Buzz and countless other smaller services.

Blogs are already real-time.

3. Identified Subscriptions

One of the nice things that Twitter does that traditional Blogging software does not do is 'Identified Subscriptions'. That is, when you subscribe (aka Follow) a user, their name and face appear in your sidebar and you get a nice little ego boost in the form of a notification email and follower count.

Why couldn't we add a simple mechanism to PubSubHub so that when a client subscribes to push updates, it leaves behind some optional identifying information about the user like their name and avatar? Or maybe instead of leaving the actual username/avatar, it might provide a URL to the subscribing user's own microblogging site which has that metadata stored in the header.

4. Addressability

This is perhaps the most complicated difference/gap to close. With Twitter, you can easily say "Hey @chrissaad you are are a crazy hippy" and I will get it in my message stream.

Blogs can't do that right?

Well, actually, blogs have been doing addressability since day 0. The same way the rest of the web does addressability - using Links. Bloggers frequently link to each other and then check their trackbacks and pingbacks for incoming references.

The only problem with this model is that it's not user friendly enough. Mainstream users don't understand URLs and checking pingback and referrer logs is just plain silly.

So rather than re-invent the wheel, why not just add rubber?

To make it easier for users, imagine if blogging software kept track of the users you were following (see point 3) and when you type '@' they provided a list of suggested Aliases to choose from. When you select the person you are addressing, the software could insert the alias and hyperlink the name to the associated URL of that user's microblogging site.

Clients, then, could subscribe to Google Blog Search (remember blog search is essentially the Blogging world's open firehose) and search for any reference to your personal URL.

The rest is just presentation tricks to show those replies mixed in with the rest of your microblogging items.

5. Clients

Why can't existing Twitter clients allow users to subscribe to PubSubHub enabled RSS/Atom feeds. They would also subscribe to the Google Blog Search for references to your own URL (For @ Replies). No need to rip and replace Twitter, just offer an open alternative - subscribe to any site - anywhere.

Conclusion

As you can see here, Microblogging is/could be fundamentally the same as Blogging in terms of the mechanics and technologies involved. The techniques used to build/improve the open blogosphere could be used to bootstrap a microblogging sphere as well.

Many have made big strides in this area such as Statusnet. The opportunity now is for the (ex?)Twitter Clients, Blog Publishing Platforms and the standards groups to make small tweaks to extend the technology in the right way.

Missed opportunities in Publishing

Added on by Chris Saad.

MG Siegler over on Techcrunch yesterday wrote a story about how the AP is tweeting links to its stories. Those links, however, are not to its website. Instead those twitter links lead to Facebook copies of their stories! Here's a snippet of his post:

The AP is using their Twitter feed to tweet out their stories — nothing new there, obviously — but every single one of them links to the story on their Facebook Notes page. It’s not clear how long they’ve been doing this, but Search Engine Land’s Danny Sullivan noted the oddness of this, and how annoying it is, tonight. The AP obviously has a ton of media partners, and they could easily link to any of those, or even the story hosted on their own site. But no, instead they’re copying all these stories to their Facebook page and linking there for no apparent reason.

As Sullivan notes in a follow-up tweet, “i really miss when people had web sites they owned and pointed at. why lease your soul to facebook. or buzz. or whatever. master your domain.”

What’s really odd about this is the AP’s recent scuffle with Google over the hosting of AP content. The two sides appeared to reach some sort of deal earlier this month (after months of threats and actual pulled content), but now the AP is just hosting all this content on Facebook for the hell of it?

To me this isn't unusual at all. In fact it's common practice amongst 'social media experts'. Many of us use/used tools like FriendFeed, Buzz, Facebook etc not just to share links, but to actually host original content. We actively send all our traffic to these sites rather than using them as draws back to our own open blog/publishing platforms.

I completely agree with MG. Sending your audience to a closed destination site which provides you no brand control, monetization or cross-sell capability shows a profound misunderstanding of the economics of publishing.

Some will argue that the content should find the audience, and they should be free to read it wherever they like. Sure, I won't disagree with that, but actively generating it in a non-monetizable place and actively sending people there seems like a missed opportunity to me. Why not generate it on your blog and then simply share the links in other places. If those users choose to chat over there, that's fine, but the first, best place to view the content and observe the conversation should always be at the source, at YOUR source. YOUR site.

Some will argue that those platforms generate more engagement than a regular blog/site. They generate engagement because your blog is not looked after. You're using inferior plugins and have not taken the time to consider how your blog can become a first class social platform. You're willing to use tools that cannibalize your audience rather than attract them. You're willing to use your  blog as a traffic funnel back to other destination sites by replacing big chunks of it with FriendFeed streams rather than hosting your own LifeStream like Louis Gray and Leo Laporte have done.

Some will argue (or not, because they don't realize or don't want to say it out loud) that they are not journalists, they are personalities, and they go wherever their audience is. They don't monetize their content, they monetize the fact that they HAVE an audience by getting paying jobs that enable them to evangelize through any channel that they choose. Those people (and there are very few of them) have less incentive to consolidate their content sources (although there are still reasons to do so). Unfortunately, though, media properties sometimes get confused and think they can do the same thing.

The list of reasons why publishing stuff on Buzz or FriendFeed or Facebook as a source rather than an aggregator goes on and on, so I will just stop here.

I'm glad MG has picked up on it and written about it on Techcrunch.

#blogsareback

Update: Steve Rubel is agreeing with the AP's approach. Using all sorts of fancy words like Attention Spirals, Curating and Relationships Steve is justifying APs ritual suicide of their destination site in favor of adding value, engagement and traffic to Facebook. Sorry Steve, but giving Facebook all your content and your traffic and not getting anything in return is called giving away the house.

Again, I'm not advocating that you lock content away behind paywalls, I'm simply saying that you need to own the source and make your site a first-class citizen on the social web. Not make Facebook the only game in town by handing it your audience.

Google Buzz = FriendFeed Reborn

Added on by Chris Saad.

FriendFeed was dead, now it is re-born as Google Buzz. I've not been able to try the product yet, but philosophically and architecturally it seems superior to FriendFeed.

Here are my observations so far:

Consumption Tools

Buzz is better than FriendFeed because Google is treating it as a consumption tool rather than a destination site (by placing it in Gmail rather than hosting it on a public page). FriendFeed should have always been treated this way. Some people got confused and started hosting public discussions on FriendFeed.

That being said, though, I've long said that news and sharing is not the same as an email inbox and those sorts of items should not be 'marked as read' but rather stream by in an ambient way.

While Buzz is in fact a stream, it is its own tab that you have to focus on rather than a sidebar you can ignore (at least as far as I can tell right now).

How it affects Publishers (and Echo)

The inevitable question of 'How does this affect Echo' has already come up on Twitter. Like FriendFeed before it, Buzz generates siloed conversations that do not get hosted at the source.

So, the publisher spends the time and money to create the content and Buzz/Google get the engagement/monetization inside Gmail.

For some reason, all these aggregators think that they need to create content to be of value. I disagree. I long for a pure aggregator that does not generate any of its own content such as comments, likes, shares etc.

That being said, however, the more places we have to engage with content the more reasons there are for Echo to exist so that publishers can re-assemble all that conversation and engagement back on their sites.

Synaptic Connections

Note that they don't have a 'Follow' button - it's using synaptic connections to determine who you care about. Very cool! I worry though that there might not be enough controls for the user to override the assumptions.

Open Standards

Already, Marshall is calling it the savior of open standards. I don't think Open Standards need to be saved - but they certainly have all the buzz words on their site so that's promising.

That's it for now, maybe more later when I've had a chance to play with it.

Update: After playing with it this morning, and reading a little more, it's clear that this is actually Jaiku reborn (not FriendFeed), because the Jaiku team were involved in building it. They deserve a lot of credit for inventing much of this stuff in the first place - long before FriendFeed.

Also, having used it only for an hour, the unread count on the Buzz tab is driving me nuts. It shouldn't be there. It's a stream not an inbox. Also it makes no sense why I can't display buzz in a sidebar on the right side of my primary Gmail inbox view. That would be ideal.

It's also funny to me that some people have tried to give Chris Messina credit for Buzz even though he's been at Google for no more than a month. They clearly don't understand how long and hard it is to build product. Messina is good, but he aint that good :)

Facebook and the future of News

Added on by Chris Saad.

Marshall Kirkpatrick has written a thoughtful piece over on Read/Write Web entitled 'Facebook and the future of Free Thought' in which he explains the hard facts about news consumption and the open subscription models that were supposed to create a more open playing field for niche voices. In it, he states that news consumption has barely changed in the last 10 years. RSS and Feed Readers drive very little traffic and most people still get their news from hand selected mainstream portals and destination sites (like MSN News and Yahoo news etc). In other words, mainstream users do not curate and consume niche subscriptions and are quite content to read what the mainstream sites feed them.

This is troubling news (pun intended) for those of us who believe that the democratization of publishing might open up the world to niche voices and personalized story-telling.

Marshall goes on to argue that Facebook might be our last hope. That since everyone spends all their time in Facebook already, that the service has an opportunity to popularize the notion of subscribing to news sources and thereby bring to life our collective vision of personalized news for the mainstream. Facebook already does a great deal of this with users getting large amounts of news and links from their friends as they share and comment on links.

Through my work with APML I have long dreamed of a world where users are able to view information through a highly personalized lens - a lens that allows them to see personally relevant news instead of just popular news (note that Popularity is a factor of personal relevancy, but it is not the only factor). That doesn't mean the news would be skewed to one persuasion (liberal or conservative for example) but rather to a specific topic or theme.

Could Facebook popularize personalized news? Should it? Do we really want a closed platform to dictate how the transports, formats and tools of next generation story-telling get built? If so, would we simply be moving the top-down command and control systems of network television and big media to another closed platform with its own limitations and restrictions?

Personalized news on closed platforms are almost as bad as mainstream news on closed platforms. News organizations and small niche publishers both need a way to reach their audience using open technologies or we are doomed to repeat the homogenized news environment of the last 2 decades. The one that failed to protect us from a war in Iraq, failed to innovate when it came to on-demand, and failed to allow each of us to customize and personalize our own news reading tools.

That's why technologies like RSS/Atom, PubSubHub and others are so important.

What's missing now is a presentation tool that makes these technologies sing for the mainstream.

So far, as an industry, we've failed to deliver on this promise. I don't have the answers for how we might succeed. But succeed we must.

Perhaps established tier 1 media sites have a role to play. Perhaps market forces that are driving them to cut costs and innovate will drive these properties to turn from purely creating mainstream news editorially toward a model where they curate and surface contributions from their readership and the wider web.

In other words, Tier 1 publishers are being transformed from content creators to content curators - and this could change the game.

In the race to open up and leverage social and real-time technologies, these media organizations are actually making way for the most effective democratization of niche news yet.

Niche, personalized news distributed by open news hubs born from the 'ashes' of old media.

Don't like the tools one hub gives you? Switch to another. the brands we all know and love have an opportunity to become powerful players in the news aggregation and consumption game. Will they respond in time?

Due to my experience working with Tier 1 publishers for Echo, I have high hopes for many of them to learn and adapt. But much more work still remains.

Learn more about how news organizations are practically turning into personalized news curation hubs over on the Echo Blog.

A call for focus from the open standards community

Added on by Chris Saad.
Time to refocus the open community
Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the efforts being poured into the 'common ground' of the standards efforts.
Let me define the 'Common Ground' as I see it.
Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.
You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.
The problem, of course, is that the rest of the world has to care for a standard to matter.
Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.
Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.
All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.
At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.
The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.
Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.
Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.
Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is' the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.
Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would Fox News to shame.
The result, of course, has been a diversion from the important work of providing common area services to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect its own existance.
I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. Ny friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.
At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.
To restate my humble view on the matter:
To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.
This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.
In my book that is not only a very worthy effort, it is increasingly critical to the success and health of the web.

Over on the Open Web Foundation mailing list Eran Hammer-Lahav who, despite his gruff and disagreeable personality, I respect greatly for his work in the development of open standards, is effectively calling for a complete shakeup of the foundation and the work being poured into the 'common ground' of the standards efforts.

Let me define the 'Common Ground' as I see it.

Building strong common ground is like building strong open standards deep into the stack. Just like a software stack, our community needs a stack of organizations that are loosely coupled and open to participation. Groups like the W3C and IETF provide a rock solid core, more agile groups focused on specific standards like OpenID and Oauth are in the middle and a project like the DataPortability project was supposed to be on top - a kind of user interface layer.

You see, good standards efforts are neccessarily projects that work to solve one small problem well. The problems are often deep technical challenges that attract passionate and, let's face it, geeky people to hack, debate and decide on details that don't hit the radar for 99.9% of the population.

The problem, of course, is that the rest of the world has to care for a standard to matter.

Leaders and project managers need to be found, real companies need to get involved (not just their staff), collaboration platforms need to facilitate real and open discussion, calls for collaboration need to be heard, specs need to be written (and written well), libraries need to be written, governance needs to be put in place and so on.

Also, once the standard is (half) baked, less involved hackers need to participate to test the theories in the real world. Less savvy developers need to hear about the standard and understand it. Business people need to understand the value of using a standard over a proprietary solution. They also need IP protections in place to ensure that by using the standard they are not putting their company at risk. Marketing people need to know how to sell it to their customer base. Customers need to know how to look for and choose open solutions to create a market place that rewards openness.

All of this is 'Common Ground'. It is common to any standards effort and there should - no must - be an organization that is just as lean, mean and aggresive as Facebook in place to provide these resources if we are ever going to compete with closed solutions.

At the start of 2008 the DataPortability project became very popular. It's goal was not to build standards, but rather to promote them. To provide much of the common ground that I described above.

The DP project's particular mission, in my mind at least, was to focus on the marketing effort. To build a massive spot light and to shine that intense light on the people, organizations and standards that were getting the job done.

Is the OWF providing a generic legal/IPR framework? Fantastic! It was the DPP's job to let everyone know - developers, business execs, media, potential editors, contributors and more. Our job was not, and should never be to start the framework itself, but rather to advocate for, provide context around and promote the hell out of someone else's effort to do so.

Is a conference happening next year? Excellent. It was the DPP's job to get in touch with the conference organizer, organize not just a DP panel, but a DP Track and to create room (and perhaps even a narritive) inside which the people doing the actual work can speak.

Has Facebook just announced a new feature that could have been achieved through a combination of existing open standards? Then it is the DPP's job to consult with each of those standards groups and create a cohesive response/set of quotes for the media to use.

What is the relationship Facebook Platform, OpenSocial, Open Standards, OpenID, OAuth, Portable Contacts and Twitter's 'Open API'? DataPortability.org should have the answer neatly described on its website.

Unfortunately, though, many in the standards community chose to fight the creation of the project for whatever reasons crossed their mind at the time. They used all sorts of methods to undermine the effort. Some that would put Fox News to shame.

The result, of course, has been a diversion from the important work of providing this common ground  to the standards community to a self-protection state of creating governance and creating our own 'deliverables' in order to justify and protect our own existence.

I have, as a result of a series of unfortunate events, fallen out of touch with the Steering group at the DPP. Moving to the US, getting disillusioned with the community I admired (not those involved with DPP. My friends at the DPP Steering group have always performed very admirably and worked extremely hard) and ultimately shifting my world view to realize that the best contribution I can make - the best way to really move the needle - is to ship Data Portability compliant software at scale.

At this juncture, however, I think it's time for us all to refocus on our original mission for the DataPortability Project.

To restate my humble view on the matter:

  • To provide a website that explains data portability to various audiences in neat and concise ways. It is the onramp for the standards community. You should be able to send anyone to 'dataportability.org' and they 'get it' and know what to do next.
  • To provide context and advocacy on news and development from inside and outside the standards community so that media, execs and less involved developers can understand and react
  • To build a community of interested parties so that they can swam to the aid of standards groups or the standards effort in general.
  • To act as a market force to (yes I'm going to say it) pick winners. To highlight what works, what doesn't and what should be done next to move the whole effort forward. Nothing is as powerful as removing confusion and planting a big red flag on the answer.
  • To recognize that we have the authority to do whatever we want to do because we are an independant, private group who has chosen to create public/transparent processes. We need to believe in ourselves. If we do good work, then people will listen. If we don't then they can listen to someone else.

This necessarily means that the only real deliverable from the project would be a small set of communication tools that build community, context and advocacy around what we believe is the 'truth' (or at least things worth paying attention to) in the broader standards community.

Many have scoffed at that these goals in the past claiming that there was no 'value'. In my book this set of goals is not only a very worthy, it is increasingly critical to the success and health of the web.

Facebook privacy changes are not evil

Added on by Chris Saad.

I give Facebook a lot of crap. But I don't think their latest privacy changes are all that nefarious. It's pretty obvious what they are doing. They want search inventory to sell to Google and Microsoft. They want to be as cool as Twitter.

I think the more important story is that they are turning their square into a triangle.

A well placed friend of mine (who shall remain nameless) gave me this metaphor (I will try not to butcher it too much).

Twitter is like a triangle. Small group of people (on top) broadcasting to a large group of people down bottom.

Facebook is/was more like a square. Everyone communicating more or less as equal peers (at least on their own personal profile pages).

This is very rare on the internet. It's rare anywhere really. It's unusual to have a platform that encourages so much 'public' peer-2-peer participation.

It's clear, however, that Facebook is trying to have its cake and eat it too. They want to be a triangle for those who want one, and a square for those who want one of those.

Will it work? Maybe. They are a 'Social Utility' after all. They have never thought of themselves as a vertical social network with a static social contract. As I've said before, their ability to change and evolve at scale is beyond impressive. It has never been seen before.

From College kid profile pages, to app platform, to stream platform, to stream platform with deep identity and routing. Their flexibility, rate of change and reinvention is staggering. They put Madonna and Michael Jackson to shame.

Ultimately Facebook wants to be the Microsoft Outlook and Google Adsense of the Social Web all rolled into one. Maybe throw some PayPal in for good measure.

To do this I think you will see them continue to provide square or triangle options for their users (with their own personal bias towards triangles) and deprecate legacy parts of their system like canvas pages and groups.

Ultimately, though, the real opportunity is to look beyond the public vs. private debate and observe the 'Multiple Publics' that Danah Boyd and Kevin Marks speak about. But that's a post for another day.

Is this good or bad for us? I'm not sure it matters. It's another big bet for the company though, and it was a necessary step to clean up the half steps that resulted in privacy setting hell on the service so far.

A failure of Imagination and Conviction

Added on by Chris Saad.

As you might know if you follow my work even remotely, my projects almost always come from a place of philosophical supposition. That is, I first create a model that I think matches the current and emerging state of the world, and then I create a product, project, format or other that works inside, encourages or commercializes that model. Many of my colleagues at JS-Kit do the same thing. Khris Loux and I, for example, spend hours and hours discussing our shared world views and how this translates to features, business direction and general life goals.

This methodology allows us to couch our decisions in well thought out mental models to make them more consistent, predictable and, we hope, more effective.

Over the years, and with my friends, I've proposed a number of these philosophical models including APML, DataPortability and most recently (this time working with Khris) SynapticWeb.

One of the hardest aspects of creating a philosophical model, however, is truly letting it guide you. To trust it. To take it's premise to the logical conclusion. Another challenge is explaining this methodology (and the value of the resulting outcomes) to others who a) don't think this way and b) have not taken the time to examine and live the model more fully.

Many times, the choices and decisions that I/we make from these models are nuanced, but the sum of their parts, we believe, are significant.

Let me make some concrete examples.

Social Media

There is this ongoing tension between the value of social/user generated media and the media produced by 'Journalists'. Sure social media is amazing, some say, but bloggers will never replace the role of Journalists.

The fact of the matter is, if your philosophical world view is that Social Media is important, that it is a return to one-to-one personal story telling and that it allows those in the know - involved in the action - to report their first hand accounts, then you must necessarily expand your imagination and have the conviction to follow that line of logic all the way to the end.

If you do, you must necessarily discover that the distinction between Journalists and 'Us' as social media participants (all of us) is authority, perspective, distribution and an attempt at impartiality.

In the end, however, we are each human beings (yes, even the journalists). Journalists are imbued with authority because a trusted news brand vets and pays them, they are given the gift of perspective because they sit above the news and are not part of it, they have distribution because their media outlet prints millions of pieces of paper or reaches into the cable set top boxes of millions of homes and their impartiality is a lie.

Can't these traits be replicated in social media? Of course they can.

Reputation can be algorithmically determined or revealed through light research/aggregation, perspective can be factored in by intelligent human beings or machines that find both sides of a story, distribution is clearly a solved problem through platforms like Twitter, Digg and others and impartiality is still a lie. At least in social media bias is revealed and transparency is the new impartiality.

I don't mean to provide an exhaustive reasoning on why Social Media as a philosophical framework holds up as new paradigm for news gathering and reporting here - only to give an example of how we must allow ourselves to imagine outside the box and have the conviction to fully believe in our own assumptions.

Streams

The same type of artificial mental barriers have appeared at every step of the way with each of the philosophical frameworks in which I have participated. Streams, is the most recent.

When we launched Echo we proposed that any conversation anywhere, irrespective of the mode or channel in which it was taking place, had the potential to be a first class part of the canonical and re-assembled lifestream of a piece of content.

Many pushed back. "Oh a Tweet can't possibly be as valuable as a comment" they lamented. They're wrong.

A Tweet, an @ Reply, a Digg, a Digg Comment, a Facebook Status Update, a Facebook Comment, an 'on page' comment and any other form of reaction each have just as much potential for value as the other.

Some have created artificial distinctions between them. They separate the stream into 'Comments' and 'Social Reactions'. I have news for everyone. A comment is a social reaction. Thinking of it as anything less is a failure of imagination and conviction. The trick is not a brute force separation of the two, but rather a nuanced set of rules that help diminish the noise and highlight the signal - where ever it might be - from any mode or channel. We've started that process in Echo with a feature we call 'Whirlpools'.

Communities

Another interesting failure of imagination that I come up against a lot lately is the notion of community building.

With Echo, we have taken the philosophical position that users already have a social network - many have too many of them in fact. There is no reason for them to join yet another network just to comment. Not ours, not our publisher's.

No, instead they should be able to bring their social network with them, participate with the content on a publisher's website, share with their existing friends on existing social networks, and leave just as easily.

By using Echo, you are not joining 'our community'. You already have a community. If anything you are participating in the Publishers community - not ours.

We don't welcome new customers to 'Our community'. Instead we help their users bring their community to a piece of content, interact, share and leave.

Publishers invest large quantities of capital in producing high quality content only to have the engagement and monetization opportunities occur on Social Networks. In these tough economic times, publishers can not afford to bleed their audience and SEO to yet another social network just to facilitate commenting. That is the opposite of the effect they are trying to achieve by adding rich commenting in the first place.

If we use our imagination, and have the conviction to see our ideas through, we realize that publishers need tools that encourage on-site engagement and re-assemble offsite reactions as well - not bolster the branded 3rd party communities of the products they use.

Be Brave

In summation - be brave. Observe the world, define a philosophical framework, imagine the possibilities and have the conviction to follow through on your ideas. Stop being lazy. Stop stopping short of taking your impulses to their logical conclusions because I've found, when you consistently execute on your vision it might be a little harder to sell your point of differentiation - but your contributions will ultimately be better, more consistent and more long lasting for your company, the web and the rest of the world.

Redefining Open

Added on by Chris Saad.

In my mind, there are four kinds of open.

  • Torvalds Open.
  • Zuckerberg Open.
  • Not Open but we use the word Open anyway.
  • Saad Open.

This fragmentation has diluted the word open to the point where it almost has no value.

It's time to re-define the word open. First let me explain each category.

Torvalds Open.

In Linus Torvalds world (the guy who invented Linux) Open means that the software is developed through a community process. The source code is visible and modifiable by anyone and is available for free.

This is called 'Open Source'.

Companies may package and bundle the software in new and novel ways, and provide support and services on top for a fee.

The problem with Open Source on the web is that the software itself has less value than the network effects and up-time provided by a branded, hosted experience. Running Twitter.com on open source software, for example, would have very little value because Twitter's lock-in is not their software, but rather their name space (@chrissaad) and their developer ecosystem all developing software with dependencies on their proprietary API.

Open Source is useful, interesting and important, but is not what I mean when I talk about the Open Web. I feel like its value is well understood and it is not the first, best way of making our world (and the Internet) a better place - at least not in the same way it once did when client-side software was the primary way we used computers.

Zuckerberg Open.

When Mark Zuckerberg talks about open, he is not talking about Technology. He is talking about human interactions.

Ever since the popularity of Data Portability (via the DataPortability project) Facebook has gone to great lengths to redefine the word Open to mean the way people interact with each other.

In doing so, they have managed to, in large part, co-opt the word and claim their platform makes people 'more open'.

In many respects, and by their definition, they are right. Facebook has encouraged a mind bending number of people to connect and share with each other in ways that had been previously reserved for bloggers and other social media 'experts'.

Facebook deserves a lot of credit for introducing social networking to the masses.

Their definition of Open, however important, is not the kind I'm talking about either.

Not Open but we use the word Open anyway.

This is when a platform or product has an API and therefore claim that they have an 'Open Platform'.

There's nothing open about having an API. It's just having an API. The platform could be closed or open depending on how the given application and API is built and what limitations are placed upon it.

In most cases, an 'Open Platform' is not actually open, it's just a platform.

Saad Open

My definition of open is very specific. In fact a better way to describe it would be Interoperable and Distributed.

To explain, let me provide some compare and contrast examples.

Twitter is closed because it owns a proprietary namespace (e.g. @chrissaad). The only way to address people is using their username system. They own those usernames and have final authority over what to do with them.

They are closed because they do not provide free and clear access to their data without rate limiting that access or cutting deals for improved quality of services.

They are also closed because they are not a federated system. You can not start your own Twitter style tool and communicate with users on Twitter or vice versa. The only way to message people on Twitter is to use Twitter's propietary APIs for submitting and retrieving data.

A proprietary API is an API that is special to a company and/or produces data that is not in an open standard.

Wordpress, on the other hand (and to contrast) is an open system. Let's compare point for point.

It does not own the namespace on which it is developed. The namespaces are standard URLs. This blog, for example is hosted at blog.areyoupayingattention.com. Wordpress does not own that domain.

Wordpress produces a single type of data - blog posts. Those blog posts are accessible using an open standard - RSS or Atom. There is no rate limit on accessing that data.

Wordpress is a federated system. While they provide a hosted solution at Wordpress.com for convenience, there is nothing stopping me from switching to Blogger or Tumblr. The tools that you would use to consume my blog would remain unchanged and the programmers who make those tools would not need to program defensibly against Wordpress' API. They simply need to be given the URL of my RSS feed and they are good to go.

This makes Wordpress an open tool in the open blogosphere.

Blogging is open.

Microblogging should be open too.

To summarize. Open, in my definition, does not mean the software is open source or free. It means that the software receives open standards data, provides open standards data, has an interoperable API and can easily be switched out for other software.

Today I was challenged on Twitter that Echo is not 'Open' because it is proprietary code and costs money to use.

This person does not understand my definition of Open. Echo is open because it is not a destination site, it sits on any site anywhere. The owner of that site can take it off and replace it with another engagement tool at any time. The data being absorbed by Echo, for the most part, is RSS or Atom, and the data coming out of Echo is RSS.

It does not have any proprietary namespaces (except our useless legacy login system which we are trying to get rid of as quickly as possible) and does not pretend to create some amazing social network of its own. It is just a tool to communicate on the open, social web.

Is Echo perfect? No, of course not, but our intention is to make each and every aspect of the product as interoperable and distributed as possible. We will even use and contribute to open source where appropriate.

How does your product, or the tools you choose, compare? Tell me in the comments.

Next up, we should start to redefine the 'Open' community that creates open standards. Much of it is not very open.

Merry Christmas - The power of memes

Added on by Chris Saad.

Many, many of the things in our lives could be called 'Memes'.  Here's what happens when you type 'Define:meme' into Google.

Memes are everywhere. We just experienced a country wide meme here in the US called 'Thanksgiving'. We are about to hit a similar meme (except this one is global) called 'Christmas'.

Memes are fascinating things. They are almost as important as Context, Perspective and Metaphors. Together these three things compose the great majority of our thought processes.

What is this like (metaphor), What else is going on (context), What does everyone else think (meme), What does my experience and current state of mind tell me (Perspective).

Some memes emerge organically over time - like folding the end of hotel toilet paper into a little triangle. Others are created through brute force by strategic construction and repetition. No one has mastered this better than the extreme right wing of the US political system. Fox news is a bright shining example of how to craft, seed, propagate and manipulate a meme.

Silicon Valley loves a meme. We live on them. In fact one could argue that the whole ecosystem would shut down without the meme of the day, week and bubble.

.Com, Web 2.0, Data Portability, Real-time web, RSS is dead, Myspace, Facebook, Twitter, Cloud, Semantic Web, Synaptic Web and so on and so forth.

Like in real life, some of these memes emerge organically, some through brute force. Some make more sense than others. Some of these memes get undue attention. Some are created to stir controversy. Others form organically to create a shorthand. Some are genuine cultural shifts that have been observed and documented.

These memes matter. They matter a lot. They dictate a large part of how people act, what they pay attention to and their assumptions about the world in which they live, and the people they encounter. In Silicon Valley they dictate who gets heard and which projects get funded. They form the basis of many of our decisions.

Some services like Techmeme do a very good job at capturing daily memes. I've yet to see a service that captures memes that span weeks, months, years or even decades though. I dream of such a service. Particularly one focused on news memes.

Imagine being able to zoom in and out of the news, and drag the timeline back and forth like some kind of Google maps for headlines. Imagine being able to read about an IED explosion in Bagdad and quickly understand its context in the decade long struggle for the entire region through some kind of clustered headline/topic view.

Consider the context, perspective and metaphoric power such a tool would give us. How could it change our world view and help turn the temporary, vacuous nature of a microblog update into something far more substantial and impactful with an in line summary of the rich historic narrative inside which it belongs.

The algorithm to create such correlations and the user interface to present it would challenge even the smartest mathematicians and user interaction designers I imagine. It's commercial value is vague at best. It probably shouldn't be attached to a business at all - maybe it should be some kind of wikipedia style gift to the world.

Maybe the news media, Reuters, CNN and Washington Post might take it upon themselves to sponsor such a project in an effort to re-contextualize their news archives in the new AAADD, real-time, now, now now, every one is a journalist media world.

I've bought some domains and done some mockups of such a service, but I probably would never have the time or the patience to build it - at least not in the foreseeable future.

Maybe I'm just dreaming. But I think it's a good dream!

Calling for open

Added on by Chris Saad.

Steve Gillmor often writes fantastic (and fantastically long) editorials on the landscape of the real-time web, but they are often very dense and sometimes fail to cover some key points. I thought I would take the liberty of translating and correcting his latest post with my own contributions.

Ever since FriendFeed was sold to Facebook, we’ve been told over and over again that the company and its community were toast. And as if to underline the fact, FriendFeed’s access to the Twitter firehose was terminated and vaguely replaced with a slow version that is currently delivering Twitter posts between 20 minutes and two hours after their appearance on Twitter. At the Realtime CrunchUp, Bret Taylor confirmed this was not a technical but rather a legal issue. Put simply, Twitter is choking FriendFeed to death.

Translation: The FriendFeed team were absorbed by way of acquisition. Twitter has terminated their priority access to Twitter data because FriendFeed is now owned by Twitter's primary competitor.

Correction: Of course Twitter turned them off. Facebook is Twitter's self-declared number one competitor. When you own the platform and the protocol you have every right to protect your own arse. In fact they have an obligation to their shareholders and investors.

What’s odd about this is that most observers consider FriendFeed a failure, too complicated and user-unfriendly to compete with Twitter or Facebook. If Twitter believed that to be the case, why would they endeavor to kill it? And if it were not a failure? Then Twitter is trying to kill it for a good reason. That reason: FriendFeed exposes the impossible task of owning all access to its user’s data. Does Microsoft or Google or IBM own your email? Does Gmail apply rate limiting to POP3 and IMAP?

Translation: Most commentators think that FriendFeed is dead because the founders have been bought by and buried inside Facebook. If FriendFeed is so dead why is Twitter trying to choke it.

Correction: FriendFeed is clearly dead. If you have ever worked for a startup and tried to ship a running product you know that focus is the only thing that will keep you alive. Facebook is a massive platform serving a scale of social interaction that has only been previously seen by distributed systems like email. The last thing Facebook wants is for its newly aquiried superstar team to waste time working on a platform that no longer matters to their commercial success or the bulk of their users (i.e. Friendfeed).

Twitter is choking FriendFeed for another reason - because it's systems are now essentially just a proxy to Facebook. As stated above, Twitter can not give it's number one competitor priority access to one of its major assets (i.e. timley access to the data).

The data that Microsoft and Google does not exercise hoarding tactics over (the examples Steve gave were IMAP and POP3) are open standards using open protocols.

I am never sure about Steve's position on open standards, he often vacillates from championing the open cause through projects like the Attention Trust only then to claim things like APML and DataPortability are bullshit - maybe he just doesn't like me (That can't be right can it Steve?).

The fact is, however, that open standards and protocols are the basis for open systems which is why companies like Microsoft and Google do not control your email. Twitter and Facebook are not open systems.

So the reason Twitter is killing FriendFeed is because they think they can get away with it. And they will, as far as it goes, as long as the third party vendors orbiting Twitter validate the idea that Twitter owns the data. That, of course, means Facebook has to go along with it. Playing ball with Twitter command and control doesn’t make sense unless Facebook likes the idea of doing the same thing with “their” own stream. Well, maybe so. That leaves two obvious alternatives.

The first is Google Wave, which offers much of the realtime conversational technology FriendFeed rebooted around, minus a way of deploying this stream publicly. The Wave team seems to be somewhat adrift in the conversion of private Waves to public streams, running into scaling issues with Wave bots that don’t seem to effectively handle a publishing process (if I understood the recent briefing correctly.) But if Waves can gain traction around events and become integrated with Gmail as Paul Buchheit recently predicted, then an enterprising Wave developer might write a bot that captures Tweets as they are entered or received by Twitter and siphons them into the Wave repository in near realtime.

Translation: Twitter is killing FriendFeed because they think no one will notice or care enough to stop them - Twitter has more than enough momentum and support to continue along it's current path. Facebook wont cry foul because they are doing the same hoarding technique with their own data.

Maybe Google Wave might save the day, but they seem to have lost their way.

Correction: Actually the only people who can call bullshit on Twitter and Facebook is us, the media. We are all media after all. Steve Gillmor in fact is one of the loudest voices - he should call bullshit on closed systems in general. Instead we all seem to be betting on one closed system to do better than another closed system.

We are like abused wives going back for more, each time pretending that our husbands love us. Guess what, they don't love us. They love their IPO.

I was the first to support Google Wave very loudly and proudly. I met with the team and was among the first to get in and play with the preview. It is a revolution in collaboration and how to launch a new open system. It is not, however, a Twitter or Facebook competitor. Especially not in its current state. It is not even a replacement to email. It is simply the best damned wiki product ever created.

Waves are the 180' opposite of FriendFeed and Facebook or even Twitter. They are open, flexible and lacking any structure whatsoever. Their current container, the Google Wave client, however, is totally sub-optimal for a messaging metaphor much less a many-to-many passive social platform. It is a document development platform. Nothing more.

The same could be true of Microsoft’s deal for the firehose, but here, as with Google, Twitter may not want to risk flaunting ownership of a stream that can so easily be cloned for its enterprise value. And as easily as you can say RSS is dead, Salesforce Chatter enters the picture. Here’s one player Twitter can’t just laugh off. First of all, it’s not Twitter but Facebook Benioff is cloning, and a future Facebook at that, one where the Everyone status will be built out as a (pardon the expression) public option. This free cross-Web Chatter stream will challenge Facebook’s transitional issues from private to public, given that Salesforce’s cloud can immediately scale up to the allegedly onerous task of providing personalized Track on demand.

Translation: Maybe the enterprise players - specifically Salesforces' Chatter - will save the day.

Correction: Doubtful. This is just another closed system for a specific vertical. It's long overdue. It is awesome. But it is not a Facebook or Twitter competitor much less an open alternative to the proprietary messaging systems we keep flocking to. It is simply a long overdue expansion of the simple changelog tracking feature on ERP assets. It's a simple feature that was sponsored by a simple question. "Why doesn't the asset changelog include more data - including social data?". Duh. I was doing this in my own web based CRM at the start of the decade.

It’s likely this pressure can be turned to good use by Facebook, unencumbered as they are by any licensing deal with Twitter. Instead, a Chatter alliance with the Facebook Everyone cloud puts Salesforce in the interesting position of managing a public stream with Google Apps support, which eventually could mean Wave integration. Where this might break first is in media publishing, as Benioff noted at the CrunchUp. Twitter’s leverage over its third party developers could be diluted significantly once Salesforce offers monetization paths for its Force.com developers. So much so that this may call Twitter’s bluff with FriendFeed.

Translation: No idea

But FriendFeed has always been more of a tactical takedown of Twitter than an actual competitor, a stalking horse for just the kind of attack Twitter seems most afraid of. No wonder the speed with which Twitter is introducing metadata traps to lock down the IP before a significant cloud emerges to challenge its inevitability. Lists, retweets, location — they’re all based on raising the rate limiting hammer to discourage heading for the exits. It’s not that retweets reduce the functionality of the trail of overlapping social circles, it’s that they lock them behind the Wall.

Translation: Twitter is introducing more metadata into tweets to maintain its lock in through API limits etc.

Correction: On this point Steve is partially correct. This isn't about rate limiting though - it's about turning Twitter's proprietary protocol into a real-time transport for all the data the web has to offer. It is not about API limits but rather cramming so much value into the pipe that the pipe becomes like water - you gotta drink from it or you're going to die.

I don’t expect anyone from Twitter to answer the simple question of when will Twitter give FriendFeed the same access they provide other third party client vendors. For now, it’s frustrating to not see the flow of Twitter messages in realtime, but over time we’ll build tools on top of FriendFeed to take such embargoed messages private. Once inside FriendFeed, the realtime conversations that result are just the kind of high value threads Chatter will support, Wave will accelerate, and Silverlight will transport. Keep up the good work, Twitter.

Translation: I doubt Twitter will play nice with FriendFeed and give them equal access again because once items are inside FriendFeed they turn into rich conversations. Conversations that Chatter will support, Wave will accelerate and silverlight will transport.

Correction: Actually Twitter does not and has never given fair and equal access to its data. FriendFeed had a moment in the sun with first class access the likes of which almost no one else has seen before or since.

I have no idea how Chatter fits into the B2C picture - it is clearly an Enterprise play for Salesforce. Wave indeed will act as a great interface through which to participate in real-time threads. The threads themselves, however, will need to be generated or framed by much more rigid systems designed for public discussion.

Silverlight is great for rich web apps. It is Microsoft's way of bringing the richness of the client into the browser. Just like .NET is to Java, Silverlight is to Flash. A way for Microsoft to leverage a key technology component without handing the crown to someone/something it doesn't control. But I'm not sure if fits into this discussion.

In the end, the only real solution for all of this, of course, is a return to the way the web has always worked (well). Open systems. The transport should not be Twitter, Facebook, FriendFeed, Wave or any other nonsense. It should be RSS and Atom (ActivityStrea.ms specifically) transported over PubSubHubBub and read by open standards aggregators. The namespaces should be OpenID based and adoptable by all.

The sooner the early adopter community realizes this, the commentators push for this and the developers code for this, the better off we will all be.

Disclosure: I work for JS-Kit, creators of Echo - one of the largest providers of Real-time streams. I also Tweet - trying to find an alternative though!

Twitter Lists and Tags

Added on by Chris Saad.

In my previous post (written 5 minutes ago) I talk about Twitter Lists in relation to shared namespaces (Hint: They are not in a shared namespace). Another under-reported fact, however, is that lists are also Tags. They are a great way for Twitter to learn how Twitter users are perceived and grouped (As a side note, they are also great for people to see how other people perceive them - one of my favorite lists in which I am listed: @chadcat/unreasonably-talented haha).

One could easily see an algorithm that can determine accurate APML data about each user not just by looking at their Tweet history, but by also checking their Bios and the Tweet History/Bios of the people they are listed with. The list name itself, in fact, is a very concentrated form of topic/tag data.

Do lists double as Twitter's user tagging feature?

Who will be the first to ship an automated user discovery directory based on analyzing the relationship between users who are on the same lists?

I hope MrTweet is already working on this!

Twitter Lists and Namespaces

Added on by Chris Saad.

A very important fact that seems to be getting little to no coverage at the moment about Twitter Lists is the issue of namespaces. Twitter's number one asset is its control and allocation of namespaces. Those little things we call 'Usernames'. @chrissaad is not just my Twitter Name, it is a short form addressable identity that concretely links to my Twitter inbox any time someone uses it in a Tweet.

Addressable, convenient namespaces that can be used in a sentence like this are so interesting and important that facebook went to great lengths to copy them. Nothing on the open web has yet come close to this simplicity and effectiveness. Which is not to say there won't be an alternative soon.

The important fact with Twitter usernames, though, is that they are unique. There is a finite and shared 'space' in which 'names' can be allocated.

The result is that early adopters end up with all the best names and squatters rush to lock up all the best phrases. Late comers to the system end up with names like chris2423.

Twitter Lists, however, are different. They include the list creator's username. For example my JS-Kit list is "@ChrisSaad/jskit".

As you can see, the list 'JSKIT' is attached to my username. This means means that each user has their own namespace.

This result: There can't be a landrush for List names because the list naming convention sits on top of the username. It also means that no one can own a definitive list on a subject because each list is subjective.

This is an important design decision for Twitter. One that has both pros and cons for the community. Overall, however, I think the decision was a correct one. Lists can rise and fall organically (or at least based on the influence and popularity of their creators) without the pain and pressure (for Twitter) of maintaining yet another shared namespace.

Twitter's username namespace, however, is just rife with and waiting for all sorts of headaches. I don't envy their position and I can't wait for an open alternative.

Stalqer - Viral Loops and Network Effects

Added on by Chris Saad.

63954v1-max-150x150Today a company I am advising has launched in the press and will soon be available in the Apple App Store. They are called Stalqer and, as Techcrunch writes, they are basically Foursquare on steroids.

I think that's a pretty good description. The fact is, however, the most impressive thing about Stalqer is not what it does but how it does it. Rather than approaching acquisition and retention of users like any typical app , it uses data portability, viral loops and network effects to on-board and engage users on an ongoing basis.

Not enough app developers consider this when engineering their user experiences and the result is usually a big 'Techcrunch' launch and a big flame out as users flock for a 5 minute road test and never return.

Mick (CEO of Stalqer) and his team, however, have almost turned virality and network effects it into a science.

Here are some of the highlights of their product decisions.

  1. Instead of building yet another registration and friending system, they simply import your Facebook Friends.
  2. Instead of being content to be confined by Facebook's data licensing limitations, they merge and mingle FB data with other data sources (in this case, your phone's address book!) to access email addresses and phone numbers.
  3. Instead of assuming that their app lives in a vacuum, they are using other data sources (Facebook, Phone Book and eventually others) to aggregate location data and make a best guess at friend locations even if they aren't using the app.
  4. Instead of being limited by their active user base, they encourage existing users to manipulate and optimize profiles of non-users - the effect being that even if you don't use Stalqer, chances are one of your friends is doing the work of checking you in. Don't like where they put you - then sign up and get back control!
  5. Instead of letting the multitasking limitations of the iPhone limit their background tracking capabilities, they innovated their way out of the problem using amazing email tricks.

The list of innovations goes on and on.

The Stalqer team have done an amazing job of baking in the right workflows to ensure maximum adoption and engagement based on their primary use case (discovering people around you) without resorting to raw gaming tricks like points and badges.

I can't wait to see how the app performs and what they do next!

As a side note, I too have been experimenting with non-obvious network effects in my day-job. More on that later...

You get what you deserve

Added on by Chris Saad.

Lately a number of my friends seem to be having great wins and making their mark on the industry in awesome ways. When I first moved out to Silicon Valley (starting with a short trip in 2006) I already knew (by reputation) many of the names and personalities that made up the ecosystem. I read them on blogs, listened to them on podcasts and generally admired their work and learned from their ideas.

Once coming out here, I got to know many of them personally. Some let me down, others surprised me with their generosity and still others became wonderful friends.

I'd like to highlight just a couple of those today because they've been on my mind.

4829_SM_biggerJeremiah Owyang (and his new partners Deb Schultz & Charlene Li) has/have always struck me as one of the hardest working and smartest people in the valley.

Most recently I've had the pleasure to get to know Jeremiah on a personal level but had never actually worked with him 1:1 on anything serious before.

That changed last week when we sat down for a real 'business meeting'. He blew my mind. That doesn't happen often. His blog posts only show a fraction of the mans thinking. Not only does he think 5 steps ahead, he manages to find a way to package it on his blog in a way that even laymen can understand.

I am so happy for his collaboration at Altimeter. Jeremiah, Debs and Charlene are the nicest people and are all wicked smart.

Those that have been around me in the last 12 months have probably heard me talk about the need for an Altimeter group style firm and I'm glad that they are the ones to pull it off. They've done it with grace, style and stunning execution.

Can't wait to see what they do next.

steph2.0_biggerStephanie Agresta is another of the people that I got to know as a friend once moving out here. For some reason and on some level we connected as kindred spirits who love to smile.

I've always felt like she had an undeserved level of faith and affection for me - but I accepted it gladly because it meant she wanted to hang out!

She too has recently made a move that not only befits her stature as a connector and thinker, but also rewards her kind spirit and positive attitude.

She gave me her new card at her birthday the other day - it says EVP of Social Media, Global - Porter Novelli (or something like that hah). EVP, Global, Porter Novelli. Are you serious!?

This is such wonderful news for our community because it means that someone who not only gets it, but loves it and is one of us, is in a position to help the brands we all know and love.

These are just two of my friends who have gotten what they deserve lately - in the best meaning of the phrase possible.

Congratulations peeps.

If I can help any of you reading this to achieve your goals, please let me know. This whole ecosystem, worldwide, is built on pay-it-forward. And I have a lot to pay forward.

5 Things you need to know about Social Media Marketing

Added on by Chris Saad.

Someone recently asked me to give them the top few tips I could think of about Social Media Marketing. Here's the first 5 things that came to mind.

  1. Conversation is not a buzzword They call it a 'conversation' - the meaning is literal - not figurative. Someone speaks, you listen, and you respond appropriately. You try to add value to the dialogue not shout your message. The most common mistakes people make in social media are the same mistakes they make at a dinner party. They don't listen. They don't add value. They don't have something interesting to say. They are not authentic. They are not humble. They don't listen and learn because they are too busy talking.
  2. Have something worth saying and say it with Authenticity. Talking about your product only gets you so far. You need a point of view. What is the underlying philosophy that makes you wake up in the morning? What drives you? Why do you make the decisions you make? They want to know how the sausage is made just as much as they want to have a BBQ with it.
  3. Build something worth talking about and get out of the way The best thing you can say is nothing at all. Instead ship something worth talking about and have others do the talking for you. That means you need to listen to what your customers want, build something they will love and facilitate their interaction between each other. Do not fear negative feedback - you can not control your message or your brand - you can only discover, engage and learn. If and when you do, you will turn critics into brand/product evangelists.
  4. Don't build a social network "Having a social networking strategy for marketing is like having a muscle strategy for smiling" - Tony Hsieh, Zappos. You don't need to build a social network to have a social media strategy. In fact that's a bad move. The conversation is already happening on existing social tools - you just need to search for it and jump in (carefully).
  5. Time and ROI If you don't think you have the time or can't work out the ROI then you don't understand business. Business is about people. It's about relationships. It's about creating value for others. Social Media is not something your marketing department should do. It's something your whole company should do. Just like they all answer the phone and send email, they also need to exist in the global conversation about your products and services. Get involved. Find the time. The return on your investment will be nothing short of staying relevant as the world changes around you.

What are your top 5 Social Media Marketing 'tips'?

FriendFeed is over - Time for a Blog Revolution

Added on by Chris Saad.

The blog revolution that I spoke of in my previous post 'Blogs are Back" feels to me, right now, like the Iranian revolution that almost happened a couple of months back. It is in danger of fading away as we get wrapped up in 'what will Facebook do next' mania. You see, a couple of months ago there seemed to be an awakening that blogs are the first, best social networking platforms. This realization seemed to be driven by many converging factors including...

  1. Twitter Inc decisions that have not reflected the will of the community – particularly changing the @ behavior, changing their API without informing developers, making opaque decisions with their Suggested User List and limiting access to their Firehose.
  2. Facebook’s continued resistance to true DataPortability
  3. The emergence of tools and technologies that turn blogs into real-time, first class citizens of the social web. Tools like Lijit, PubSubHubBub and of course Echo.
  4. A broader understanding that blogs are a self-owned, personalized, tool agnostic way to participate in the open social web.
  5. FriendFeed selling out to Facebook
  6. A flurry of great posts on the subject
  7. The broader themes of the Synaptic Web

Instead though, it now seems that many bloggers are holding on desperately to the notion that FriendFeed may survive or that Facebook may get better. They continue to pour their content, conversation and influence into a platform that does not hold their brand, their ads or their control. We all seem desperate to see what next move these closed platforms make.

I have news for you - FriendFeed is dead. The team has moved on to work with the core Facebook team.

At best, FriendFeed will go the way of Del.icio.us and Flickr - stable but not innovating. At worst, it will go the way of Jaiku or even Dodgeball.

It's time we start re-investing in our own, open social platforms. Blogs. Blogs are our profile pages - social nodes - on the open, distributed social web.

Blogs missing a feature you like from FriendFeed? Build a plugin. There's nothing Facebook or FriendFeed does that a blog can't do with enough imagination.

Our job now, as early adopters and social media addicts, should be to build the tools and technologies to educate the mainstream that blogs and blogging can be just as easy, lightweight, social and exciting as Facebook. Even more so.

All that's need is a change in perspective and slight tweaks around the edges.

Blogs are back.

Who's with me?

Blogs are Back

Added on by Chris Saad.

When Khris and I showed Robert Scoble Echo prior to the Launch at the Real-Time Crunchup he said "Wow, Blogs are Back!". I couldn't agree more. It looks like his sentiment is starting to propagate.

When I say Blogs are Back I mean that the balance between other forms of social media (Twitter, Facebook, FriendFeed etc) are now finding their rightful balance with the first and foremost social platform, Blogging.

This is not to suggest that other forms of interaction are going away, only that there is a natural equilibrium to be struck.

There are a number of factors that are helping this trend along.

They include:

  1. Twitter Inc decisions that have not reflected the will of the community - particularly changing the @ behavior, changing their API without informing developers, making opaque decisions with their Suggested User List and limiting access to their Firehose.
  2. Facebook's continued resistance to true DataPortability
  3. The emergence of tools and technologies that turn blogs into real-time, first class citizens of the social web. Tools like Lijit, PubSubHubBub and of course Echo.
  4. A realization that blogs are a self-owned, personalized, tool agnostic way to participate in the open social web.
  5. The broader themes of the Synaptic Web

I also discussed this with Dave Winer, Doc Searls and Marshall Kirkpatrick the other day on the BadHairDay podcast.

You can also see previous references to this in my 'What is Echo' post. I've also posted a more detailed account of how Echo fits into this notion on the JS-Kit blog.

Robert Scoble and Shel Israel have also posted on this. I also registered 'BlogsAreBack.com' (what should I do with it?).

I look forward to see what this new trend brings!

What is Echo Comments?

Added on by Chris Saad.

On October 14, 2008 I wrote a blog titled 'Who is JS-Kit'. In it, I explained why I was joining the JS-Kit team and how their philosophy and execution resonated so much with me. On Friday the 10th of July, 2009, the JS-Kit team launched Echo. Here's the video. It is the clearest example yet of the potential of the JS-Kit team that I spoke about back in my Who is JS-Kit post.

I wanted to take this opportunity to explain what Echo means to me personally. But first, I'd like to make something very clear. Although much of this will be about my personal opinions, feelings and philosophies on Echo and the trends and tribulations that bore it,  Echo is the result of the hard work and collaboration of a stellar team of first grade entrepreneurs that I have the pleasure of working with every day (and night).

From Khris Loux our fearless and philosophical CEO who lead the charge, to Lev Walkin our CTO who seems to know no boundaries when it comes to writing software, to Philippe Cailloux, the man who turns our raving ADD rants into actionable mingle tickets, to our developers who worked tirelessly to turn napkin sketches into reality. We all scrubbed every pixel and will continue to be at the front lines with our customers. This is the team that made it happen.

For me, Echo is the next major milestone on a journey that only properly got underway in November 2006 when I visited Silicon Valley for the first time.

I was at the Web 2.2 meetup. It was set up by one of my now friends Chris Heuer. There was a group discussion about social networking and how we, as individuals, might communicate in ways that were independent of the tools that facilitated such communication.

I was sitting in the back of the room in awe of the intellect and scope of the conversation. Could you imagine it, for the first time in a long time I (a kid from Brisbane Australia) was in a room full of people who were just as passionate about this technology thing as me - and they were actually at the center of the ecosystem that could make a real impact on the outcome of these technologies.

I shyly put my hand up at the back of the room and squeaked out (I'm paraphrasing and cleaning up for eloquence here - I'm sure I sounded far less intelligent at the time).

"Aah... excuse me... aren't blogs the ultimate tool agnostic social networking platforms?"

What I meant was that blogs use the web as the platform. They produce RSS. They have audiences. They illicit reactions. They create social conversations over large distances. They essentially create one giant implicit social network.

I got some "oh yeah he might be right" reactions and the conversation moved swiftly along to other things.

For me, a light turned on. One I've been chasing ever since in various forms and to varying degrees of success (or failure as the case may be). For me, Faraday Media, APML, DataPortability and now JS-Kit have all been an exploration on how to create a tool-agnostic, internet scale social network that has notification, filtering, interoperability and community at its heart.

As I said at the start of this post, Echo is the next step along that journey. For me, Echo represents an opportunity to making Blogging not only 'cool' again, but to make it a first class citizen on the web-wide social network. To make all sites part of that network.

Much has been made of its real-time nature. Even more about its ability to aggregate the fragmented internet conversation back to the source. These are both critical aspects of the product. They are the most obvious and impactful changes we made. But there is much more to Echo than meets the eye. Much more in the product today and much more we hope to still add.

Our choice of comment form layout. The use of the words 'From' and 'To'. The language of 'I am... my Facebook profile'. The choice to treat the comment form as just another app (as shown by the use of the 'Via Comments' tag) and more. The choice to merge the various channels into a unified stream (comments+off-site gestures). These were all deliberate and painstaking choices that the team made together.

Echo is based on a theory we call the 'Synaptic Web'. This is the frame of reference from which all our product decisions will be made. It is an open straw man that I hope will eventually be just as exciting as any given product launch. It states in explicit terms the trends and opportunities that many of us are seeing and is designed to help foster a conversation around those observations.

In the coming hours and weeks I'm also going to record video screen casts of the specific product decisions that have already made it into Echo - hopefully these will further illustrate how each pixel brings about a subtle but important change to the space.

In the mean time, I'd like to reiterate how humbled I am by the reaction to the product and how excited I am to be working with the JS-Kit team in this space at this time in the Internet's history.

I look forward to hearing from each of you about your thoughts and feelings on our direction, and shaping our road map directly from your feedback.