10 November 2011

Infographic: Diffusion of Social Networks — Facebook, Twitter, LinkedIn and Google+

Social networking market

They say a picture's worth a thousand words and much digital ink has been spilled recently on impressive sounding (yet relatively unimpressive) user counts, so here's an infographic showing the diffusion of social networks as at last month to put things in perspective.

There are 7 billion people on the planet, of which 2 billion are on the Internet. Given Facebook are now starting to make inroads into the laggards (e.g. parents/grandparents) with 800 million active users already under their belt, I've assumed that the total addressable market (TAM) for social media (that is, those likely to use it in the short-medium term) is around a billion Internet users (i.e. half) and growing — both with the growth of the Internet and as growing fraction of Internet users. That gives social media market shares of 80% for Facebook, 20% for Twitter and <5% for Google+. In other words, Twitter is 5x the size of Google+ and Facebook is 4x the size of Twitter (e.g. 20x the size of Google+).

It's important to note that while some report active users, Google report total (e.g. best case) users — only a percentage of the total users are active at any one time. I'm also hesitant to make direct comparisons with LinkedIn as while everyone is potentially interested in Facebook, Twitter and Google+, the total addressable market for a professional network is limited, by definition, to professionals — I would say around 200 million and growing fast given the penetration I see in my own professional network. This puts them in a similar position to Facebook in this space — up in the top right chasing after the laggards rather than the bottom left facing the chasm.

Diffusion of innovations

The graph shows Rogers' theory on the diffusion of innovations, documented in The Innovator's Dilemma, where diffusion is the process by which an innovation is communicated through certain channels over time among the members of a social system. There are 5 stages:
  1. Knowledge is when people are aware of the innovation but don't know (and don't care) about it.
  2. Persuasion is when people are interested in learning more.
  3. Decision is when people decide to accept or reject it.
  4. Implementation is when people employ it to some degree for testing (e.g. create an account).
  5. Confirmation is when people finally decide to use it, possibly to its full potential.
I would suggest that the majority of the total addressable market are at stage 1 or 2 for Google+ and Twitter, and stage 4 or 5 for Facebook and LinkedIn (with its smaller TAM). Of note, users' decisions to reject an innovation at the decision or implementation phase may be semi-permanent — to quote Slate magazine's Google+ is Dead article, "by failing to offer people a reason to keep coming back to the site every day, Google+ made a bad first impression. And in the social-networking business, a bad first impression spells death." The same could be said for many users of Twitter, who sign up but fail to engage sufficiently to realise its true value. Facebook, on the other hand, often exhibits users who leave only to subsequently return due to network effects.

Social networking is also arguably a natural monopoly given, among other things, dramatically higher acquisition costs once users' changing needs have been satisfied by the first mover (e.g. Facebook). Humans have been using social networking forever, only until recently it's been manual and physiologically limited to around 150 connections (Dunbar's number, named after British anthropologist Robin Dunbar). With the advent of technology that could displace traditional systems like business cards and rolodexes came a new demand for pushing the limits for personal and professional reasons — I use Facebook and LinkedIn extensively to push Dunbar's number out an order of magnitude to ~1,500 contacts for example, and Twitter to make new contacts and communicate with thousands of people. I don't want to maintain 4 different social networks any more than I want to have to search 4 different directories to find a phone number — I already have 3 which is 2 too many!

Rogers' 5 factors

How far an innovation ultimately progresses depends on 5 factors:
  1. Relative Advantage — Does it improve substantially on the status quo (e.g. Facebook)?
  2. Compatibility — Can it be easily assimilated into an individual's life?
  3. Simplicity or Complexity — Is it too complex for your average user?
  4. Trialability — How easy is it to experiment?
  5. Observability — To what extent is it visible to others (e.g. for viral adoption)
Facebook, which started as a closed community at Harvard and other colleges and grew from there, obviously offered significant relative advantage over MySpace. I was in California at the time and it seemed like everyone had a MySpace page while only students (and a few of us in local/company networks) had Facebook. It took off like wildfire when they solved the trialability problem by opening the floodgates and a critical mass of users was quickly drawn in due to the observability of viral email notifications, the simplicity of getting up and running and the compatibility with users' lives (features incompatible with the unwashed masses — such as the egregiously abused "how we met" form — are long gone and complex lists/groups are there for those who need them but invisible to those who don't). Twitter is also trivial to get started but can be difficult to extract value from initially.

Network models

Conversely, the complexity of getting started on Google+ presents a huge barrier to entry and as a result we may see the circles interface buried in favour of a flat "follower" default like that of Twitter (the "suggested user list" has already appeared), or automated. Just because our real-life social networks are complex and dynamic does not imply that your average user is willing to invest time and energy in maintaining a complex and dynamic digital model. The process of sifting through and categorising friends into circles has been likened to the arduous process of arranging tables for a wedding and for the overwhelming majority of users it simply does not offer a return on investment:
In reality we're most comfortable with concentric rings, which Facebook's hybrid model recently introduced by way of "Close Friends", "Acquaintances" and "Restricted" lists (as well as automatically maintained lists for locations and workplaces — a feature I hope gets extended to other attributes). By default Facebook is simple/flat — mutual/confirmed/2-way connections are "Friends" (though they now also support 1-way follower/subscriber relationships ala Twitter). Concentric rings then offer a greater degree of flexibility for more advanced users and the most demanding users can still model arbitrarily complex networks using lists:
In any case, if you give users the ability to restrict sharing you run the risk of their actually using it, which is a sure-fire way to kill off your social network — after all, much of the value derived from networks like Facebook is from "harmless voyeurism". That's why Google+ is worse than a ghost town for many users (including myself, though as a Google Apps users I was excluded from the landrush phase) while being too noisy for others. Furthermore, while Facebook and Twitter have a subscribe/follow ("pull") model which allows users to be selective of what they hear, when a publisher shares content with circles on Google+ other users are explicitly notified ("push") — this is important for "observability" but can be annoying for users.


The requirement to provide and/or share your real name, sex, date of birth and a photo also presents a compatibility problem with many users' expectations of privacy and security, as evidenced by the resulting protests over valid use cases for anonymity and pseudonymity. For something that was accepted largely without question with Facebook, the nymwars appear to have caused irreparable harm to Google+ in the critically important innovator and early adopter segments, for reasons that are not entirely clear to me. I presume that there is a greater expectation of privacy for Google (to whom people entrust private emails, documents, etc.) than for Facebook (which people use specifically and solely for controlled sharing).

Adopter categories

Finally, there are 5 classes of adopters (along the X axis) varying over time as the innovation attains deeper penetration:
  1. Innovators (the first 2.5%) are generally young, social, wealthy, risk tolerant individuals who adopt first.
  2. Early Adopters (the next 13.5%) are opinion leaders who adopt early enough (but not too early) to maintain a central communication position.
  3. Early Majority (the next 34%, to 50% of the population) take significantly longer to adopt innovations.
  4. Late Majority (the next 34%) adopt innovations after the average member of society and tend to be highly sceptical.
  5. Laggards (the last 16%) show little to no opinion leadership and tend to be older, more reclusive and have an aversion to change-agents.
I've ruled out wealth because while buying an iPhone is expensive (and thus a barrier to entry), signing up for a social network is free.

The peak of the bell curve is the point at with the average user (e.g. 50% of the market) has adopted the technology, and it is very difficult both to climb the curve as a new technology and to displace an existing technology that is over the hump.

The Chasm

The chasm (which exists between Early Adopters and Early Majority i.e. at 16% penetration), refers to Moore's argument in Crossing the Chasm that there is a gap between early adopters and the mass market which must be crossed by any innovation which is to be successful. Furthermore, thanks to accelerating technological change they must do so within an increasingly limited time for fear of being equaled by an incumbent or disrupted by another innovation. The needs of the mass market differ — often wildly — from the needs of early adopters and innovations typically need to adapt quickly to make the transition. I would argue that MySpace, having achieved ~75 million users at peak, failed to cross the chasm by finding appeal in the mass market (ironically due in no small part to their unfettered flexibility in customising profiles) and was disrupted by Facebook. Twitter on the other hand (with some 200 million active users) has crossed the chasm, as evidenced by the presence of mainstream icons like BieberSpears and Obama as well as their fans. LinkedIn (for reasons explained above) belongs at the top right rather than the bottom left.

Disruptive innovations

The big question today is whether Google+ can cross the chasm too and give Facebook a run for its money. Facebook, having achieved "new-market disruption" with almost a decade head start in refining the service with a largely captive audience, now exhibits extremely strong network effects. It would almost certainly take another disruptive innovation to displace them (that is, according to Clayton Christensen, one that develops in an emerging market and creates a new market and value network before going on to disrupt existing markets and value networks), in the same way that Google previously disrupted the existing search market a decade ago.

In observing that creating a link to a site is essentially a vote for that site ("PageRank"), Google implemented a higher quality search engine that was more efficient, more scalable and less susceptible to spam. In the beginning Backrub Google was nothing special and the incumbents (remember Altavista?) were continuously evolving — they had little to fear from Google and Google had little to fear from them as it simply wasn't worth their while chasing after potentially disruptive innovations like Backrub. They were so disinterested in fact that Yahoo! missed an opportunity to acquire Google for $3bn in the early days. Like most disruptive technologies, PageRank was technologically straightforward and far simpler than trying to determine relevance from the content itself. It was also built on a revolutionary hardware and software platform that scaled out rather than up, distributing work between many commodity PCs, thus reducing costs and causing "low-end disruption". Its initial applications were trivial, but it quickly outpaced the sustaining innovation of the incumbents and took the lead, which it has held ever since:

Today Facebook is looking increasingly disruptive too, only in their world it's no longer about links between pages, but links between people (which are arguably far more valuable). Last year while working at Google I actively advocated the development of a "PageRank for people" (which I referred to as "PeopleRank" or "SocialRank"), whereby a connection to a person was effectively a vote for that person and the weight of that vote would depend on the person's influence in the community, in the same way that a link from microsoft.com is worth more than one from viagra.tld (which could actually have negative value in the same way that hanging out with the wrong crowd negatively affects reputation). I'd previously built what I'd call a "social metanetwork" named "meshed" (which never saw the light of day due to cloud-related commitments) and the idea stemmed from that, but I was busy running tape backups for Google, not building social networks on the Emerald Sea team.

With the wealth of information Google has at its fingertips — including what amounts to a pen trace of users' e-mail and (courtesy Android and Google Voice) phone calls and text messages — it should have been possible for them to completely automate the process of circle creation, in the same way that LinkedIn Maps can identify clusters of contacts. But they didn't (perhaps because they got it badly wrong with Buzz), and they're now on the sustaining innovation treadmill with otherwise revolutionary differentiating features being quickly co-opted by Facebook (circles vs lists, hangouts vs Skype, etc).

Another factor to consider is that Google have a massive base of existing users in a number of markets that they can push Google+ to, and they're not afraid to do so (as evidenced by its appearance in other products and services including AndroidAdWords, BloggerChrome, Picasa, MapsNewsReader, TalkYouTube and of course the ubiquitous sandbar and gratuitous blue arrow which appeared on Google Search). This strategy is not without risk though as if successful it will almost certainly attract further antitrust scrutiny, in the same way that Microsoft found itself in hot water for what was essentially putting an IE icon on the desktop. Indeed I had advocated the deployment of Google+ as a "social layer" rather than isolated product (ala the defunct Google Buzz), but stopped short of promoting an integrated product to rival Facebook — if only to maintain a separation of duties between content production/hosting and discovery.

The solution

While I'm happy to see some healthy competition in the space, I'd rather not see any of the social networks "win" as if any one of them were able to cement a monopoly then us users would ultimately suffer. At the end of the day we need to remember that for any commercial social network we're not the customer, we're the product being sold:
As such, I strongly advocate the adoption of open standards for social networking, whereby users select a service or host a product that is most suitable for their specific needs (e.g. personal, professional, branding, etc) which is interoperable with other, similar products.

What we're seeing today is similar to the early days of Internet email, where the Simple Mail Transfer Protocol (SMTP) broke down the barriers between different silos — what we need is an SMTP for social networking.

  • Facebook: 800 million users (active) [source]
  • Twitter: 200 million users (active) [source]
  • LinkedIn: 135 million users (total) [source]
  • MySpace: 75.9 million users (peak) [source]
  • Google+: 40 million users (total) [source]

09 November 2011

RIP Adobe Flash (1996-2011) - now let's bury the dead

Adobe kills mobile Flash, giving Steve Jobs the last laugh, reports The Guardian's Charles Arthur following the late Steve Jobs' epic Thoughts on Flash rant 18 months ago. It's been about 2.5 years since I too got sick of Flash bringing my powerful Mac to its knees, so I went after the underlying lie that perpetuates the problem, explaining why Adobe Flash penetration is more like 50% than 99%. I even made progress Towards a Flash free YouTube killer, only it ended up being YouTube themselves who eventually started testing a YouTube HTML5 Video Player (while you're there please do your bit for the open web by clicking "Join the HTML5 Trial" at the bottom of that page).
I heard a sound as though a million restaurant websites cried out at onceCharles Arthur
You see, armed with this heavily manipulated statistic, armies of developers are to this day fraudulently duping their paying clients into deploying a platform that will invariably turn away a percentage of their business at the door, in favour of annoying flaming logos and other atrocities that blight the web:

How much business can you tolerate losing? If you've got 95% penetration then you're turning away 1 in 20 customers. At 90% you're turning away 1 in 10. At 50% half of your customers won't even get to see your product. I don't know too many businesses who can afford to turn away any customers in this economic climate.

In my opinion the only place Flash technology has in today's cloud computing environment is as a component of the AIR runtime for building (sub-par) cross-platform applications, and even then I'd argue that they should be using HTML5. As an Adobe Creative Suite Master Collection customer I'm very happy to see them dropping support for this legacy technology to focus on generating interactive HTML5 applications, and look forward to a similar announcement for desktop versions of the Flash player in the not too distant future.

In any case, with the overwhelming majority of devices being mobile today and with more and more of them including browser functionality, the days of Flash were numbered even before Adobe put the mobile version out of its misery. Let's not drag this out any longer than we have to, and bury the dead by uninstalling Flash Player. Here's instructions for Mac OS X and Windows, and if you're not ready to take the plunge into an open standards based HTML5 future then at least install FlashBlock for Chrome or Firefox (surely you're not still using IE?).

Update: Flash for TV is dead too, as if killing off mobile wasn't enough: Adobe Scrapping Flash for TV, Too‎

Update: Rich Internet Application (RIA) architectures in general are in a lot of trouble — Microsoft are killing off Silverlight as well: Mm, Silverlight, what's that smell? Yes, it's death

Update: In a surprising move that will no doubt be reversed, RIM announced it would continue developing Flash on the PlayBook (despite almost certainly lacking the ability to do so): RIM vows to keep developing Flash for BlackBerry PlayBook – no joke

How NOT to respond to vulnerability reports

Reuven Cohen and the guys at Enomaly could write the book on how NOT to respond to vulnerability reports:
  1. Don't disavow vulnerabilities in products you've previously taken credit for
  2. Don't claim issues are not valid while denying researchers a right of reply
  3. Don't claim obvious issues are "unactionably vague" and then ignore them, even after a working exploit is publicly available
  4. Don't claim trivial remote root exploits are "theoretically valid but extremely difficult to exploit"
  5. Don't claim it's ok to rely on security by obscurity or race conditions
  6. Don't turn on moderation because a researcher posts a vulnerability report to your lists
  7. Don't subsequently ban a researcher from your lists because they tried to notify your users when you failed to
  8. Don't claim that security vulnerabilities are ok because there have been "no reports of any security compromise"
  9. Don't claim "other mitigating factors that have been present in the environment from the beginning" when the vulnerability has already been demonstrated
  10. Don't ask for private notification of vulnerabilities only to then ignore/dispute them
  11. Don't publicly call researchers unethical for opting for full disclosure, especially when they do so because you have been reticent and unresponsive in the past
  12. Don't release ineffective fixes, especially when the researcher has told you exactly how to fix it
  13. Don't dispute the vulnerability when a clearinghouse like Secunia contacts you to verify it
  14. Don't criticise researchers for reviewing your product
  15. Don't shoot the messenger
  16. Don't downplay critical vulnerabilities as "relatively minor", "random" paths as "pretty hard to guess", etc.
  17. Don't send in board members to fight your battles
  18. Don't claim new products having "significant new and enhanced functionality" is a valid excuse
  19. Don't make security claims like "High Assurance" if you're not going to take security seriously
  20. Don't claim that "Enomaly shall be entitled to (i) suspend or de-activate your account without notice, and (ii) retain any remaining funds in your account", and definitely don't actually do it.
After my recent SploitCloud: exploiting cloud brokers for fun and profit article and the follow-up Retro vulnerability of the day: cleartext passwords over the wire you'd have thought the publicly demonstrated vulnerabilities would have been quietly fixed and we'd have moved on. But no — they've decided instead to suspend my Spotcloud account so as I can't find any more holes, keeping funds they were holding in trust for payment to third-party providers as "compensation" — something I'm more inclined to refer to as "theft":

Enomaly have also not only failed to notify Spotcloud buyers and sellers that they are vulnerable themselves, but moderated (e.g. deleted) my notification to them and banned me from the lists in the process:

If I were one of the (apparently few) users of the Spotcloud service then I'd be extremely dissatisfied, to say the least, that this information was being actively concealed from me. At the end of the day you owe it to yourselves and your users to only ever work with providers who take security seriously.

06 November 2011

Retro vulnerability of the day: cleartext passwords over the wire

While spending my Sunday looking at what people are doing with various cloud platform services I came across these 4 case studies on the Google App Engine (GAE) pricing page:
Ignoring WebFilings (who have an Amazon EC2 backend) and gigya (who have their own platform and only use GAE for their live chat applet), Best Buy caught my eye as I already caught them sending employee credentials in the clear with the Twelpforce GAE app written by Enomaly a few years ago and Giftag was also done in "partnership" with Best Buy (whatever that means): "Enomaly Launches Giftag.com for Best Buy".

I also stumbled on a cross-site request forgery vulnerability in Enomaly's own flagship SpotCloud product earlier this year, which I wrote up last week — some 6 months after the initial report: SploitCloud: exploiting cloud brokers for fun and profit.

Sure enough when you crack out Wireshark and sniff the wire you can clearly see they're sending credentials in the clear over the public Internet, both at signup:

...and for good measure, on every login:
This wouldn't be such a problem were it not for rampant password reuse — I would not be at all surprised if most of the email/password combinations captured also worked on the email account itself. That is, by sniffing Giftag signups/logins you also have a good chance of a type of privilege escalation to the email account and from there to other services like Facebook:

To their credit(!?!), the other GAE case study application (Apmasphere, a property management application by Ray White, Australia's largest real estate group) exhibits exactly the same vulnerability, both at signup:

...and at login:

The moral of the story is that it doesn't matter how trivial your app is, given enough rope users will hang themselves by re-using passwords. As developers you owe it to your employers, clients and users to protect them from themselves, in this case by requiring SSL using Google App Engine's "secure: always" configuration directive which was introduced over 3 years ago. Very soon you'll also be able to use your own domains with SSL (rather than *.appspot.com) which, due to limitations in the protocol, is technically challenging to implement for a multi-tenant service at scale.

Update: While Best Buy's Giftag IP address is owned by Google (according to whois) and runs on the Google platform (according to the Server: Google Frontend HTTP header), the IP address for Ray White's Apmasphere is owned by Primus Telecommunications (according to whois) and runs an Apache web server (according to the Server: Apache HTTP header). Does anyone know whether one of the four main GAE case studies has indeed migrated to an in-house platform and if so, when and why? More to the point, is anyone aware of anyone doing anything of any consequence on GAE? I'm still looking for decent case studies of GAE native applications.

Update: Enomaly founder, Reuven Cohen disavows the vulnerability, claiming "Interestingly, the GAE version the giftag site wasn't developed by enomaly." SFAICT the "GAE version" is the only version so in my opinion they're responsible or they're plagiarists — taking someone else's work or ideas and passing them off as one's own. I'll let you decide for yourselves:

And 6 months later:

Update: The Giftag extension for Firefox is also vulnerable:

Update: Even the bookmarklet is vulnerable... if you add this to your toolbar and click on it then it will insecurely retrieve Javascript (gift-bookmarklet-loader.js) and execute it, even within an SSL session. That is, an attacker can trivially execute trusted code that has full access to secure pages:

javascript: (function () {
    var d = document;
    var s = d.createElement('script');
    s.id = "gt_boot";
    s.setAttribute('src', 'http://www.giftag.com:80/media/js/gift-bookmarklet-loader.js');