Tag Archives: research

Nonprofit website benchmarks study released

Groundwire Website Benchmarks Cover
Download me!

I’m very happy to have pushed the “launch” button on Groundwire’s 2010 Website Benchmarks Study, a first-of-its-kind-so-far-as-I-know report that takes an in-depth look at website statistics and online behaviors of 43 small-to-midsized environmental nonprofits.

There’s a ton of useful information, not only about groups’ “raw” website statistics, but also about how much time and energy groups are investing in their web presence.  Lots to chew on for nonprofits of any size, but I think it’s especially relevant for groups up to about 50 staff.

One thing I’m particularly proud of is the fact that I was able to develop a highly scalable and repeatable methodology for quickly gathering data, using a combination of a simple, open-source Python script (written by my awesome colleague Matt Yoder) for interacting with Google Analytics and a quick-and-dirty online survey instrument.

Using more tools != better

I think we would all be better off without analyses like this which inventory how many social media tools large advocacy groups are using as if using more tools is somehow indicative of sophistication, effectiveness or having a solid strategy for achieving your organizing goals.

Sigh.  When will consultants stop promoting this kind of shallow, tool-centric thinking?  Probably never, because it’s easy, cheap marketing.

Is the Tipping Point Bullshit?

New research suggests the Malcom Gladwell-popularized theory of “Influentials” (or Gatekeepers) doesn’t hold water. Really interesting article in FastCompany about research Duncan Watts:

Watts, for one, didn’t think the gatekeeper model was true. It certainly didn’t match what he’d found studying networks. So he decided to test it in the real world by remounting the Milgram experiment on a massive scale. In 2001, Watts used a Web site to recruit about 61,000 people, then asked them to ferry messages to 18 targets worldwide. Sure enough, he found that Milgram was right: The average length of the chain was roughly six links. But when he examined these pathways, he found that “hubs”–highly connected people–weren’t crucial. Sure, they existed. But only 5% of the email messages passed through one of these superconnectors. The rest of the messages moved through society in much more democratic paths, zipping from one weakly connected individual to another, until they arrived at the target. Why did Milgram get it wrong? Watts thinks it’s simply because his sample was so small–only a few dozen letters reached their mark. The dominance of the three friends could have been a statistical accident. “And since Milgram’s finding sort of made sense, nobody even bothered to redo the experiment,” Watts shrugs. But when you perform the experiment with hundreds of successfully completed letters, a different picture emerges: Influentials don’t govern person-to-person communication. We all do.

There’s a really interesting bit about how they experimented with ForwardTrack, which makes viral forwarding activity transparent to the users. It massively increased pass-along traffic. I really want to start working this into more online activism work.