Friday, March 6, 2009

Some 2009 Twitter Statistics

Via istrategylabs: 2009 Twitter Demographics and Statistics Report.

I was bit surprised to find that Twitter is more heavily female than male, but not that it's popular in the 18-34 demographic.

Saturday, February 14, 2009

Our Documentation Impulse

Via Digital Natives:
Viewfinders: Digital Natives and the Documentation Impulse

[Regarding the Obama Inauguration expereince ...]

But what about the digital natives who were “really” there? How did they traverse the liminal zone between their screens and the scenes around them?

Two indelible images from Inauguration Day involved young people with cameras, capturing history for instant posterity. It’s no surprise that digital natives would use technology to mediate their experiences, but seeing the mediation in action is confronting, inspiring, and somewhat curious.

As President Obama took the oath slightly after noon on January 20, cameras were evident everywhere in the audience. But right there, at the very front, Malia Obama foreshadowed what it will be like to have two have two tech-savvy young girls in the White House. Even as her father was making history, Malia was focused on her tiny pastel camera, documenting the moment for herself.


But the instantaneity of the Inauguration photography cycle served underlined the ubiquity of the impulse, and, more importantly, its ubiquitous expression. When even the president’s daughter experiences her father’s inauguration “through the viewfinder,” this perspective transcends lark and becomes a force in its own right.

I used to do this long before our digital revolution. I still remember pointing my Kodak 100 camera at nearly everything; our school functions were rife with cameras. Is this necessarily something unique or even necessarily more ubiquitous in our digital era? I admit I carry my camera around much more than I did before, but I often forget to pull it out at events, only to wish I had later. Ironically, when I was stuck with a larger film camera that required more effort to bring with me (as well more effort and money to get images out of) I was far more likely to be looking at an event through the viewfinder.

So, have you succumbed to the urge to point your digital camera or camera phone at every memorable expereince? Or do you forget to do so?

Resource: Social Media in Government

Via the Guide to Managing U.S. Government Websites:
Social Media and Web 2.0 in Government

What Is Social Media and Web 2.0?

Social Media and Web 2.0 are umbrella terms that define the various activities that integrate technology, social interaction, and content creation. Social media use the "wisdom of crowds" to connect information in a collaborative manner online. Through social media, individuals or collaborations of individuals create web content, organize content, edit or comment on content, combine content, and share content. Social media and Web 2.0 use uses many technologies and forms, including RSS and other syndicated web feeds, blogs, wikis, photo–sharing, video–sharing, podcasts, social networking, social bookmarking, mashups, widgets, virtual worlds, micro–blogs, and more.
This site provides several documents on social media, useful to those who build, maintain and manage federal websites.

The Myth of Concentration vs. Digital Distraction

Via Mind Hacks:
The myth of the concentration oasis:

Wired has an interview with author Maggie Jackson who's recently written a book called 'Distracted: The Erosion of Attention and the Coming Dark Age' in which she argues modern life and digital technology constantly demand our attention and are consequently damaging our ability to concentrate and be creative. The trouble is, I just don't buy it and it's easy to see why.

The 'modern technology is hurting our brain' argument is widespread but it seems so short-sighted. It's based on the idea that before digital communication technology came along, people spent their time focusing on single tasks for hours on end and were rarely distracted.

The trouble is, it's plainly rubbish, and you just have to spend time with some low tech communities to see this is the case.

In some of the poorer neighbourhoods Medellín, my current city of residence, there is no electricity. In these barrios, computers, the internet, and even washing machines and telephones don't exist in the average home.

Pretty much everything is done manually. By the lights of the 'driven to digital distraction' argument, the residents should be able to live blissfully focused distraction-free lives, but they don't.

If you think twitter is an attention magnet, try living with an infant. Kids are the most distracting thing there is and when you have three of even four in the house it is both impossible to focus on one thing, and stressful, because the consequences of not keeping an eye on your kids can be frightening even to think about.

The manual nature of all the tasks means you have to watch everything. There is no timer on the cooker, so you need to watch the food. The washing has to be done, by hand, while keeping an eye on everything else.


An excellent counter-argument, this article points out many of the things that have long bothered me about the critiques of digital lives and digital distraction. After reading it occurs to me that, once again, class and gender have been ignored in those critiques, and that an assumption is made that regards our past ways of living as somewhat idealized.

Friday, February 13, 2009

Scientific American on Twitter

Via Scientific American:
Twitter: What is it good for?

Just who's using Twitter, and to what end? We're about to tell you, but the answer takes more than 140 characters — the limit for tweets.

Some 11 percent of U.S. adults who use the Internet also send status updates on Twitter, a three-year-old "communications protocol" that allows users to blast small bursts of info to their followers and friends, according to new data by the Pew Internet & American Life Project. Status updating is most common among young adults: 20 percent of 25-to-34-year-olds use Twitter, as do slightly fewer 18-to-24-year-olds. The results are based on a telephone survey of 2,253 adults.

Twitter, Yammer, Facebook and other micro-blogging platforms might be seen as just another way to self-promote. But more recently they've become journalism tools: reporters including those at use Twitter as a dedicated newsfeed to keep up with the competition


That said, many Twitter users embrace the technology as a way of feeling "ambient intimacy," Fox says, just as people share the details of their lives with those far away via the telephone, email and blogging.


But if you're annoyed by the content of the tweets in your feed, don’t blame Twitter, Fox says. "New technology is often praised or blamed for human foibles that are universal, whether it is the telephone or social networking sites or now Twitter," she says. "In the hands of some, it will be a tool of self-promotion and for some, other pursuits."

That last statement is important, I think. So often I hear people criticize trending technology as a "waste of time" or especially annoying when what they really seem to mean is that they found the community or interaction with others within that technology annoying or frustrating or a waste of time. And really, haven't we all experienced that in one way or another without the aid of technology? Social technology simply allows us to behave as we already culturally do on a larger scale and with more reach.

Wednesday, February 11, 2009

Bloggies to announce global blog awards at SXSW

Besides the usual interactive panels and discussions, The 2009 Bloggies will announce their winners at this year's SXSW Interactive Festival, in Austin, Texas. According to the awards website, the Bloggies are the Web's longest-running blog awards, and the nominations, finalist selection, are up to the blog reader.

Ndesanjo Macha over at Global Voices notes that five blogs have been nominated in the Best African Weblog category: Being Brazen, Appfrica, Glad To Be a Girl, West Africa Wins Always, and Scarlett Lion.

Other global categories include best European blog, best Asian blog, best Latin American blog, etc.

As an aside, am I the only person who can't stand the sideways scrolling of the Bloggies web site?

Tuesday, February 10, 2009

Reflecting on Cultural Changes

Actor David H. Lawrence makes his case for what he feels is SAG's (The Screen Actor's Guild) failure to understand the realities of how to make money with the Internet:
"But I thought..." Hey, you thought wrong.

In a recent article, several bloggers who, if we're to believe the writing/broadcasting of the subject in mainstream media, should be making millions, aren't really doing squat.

And they're actually giving up - quitting the blogging biz. It's not "new media," SAG's stupid "now media" or anything close. Sadly, it's "never was and never gonna be media" for them.

One guy, Dan Lyons, the writer whose "Fake Steve Jobs" column got him plenty of attention and plenty of talking head time on TV news shows, made a grand total of a little over a thousand dollars one month, his best month ever - but he needed the exposure of being outed as Fake Steve Jobs by no less than the New York Times to get anywhere near that kind of traffic, which translated into some ad dollars. He says he never made enough money to quit his day job.

And that wouldn't be enough for most high profile old media personalities to attempt to become blogging superstars - and besides, aren't the Andrew Sullivans and Robert Scobles and yes, Dan Lyonses of the world supposed to be rolling in the Internet dough?

Isn't that the line that SAG's Alan Rosenberg and Membership First is feeding us about the future of the Internet and our creative work on it?

Reading David's blog post, I'm stuck by how much has changed in our culture with regards to the way we consume entertainment.

Take television, for example. Though the first generation of television sets were available before 1935, the display screens only served up a somewhat blurry reddish-orange picture about half the size of a business card. Broadcasters began appearing shortly thereafter, but it wasn't until almost 1948 that television sets began to appear in earnest in American homes.

What began as a multi-family gathering around the one set available in the neighborhood, gave way to the nostalgic image of the American family gathered around the television set, now a relic of the 1950's, nearly twenty years after the technology first became available. Then, as the technology became smaller and more affordable and nearly every home had more than one television set, the nuclear family parted ways at show time. Soon broadcasters were no longer targeting the American family, but the evening-viewing dad, the stay-at-home-mom who watched the daytime soaps, and children who were now being baby-sat by the boob-tube.

And now? Whoa.

Take my own household as an example of the monumental cultural change:

We own one television. It's a gorgeous bigger-than-I-am high definition screen. But it's only one. With two computers (and at one time, 4 monitors), three laptops, an iPod and two iPhones, who needs another TV?

We almost never watch something as it airs. We use a TIVO and digitally record everything, sometimes watching a show 15 minutes after it's recorded, and sometimes, um (let me check), yeah, five weeks after it's recorded. We almost always wait for a show to be 20-30 minutes into recording before we attempt to watch it. That way we can easily fast-forward through the commercials and not catch up with the stream. We have no patience for watching television that we can't pause so that we can engage in conversation, answer the phone, or simply put off 'till the next day because we felt like it. In a situation like that we'd rather just tune off the set. We download shows and movies to our iPods or laptops to watch while traveling for business or pleasure simply to avoid the aforementioned situation.

We rarely ever try out a new show just because the network is pumping out commercials for it. We never see the commercials, remember? Instead, we hear about new shows through our friends and family and record a couple of episodes, or download from iTunes, or rent a season from Netflix in order to check it out. If three episodes hold our interest we'll use our TIVO's subscription feature, or rent all the seasons off of Netflix, or (lastly) download all from iTunes or Hulu in order to catch up with the show.

And don't think target people who watch a lot of TV will reach folks like my husband with peer recommendations. We got turned onto the show "Heros", in which David Lawrence guest stars as a delightfully vicious villain, through a friend of ours - who doesn't even own a TV. We figured that if someone we know - who had pretty much eschewed television on a regular basis - recommended something to us, it was worth checking out. So we gave in to her instance that we check it out, and bought three episodes off of iTunes. We liked it enough that we bought the rest of the episodes and caught up with the broadcast of the show. And then promptly invited our TV-less pusher to come watch a season finale with us on our big screen.

We also hear about new shows through online forums and email lists where we're following the sorts of shows we already love. Occasionally I'll check out an actor's IMDB entry for other things they've appeared in, and set my TIVO to catch those. If there's something I want my parents - who don't bother with cable or satellite - to see, I'll either (a) buy the DVD set or (b) download them and burn them to DVDs myself and give them to her.

We drive by cars where the children in the back seat no longer listen to music on a walkman, but watch TV from tiny headrest televisions. On business trips we set the laptop on a table, slip a DVD in and enjoy room service - the hotel's nice flat screen completely abandoned.

In fact, now that I think about it, my husband and I no longer seek out the entertainment that our communities (friends, family, local and national peer groups) used to share. Instead, we seek out communities around the entertainment we've chosen, filtered through technology. The only time in the last several years that I've sat down to watch something, live as it aired, surrounded by other people and sharing in a mass cultural event was as Barack Obama was being sworn in as President of the United States. And even then I was also following it on Twitter and online.

This litany of changes doesn't even cover the kind of thing that David mentions in his post - namely, the tendency for audience members like myself to find and use free versions rather than pay for programming. So how does a studio, production company or an actor make money in an environment like this? What do the changes in our culture mean for this industry? I'm not sure, but I agree with David - what his industry, from the unions to the studios - currently think they should be doing is probably going to bring them a lot of heartache. And empty pockets. As David puts it:
Pay attention to the inevitable shakeup over the next couple of years, as shareholders, upper management and the bean counters will begin to weed out the programs currently touted as our future: Hulu,, the CBS Audience Network, even YouTube, FunnyOrDie and the like. If they don't start making serious money soon, the corporations that would seem to be best positioned to make the journey from investment to profitability will scrap those plans. They're not making any money. And they're not sharing any of the money they're not making with the people who are making their product - you.

And you might blame them, which would be wrong. Rather blame the audience, trained to steal and feel great about it, brought up with an insane sense of entitlement, and a total lack of respect for what it takes to make great entertainment. And blame yourself if you've gone from being an arrogant Grokster user to a Internet supastah wannabe and can't figure out why you're not eFamous and eRich.
The situation is not without hope. It's just that cookie-cutter solutions, and starry-eyed dreams of mass market 1990' success is not the right attitude to make a go of this:
Not that it can't be done. Individuals who have actually figured out a niche, many of whom I share regular emails and strategies with in a private online mastermind group, happily go about their businesses, making with very little effort, hundreds of thousands, even millions of dollars a year in creative online marketing endeavors. And they chuckle at the attempts at blatant and clumsy systemization of those strategies, cloddish attempts to re-create internet marketing success at will, in which the television networks and film studios are currently engaged.

That's the metric for success for them: can we do it? Can we make money? And most importantly, can we do it again? And again? And again?
Anyway, I thought David's post was a thought provoking perspective (and advice to his industry) from someone trying to navigate a career through the changes technology has brought to our shared culture.

Updated 2/13/2009:

David H. Lawrence makes his case to *** on YouTube:

Here's the YouTube video he's responding to:

By-the-way, if you're confused about what David and John are discussing here a few things that might help get you oriented:

SAG is the Screen Actors Guild, a union that "exists to enhance actors’ working conditions, compensation and benefits and to be a powerful, unified voice on behalf of artists’ rights."

AFTRA is "The American Federation of Television and Radio Artists", a union "representing professional actors, dancers, singers, and broadcasters."

AMPTP is the Alliance of Motion Picture and Television Producers, which represents Hollywood's major film and TV studios.

What they're debating involves the restart of contract talks between SAG and the AMPTP. A major sticking point in the negotiations has centered on issues of how much actors should get paid for work distributed over the Internet. This has led to some considerable debate among SAG members.