Today, I absolutely rejoiced when I read Phil Elliott's interview with Club Penguin co-founder Lane Merrifield in gamesindustry.biz. Someone else gets that, when a company creates an online community (and I don't care if it's a virtual world or a message board,) moderation* is necessary to keep the inhabitants of that space safe from stupidity...by moderation, I mean some oversight of the dialogue going on. Doesn't mean every post has to be held, but it does mean the conversation should be humanly screened in some way...
When you keep a place safe through watching the conversation, you are laying the foundation for a civil environment. When you lay the foundation for a civil environment, you do not need to ask participants to disclose sensitive information about themselves.
Now, I can hear y'all bloatedly bluster: "but Merrifield is talking about the kid's virtual world (Club Penguin)! Of course *children* need to be protected online!"
Well, don't all of us need some protection when we enter an online community??? Think about it: If you've ever participated in an online community, you know that if you don't want to get flamed right out of the gate, you have to lurk. Some people never get past the lurking stage to participate, in part because it never feels totally safe. Perhaps they never feel totally safe because they may be asked to disclose more information about themselves than feels comfortable or right.
Oh, yes, I've heard the answers to this one: "Civility is all about disclosure! We have to do away with anonymity or else these users will just continue to behave like cretins!!" says Andrew Keen's echo...
Although, according to Merrifield, one of the ways in which they keep kids safe at Club Penguin is "we encourage kids to not reveal any sort of personal identifiable information. "
Gee, isn't that kind of like that old bugaboo ANONYMITY?? Oh my Lord! He's encouraging kids to be either pseudonymous or anonymous online so that they don't get hurt!
Isn't that what some of us were doing when we had screen names in those oldy-mouldy chat rooms and online communities at the dawn of the Internet? I believe so! Us super-early adopters knew there were risks, and wanted to keep ourselves safe while engaging in conversation. It's like not showing your driver's license to the guy you're chatting with at a bar...
Merrifield adds...."And we have huge filters and over 100 moderators to try and keep the world as safe as possible.
So, if you're going to have an online community where people will have their personal information kept safe, then you will need at least two things: (1) filters and (2)a sufficient number of moderators to make sure that community stays safe.
This seems like a total, logical no-brainer to me--and should be even more of a no-brainer when it comes to communities frequented by grown-ups. Let's face it: some folks might be grown in body, but that's no indication of the growth of their minds...let alone the old Id, the unbridled expression of which has been touted as "free speech" by people who just don't get that with such a big freedom can come equally big responsibility (I mean you Howard Stern, and all the other "shock jocks" out there. Yes, I blame shock jock-yness for the decline in civil public discourse. It started well before the Internet, and so were discussions about that decline...)
Back to online communities: I'd like to point y'all back to Clay Shirky's wonderful essay A Group is its own Worst Enemy" where he defines three persistent patterns in long-lived online communities. These patterns continue ad nauseum within online communities, whether they are hosted by newspapers or by corporations or other concerns. These are weird little quirks of human nature that even occur when we are totally non-anonymous in face to face environments (Shirky cites W.R. Bion's work in the mid 20th century...way before the Internet.)
When it comes to the human condition, online or off, the more things change, the more they stay the same. We behave in weird little ways in groups because we look to preserve the safety of that group (see Shirky.) Sure, a site may be able to get adults to disclose all the pertinent information that the Powers believe is necessary to create "civility." But the direction in which an online community de-volves can't necessarily be controlled by those Powers. People can be forced to surrender their full identities in order for them to participate in a community--but that might simply end up exposing them in ways they may be very uncomfortable and unsafe. By exposure, I do not mean just to malicious folks, but also exposure in search which is becoming more prevalent for information/social profiles we put online. (Google yourself and see what I mean...)
There is another aspect to Merrifield's interview that's of interest: the attitude about technology and community. Perhaps it is because he's working with children that he sees the "irresponsible nature of our industry" (towards children) and the "over-reliance on technology" as obstacles that had to be overcome in order to create a safe community.
I find it sad that this sort of care couldn't be extended to adults as well, and that human moderation is still thrown aside in favor of automated filters (CP has filters as well, but they work in concert with human beings--not as an efficient replacement for human beings.)
"Oh, but if we have all these moderators, won't we be treating the adults like babies??"
No--and here's why: whether you're a kid or an adult, online is a weird place. You can be chatting or exchanging messages with someone who tells you he/she is in one place, when he/she might even be next door. You might interpret someone's message syntax (if you get this far) to be female, when, in fact, you're chatting with a male. Everything about a profile can be faked to look like the person is real (stock photos touched up to make them look real, unverifiable addresses and other things.) There are no guarantees that the info a person submits to a site is their correct personal information. Most of the time the only verification is an active email address.
Bottom line: moderation is key to civility and safety in online communities. It doesn't have to be heavy-handed, and it can be assisted by technology--but it can't be done by one guy for thousands of messages (as I understand it has been at the Hartford Courant...)
If an industry--newspaper or otherwise--is going to skimp on moderation because of cost, or hires moderators who are either very inexperienced or hostile to the mores of online communities (because they've read too much Andrew Keen and think "why can't these idiots give up this anonymity crap!") then perhaps the industry has no business building an online community in the first place.
One addendum: if a site is developed that insists on full disclosure, there should be mechanisms in place that shields certain information from others. However, this is no guarantee that the information entrusted to you won't get into search--unless there is some mechanism to block search. What may end up happening is that the community will consist of only certain kinds of individuals who are comfortable living out in the open online. That may be fine if meeting the goals of your community require disclosure, but remember that your participation levels will be limited to certain kinds of folks. If you gain a good level of participation, then great! You may, however, have to live with a small, yet beautiful, community. Still, don't think some folks aren't going to try to spoof you with pseudonyms. There are no guarantees...
For further reading: Getting Commenters to Play Nice from my friends at Poynter.org. My only criticism: what about when the newspaper's staff doesn't play nice??
Chris Brogan's great On Managing a Community gives a step-by-step strategy for good community management.
and Francois Gossieaux's Understanding the power of communities--even when you do not have a critical mass of users--no matter how small your community, there's still a whole lot you can learn from your participants!
7 comments:
Excellent post Tish. I like your point about it not being about using real names, but also not being wide open.
thank you, Jack! I wasn't quite sure if that point was clear. :-)
Hi Tish - thank you for the link. Having been knee-deep in research on communities for the last few months, it is amazing to realize what is needed to make them work...
Hi Francois! I know what you mean! It's very easy to get caught up in either the research or the theoreticals of communities and forget about the "people" part of it. hope the research is going well :-)
On the "customizing" of profiles, I asked someone the other day to clarify for me which pic of him was real, the relaxed 20something or the suit and tie stern 50+ one: he said he used photos to suit the culture of the network. No, I didn't ask him to explain further - life's short enough!
Anonymity - whether via an avatar, screen-name or withheld personal info - has been important part of using the internet in my literacy work with adults. They're not children, but they are a vulnerable population. They're newbies with a stigma ("you can't read?!?") and little off-line social capital. That means, among other things, that they're inexperienced in web conventions - little things like using *LOL* to say "I'm just kidding" - as well as ordinary writing conventions.
I'm always worried when I watch them step out on to the unmoderated web (even though I know that's part of what becoming an independent learner looks like).
As for myself, I do it exactly like you said: lurk for awhile, and then decide if it's safe. The difference between me and my learners, I suppose, is that I have more tools for identifying unsafe environments.
hi Des... wow! I find that comment from that guy rather callous! I'd call that a kind of "sock puppetry" of the highest order. That's also very different, IMO, from anonymity. Then again, his syntax might eventually give his age away...
Hi Wendell... I'm glad you stopped by to give your perspective. You work with a unique population--one I think most people (and many in the tech industry) don't think about. It's not that they need "protection" but time to learn, and to be guided. It's good that you're around to give them guidance--and from your own experience!
Post a Comment