Threat Modeling in Digital Communites

· 04.08.2015 · etc

It's trolls all the way down

As part of my OpenNews fellowship I've recently started working on the Coral Project, a Knight-funded joint venture between the New York Times, the Washington Post, and Mozilla which, broadly, is focused on improving the experience of digital community online. That mission is an catch-all for lots and lots of subproblems; the set I'm particularly drawn to are those issues around creating inclusive and civil spaces for discussion.

Any attempt at this must contend with a variety of problems which undermine and degrade online communities. To make the problem more explicit, it's helpful to have a taxonomy of these "threats". I'll try to avoid speculating on solutions and save that for another post.

Trolls/flamers/cyberbullying

The most visible barrier to discussion spaces are deliberately toxic actors - generally lumped together under the term "trolls"*.

I think most people are familiar with what a troll is, but for the sake of completeness: trolls are the users who go out and deliberately attack, harass, or offend individuals or groups of people.

If you're interested in hearing more about what might motivation a troll, this piece provides some insight.

Astroturfing

Any mass conglomeration of spending power or social capital soon becomes a resource to be mined by brands. So many companies (and other organizations) have adopted the practice of astroturfing, which is a simulated grassroots movement.

For instance, a company gets a lot of people to rave about their products until you too, just by sheer exposure (i.e. attrition), adopt a similar attitude as your baseline. This is a much more devious form of spam because it deliberately tries to misshape our perception of reality.

This can increase the amount of noise in the network and reduce the visibility/voice of legitimate members.

Sockpuppeting/Sybil attacks

A common problem in ban-based moderation systems is that barriers-to-entry on the site may be low enough such that malicious actors can create endless new accounts with which to continue their harassment. This type of attack is called a Sybil attack (named after the dissociative identity disorder patient).

Similarly, a user may preemptively create separate accounts to carry out malicious activity, keeping deplorable behavior distinct from their primary account. In this case, the non-primary accounts are sockpuppets.

It seems the problem with Sybil attacks is the ease of account creation, but I don't think a solution to the Sybil attacks is to make barriers-to-entry higher. Rather, you should ask whether banning is the best strategy. Ideally, we should seek to forgive and reform users rather than to exclude them (I'll expand on this in another post). This solution is dependent on whether or not the user is actually trying to participate in good faith.

Witch hunts

This is the madness of crowds that can spawn on social networks. An infraction, whether it exists or not, whether it is big or small, becomes viral to the point that the response is disproportionate by several orders of magnitude. Gamergate, which began last year and now seems to be permanent part of the background radiation of the internet, is an entire movement that blew up from a perceived - that is, non-existent, nor particularly problematic - offense. In these cases, the target often becomes a symbol for some broader issue, and it's too quickly forgotten that this is a person we're talking about.

Eternal September

"Eternal September" refers to September 1993, when AOL expanded access to Usenet caused a large influx of new users, not socialized to the norms of existing Usenet communities. This event is credited with the decline of the quality of those communities, and now generally refers to the anxiety of a similar event. New users who know nothing about what a group values, how they communicate, and so on come in and overwhelm the existing members.

Appeals to "Eternal September"-like problems may themselves be a problem - it may be used to rally existing community members in order to suppress a diversifying membership, in which case it's really no different than any other kind of status quo bias.

To me this is more a question of socialization and plasticity - that is, how should new members be integrated into the community and its norms? How does the community smoothly adapt as its membership changes?

Brigading

Brigading is the practice where organized groups suss out targets - individuals, articles, etc - which criticize their associated ideas, people, and so on and go en masse to flood the comments in an incendiary way (or otherwise enact harmfully).

This is similar to astroturfing, but I tend to see brigading as being more of a bottom-up movement (i.e. genuinely grassroots and self-organized).

Doxxing

Doxxing - the practice of uncovering and releasing personally identifying information without consent - is by now notorious and is no less terrible than when it first became a thing. Doxxing is made possible by continuity in online identity - the attacker needs to connect one particular account to others, which can be accomplished through linking the same or similar usernames, email addresses, or even personal anecdotes posted across various locations. This is a reason why pseudonyms are so important.

Swatting

Swatting is a social engineering (i.e. manipulative) "prank" in which police are called in to a investigate possible threat where there is none. It isn't new but seems to have had a resurgence in popularity recently. What was once an activity for revenge (i.e. you might "swat" someone you didn't like) now seems to be purely for the spectacle (i.e. done without consideration of who the target is, just for lulz) - for instance, someone may get swatted while streaming themselves on Twitch.tv.

The Fluff Principle

The "Fluff Principle" (as it was named by Paul Graham) is where a vote-driven social network eventually comes to be dominated by "low-investment material" (or, in Paul's own words, "the links that are easiest to judge").

The general idea is that if a piece of content takes one second to consume and judge, more people will be able to upvote it in a given amount of time. Thus knee-jerk or image macro-type content come to dominate. Long-form essays and other content which takes time to consume, digest, and judge just can't compete.

Over time, the increased visibility of the low-investment material causes it to be come the norm, and so more of it is submitted, and so the site's demographic comes expecting that, and thus goes the positive feedback loop.

Power-user oligarchies

In order to improve the quality of content or user contributions, many sites rely on voting systems or user reputation systems (or both). Often these systems confer greater influence or control features in accordance with social rank, which can spiral into an oligarchy. A small number of powerful users end up controlling the majority of content and discussion on the site.

Gaming the system

Attempts to solve any of the above typically involve creating some kind of technological system (as opposed to a social or cultural one) to muffle undesirable behavior and/or encourage positive contribution.

Especially clever users often find ways of using these systems in ways contradictory to their purpose. We should never underestimate the creativity of users under constrained conditions (in both bad and good ways!).


Whether or not some of these are problems really depends on the community in question. For instance, maybe a site's purpose is to deliver quick-hit content and not cerebral long-form essays. And the exact nature of these problems - their nuances and idiosyncrasies to a particular community - are critical in determining what an appropriate and effective solution might be. Unfortunately, there are no free lunches :\

(Did I miss any?)


* The term "troll" used to have a much more nuanced meaning. "Troll" used to refer to subtle social manipulators, engaging in a kind of aikido in which they caused people to trip on their own words and fall by the force of their own arguments. They were adept at playing dumb to cull out our own inconsistencies, hypocrisies, failures in thinking, or inappropriate emotional reactions. But you don't see that much anymore...just the real brutish, nasty stuff.