There's this thing that happens on the internet, maybe you've seen it...
Someone makes a short video, an explainer with quick, punchy sentences, or maybe remixes a meme with big, expressive hand movements, and, just like that, a complex and nuanced topic becomes more accessible to a mass audience.
The most obvious version of this phenomenon can be seen in those short sidewalk or living room dance videos aimed at going viral on TikTok, with lots of cultural appropriation involved (if you need a reference, Khalil Greene breaks it down with examples).
As an art form, dancing has a whole history, context, and set of perspectives, and street dance is a subset that tends to be more improvisational. In a lot of ways, it creates culture which more formal dance than responds to.
There is something that stands out: when you look at these clips, particularly the ones that go viral, a big part of the job of these dancers is to fit their moves into a 9:16 frame, which is the standard vertical video size for TikTok and other mobile-focused video apps.
When an ecosystem can’t or won’t reject bad actors
How much of what we see online is real?
It’s a question we’re all facing - made worse by the fact that people often fail to look closely at the information they consume, and sometimes quickly fire it back into the world without looking at all.
In the case of millions of fake accounts and bots described by the New York Times over the weekend, the problem has reached such massive levels that if social media giants gave the same treatment to showing the impacts of bots and fake audiences as they are to Russian interference in the 2016 presidential election, it’s doubtful it would show that anyone has gone without at least one fake retweet or favorite.
In nature, a healthy ecosystem by definition rejects or minimizes bad actors to ensure variation and longevity. But in the case of social media platforms this problem can be deceptive, because most tech startups are optimized for growth and growth alone.
Read More