What are we worrying about when we worry about TikTok? | Samantha Floreani

What are we worrying about when we worry about TikTok? | Samantha Floreani

https://ift.tt/WFgCcb3

Is there any platform that creates as much collective angst as TikTok?

For some, TikTok is just a silly video app. For others, it’s a symbol of our most potent social and political fears. What are young people engaging with? Isn’t it collecting a huge amount of data? Are they being dragged down dangerous rabbit holes? And is China spying on them?

Concerns about data privacy, hyper-personalisation and exposure to content that could be harmful are all reasonable. But sensationalist headlines, reactionary calls for stricter content moderation – or banning the app entirely – risk missing the forest for the trees.

TikTok is not some strange aberration; it’s the logical next-step on the pathway of platform capitalism that was laid down by those that came before it. It’s a product of a privatised internet that best serves applications ultimately designed not for people, but for profit.

I confess: I really like TikTok. For me, it’s become a place of joy and absurdity among the rage, horrors, and tedium of its competitors. As a digital rights and privacy advocate, admitting this feels like a dirty little secret.

The thing is, it’s possible to simultaneously hate a platform but love the people on it and the things they create.

But my experience of TikTok is likely to be completely different to yours; that’s by design. TikTok’s commitment to algorithmically curated content is one of the reasons it stands out from the rest. The “For You’” page is responsible for its popularity and profitability – but also its harm.

As with all social media, there are myriad horrendous marks against TikTok. From TraumaTok and content encouraging disordered eating and self-harm to influencer propaganda attempting to recruit Gen Z to the military, there is no shortage of reasons to worry.

Australia weekend

There are also plenty of examples of TikTok being used for social good. Labourers have used it to gain visibility and criticise their working conditions; it’s the home of a growing Indigenous creator community; and many young people use it to organise and amplify their voices on critical political issues.

What are we really worrying about when we worry about TikTok? Most concerns seem to be misdirected anxieties about the broader status quo of the platform ecosystem. Almost all widely used digital platforms threaten the privacy and security of users. They share information with various governments, have the capacity for cultural and ideological influence, and exploit user data for profit.

TikTok has shifted emphasis away from mass virality and toward maximum niche-ification. Once it has determined what keeps someone on the app, it takes them deep into the obscure content trenches. Perhaps they lingered on a couple of sad heartbreak videos and now they’re being bombarded with depression content, or re-watching a controversial political video led them to conspiracy theories. Wherever they end up, once there, it can be incredibly hard to get out.

This is partially why online anonymity is so important – it gives people the grace of exploration and inquiry. It allows people to make choices, change their minds, learn, and grow. TikTok doesn’t make room for this kind of internet exploration; it makes it impossible to have curiosity without consequence.

TikTok isn’t alone in using engagement and recommender algorithms to curate personalised content feeds, but it does take it to the extreme. This is profitable both because it keeps people scrolling and because there’s very little difference between being able to personalise content and personalise ads.

Because of its monumental success, other apps are attempting to follow in TikTok’s footsteps, giving us a glimpse into the current trajectory of social media. Instagram recently faced backlash when it started prioritising recommended short-form videos, and just last week, Twitter made the algorithmic feed the default. With a business model this lucrative, it’s not enough to fight TikTok alone.

Let’s go down our own rabbit hole: if you’re worried about algorithms showing people problematic content, you should be worried about targeted advertising. The logic of personalised engagement is the same. And if you’re worried about targeted advertising, you should be worried about the way data is collected for profit under surveillance capitalism. That’s what enables it.

And if you’re worried about surveillance capitalism, you should be worried about regular old capitalism. Profit is what drives companies toward invasive data collection and developing algorithms that keep people on their apps for longer.

But online spaces run for profit aren’t preordained. This is a choice, and we could make a different one. What might social networking look like if the incentive to make money was removed? What might be built if it was in the hands of the people, with the motive being connection, creativity, or community, rather than market competition?

This is not a call to apathy, but rather, to think bigger. It’s an invitation to take those concerns about TikTok and reorient them. It’s time to broaden our collective political imagination of the kind of online experiences that could be possible if we break the profit-motive stranglehold and make room for publicly owned and collectively controlled social technology.

via the Guardian

January 22, 2023 at 07:42AM