Do you use Facebook? Is it time to....
发表于 2018-3-30 00:55:18
|
最新回复 2018-9-21 12:44:29
|
来自于
Singapore
楼主
Is it time to give up on social media? My take: Don't give up using Facebook, but don't trust it at the same time...
Many people are thinking about that in the wake of revelations regarding Cambridge Analyticas questionable use of personal data from over 50 million Facebook users to support the Trump campaign. Not to mention the troubles with data theft, trolling, harassment, the proliferation of fake news, conspiracy theories and Russian bots.
If one looks beyond the tip of the iceberg (ie the symptoms), one may unearth the fundammental root cause. The real problem might be Facebook’s business model. Along with other social media platforms, it makes money by nudging users to provide their data (without understanding the potential consequences), and then using that data in ways well beyond what people may expect.
As researchers who study social media and the impact of new technologies on society in both the past and the present, we share these concerns. However, we were not ready to give up on the idea of social media just yet.
A main reason is that, like all forms of once “new” media (including everything from the telegraph to the internet), social media has become an essential conduit for interacting with other people. We dont think its reasonable for users to be told their only hope of avoiding exploitation is to isolate themselves. And for many vulnerable people, including members of impoverished, marginalized or activist communities, leaving Facebook is simply not possible anyway. As individuals, and society as a whole, come to better understand the role social media plays in life and politics, theyre wondering: Is it possible – or worthwhile – to trust Facebook?
Of course, social media platforms dont exist without their users. Facebook has grown from its origins serving only college students by exploiting the network effect: If all your friends are socializing on the site, its tempting to join yourself. Over time this network effect has made Facebook not only more valuable, but also harder to leave.
However, now that Facebook and its ilk are under fire, its possible that those network effects might unravel the other way: Facebooks number of active users continued to rise in 2017, but in the final three months of the year, its growth showed signs of slowing. If all your friends are leaving Facebook, you might go with them.
The design of social media platforms like Facebook – and many other common apps is intentionally engrossing. Some scholars go so far as to call it “addictive,” but were uncomfortable using the term so broadly in this context. Nevertheless, digital designers manipulate users’ behavior with a wide array of interface elements and interaction strategies, such as nudges and cultivating routines and habits, to keep users’attention.
Attention is at the center of the social media business model because its worth money: Media theorist Jonathan Beller has observed that “human attention is productive of value.” To attract users, keep them engaged and ensure they want to come back, companies manipulate the details of visual interfaces and user interaction. For example, the ride-sharing app Uber shows customers phantom cars to trick them into thinking drivers are nearby. The company uses similar psychological tricks when sending drivers text messages encouraging them to stay active.
This manipulation is particularly effective when app developers set default options for users that serve the company’s needs. For example, some privacy policies make users opt out of sharing their personal data, while others allow users to opt in. This initial choice affects not only what information users end up disclosing, but also their overall trust in the online platform. Some of the measures announced by Facebook CEO Mark Zuckerberg in the wake of the Cambridge Analytica revelations – including tools showing users which third parties have access to their personal data – could further complicate the design of the site and discourage users even more.
Was users’trust in Facebook misplaced in the first place? Unfortunately, we think so. Social media companies have never been transparent about what they’re up to with users’ data. Without full information about what happens to their personal data once its gathered, we recommend people default to not trusting companies until they’re convinced they should. Yet neither regulations nor third-party institutions currently exist to ensure that social media companies are trustworthy.
This is not the first time new technologies created social change that disrupted established mechanisms of trust. For example, in the industrial revolution, new forms of organization like factories, and major demographic shifts from migration, increased contact among strangers and across cultures. That altered established relationships and forced people to do business with unknown merchants.
People could no longer rely on interpersonal trust. Instead, new institutions arose: Regulatory agencies like the Interstate Commerce Commission, trade associations like the American Railway Association, and other third parties like the American Medical Association’s Council on Medical Education established systematic rules for transactions, standards for product quality and professional training. They also offered accountability if something went wrong.
There are not yet similar standards and accountability requirements for 21st-century technologies like social media. There is plenty of demand for more supervision of social media platforms. Several existing proposals could regulate and support trust online.
In the U.S., the Federal Trade Commission is one of the few regulatory bodies working to hold digital platforms to account for business practices that are deceptive or potentially unfair. The FTC is now investigating Facebook over the Cambridge Analytica situation.
Other countries have rules, such as the EU’s General Data Protection Regulation and Canada’s Personal Information Protection and Electronic Documents Act. However, in the U.S., technology companies like Facebook have actively blocked and resisted these efforts while policymakers and other tech gurus have convinced people theyre not necessary.
'
The main issue is: Facebook has the technical know-how to give users more control over their private data, but has chosen not to – and that’s not surprising. No laws or other institutional rules require it, or provide necessary oversight to ensure that it does. Until a major social media platform like Facebook is required to reliably and transparently demonstrate that it is protecting the interests of its users – as distinct from its advertising customers – the calls to break the company up and start afresh are only going to grow.
What do you think?
—
Summarised & adapted from
The Conversation
|
|