The Test We Can — and Should — Run on Facebook
What we do know is that Facebook, like many social media platforms, is an experiment engine: a machine for making A/B tests and algorithmic adjustments, fueled by our every keystroke. This has been used as a justification for this study, and all studies like it: Why object to this when you are always being messed with? If there is no ‘natural’ News Feed, or search result or trending topic, what difference does it make if you experience A or B? […] There is no easy answer to this, but we could do worse than begin by asking the questions that Shils struggled with: What kinds of power are at work? What are the dynamics of trust, consent and deception? Who or what is at risk? While academic research is framed in the context of having a wider social responsibility, we can consider the ways the technology sector also has a social responsibility. To date, Silicon Valley has not done well in thinking about its own power and privilege, or what it owes to others. But this is an essential step if platforms are to understand their obligation to the communities of people who provide them with content, value and meaning
Algorithmic Accountability Reporting
We’re living in a world now where algorithms adjudicate more and more consequential decisions in our lives. It’s not just search engines either; it’s everything from online review systems to educational evaluations, the operation of markets to how political campaigns are run, and even how social services like welfare and public safety are managed. Algorithms, driven by vast troves of data, are the new power brokers in society.
Data isn’t a neutral asset. How and what data we collect is shaped by our priorities and our attention. What we do with that data is even more of a human construct. Last month, I attended a robotics conference down at SRI, where I saw a talk on autonomous technology by MIT professor David Mindell. In his presentation, he made the point (and I paraphrase) that autonomous technology isn’t actually free of human intervention. We’ve simply shifted when, and how, we interact with it. Our intentions, assumptions, biases, desires, and preferences (to name a few) are all baked into the sensors, algorithms, and other technology that make these technologies work.
Leave a Reply