Kate Crawford
Kate Crawford

Self-tracking using a wearable device can be fascinating.

Kate Crawford
Kate Crawford

Surveillant anxiety is always a conjoined twin: The anxiety of those surveilled is deeply connected to the anxiety of the surveillers. But the anxiety of the surveillers is generally hard to see; it's hidden in classified documents and delivered in highly coded languages in front of Senate committees.

Kate Crawford
Kate Crawford

Big Data is neither color-blind nor gender-blind. We can see how it is used in marketing to segment people.

Kate Crawford
Kate Crawford

We urgently need more due process with the algorithmic systems influencing our lives. If you are given a score that jeopardizes your ability to get a job, housing, or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision.

Kate Crawford
Kate Crawford

Data is something we create, but it's also something we imagine.

Kate Crawford
Kate Crawford

Like all technologies before it, artificial intelligence will reflect the values of its creators. So inclusivity matters - from who designs it to who sits on the company boards and which ethical perspectives are included.

Kate Crawford
Kate Crawford

Many of us now expect our online activities to be recorded and analyzed, but we assume the physical spaces we inhabit are different. The data broker industry doesn't see it that way. To them, even the act of walking down the street is a legitimate data set to be captured, catalogued, and exploited.

Kate Crawford
Kate Crawford

We need a sweeping debate about ethics, boundaries, and regulation for location data technologies.

Kate Crawford
Kate Crawford

Numbers can't speak for themselves, and data sets - no matter their scale - are still objects of human design.

Kate Crawford
Kate Crawford

While many big-data providers do their best to de-identify individuals from human-subject data sets, the risk of re-identification is very real.

Kate Crawford
Kate Crawford

If you're not thinking about the way systemic bias can be propagated through the criminal justice system or predictive policing, then it's very likely that, if you're designing a system based on historical data, you're going to be perpetuating those biases.

Kate Crawford
Kate Crawford

Data will always bear the marks of its history. That is human history held in those data sets.

Kate Crawford
Kate Crawford

We should have equivalent due-process protections for algorithmic decisions as for human decisions.

Kate Crawford
Kate Crawford

Only by developing a deeper understanding of AI systems as they act in the world can we ensure that this new infrastructure never turns toxic.

Kate Crawford
Kate Crawford

Error-prone or biased artificial-intelligence systems have the potential to taint our social ecosystem in ways that are initially hard to detect, harmful in the long term, and expensive - or even impossible - to reverse.

Kate Crawford
Kate Crawford

If you have rooms that are very homogeneous, that have all had the same life experiences and educational backgrounds, and they're all relatively wealthy, their perspective on the world is going to mirror what they already know. That can be dangerous when we're making systems that will affect so many diverse populations.

Kate Crawford
Kate Crawford

As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.

Kate Crawford
Kate Crawford

We should always be suspicious when machine-learning systems are described as free from bias if it's been trained on human-generated data. Our biases are built into that training data.

Kate Crawford
Kate Crawford

Big data sets are never complete.

Kate Crawford
Kate Crawford

Facebook is not the world.