Kate Crawford
Kate Crawford

We need to be vigilant about how we design and train these machine-learning systems, or we will see ingrained forms of bias built into the artificial intelligence of the future.

Kate Crawford
Kate Crawford

Sexism, racism, and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many 'intelligent' systems that shape how we are categorized and advertised to.

Kate Crawford
Kate Crawford

Hidden biases in both the collection and analysis stages present considerable risks and are as important to the big-data equation as the numbers themselves.

Kate Crawford
Kate Crawford

Vivametrica isn't the only company vying for control of the fitness data space. There is considerable power in becoming the default standard-setter for health metrics. Any company that becomes the go-to data analysis group for brands like Fitbit and Jawbone stands to make a lot of money.

Kate Crawford
Kate Crawford

Algorithms learn by being fed certain images, often chosen by engineers, and the system builds a model of the world based on those images. If a system is trained on photos of people who are overwhelmingly white, it will have a harder time recognizing nonwhite faces.

Kate Crawford
Kate Crawford

There's been the emergence of a philosophy that big data is all you need. We would suggest that, actually, numbers don't speak for themselves.

Kate Crawford
Kate Crawford

People think 'big data' avoids the problem of discrimination because you are dealing with big data sets, but, in fact, big data is being used for more and more precise forms of discrimination - a form of data redlining.

Kate Crawford
Kate Crawford

The promoters of big data would like us to believe that behind the lines of code and vast databases lie objective and universal insights into patterns of human behavior, be it consumer spending, criminal or terrorist acts, healthy habits, or employee productivity. But many big-data evangelists avoid taking a hard look at the weaknesses.

Kate Crawford
Kate Crawford

While massive datasets may feel very abstract, they are intricately linked to physical place and human culture. And places, like people, have their own individual character and grain.

Kate Crawford
Kate Crawford

Data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations.

Kate Crawford
Kate Crawford

There is no quick technical fix for a social problem.

Kate Crawford
Kate Crawford

As we move into an era in which personal devices are seen as proxies for public needs, we run the risk that already-existing inequities will be further entrenched. Thus, with every big data set, we need to ask which people are excluded. Which places are less visible? What happens if you live in the shadow of big data sets?

Kate Crawford
Kate Crawford

Histories of discrimination can live on in digital platforms, and if they go unquestioned, they become part of the logic of everyday algorithmic systems.

Kate Crawford
Kate Crawford

With big data comes big responsibilities.

Kate Crawford
Kate Crawford

When dealing with data, scientists have often struggled to account for the risks and harms using it might inflict. One primary concern has been privacy - the disclosure of sensitive data about individuals, either directly to the public or indirectly from anonymised data sets through computational processes of re-identification.

Kate Crawford
Kate Crawford

The fear isn't that big data discriminates. We already know that it does. It's that you don't know if you've been discriminated against.

Kate Crawford
Kate Crawford

Rather than assuming Terms of Service are equivalent to informed consent, platforms should offer opt-in settings where users can choose to join experimental panels. If they don't opt in, they aren't forced to participate.

Kate Crawford
Kate Crawford

Books about technology start-ups have a pattern. First, there's the grand vision of the founders, then the heroic journey of producing new worlds from all-night coding and caffeine abuse, and finally, the grand finale: immense wealth and secular sainthood. Let's call it the Jobs Narrative.

Kate Crawford
Kate Crawford

Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation.

Kate Crawford
Kate Crawford

It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science.