Video games are a special kind of play, but at root, they're about the same things as other games: embracing particular rules and restrictions in order to develop skills and experience rewards. When a game is well-designed, it's the balance between these factors that engages people on a fundamental level.
Mass literacy is a phenomenon of the past few centuries, and one that has reached the majority of the world's adult population only within the past 75 years.
I love video games. I'm also slightly in awe of them. I'm in awe of their power in terms of imagination, in terms of technology, in terms of concept. But I think, above all, I'm in awe at their power to motivate, to compel us, to transfix us, like really nothing else we've ever invented has quite done before.
In classrooms full of students who range from brilliant to sullen disaffection, it's games - and often games alone - that I've seen engage every single person in the room. For some, the right kind of play can spell the difference between becoming part of something, and the lifelong feeling that they're not meant to take part.
For the moment, machines able to 'think' in anything approaching a human sense remain science-fiction. How we should prepare for their potential emergence, however, is a deeply unsettling question - not least because intelligent machines seem considerably more achievable than any consensus around their programming or consequences.
Forget artificial intelligence - in the brave new world of big data, it's artificial idiocy we should be looking out for.
From exam grading to health education to professional training to democratic participation, paths towards self-realization and success in the world are often daunting and obscure: journeys only the privileged feel confident setting off along.
The biggest neurological turn-on for people is other people. This is what really excites us. In reward terms, it's not money; it's not being given cash - that's nice - it's doing stuff with our peers, watching us, collaborating with us.
The really interesting stuff about virtuality is what you can measure with it. Because what you can measure in virtuality is everything. Every single thing that every single person who's ever played in a game has ever done can be measured.
The earliest known writing probably emerged in southern Mesopotamia around 5,000 years ago, but for most of recorded history, reading and writing remained among the most elite human activities: the province of monarchs, priests and nobles who reserved for themselves the privilege of lasting words.
Once the words of a book appear onscreen, they are no longer simply themselves; they have become a part of something else. They now occupy the same space, not only as every other digital text, but as every other medium, too.
The best teachers, one hopes, don't shout at their students - because they are skilled at wooing as well as demanding the best efforts of others. For the ancient Greeks and Romans, this wooing was a sufficiently fine art in itself to be the central focus of education.
Vast volumes of mixed media surround us, from music to games and videos. Yet almost all of our online actions still begin and end with writing: text messages, status updates, typed search queries, comments and responses, screens packed with verbal exchanges and, underpinning it all, countless billions of words.
Above all, the translation of books into digital formats means the destruction of boundaries. Bound, printed texts are discrete objects: immutable, individual, lendable, cut off from the world.
I spoke at TED Global 2010 about the ways that video games engage the brain, and in particular, the idea of reward structures: how a challenge or task can be broken down and presented to make it as engaging as possible.
Over tens and hundreds of thousands of years, we evolved to find certain things stimulating, and as very intelligent, civilized beings, we're enormously stimulated by problem solving and learning.
As commentators like the American psychologist Gary Marcus have noted, it's extremely difficult to teach a computer to recognise cats. And that's not for want of trying.
If computers remain far worse than us at image recognition, a certain over-confident combination of man and machine can elsewhere take inaccuracy to a whole new level.