Posting here will be light to non-existent for the next few days, as I will be traveling and doing various other kinds of labor that can only be done when away from the computer. But maybe someday I’ll be wired in such a way that I will never be away from the computer: I will be in it and it will be in me. Or at least that’s the hope of Max Levchin, as expressed in a recent lecture.
Levchin sees human beings, and especially the brains of human beings, as underexploited analog resources. All those brain cycles going to waste while we sleep or daydream! Nick Carr’s analysis of Levchin’s talk is chillingly spot-on:
No need to think of analog resources in the aggregate anymore; networked sensors allow us to monitor and rationalize the utilization of each individual resource, each person in isolation. But you can go even deeper. You can begin to rationalize each individual’s internal resources. Imagine, as Levchin does, that everyone is hooked up to physical sensors that minutely monitor their health and behavior and send the data to a centralized processing system. An insurance company “looking at someone’s heart rate monitor data could make their cardiovascular healthcare cost-free.” Of course, if you engage in risky behavior (do you really want that third slice of pizza, or that third beer?) or have some suboptimal health reading (did your heart just skip a beat?), an alert from your insurer, or maybe your employer, or maybe the government, would immediately come through your smartphone notifying you that your health care premium has just been increased. Or maybe your policy has been cancelled. Or maybe you’ve been scheduled for a brief reeducation session down at the local office of the Bureau for Internal Resource Optimization.
This is the nightmare world of Big Data, where the moment-by-moment behavior of human beings — analog resources — is tracked by sensors and engineered by central authorities to create optimal statistical outcomes. We might dismiss it as a warped science fiction fantasy if it weren’t also the utopian dream of the Max Levchins of the world. They have lots of money and they smell even more: ”I believe that in the next decades we will see huge number of inherently analog processes captured digitally. Opportunities to build businesses that process this data and improve lives will abound.” It’s the ultimate win-win: you get filthy rich by purifying the tribe.
There are the hackers like Aaron Swartz who have a deep suspicion of and resistance to the ways that vast corporate and governmental entities want to use electronic technologies to discipline and control people. And then there are the technologists like Max Levchin who can’t want to participate in the disciplining and controlling, and to profit from it.
And Levchin makes his case for these exciting new developments in a very common way: “This is going to add a huge amount of new kinds of risks. But as a species, we simply must take these risks, to continue advancing, to use all available resources to their maximum.” Levchin doesn’t really spell out what the risks are — though Nick Carr does — because he doesn’t want us to think about them. He just wants us to think about the absolutely unquestionable “must” of using “all available resources to their maximum.” That “advance” in that direction might amount to “retreat” in actual human flourishing does not cross his mind, because it cannot. Efficiency in the exploitation of “resources” is his only canon.