Human Learning: Beyond the Panopticon

Source –

In the quest to personalize your experience with the latest technologies, and as a reward for your dedicated participation in signing up for new services without reading the fine print in their consent agreements, the powers that be have a special gift for you. Yes, it’s your own personal panopticon! Previously only reserved for fiction writers, today you can have dystopia delivered — with free shipping!

They say it helps to laugh, but this isn’t really that funny anymore, with a sense of uneasiness beginning to build as the realities of Generation Tech start to set in for the long haul. The devices we perpetually use and that connect almost everything around us aren’t going anywhere, but every byte of data they collect is likely going somewhere. Is it in the cloud? A massive data storage facility? Individual dossiers?

Wherever our data goes, we pretty much know it’s being compiled. Companies use this information to build detailed profiles for marketing purposes, governments can potentially use it for enforcement, and others from insurers to employers can access much of it through garden-variety online searches. Our personal lives our increasingly being laid bare, often blithely dismissed as the cost of doing business.

Indeed, we’ve all probably had this experience by now: you search for something that you’ve never searched for before — maybe a household appliance or a healthcare provider in your area. Suddenly and seamlessly, ads for those services and related products begin popping up in banners and sidebars; you might even get solicitations on your other devices. And that’s only from conducting one simple search.

Extrapolating further, and drawing from routine revelations about hacking and backdoors, it appears that the depths of data mining are expanding all the time. With each new voice-activated ‘assistant’ or IoT-connected gadget, access to our private domains is being pried open more and more. No warrants need to be issued for doing this, nor is there any oversight committee; this all happens with consent.

And it’s just getting started in earnest. Soon enough, if not already at hand, there will be a record of every conversation you have, every keystroke you enter, every transaction you make, every person you interact with, every place you go, and everything you watch, listen to, like, and purchase. This will all be promoted as bringing greater convenience, promising security and mobility, and encouraging ‘sharing’.

Such observations are almost passé by now, seen as a downer at best or alarmist at worst. But the full implications are worth considering, even as the sense of resignation to the inevitable becomes almost palpable. As New York Times tech columnist Farhad Manjoo recently lamented, “Technology has crossed over to the dark side. It’s coming for you; it’s coming for us all, and we may not survive its advance.”

If our lives are an open book, what becomes of privacy? And perhaps more to the point: without privacy, what becomes of human development? Some may say they’re doing nothing wrong and have nothing to hide, but our rights weren’t designed to protect only the pure. A healthy society requires functional individuals; this includes spaces of autonomy, exploration, reflection, expression, and more.

More pointedly, how many of us can really say that our lives could withstand such an unprecedented level of total exposure? We spend a lot of time cultivating complex personas, engaging in “impression management,” building faces to the world that reflect our personal images and aspirations. We have ethical ideals, spiritual frameworks, and emotional cores. And we also have things we keep to ourselves.

This is natural, and it’s why privacy exists. Having unknown entities (or just anyone with a computer) peer through digital windows into our very being is a perverse form of high-tech voyeurism. The fact that access is often freely given doesn’t negate the responsibility of those collecting, storing, mining, and deploying the data being gleaned. Before considering alternatives, some implications are worth noting:

Manipulation: We already know what this looks like, since it’s often done openly. Our digital footprints are regularly used for marketing purposes, to tailor ads to our desires and information to our tastes. We’ve also seen a darker side, as with the propagation of “fake news” (the real fake news, not the fake fake news) and the deployment of targeted persuasion for political purposes. And there’s more to come.

Coercion: Maybe you’ve seen the Black Mirror episode where people with secrets and repugnant habits are blackmailed to engage in horrific behaviors? Imagine this playing out in more ordinary terms, less to make people do awful things than to lead them into deeper modes of obedience. In fact, the panopticon itself was conceived as a space of coerced conduct through constant surveillance, as the ultimate prison.

Control: And thus we reach the dystopian horizon of the panopticon, commensurate with the Orwellian tendencies already in evidence. Couched in the rhetoric of convenience and access, a web of technology that tracks our every impulse is fraught with implications for social control. Aptly, the lyric that “every step you take, I’ll be watching you” was intoned by The Police — released in 1983, but very much 1984.

There aren’t easy answers to these concerns. Perhaps if the societal ethos moved toward “watching the watchers” rather than simply yielding to total surveillance, things may improve. More oversight as to what’s collected and who has access is crucial, as are clearly marked rights and remedies. We might even demand technology that expands our privacy, rather than leveraging it for someone else’s gain.

0 0 votes
Article Rating
Notify of
Newest Most Voted
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x
Artificial Intelligence Universe