The Hill

th-000.jpg

Figure 1: "You ain't running things round here." – Tank Commander Roberts

'Smart' digital technology as a character

Complex ideas can sometimes be understood by personifying them. If "AI" had a character, what might it be? 1

I recently re-watched Sidney Lumet's under-rated and powerful 1965 film The Hill, which stars Sean Connery. It contains timeless psychological themes on abuse, courage, ambition, compassion and duty. In one of the characters I found an unexpected insight that connects to digital technology.

Insidious projects

I've been searching for a word, and somehow an idea, within my writing on Digital Self Defence and Digital Veganism. It is something that has felt very hard to capture. Perhaps because it is so dark. It is a technological critique absent in both Neil Postman's and Jacques Ellul's observations.

Theirs remains an account rooted in Aldous Huxley's world, in which we welcome overt technological control. If that technology has a character it is as a faux friend, parent or sibling. George Orwell made "Big Brother" an attachment figure, to be loved and feared, but nonetheless a bold and overt figure.

A central character in Lumet's film has qualities of insidiousness. That quality of pernicious and subtle harm is also linked, etymologically, to that which insinuates itself in a devious manner. It sidles in, unnoticed. It makes itself "indispensable".

The Hill

In Lumet's film, technology is not the enemy. In fact there is no technology to speak of, except perhaps a wind-up field telephone. This is not a Mad Max film. Nor do modern issues of tracking, surveillance, profiling, propaganda or technologically enabled imprisonment, coercion, torture or anything else take the stage. They are entirely absent.

The film is set in the African desert, within a spartan British Army prison camp 2 for low level offenders, petty thieves and insubordinates. It is as 'low tech' as one could get. The rewards are sleep and drinking water and, save an occasional outburst of fists or verbal abuse, the main punishment is brutal physical exercise over a man-made sand hill in the camp compound.

th-001.jpg

Psychologically the film is sophisticated. There are no two-dimensional tough guys, common to American-style prison dramas. No armed perimeter guards - just the blazing desert. Thus we find notes of other 'prison colony' stories, where the guards and prisoners are sintered into a codependent community (eg. Alien 3). As Connery's character, Roberts says, ``Everyone is doing time here, even the screws''.

Malevolence is located in the insidious character of Williams (brilliantly played by Ian Hendry), one of the prison officers. He is new, young, charming, cleverly guarded, deceptive, and psychopathically ambitious. Williams comes into a functioning community, finds weaknesses in the existing order, and utterly destabilises everything.

Metaphors and plot

The prison itself represents something complex and nuanced, something quite human, cleverly cast in the role of an inhuman spectacle. It is Kierkegaardian in its contradiction. The prison community is poised between ostensible necessity and utter absurdity. The pointlessness of fit men being squandered for procedural transgressions while a real war rages is so silly 3.

Yet, with Pythonesque absurdity, the commander takes seriously the job of 'reconstructing his good soldiers'. The atmosphere is of a 'cruel but fair' assembly of faulted yet 'basically good men' doing the best they can with outdated values in a ridiculous situation. They mark their time with ritual and as much dignity as possible until their obligations to one another are discharged. It is Waiting for Godot in desert drabs.

The centrepiece of the camp, the hill represents Sisyphean make-work (Graeber). My interpretation is of industrial capitalism. It is at once an object of pride (Protestant Work Ethic) and self-torture. Pretty much that's digital technology in a nutshell because as developers we build the same apps over, and over, and over. Billions of man-hours of labour go into code projects that are scrapped or operate for only a few months before being abandoned. If tech was once exciting and creative - while still portrayed as a creative profession to young programmers - it's now an intellectual labour camp under Big Tech tyranny.

Ripe for the taking

Enter Williams, who by contrast embodies hungry, systematic malintent. He hides behind a cleverly constructed sprightly facade of competence and purpose. He is a solution looking for a problem. He seems disciplined and humorous, perhaps a little creepily 'too good' to his superiors and prisoners alike, until the sadism starts to show through. Williams repurposed the hill as his torture instrument.

Williams is able to play systems. He can infiltrate and insinuate himself into relations, appear popular and even gain nominal respect. It is only in the final moments of the film, through some brilliant writing by R.S. Allen (play) and Ray Rigby (screenplay), that Williams's real identity as a sadist, usurper and saboteur unravels.

However, he is outmanoeuvred by tank commander Roberts, who is determined to bring Williams to justice for the death of a prisoner. Roberts has Williams sussed out, he sees exactly what is going on and tells the camp commander;

``You ain't running this place, Bert, WILLIAMS is! Look at him! He took over days ago! You STILL haven't caught on!''

th-003.jpg

Figure 2: "You still haven't caught on?"

The switch

What does any of this have to do with technology? Throughout my long career as a computer scientist and most recently in my studies of cyber security I have seen this same character in many metaphorical guises. The character performs that same trick of being in some ways invisible. He is able to appear useful and agreeable. At the same time he is ruthlessly expansive and infectious, building networks and lists of friends and enemies. Occasionally he shows terrifying flashes of potential malice when the mask drops.

Technically, the usurper may be manifest as software or hardware, as a new 'preferred' vendor, or as new policy quietly inserted in a workplace or by a government. These days he appears as an entire technologically micro-managed corporate culture, from which common sense and humanity have been driven out.

One narrative always accompanies this insinuator. The tired order of established leaders, who ought to rise to their responsibility for stewarding new blood are simultaneously enchanted and threatened. They ignore all the warning signs. They hope the rising power will become more rounded and mature. They refuse to recognise the latent danger. Essentially they shrug, dreaming of their retirement and leaving the problem to the next generation.

(note:2023) Notice the unsettling allegory with "AI" in the present hype cycle

As with the rise of all tyrants, for those who see the approaching danger it is difficult to warn others. The majority are quite hypnotised, reluctant to believe bad things. They are unsure whether it is prudent to challenge the rising tyrant, or hedge their bets for 'staying on his good side'. It's a tale told in many tragedies involving gifted young magicians, such as George Lucas's Star Wars saga.

Cybernetic tyranny

There is a "Williams" in every organisational bureaucracy, in the form of malevolent systems that nobody dares to challenge, in cybernetic management, algorithmic decision making, immutable processes that nobody understands.

The vital question, which Connery's character has the courage to ask is:

"Who is in charge here?".

The answer should send shivers down the spines of every Principle, Director, Dean, CEO, Pilot, Captain or General. Anybody who thinks they have control. Pretty much what now defines western society is that people are not in control, 'systems' are. Williams is.

We mistake uniformity for fairness, efficiency or profit for success, and we label anything more human as corruption. To simply blame Capitalism or Communism is off the mark. There are far more ancient human traits at work than these modern ideological symbols. A fusion of Maoist ideologies with surveillance capitalism, to create consumer-communism at the level of data rather than politics, is just the latest twist. In search of new gods, servants of technology happily live with the delusion that people still run the show.

th-004.jpg

Figure 3: "The twenty first century is fundamentally, a lack of integrity"

(Added 2024): Whether it is perfectly serviceable planes dropping from the sky because pilots cannot over-ride the controls, self-driving cars veering into oncoming traffic, parking tickets being issued by faulty visual recognition systems to drivers who were provably thousands of miles away at the time, or even 'officially dead' patients being shipped to the morgue despite doctors seeing them breathing, we are entering a new era of extraordinary absurdity, in which humans are nothing but vestigial tokens of authority.

And this is not really to do with 'AI' (however these are the necessary conditions for AI to really take over). There are many ways that automation, sensor technology, machine learning and common information systems can hugely benefit individuals and society. One problem is that they are used as crutches rather than tools, to replace rather than augment skills. AI (artificial intelligence) is the toxic cousin of IA (intelligence amplification).

A second, and more serious problem, lies in our psychological and cultural position, in that we defer to technological tools. Instead of treating them as subordinate extensions of our own will, we raise them to authorities and kneel before them. More than a few times in the last year I have heard CEO's and general managers bemoan that ``the system won't let us do that''. Polices and directions of huge companies are dictated by 20 year old database schemas and poorly designed, inflexible software systems.

They no longer say; "I'm in charge here, and I want you to fix this". They lack the courage that comes with the most basic knowledge about technology. Instead they meekly ask some mid level administrator for permission, who in turn tells them what the "system will allow".

(Note 2026:) The plan now is to fire that CTO/Sysadmin or CISO and have us directly negotiate with the system itself - a system presumably programmed and updated from Redmond. Cue dialogue between Doolittle and Dark Star bomb.

It is just easier to 'obey' systems than to change them. CEO's defer to IT people instead of telling them what to do. IT people defer to giant monopoly vendors. Nobody wants to step up to take responsibility. The result is an insidious orthogonal power hierarchy that bleeds control and responsibility away from human agents with whom it ought to rest. It renders critical national infrastructure vassal to giant foreign tech monopolies.

The triumph of systems is also, always their triumph over their creators. In pursuit of cost cutting we have driven people who really understand technology out of our businesses. Indeed, 'techies', if they cannot be contained within ICT, are seen as irritants to modern management since they cannot easily be brought under the heel of punitive technological micro-management 4.

Corporate control is now the art of personally avoiding it. I believe this rampant abdication of human power and responsibility is an important aspect of (2026:) current "AI" automation ambitions, and the Digital Self Defence needed to counter it.

The seeds for this desire to abdicate were sown in the highest operational levels of organisations long ago. During systems analysis it is not uncommon to chase around inside an organisation for months or years looking for a person who owns a process or dares to give a definitive answer. Everyone knows what their job isn't.

When systems collapse

Spoiler; at the end of the film the camp descends into madness. Williams explodes in a homicidal (perhaps suicidal) rage, having lost his power game to Roberts. A note of tragedy is that he manages to take the others down with him once things spiral into violence.

We are already approaching a state where our technological systems are our masters, and they have the power to take us all down if we challenge them in the wrong way. (2026:) What will happen after the "AI" bubble bursts? What will be salvageable from a ruined tech industry? How will we meet even basic IT needs?

Real authority, which must be granted, not assumed, is closely related to trust. Ross Anderson offers an elegant definition of extrinsic trust (one commonly attributed to the NSA's culture); Trust is a quality invested in another, and therefore;

``Trust is the ability to do harm''.

Great harm is often done by trusted systems simply when they fail. Disaster does not require that the trusted cybernetic systems we build are the sentient malevolent AI's of science fiction. We just need to helplessly depend on them and never question them.

Neil Postman constantly asks the question of why we choose to externalise trust in systems, and hints at a misanthropic principle. I have heard Silicon Valley techno-utopianism called out as 'species treachery'. There is a self-loathing at the bottom of many people's deification of technology.

Our narrative, as technologists, is that we're on a road to hell paved with good intentions, we are just innocent travellers. At worst, to the Behemoth machinery, we are handing over the keys to our lives out of tiredness, weakness and fear in the face of overpopulation and climate threats.

But as Postman points out, the solution to all these challenges is not with technology, but human endeavour, education, creativity and community. Technology may help, but only if the necessary human conditions are the foundation. If we go against Nietzsche's warning, along the path of raising technology to our new god, we need to accept that, like the West's Christian God, technology will only save us if we are worthy of saving. Some will be chosen, some will not.

What Postman misses somewhat, though Weiner and Meadows come closer, is that systems have a life of their own. Every corporation doing evil and trashing the planet is made up of ordinary men (Browning). Good family men go along to get along. They can easily believe that somewhere, someone is in charge, some beneficent authority who approves and underwrites their actions.

th-005.jpg

Figure 4: "Good, ordinary men"

We are all good soldiers in a crazy war. Digital technology and its many dark sides is more than the sum of those good scientists and entrepreneurs who want to make the world a better place. It has its own character, which is insidious, regardless of our intentions as engineers. It is expansive, self-justifying, and naturally displaces much of what we consider human virtue. Meanwhile, when we extend the leash of moral relativism, it is happy to disguise itself as 'beyond good and evil'. Part of us wants that destruction because it is part of each of us.

Sometimes it tries to kid me
That it's just a teddy bear
And even somehow manage to vanish in the air
And that is when I must beware
Of the beast in me
- Nick Lowe

Technology as an imaginary friend

All technology arises in a context and culture with particular aims and taboos. It is loaded with overt and hidden values from its creators.

We are at a juncture in history where we should now be starting to see digital technology with the same caveats as tobacco, automobiles, and pesticides. We need to actively build measures, and set limits, to discipline that character. "AI" is not a cute Tamogotchi.

(Note 2026:) As kids we create imaginary robot superhero friends. They give us superpowers. The crew of Blakes-7 had Zen. Gene Roddenberry created Commander Data to explore human-cyborg relations. Star Wars Trilogy One is really a robot love affair. HAL in Space Odyssey and the computer in Dark Star are the odd exceptions in their psychopathic logic. Today what people call "AI" is really a fantasy projection of middle-aged investors and silicon valley men who never grew up and past those fantasies.

Most technology needs "selling" because it is just a tool. Beyond things like hammers and bicycles, it is rare to encounter any technology so obviously useful and intuitive that a user immediately recognises its affordances and takes naturally to it. Users need, at least training, and in most cases convincing to take it up.

Digital "smart" technology is a solution looking for a problem, a way to persist growth in a tech industry that would otherwise collapse since we've passed three-sigma of utility and hit fundamental resource limits around rare metals (and some would argue we've passed "peak technology" overall). Instead of perfecting the technology we have, we're pouring money into rich kid's sci-fi toys.

So the "markets" need bullying through peer pressure, threats of being "left behind", and while technology advances in a technical sense, a further social project of advancing technology uptake as a fantasy occurs in parallel. It is, as Chomsky would say, simply manufacturing consent (frivolous demand). Science fiction films are useful to create a benign Star Trek universe. But when that fails, because people are broadly satisfied, another method is not to sell or promote it, but rather to insinuate it quietly, poisonously, into our lives as "necessity".

Attachment valuation of technology

Having made the transition, users will often say "I cannot imagine my life without this now". And that may be true, but is not to be taken as an endorsement. If you are in a relationship with someone, then break it off to date a new partner, the feeling that you "cannot imagine going back to your ex" is common enough. That does not make your current partner objectively "better" than the last. For the most-part we all make complex rationalisations that mean we feel happiest with where we are, and for mentally healthy people at least, relationships are not a ladder to climb up through successively more desirable partners.

This is equally true of technology. CPUs may get faster, memory larger, and smaller sizes and longer battery life seem to make tech improve, there is no objective and inarguable sense in which technology gets "better". Better for what? Most-times efficiency is overrated.

For example, consider the "entertainment value" derived from computer games. Most gamers will tell you about a glorious experience from 10 or 20 years ago and how "games have gone downhill since then". Many people still use old applications or hardware because they have a significant time and mental investment with it. People form attachments and relations to technology because it has become part of their mind-body system (Heidegger), it sets their routines and structures their world.

(Note 2026:) "AI" companies are trying to recreate that as a more "intimate" relationship between chat-bot technologies and users. That fails because they are either clownish, incompetent, treacherous, or oily sycophants hiding psychotic malintent - like Williams! It may be that relational connections with "tools" is simply a psychological impossibility for humans. The only psychological model is slavery (domination) - which should be troubling enough to contemplate. Though Roddenberry's creation of Commander Data is entertaining and thought-provoking, it remains Shakespearean theatre. Shelley (Prometheus) is probably closer to the mark.

But on the whole, people are less happy with new technology than they were, even if it is technically better. One factor is the extraordinary velocity of development that leaves us feeling dissatisfied no matter what we buy. Another is plummeting quality under conditions of end-stage capitalism. From an attachment perspective, another reason is that people feel "out of control". We feel that technology is forced upon us and we do not make genuine relational choices about it.

Footnotes:

1

Originally written January 2018 for a book of essays that were not published. At the time I considered this too negative about technology. I had not yet finished reading Mumford or even discovered Ursula Franklin. Back then, pre-pandemic, the term "AI" was pretty much only used in academic (it actually had some real substance for computer scientists and one would never have cheaply misused it as we do today) and science fiction circles. Since then the character of Williams has come to represent not so much "AI" as the megalomaniac "tech bro" culture that drives it. January 2026: I've added comments that address the current "AI" bubble and deterioration of US leadership into fascism. One reason for reanimating this monster is that Kate, Helen and I will soon be looking at more themes connecting attachment theory and tech.

2

In British military culture of the 1950s and 60s these were called the 'Glasshouse'. Rather few of the prisoners were actually there on charges of violence or thievery, as both were normalised and generally ignored in the army. They were used to contain a large number of what would now be called PTSD cases. These patients were still being labelled for cowardice and desertion even into the Korean War despite the earlier work of Wilfred Bion and John Rickman who pioneered the Northfield projects to approach combat trauma on a more mature footing.

3

Today as developers with valuable, hard earned skills, we toil pointlessly making better ways to sell shampoo or trick people into clicking on malware. Meanwhile climate catastrophe and huge social problems that we could solve present no incentive and our stupid governments pour money into "AI".

4

As digital technology is used more to enforce policy than to facilitate earnest activity, technically adept employees capable of electronic self-sufficiency are increasingly seen as 'insider threats'. The main threat they pose is actually doing their jobs (just not as abject slaves of undignified micro-management). Employees with knowledge that allows them to work around disabling centralised ICT are marginalised by middle management as 'Shadow IT'. A survival strategy for smart industrious people in large organisations is to feign ignorance, pretend to go along with broken internal systems, while hiding evidence of how they successfully meet their objectives.

Date: 2018-03-20 Tue 00:00

Author: Dr. Andy Farnell

Created: 2026-01-13 Tue 09:04

Validate