The Engine of Technological Acceleration is You

background image

Control over Thinking with Technology: But surely at least our devices are our own, to do with what we want? I turn on the TV, choose a movie I want to watch in the privacy of my own home: one can’t be any more master of their domain than that? 

Except that the options available to you are a cultural and technical byproduct.  Your Netflix feed is the result of algorithms that balance the needs of platforms like Netflix (the company and its technologies) against your watching history and that of millions of others.

Your TT is controlled by numerous corporate interests, and technological features which filter and impose their needs on you, barely moderated by a thin layer of regulation or your user preferences. They espouse having your interests at heart, but at the end of the day, their one mission is to create a profit for their shareholders, and they achieve that by manufacturing market demand. 

Let’s return to our discussion of the ubiquitous iPhone, and the average of 90 apps on each. In 2019, NY Times reporter Kevin Litman-Navarro did a review of 150 privacy policies on common apps. He found that on average they took 15-20 minutes to read, and, like in our earlier Nest example, many were so complex that they exceeded the cognitive ability of Ph.D.’s and lawyers.  Facebook’s policy, for example, was near the average length of the other apps, but rivaled Kant’s tome The Critique of Pure Reason in complexity of the words and language phrasing and intricacies.

No wonder then, that Pew Research found that 36% of adults never read them, and 38% of the rest read them only occasionally. Other research shows this might be conservative.  A 2017 Deloitte survey of 2,000 U.S. consumers found that 91% of people consent to terms of service without reading them; the rates were highest among younger adults, 97% of whom accepted without reading. In 2017, Purple, a Wi-Fi provider, tried to draw attention to this trend when they put in place a Terms and Conditions acceptance that required users to the free service to commit to 1,000 hours of public services – cleaning toilets, scraping gum off the sidewalk and “manually relieving sewer blockages”. 22,000 people signed up during the 2 week experiment.  Only one person read the Terms and Conditions closely enough to win a prize that was embedded within it.

So – what’s the big deal? 

Let’s remember what this phone is: it’s as powerful as the world’s best supercomputer in 1998; can track anything you put into it or do with the apps on the phone, plus it knows the exact location, altitude, movements of itself. On board are microphones and cameras that can record everything within range, including the presence, name, and other details of other networked devices; and behind these sensors are cloud-based AI and ML connected to the phone by 4G/5G and wi-fi networks, which can increasingly monitor and analyze your written, spoken and visual signals deeply, and accurately determine everything from emotions and lying to diagnosis of key diseases.

So again – what’s the big deal; not a bad deal for $50 bucks a month, right?

In 2019, the Washington Post tracked the activity of an average phone and found that there were 5,400 hidden app trackers using the phone’s connection to send data about the person back out to 3rd parties. These apps send out a total of 1.5 Gigabytes of data in a month, equal to about the amount of data a standard AT&T user would consume themselves by using the device. An example app was DoorDash, which used at least nine trackers to record your device name, model, ad identifiers, memory size, accelerometer readings, and even your delivery address, name, email, and cell carrier to rate advertising effectiveness.  It shares this data with trading partners like Facebook and Google Ad Services, and DoorDash clearly says (deep within their Privacy Policy) that “DoorDash is not responsible for their privacy policies or practices.” 

And your phone is just one of many connected devices in your world, each of which is trying to understand you better, so that they can serve (and shape) your needs. Companies use this information, and other information they farm from around the world about you (in a process known as data aggregation and using tactics like de-anonymization and re-identification), to shape the experience it offers to you, and/or to inform the advertising they present you to, which is itself designed to influence and control your behavior. 

The 2020 docudrama The Social Dilemma describes how online platforms closely watch, track, measure, and monitor users, and they leverage this data to increase engagement, growth, and advertising revenue through various manipulation techniques. These tactics include sophisticated psychology research.  These platforms are highly effective at shaping your behavior, and when your behaviors conform to their needs, the result is you behave very much like an addict.  Technology itself crafts ever more narrowly targeted ads that are highly personalized for effectiveness (e.g. influence on your behaviors), that constantly evolve in response to your actions and non-actions. And remember, the technology you interact with is often free, because it is ad-supported; you are exposed to advertising throughout your day, to the tune of 5,000-10,000 times a day. As the technology platforms try to exert their needs on you, so too do the various advertisers who pay for the platform. The old saying is ‘if the platform is free, then you are the product”.

Shoshana Zuboff coined the term ‘surveillance capitalism’ to describe this dynamic, in her powerful 2019 book The Age of Surveillance Capitalism. Surveillance capitalism harnesses human experience as the raw material from which new precision products are made, in anticipation of what you will need or ask for in the future, according to Zuboff. She found “it is no longer enough to automate information flows about us; the goal now is to automate us.”

Marshal McLuhan, writing in 1964, saw this coming.  He said, “Archimedes once said, ‘Give me a place to stand and I will move the world’ Today he would have pointed to our electric media and said, ‘I will stand on your eyes, your ears, your nerves, and your brain, and the world will move in any tempo or pattern I choose.’ We have leased these “places to stand” to a myriad of private corporations.”

It should be no surprise, when you consider this, that the inventor of the Internet’s pop-up ad regrets it.Ethan Zuckerman says, “I wrote the code to launch the window and run an ad in it. I’m sorry. Our intentions were good”, and then adds:

“I have come to believe that advertising is the original sin of the web. The fallen state of our internet is a direct, if unintentional, consequence of choosing advertising as the default model to support online content and services”

Perhaps technology even controls itself? 

In his book What Technology Wants, Kevin Kelly paints a picture of technology as the seventh kingdom of life on earth. He says the technium is “the organism of ideas”, and “it’s the greater, global massively interconnected system of technology vibrating around us”.

Just as Donald saw consciousness as a tight coupling between our brains and our culture, Kelly’s technium blends technology and culture, and is “an outgrowth of the human mind,” as it – “Extends beyond hardware and incorporates culture, art, social institutions, and intellectual creations of all types.” 

Kelly ascribes autonomy and intent to his technium – “The technium also wants what every living system wants: to perpetuate itself, to keep itself going. And as it grows, those inherent wants are gaining in complexity and force.” “We don’t have everything that the technium demands, but we can learn to work with this force rather than against it.

Kelly is not alone.  Katz, White, and Thompson published Controlling Technology, containing 34 essays debating ‘Do we control technology or does technology control us?.’ They cite Heidegger and Ellul’s arguments that “modern technology, although it may seem to be just a more efficient means of doing what humans have always done, confronts humanity with issues that go to the very core of who we are and how we live.”

Ellul uses the much narrower term ‘technique’ and has argued technique is autonomous to economics and politics, and it shapes social, political, and economic change.  Joseph Pitt cautions us that if we treat technology as a “thing”, and “attributing causal powers to it and endowing it with a mind and intentions of its own” will lead us “down blind alleys.” He concludes that it depends on our definitions: if autonomous is “free from influence in both development and use”, then “technology can never be autonomous, because it is inherently something used to accomplish specific goals.” He appears to reluctantly side with Charlton Heston, in that “Guns don’t kill people, people do”. We shouldn’t claim technology is the culprit, but ourselves – we have met the enemy, and they are us. Pitt says simply, “Tools by themselves do nothing”.

Does technology control us, or do we control it?  I tend to agree with Pitt, and cite an economist, Carl Menger, to make my case. 

Menger was famous in part for making the distinction in the role of the human actor in economic markets. Prices are a reflection of human wants, and causal processes purposefully designed to satisfy those wants. The complex economy can be wholly explained as being the result of the driving force of human needs. Kelly, in What Technology Wants, quotes Gordon Moore himself as saying “Moore’s Law is really about economics”, and iconic American engineer Carver Mead, who said that Moore’s Law is “is really about people’s belief system, it’s not a law of physics, it’s about human belief, and when people believe in something, they’ll put energy behind it to make it come to pass.” Academically, this approach – that people drive the evolution and adoption of technology – is further developed within an area of scholarship known as the Social Construction of Technology (SCOT), itself a branch of the sociology field of Science and Technology Studies (STS).

So – we control technology, and it isn’t controlling us?  

Not so fast.  Just because technology in its various forms and features is designed to meet human needs and wants, doesn’t mean it was suited to meet the specific needs that you have, and that you have control of the technology in your life, including most of the technology you are outsourcing your thought to. 

It might be democratic control, as discussed by Bertrand Russell, “Democracy, or the rule of the many over themselves.. may easily slip into popular apathy which allows for corrupt politicians to go unchecked.” Has the aggregate cultural influence of what the rest of people in society desire created such demands for technology that you have no influence, and a dour future is foreordained?  Have we slipped into a popular apathy?  If so, can it be reversed? 

Consider gluten-free bread for a minute: 30 years ago (and may even 10 years ago) a person who wanted to buy gluten free products in their store would be treated like an alien from another planet.  There was no gluten free bread to be had to meet their needs, because there were not enough people asking for it.  But now, even the average store has many offerings: enough people demanded something that was healthier for them, so the markets evolved to meet those demands.  Alone we may not control the new innovations of technology, but together we can. Can this apply to thought?

Let’s adopt Metzinger’s definition of mental autonomy, “the specific ability to control one’s own mental functions” including attention, memory, planning, rational thought, and decision making.

In a provocative and important 2019 paper The Autonomous Mind: The Right to Freedom of Thought in the Twenty-First Century Simon McCarthy-Jones argues that twenty-first century technological advances pose new threats to our mental autonomy – our freedom of thought. He notes the research on how the data stored at technology companies like Facebook and Google includes “facial expressions, actions, possessions, purchases, musical preferences, websites visited, words used in Facebook posts, and ‘likes’ registered on social media”. These companies leverage deep learning neural networks, which are now better at detecting people’s sexuality from their faces than other humans are, posing, for example, a “threat to the privacy and safety of gay men and women”.  

McCarthy-Jones then pulls in work from others like Zuboff, to show how the technology systems you use are monetized when this data is used to micro-target people to maximum effectiveness of behavioral changes (e.g. buy things). And tech is not just being used to get you to buy stuff.  Some governments, McCarthy-Jones says, use technology platforms to “nudge” “citizens into joining organ donor registers, increasing fruit and vegetable consumption, and increasing tax collection rates.” Elsewhere, we see how state actors engage social media and other technologies to shift who people will vote for. And in China, their social credit system is used to control most of their citizens’ behaviors. 

All the while, the technology studies its own effectiveness, in a meta-optimization, to achieve its ultimate goals.  Facebook’s first President, Sean Parker, explains what they are looking for – “All about how do we consume as much of your time and conscious attention as possible?” “by exploiting a vulnerability in human psychology. . . The inventors, creators, it’s me, it’s Mark [Zuckerberg]. . . understood this consciously. And we did it anyway.” 

My take – Does technology control us, or do we control it?  

Technology is a direct function of human needs, as expressed in concert with the market through prices, purchases and attention. To this extent, technology is controlled by us, through our collective actions. However, these are collective needs; technologies are often adapted to mass markets, and so are geared to the common denominator. These needs are increasingly derived or synthetic in that they are a function of powerful technology which learns and adapts to be a powerful influence on people’s needs and behaviors, artificially shaping some needs, and amplifying others. They are often the result of manufactured demand.  And technologies are geared to the needs that drive the monetization, without respect to how adaptive they are to a given person or population. 

So, with respect to any one person’s conscious and unconscious needs that are aligned with adaptive goals and values that leave a person better off, on net, technology only consciously controlled within limited circumstances.

Scaled up to the collective level and technology is currently not controlled by human societies, because it gives us what we want, and we unconsciously want the wrong things, because our values are outdated. This makes us subject to manipulation by elites, such as sophisticated marketers, or political actors seeking to influence our beliefs, as well as by the foibles of the rest of society. And our own.

 We don’t take the time to do the hard work of understanding and consciously controlling these wants and needs, in large part because of how complex it is. This book set out to map these dynamics out, in part to make that control more feasible.