EdTech, BigData, Surveillance, and the Enclosure of Interaction

This essay is part of a series entitled “From the Schoolhouse to the Field: Abolition Toward an Education for Liberation“.

One of the cornerstones of neoliberal education reform is the premium placed on “data”, that which informs the almighty drive toward “efficiency” and “progress”. From how federal funds are allocated to schools, to how student “growth” is measured, and “teacher quality”, all assessments are expected to be “data-driven”. Within the past decade, there has been a shift from “accountability”, which reacts to variable outcomes with carrots and sticks, to “control”, which aims to use data to predict and proactively steer behaviors toward more profitable outcomes. 

The academic tracking of the 20th century saw young people assessed from a very young age, through all manner of tests and observations, formal and informal, and categorized (e.g. gifted, on-level, deficient). The “data” collected here was both qualitative and quantitative, both highly subjective and susceptible to the racial, gender, and other biases of teachers, administrators, or testing companies. In the 21st century, “tracking” has given way to a more granular form of data collection, mediated by the use of educational technology, or “EdTech”. These products and services, emerging from the intrusion of Silicon Valley into education, are sold as solutions to persistent problems, which EdTech companies commonly call “pain points”. In their view, the massive profits they reap are but a convenient side effect of an otherwise benevolent intervention on behalf of children.

EdTech takes numerous forms, from “learning management systems” (LMS), to the “gamification” of learning, to student information systems (SIS) and student longitudinal data systems (SLDS), to any number of other “apps” as solutions that precede, if not invent, the problems they are meant to solve. The other pretext, and pretense, upon which EdTech stakes its relevance is the need for every student to be individualized, creating the illusion of autonomy and choice, even as all paths lead to the same or at least very similar outcomes. 

This model is insidiously akin to students being mice within a Skinnerian lab’s maze, forced to find their own way to one predetermined exit, while being monitored and evaluated the entire way.[1]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy.

Student individuality is the pretext for “personalized learning”, employing the granular and systematic collection of enormous amounts of student data. It is common wisdom that teachers should “get to know” their students, but this refers to the importance of building relationships, a reciprocal exchange between teacher and student. By contrast, neoliberal data collection amounts to an invasion of privacy, a nonconsensual extraction of “digital labor”, and the mining of personal information for profit. Apropos of the sheer volume of data that is extracted, the term used to describe this phenomenon is “Big Data”. 

In this chapter, I will discuss how the neoliberal project participates both in what social psychologist Shoshana Zuboff calls “surveillance capitalism”[2]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89. and what Black Studies scholar Jackie Wang calls “carceral capitalism”[3]Wang, J. (2018). Carceral capitalism. MIT Press., within the context of education reform. EdTech (both the products and the industry) is central to this mission, using Big Data and personalized learning for the parallel purposes of profit and control.

Big Data

I define Big Data as “computer-mediated transactions“[4]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89., interactions, and actions, which are in turn “harvested”: digitized, aggregated, stored, analyzed, interpreted, exchanged, traded, and sold through a vast network of channels mostly hidden from the public. Zuboff refers to this network as “Big Other” (seemingly a play on Orwell’s concept of “Big Brother”), which as she explains:

…is constituted by unexpected and often illegible mechanisms of extraction, commodification, and control that effectively exile persons from their own behavior while producing new markets of behavioral prediction and modification.[5]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.

There is practically nothing in the world, as mediated by various computer technologies, that does not fall beneath the “Big Data” umbrella. Communications (emails, text messages, phone calls, comments), consumption (videos, music, purchases) engagements (websites visited, links clicked, “likes”, “dislikes“), expressions (social media, blog posts, performances, selfies), movement (location history, routes taken, time spent), relationships (“friends”, family, contacts), and personal characteristics (facial recognition, fingerprints) — all of these potentially collected by mobile phones alone, before we include traffic cameras, CCTV, security systems, “smart” devices, AI personal assistants,  and the “internet of things”, which incorporates even the most benign objects into a massive surveillance network. 

The minutia of what data is collected can be rather shocking when you consider that even the things we ultimately choose not to post or engage with, (draft emails, text messages reconsidered, comments deleted, videos highlighted but not watched on Netflix), our physical orientation in space, including body position and gait, how we move our head in Virtual Reality helmets[6]Koster, R. (2014, March 26). Musings on the Oculus sale. Raph’s Website., and our mouse movement patterns.[7]Wesolowski, T., Palys, M., & Kudlacik, P. (2015). Computer user verification based on mouse activity analysis. In New Trends in Intelligent Information and Database Systems (pp. 61-70). Springer, … Continue reading 

Analytics service[s] allows Web developers the ability to replay cursor movements from a user session, or generate heatmaps with aggregated cursor positions.[8]Leiva, L. A., & Huang, J. (2015). Building a better mousetrap: Compressing mouse cursor activity for web analytics. Information Processing & Management, 51(2), 114-129.

The depth and breadth of the intrusion becomes more ominous when we factor in the “data from billions of sensors embedded in a widening range of objects, bodies, and places”.[9]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.  These data, in turn, can be used in health diagnoses and intervention, or to relate behavior to performance at work or in school.[10]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. Bill Gates, who has invested billions of dollars into education, supported a “$1.1 million project to fit middle-school students with biometric sensors to monitor their response and engagement levels during lessons”[11]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance., while Google has invested in “nano particles that ‘patrol’ the body for signs of disease”.[12]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.  All of this data is collected without informed consent, processed and exploited behind a mostly opaque infrastructure. 

the typical user has little or no knowledge of Google’s business operations, the full range of personal data that they contribute to Google’s servers, the retention of those data, or how those data are instrumentalized and monetized.[13]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.

Silicon Valley goes out of their way to conceal the worst aspects of their surveillance, while blaming people for their own ignorance, and wrongfully equating a click of “I agree” at the bottom of an arcane-by-design privacy policy, with absolute consent. While most adults in the U.S. understand that we have lost control of our personal data, little more than half are willing to share “some personal information” in exchange for free services. This digital colonization — a term I use in recognition of its inherent and systematic violence — is predicated upon most people not truly understanding the far-reaching and deeply penetrating ramifications of the devil’s bargain. 

We have been manipulated into complicity in our own surveillance, trading not just our own safety, security, and sovereignty, but that of nearly everyone else in the world, for conveniences we have otherwise never known. Most of us barely grasp the unprecedented power we have vested in this network, such as to position these companies and those who run them in a near god-like position, both omniscient (knowing all about us), and omnipotent (having complete power over us). 

The problem is even more severe for young people, who offer up far more of their personal information, and in the case of the most recent generations, they have done so for the majority of their lives. A large study conducted on young people’s attitudes toward privacy revealed that while large percentages shared the concerns of adults, “a gap in privacy knowledge provides one explanation for the apparent license with which the young behave online”[14]Hoofnagle, C., King, J., Li, S., & Turow, J. (2010). How Different are Young Adults From Older Adults When it Comes to Information Privacy Attitudes & Policies?.. It is because of “a lack of knowledge rather than a cavalier attitude toward privacy [that] large numbers of youth engage with the digital world in a seemingly unconcerned manner”.[15]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.

Furthermore, people — especially young people — feel a strong need to live an “effective life”, which itself is founded upon the “evolution of luxuries into necessities”, manifest in the obligation for people to own the latest and most trendy products. Internet access is now seen by many as a “utility” or even a “right”; and expensive “designer” clothing and technology are regarded by the poor and working-class as “necessities” for the returns they provide in cultural capital. This conflict, between the needs for privacy and having an effective life…

produces a kind of psychic numbing that inures people to the realities of being tracked, parsed, mined, and modified – or disposes them to rationalize the situation in resigned cynicism.[16]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.

What’s important to understand is that Big Data contributes to a “surveillance infrastructure”,[17]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. which can “precisely map the behavior of large numbers of people”[18]Pentland, A. (2009). Reality mining of mobile communications: Toward a new deal on data. The Global Information Technology Report 2008–2009, 1981. [PDF], and be used to “modify human behavior”[19]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.. Again, toward the parallel purposes of profit and control. 

Big Data starts with the mass collection of information, through all of the aforementioned interfaces and channels, which is then used both to create an “algorithmic identity”[20]Lindh, M., & Nolin, J. (2016). Information we collect: Surveillance and privacy in the implementation of Google Apps for Education. European Educational Research Journal, 15(6), 644-663. or “digital voodoo doll”[21]Harris, T. [Milken Institute]. (2019, May 7). Town Hall | Will Technology Save or Subvert Civility and Society? [Video]., to understand and predict individual behavior. The data is then ”aggregated and decontextualized”[22]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89. to identify patterns amongst targeted populations. The data is processed by “artificially-intelligent” machines which analyze the relationships between the data, identify patterns, and from these develop algorithms that can “learn” from the data and predict human behavior.[23]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy.[24]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.. This is known as “data analytics”, of which there are four types: descriptive, diagnostic, predictive, and prescriptive. Descriptive analytics identify what happened, or what is happening in real time, while diagnostic analytics make meaning of what happened. Predictive analytics determine what might happen, based on historical patterns and “sentiment analysis”[25]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. — drawn from the various ways consumers engage with these technologies (clicks, likes, time spent), and the ways they express themselves, to determine how they feel about particular topics or events.

The line is blurry between predictive analytics and prescriptive, which attempt to steer behavior towards these desired outcomes. While these machines cannot predict the future in absolute terms, they identify multiple possible futures, organized both by likelihood, and with respect to a person’s “preferred actions“. As Data scientist Michael Wu says, “since a prescriptive model is able to predict the possible consequences based on different choice of action, it can also recommend the best course of action for any pre-specified outcome.”[26]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. For me, this raises the question of who evaluates the consequences of a given action? That is to say, what values — or whose values — inform how a future consequence is evaluated? Given that the tech industry is run almost entirely by upper class and wealthy white men, one could reasonably assume that their biases, and their value systems, have disproportionate influence over consequence evaluation. 

It feels important to recognize the mentality of the people behind these products and services, such as former Google CEO Eric Schmidt, who publicly said that  ‘If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.[27]Esguerra, R. (2009, December 10). Google CEO Eric Schmidt Dismisses the Importance of Privacy. Electronic Frontier Foundation. What his statement fails to understand (or otherwise attempts to obfuscate) is how a person’s data can be used in pernicious ways to compromise their safety, security, and sovereignty, particularly as it applies to certain communities’ relationships with police. Indeed, Schmidt would go on to say,“It is possible that that information could be made available to the authorities.”[28]Dvorak, J. C. (2009, December 11). Eric Schmidt, Google and Privacy. MarketWatch.

Perhaps due to his positionality a wealthy white man, Schmidt is incapable or unwilling to understand how merely being in the world as it is, presents a number of different dangers. The woman who must always be on guard while in public for fear of stalking, sexual harassment or assault, the black man engaging in the most benign activity, only to be accosted and potentially brutalized by police, the indigenous and transwomen who get kidnapped and/or murdered, the activist who gets arrested, and/or killed for daring to have a voice. 

Schmidt’s worldview places him in an explicit defense of hegemony, such that he cannot even comprehend why one might go against it. I include simply living as a woman, a person of color, queer, unhoused, or as the member of any marginalized group and intersections thereof, with the expectation of safety from violence and repression from individuals or the state. He seems to conflate the entire range of human activity that many of us might want to keep private, for reasons of safety or security, with his own “indiscretions”, such as his extramarital affairs, or political donations[29]Tate, R. (2014, December 4). Google CEO: Secrets are for filthy people. Gawker., accounts of which he tried to have stricken from Google’s blogging and search platforms.[30]Biddle, S. (2014.). Google’s mega-hypocrite says his privacy is more important than yours. Gawker. Eric Schmidt is only one man, but in many ways, he represents the prevailing logic of Silicon Valley. These companies encroach upon our private lives with no restraint, “until resistance is encountered”[31]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89., usually in the form of public outcry, and subsequently some attempt at government regulation, which mostly fails to materialize. 

This is likely because state control and financialization work hand in hand. Where people’s behavior is predictable, or better yet where they can be controlled, outcomes can be steered in directions that are both profitable and suppressive of any resistance to the existing order. 

As has already been established, the primary purpose of schooling, historically and at present, has been as an engine of social control. The relationship between schooling and hegemony only becomes more complex upon the incorporation of EdTech into the Big Data infrastructure.[32]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. The collaboration between the tech industry and the state manifests in many ways, such as the introduction of student longitudinal data systems (SLDS) into schools, after a push from the Gates Foundation and other stakeholders.[33]Strauss, V. (2015, November 12). The astonishing amount of data being collected about your children. Washington Post.  

Since 2005, the federal government has mandated that schools maintain databases of personal information on every student [34]Strauss, V. (2015, November 12). The astonishing amount of data being collected about your children. Washington Post.  — including medical data, survey data, interfaces with health departments, child services, and the criminal justice system, academic performance, and more, gathered from birth through college. In what would seem to be a complete abrogation of FERPA laws, these data are “shared with vendors, other governmental agencies, across states, and with organizations or individuals engaged in education-related ’research’ or evaluation”.[35]Strauss, V. (2015, November 12). The astonishing amount of data being collected about your children. Washington Post. As is the case with most of what falls under the Big Data umbrella, this information is collected, exchanged, and used in various ways without the child’s or their parents’ consent. 

[Race to the Top]’s digitized Common Core curriculum and its associated online tests are well known for accumulating huge amounts of personal student data across state borders and sharing it with third parties, including the financial industry.[36]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy.

The stated purpose of all this data collection is to “improve student achievement”, which is dubious in itself, but one must also consider the very meaning of “achievement” within the context of school as a mechanism of state control. The Orwellian design of EdTech becomes even more clear when we look at the relatively new phenomenon of “personalized learning”.

EdTech and Personalized Learning

EdTech advertises itself as the cure for “the many, many societal ills facing public education” through the use of “artificial intelligence, machine learning, data mining, and other technological advancements”.[37]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance. Personalized learning purports to tailor educational experiences down to the individual, under the pretense that it provides them with agency and choice. 

EdTech companies provide students with a “recommended learning path”[38]Kohn, A. (2015). Four Reasons to Worry About “Personalized Learning”.  Technology & Learning, 35, 9, 14-16. [PDF], derived from the use of Big Data to monitor student performance and behavior, and guided by ”strategic” interventions. For EdTech companies, “technology is not just a way for students to pursue their interests; it is way for them to discover their interests”.[39]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. But this “discovery” is steered by algorithms, in the same way the “discover” pages (or their equivalents) of Instagram, or TikTok or Facebook use data collection to “recommend” topics of engagement.

Teachers — who themselves are also surveilled and evaluated — serve as “highly disciplined data technicians”[40]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. or “data-enabled detectives”[41]Mead, R. (2016, Feburary 28). Learn Different., working to identify where, how, and why students veer off acceptable paths, in order to steer them back into compliance. As one EdTech CEO put it, his product is like “a robot tutor in the sky that can semi-read your mind and figure out what your strengths and weaknesses are, down to the percentile”.[42]Mead, R. (2016, Feburary 28). Learn Different.

The use of data in this way is in effect the automatization of symbolic violence, imposing the hegemonic perspective on students under the pretense of “choice”, or even “freedom”. There can be no freedom, much less liberation, where hegemony is arrayed so heavily against the majority of people — darker, poorer, queer, and disproportionately women — and toward the expansion of wealth and power for a much smaller group: whiter, richer, men.

Alfie Kohn discusses the problem of “progress”, and how it is measured: via data points on the acquisition of specific skills of predetermined value. He argues that progress should be measured by more qualitative metrics, such as the cultivation of curiosity.[43]Kohn, A. (2015). Four Reasons to Worry About “Personalized Learning”.  Technology & Learning, 35, 9, 14-16. [PDF] While I agree with this in principle, this is also where I struggle as an educator: with this idea that education should be based on student “interest”, rather than what will ultimately improve their circumstances (socially, politically, economically), so they are truly free to pursue those interests. 

I recognize that this is also where I need to be careful, that I am not the one defining what students need. At the same time, hegemony has been at work from young people’s earliest years, influencing their interests at any given time. Interests, after all, are merely a subset of those experiences to which young people have even been exposed, and there is a gross lack of equity between the dominant and subaltern classes as it applies not just to material wealth, but wealth of experiences.

Personalized learning relies heavily on quantitative data, which is “mined for patterns and insights to improve performance”.[44]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance. This kind of data collection is more “efficient”, as it can be extracted through an automated process. At the same time, this modality devalues qualitative data, such as the more careful, considered, personal input of a human teacher, caregiver, parent, or peer — or better yet, feedback from the community. The quantitative approach also aligns neatly with the outsize focus on “STEM”, where the S always refers to so-called hard sciences, which “align schools and classrooms with corporate manpower needs”.[45]Lipman, P. Urban Education Policy under Obama. Journal of Urban Affairs, 37, 1, 57-61.  

The acquisition of skills “useful in the workplace of the future”[46]Mead, R. (2016, Feburary 28). Learn Different. supersedes any learning of social sciences, knowledge of which may better position students to question and challenge an inequitable status quo. It is no surprise that computer science is so well-represented, given that many initiatives are created and sponsored by members of the tech industry, like Google, Facebook, or the Gates Foundation — the would-be saviors of public education.

One of the most extreme incarnations of the EdTech paradigm is AltSchool, a company based in San Francisco that runs multiple schools across the country, founded by a software engineer and businessman named Max Ventilla, who had zero experience as an educator or school administrator. AltSchool overtly collects as much data on their students (and teachers) as possible, through a combination of software and an audio/video surveillance system, the stated goal of which is to “capture every word, action, and interaction, for potential analysis“. Teachers have the ability to “bookmark” any given video moment for data analysts (usually located off-site and therefore out of sight) to analyze and from which to “draw inferences”.[47]Mead, R. (2016, Feburary 28). Learn Different.

In what must be a bid for plausible deniability, or a remarkable feat of self-deception, Ventilla views AltSchool as enabling students to “pursue…rewarding nonconformity”. It is no secret that AltSchool’s true purpose is to generate profit for its investors, which collectively provided over one hundred million dollars in venture capital, “among the largest investments every made in education technology”.[48]Mead, R. (2016, Feburary 28). Learn Different. 

Yet, the people behind AltSchool claim they are attempting to “reinvent” education, through innovation, iteration, and software proliferation. According to Mead (2016), the teachers at AltSchool, mostly forged from the “Teach for America” mold, were “drawn to the startup because of its ambition to make systemic change”. In the words of one teacher:

“It became clear to me, teaching in those neighborhoods, that by looking for standards to pull everyone up we are forgetting to address the individual needs. We are forgetting to think about how kids learn and what they need to be successful in life”.[49]Mead, R. (2016, Feburary 28). Learn Different.

While she decries the use of standards to “pull everyone up” (note the implicit bootstraps narrative), she takes for granted that there even is a narrow definition of success. And as is common in the neoliberal worldview, her focus is on the individual, rather than the community, failing to understand that without changing the fundamental conditions in which a child lives and learns, there will be no substantial impact, either on that child in the long-term, or in any project that truly strives for equity, let alone “systemic change”. It is also important to note that AltSchool teachers have a financial incentive to see that the model is “successful”, as they are given equity in the company as part of their compensation.

AltSchool parents seem to be on board with the project, given that they are willing to pay the hefty tuition of $30,000 a year, but they also seem to firmly believe in the company ethos. In the words of one parent:

We are very comfortable with our kids being guinea pigs. I do buy into the AltSchool mission. I believe education needs to change, not just in our little micro-school here but all over the world. We are raising a generation that will have the sum of human knowledge at their fingertips, for every minute of their life, so clearly education needs to change to accommodate that.[50]Mead, R. (2016, Feburary 28). Learn Different.

The problem with this characterization is that it presumes some neutrality or objectivity of knowledge, that students are or should be free to nibble as they please from some infinite buffet, without being taught how to be critical or scrutinize or recognize bias that privileges certain types of knowledge above others. 

While AltSchool claims to be opposed to standardization, their curriculum is broken down into discrete units, which are measured for “mastery”. Even though there are no tests, they still measure “achievement” or “success” via metrics aligned to the common core standards. 

Canadian educator Phillip McRae says that “At its most sinister, [personalized learning] establishes children as measurable commodities to be cataloged and capitalized upon by corporations.”[51]Kohn, A. (2015). Four Reasons to Worry About “Personalized Learning”.  Technology & Learning, 35, 9, 14-16. [PDF] These companies capitalize upon for-profit education products, and from financial speculation through educational social impact bonds, or “Pay For Success” initiatives, which measure “progress” by various quantitative metrics such as test scores, attendance percentages, or “mastery of competencies”. 

EdTech companies seek to digitize learning into “a series of tasks, and further distilling those tasks down to a series of clicks that can be measured and analyzed”.[52]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance. Seemingly everything is broken down into discrete, categorized parts, removing them from their context. 

Knowledge deconstructed, categorized, and decontextualized in this way defies the natural way humans learn, and encloses upon the continuum of knowledge. This is a natural extension of individualism, the logical progression of the siloing of school subjects, and their further breakdown into standards. 

[Competency-based education] plays on the fundamental American values of individualism, meritocracy and grit, while offering hope of providing greater opportunities for employment and freedom).[53]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy.

Students are being assessed on their ability to master discrete “skills”, without any application to the real world. And the real world, which kids of color and other marginalized groups have to grapple with immediately, cannot be so neatly atomized. 

Few things reveal how out of touch the EdTech reformers are with reality than the example given by Mead: 

One middle-school class undertook a lengthy study of the Iliad by focusing on the theme of “rage” and designing a spreadsheet that logged instances of it. They then used data-visualization techniques to show their findings, and wrote persuasive essays based on their results.[54]Mead, R. (2016, Feburary 28). Learn Different.

What is this, other than a contrivance? Perhaps it is also a mechanism by which neoliberal ideology can be propagated into the next generation. If a piece of literature — via the theme of rage — can be systematically broken down into discrete data points, then why can’t we do the same with a real-life person’s experience? In the same way that a student might use the data to extrapolate a logical “sequel” to the Iliad, Big Data, from this perspective, can be used to predict and steer the future.

A common saying in the tech industry is that “you’ve got to build the plane while flying”, which refers to the practice of innovation, experimentation, and iteration. But following this analogy to its logical endpoint, means to willfully plunge toward death and hope that something will stop the descent. “Death” here might mean the failure of an app or device, or even the collapse of a company, neither of which tend to have severe or lasting impacts. But education is not some frivolous, arbitrary thing, some niche problem to be solved. It is, in the most optimistic view, a means toward social or economic mobility, or far more often, as I have argued ad nauseam, an engine of social control and state repression. Either way, education is far too serious a matter to casually “iterate” upon, “moving fast and breaking things” when people’s lives, livelihoods, and life trajectories are at stake. 

School districts facing budget cuts due to austerity, have taken to using “blended learning”: another EdTech buzzword that means much the same as “personalized learning”, and involves students cycling between teacher-led instruction and independent work on computers or other technology. If a teacher only has to engage with, say, 15 students at any given time, they could feasibly have 45 in the class, if the other 30 are busy with independent or group tasks. If this is deemed “effective”, it becomes a way to increase class sizes and lay off teachers. Companies like Google, and non-profits like the Gates Foundation are eager to provide such tech solutions at low cost, in exchange for access to the treasure trove of student data. 

In the case of AltSchool, they’ve learned that while technology can be effective at gauging the mastery of discrete skills (e.g. common core standards), they are incapable of evaluating qualitative metrics, such as creativity or flexibility, or kindness, or resilience, or “less easily definable aspects of a humanistic education, such as literary appreciation or artistic sensibility or the development of empathy”.[55]Mead, R. (2016, Feburary 28). Learn Different.

This is a key point. If schools exist merely for the acquisition of arbitrary skills, then we might as well do everything with machines. But if they’re actually about preparing people to participate in the world, as citizens, as neighbors, as community members, as environmental stewards, then what does an algorithm know about that? In Mead’s observations, AltSchool “failed fast and failed forward”, but “tomorrow, they would iterate”.[56]Mead, R. (2016, Feburary 28). Learn Different.

What happens to the kids in the meantime?

The idea that EdTech can adequately stand-in for teachers is based on the idea that teachers’ roles are merely to deliver content. These tools cannot, however, cultivate student inquiry or build intrinsic motivation. This may be what the AltSchool technologies claim to do — learning enough about students through Big Data for “machine learning” to guide students along their chosen path. 

While this modality seems to prize students as individuals, students as agents or as actual people, are inconsequential. Rather their value is as containers for data. As Zuboff writes, these companies are “formally indifferent to what [their] users say or do”… as long it can be “capture[d] and convert[ed] into data.

The way EdTech companies use data can be rather nuanced, and seem to operate at cross-purposes, between the objectives of profit and control. On the one hand, the data that Google collects from users, what they call “your data” in their privacy policy, is not directly monetized, but is rather used to compile “algorithmic identities”, which thereafter become the company’s intellectual property, and not for sale. On the other hand, the data is aggregated, decontextualized, and to some degree anonymized, to identify overall behavior patterns in target populations, categorized by various demographics. This processed data is sold to Google’s customers at a high profit with near zero marginal cost. This same information is also fed back into individual users’ algorithmic identities, and further cycled into a massive “archive of ‘collected’ and ‘personal’ information that can be commercially exploited”.[57]Lindh, M., & Nolin, J. (2016). Information we collect: Surveillance and privacy in the implementation of Google Apps for Education. European Educational Research Journal, 15(6), 644-663. 

Another way to understand it is to consider the relationship between “survey” and “surveillance”, which share etymological roots to do with “watching from above”. While surveys do not tell much about individuals, they do reveal behavior patterns of demographic groups; surveillance, meanwhile can be more focused, targeting specific people. Both can — and are — used to manipulate people’s thoughts, feelings, and actions. In the same way, while the “algorithmic identities” of users are not sold, the “archive” of information derived from these identities — behavior patterns, overall trends, etc. — are for sale. While individuals may be safe from their personal data or digital doubles being traded — and only because Google deems it invaluable — the behavior of communities at large can be exploited for profit.

With EdTech, as with all Big Data technologies, this extraction takes place at the intersection of ignorance and convenience, with students using “free” services at the cost of their algorithmic identities being “utilized as a commodity”.[58]Lindh, M., & Nolin, J. (2016). Information we collect: Surveillance and privacy in the implementation of Google Apps for Education. European Educational Research Journal, 15(6), 644-663. Where critical concerns have been raised, the complex ways Google collects, processes, aggregates, packages, decontextualizes, repackages, and sells data is so circuitous as to allow them to redirect criticism toward a straw man — a confusion over which data is actually sold. The trick lies in how Google distinguishes between “your data”, a person’s individual identifying information (i.e. the surveillance) — which they do not sell, and “collected and personal information” (i.e. the “survey”) — which they do. All of this is to say nothing about how Google uses “your data” and “algorithmic identity” for its own proprietary purposes, more to do with control. 

As Google built up its surveillance apparatus, they saw a limitation in the fact that users had to ask questions, realizing it would be even more effective if it could anticipate one’s inquiry and “respond” even before the question was asked.[59]Varian, H. R. (2014). Beyond big data. Business Economics, 49(1), 27-31. This ”problem” was addressed by an app called Google Now, since evolved and rebranded as Google Assistant. Years before I started this research, I was, as Varian says, “completely freaked out” by a personal experience with Google Now. It led me to extrapolate the “trajectory of Google’s business model”, occurring in three phases: learn, predict, and steer. They would learn as much as possible about users (everyone in the world), predict their behavior, steer them in certain directions, and ultimately control them, all without people being aware of the manipulation.  A leaked video called the “Selfish Ledger” revealed Google’s overwhelming cynicism, that is their utter lack of faith in humans’ ability to act in our own best interests. Also their stunning hubris, that they, Google, were better suited to guide us into the future. 

Google was proposing they act as the custodians, the mediators, the purveyors of human culture — stored, analyzed, and transmitted as “data” — and conceivably modifying it along the way. Not just on the broad, collective level, but on a deep, hyper-personal, individual level. Where humans have long sought a total understanding and thereby a mastery over nature, Google would be the stewards of our nurture.  Analogous to GMOs, perhaps we would become “CMOs” (culturally modified organisms), or maybe MMOs (memetically modified).

I have focused on Google quite a bit because one of the most widespread EdTech platforms is Google Apps for Education (GAFE), which the company offers to K-12 school districts, and other educational institutions for “free”. While in general, data extraction takes place without users’ informed consent, and because young people in particular know even less than adults about what’s at stake[60]Hoofnagle, C., King, J., Li, S., & Turow, J. (2010). How Different are Young Adults From Older Adults When it Comes to Information Privacy Attitudes & Policies?., the mass adoption of GAFE across the United States and throughout the world further muddies the water. Because school is compulsory and GAFE is the CMS solution chosen by districts, the data-mining is practically unavoidable for students. As Lindh and Nolin ask, “Why should the public school system force pupils to participate in the commodification of their digital labour and algorithmic identities?”[61]Lindh, M., & Nolin, J. (2016). Information we collect: Surveillance and privacy in the implementation of Google Apps for Education. European Educational Research Journal, 15(6), 644-663.

Carceral Implications and Applications

There is a natural progression from education as an engine of social control to education as thoroughfare into the carceral system  — commonly known as the “school to prison pipeline”. Given that the mechanisms at work here are so numerous, varied, and complex, I would say “pipeline” is a misnomer. “Network”, or even “infrastructure” might be more appropriate. My specific interest here is to discuss the relationship between Big Data — mediated by EdTech — and the carceral state, as interlocking mechanisms of social control. 

There can be no discussion of the relationship between schools, surveillance, social control, and incarceration in the United States without acknowledging their racialized history. As I have discussed at length, schooling for Black, Brown, and Indigenous students has always been designed to dominate and control their lives — up to and including when, where, and how those lives will end. 

Chattel slavery, Jim Crow, Native Reservations, laws governing Mexican Americans and the schooling of Black, Brown and Indigenous people have always served as massive structures of surveillance and control.[62]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy.

While these oppressive systems may have originally been premised on the perceived inhumanity of Indigenous, Black, and Brown (IBB) people, they have been maintained by white fear of our potential for resistance. Throughout history, the response to this fear has resulted in countless campaigns of violence against IBB people. The forbidding of literacy for the enslaved, the violent quelling of slave revolts, a military intelligence division created specifically “to surveil the activities of Black Americans in both civilian and military life”[63]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. after they returned from World War I, the FBI’s COINTELPRO surveillance program, which targeted black, brown, and indigenous liberation organizations, the PRISM mass surveillance program, the designation of “black identity extremist”, and the more recent labeling of the Black Lives Matter movement as a “terrorist organization”. The goal is to either control us physically through enslavement, policing, surveillance and incarceration, or socially through schools, Big Data, and prescriptive analytics. Any revolutionary spirit must be quelled. 

Otherwise, the public/private State “predict[s] and integrate[s] resistance into [their] risk speculations”, which they use to “determine preemptive or subsequent interventions and disciplining actions”.[64]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy. It is my sense that this sort of calculation is behind corporations today placing their “bets” on standing symbolically with Black Lives Matter and related movements. Perhaps their mathematical models determined that they mitigate the most risk, and/or otherwise accumulate the most profit by siding with black people in this political moment. This reflects their dual recognition of our value as both “subjugated and rigidly controlled sources of predictable cash flows” and “risky assets” which they can ”liquidate” for market speculation through the use of Big Data.[65]Scott, T. (2016, October 29). Education Technology, Surveillance, and America’s Authoritarian Democracy.

The twin purposes of Big Data collection are profit and control. These purposes are not only parallel, but analogous, as evidenced in the stock market. The wealthiest investors, unwilling to risk their predictions being wrong, engage in “insider trading” by accessing and maintaining control of proprietary information. 

If we consider that enormous amounts of data are being collected on us from birth to death, it is conceivable that someone might use predictive analytics to “calculate the likelihood that someone will engage in criminal activity before they are born”.[66]Wang, J. (2018). Carceral capitalism. MIT Press. If the neoliberal worldview relies upon the accuracy of predictive analytics, then beyond the practices of tracking and prenatal policing, the logical next step is “steering”. Black and brown people can be guided into the cycles of production and consumption, with a special off-ramp to shunt a significant percentage of them into the carceral system, those who can’t be “risk managed”. 

In thinking about how this applies to young people, I have little doubt the mass data collection in schools (e.g. SLDS, GAFE, and other EdTech “solutions”) factors into this project. As old-school “tracking” is phased out for more granular data collection, students can be “steered” not only between grades, but on a daily basis, if not in real-time. Given that tracking has always been assessment followed by steering, the implication is that certain people will be tracked right into “criminality”. 

The pervasiveness of this system has as much to do with hegemony as it does with the sheer number of constituent mechanisms. Consider the federal government’s “E-Rate” program, which provides money to schools to pay for student internet access. In order to qualify, schools must demonstrate compliance with regulations that “ban access to websites displaying pornography, graphic material, or any other that could otherwise be judged as immoral, improper or lewd”. Strangely, there are no specific federal guidelines for what should be banned, leaving the matter to the discretion of school administrators. More often than not, they conform to white middle class (Protestant) morality, meaning that “these technologies extend current practices and prejudices that perpetuate injustices against marginalized groups”.[67]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance.

Black and Brown students are most likely to be perceived as “insubordinate” or “disrespectful”, rather than resisting indignity and repression. It is no coincidence that BBI youth are disproportionately subjected to harsh discipline, just as adults are disproportionately represented within the carceral system. Predictive analytics don’t just predict, they “enact the future”.[68]Wang, J. (2018). Carceral capitalism. MIT Press.

While crime rates are at historic lows, crime reporting (and other [mis]representations of crime) are up, by way of the 24 hour news cycle, the sensationalism of violent crime on local TV news, and the memetic cesspool that is social media. In this climate of hyper-activation and fear-mongering, it only makes people more inclined to accept more elaborate systems of control. In the same way that EdTech solutions “[prey] on the absolute worst fears of administrators”[69]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance., predictive policing sells itself on its ability to “remove the existential terror of not knowing…where and when crime will occur”.[70]Wang, J. (2018). Carceral capitalism. MIT Press.

Yet if ”criminality” is merely a response to needs/desires, then the data must also show that simply providing for those needs would “steer” people away from crime. In the same way, “deviant” student behavior is also almost always a response to an unmet need. We don’t need algorithms to identify these needs, as they haven’t changed much in millions of years. We all have at least a basic understanding of an “ethics of care”, and through praxes such as restorative justice and trauma-informed care, or providing for the basic material needs of families, we can take both preventative and/or palliative approaches. That this obvious “low-tech” solution is rarely considered reminds us that there is profit in suffering.

The widespread surveillance of students has a “panopticon effect”, wherein “students recognize that they are being watched, [and] begin to act differently”  – and from that very moment they begin to cede one small bit of freedom at a time.[71]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance.. As Wang writes, “once the ‘digital carceral infrastructure’ is built up, it will be nearly impossible to undo“, making privacy and personal sovereignty “unattainable for future generations”.[72]Irwin, J. (2017, October 7). Grooming Students for A Lifetime of Surveillance.

The challenge lies in the massive knowledge gap between tech companies and consumers, and the amount of violation the latter are willing to accept in exchange for the convenience of these technologies[73]Zuboff, S. (2015). Big Other: Surveillance Capitalism and the Prospects of an Information Civilization. Journal of Information Technology, 30(1), 75-89.[74]Lindh, M., & Nolin, J. (2016). Information we collect: Surveillance and privacy in the implementation of Google Apps for Education. European Educational Research Journal, 15(6), 644-663. This was evidenced by the response to the Cambridge Analytica scandal. In spite of extensive media coverage, and massive public outcry, including the #DeleteFacebook campaign, the use of that platform and the time people spend on it has only dramatically increased in the years since[75]Kanter, J. (2018, May 20). The backlash that never happened: New data shows people actually increased their Facebook usage after the Cambridge Analytica scandal.. And this is before we consider Instagram, owned and developed by Facebook, whose user base increased by around 200 million (25%) from September 2017 to June 2018, and hasn’t budged since.[76]Clement, J. (2019, December 3). Number of monthly active Instagram users from January 2013 to June 2018.

Conclusion

The neoliberal establishment, in its push for increasing profit and social control, has leveraged Big Data — an extraction of personal information of unprecedented volume and depth — processed to learn, predict, and steer behavior, in order to corral people within acceptable boundaries as compliant subjects. These data are used both to create digital identities for nearly everyone in the world, tracked across the internet and in physical space through a variety of surveillance technologies, and analyzed to identify patterns of behavior across whole populations. The inevitable resistance to these artificially-imposed guard rails is factored in as “risk”, informing the redirection of a certain percentage of the population — disproportionately black, brown, and indigenous — into the carceral system. Otherwise, resistance is placated, through the appointment of malleable politicians, often capitalizing on the shallow politics of representation, or co-opted through symbolic gestures and strategic investment. 

This agenda finds one of its most odious incarnations in the education sector, where student data is extracted from birth through college and beyond, and used to individualize students into gauges of “success” or “progress”. Their performance and behaviors are monitored, tracked, predicted, and guided, and further atomized into discrete units of information used to determine the success — and thereby the return on investment or need to “iterate” — of EdTech solutions. 

Humans are alienated from nature and the rest of the animal kingdom. Alienated from each other along lines of gender, race, class, culture etc, reinforced by national boundaries and enforced by violence. A culture of individualism isolates us further still, even within our own identity groups, communities, and families. Within the individual, the relations of production reinforce the false dichotomy between mind and body, such that “the work” precludes our own vital needs. Enclosures of space restrict our mobility and access to fundamental needs, while enclosures of time, manifest in the logic of “efficiency”, manufacture urgency and privilege the moment over the freedom and possibility of reflecting on the past or contemplating the future.  

Computer technology confines thinking within the human-machine interface, mediated by algorithms designed to capture our attention and drive our behaviors. Our interactions with these technologies — scrolls, clicks, swipes — are further enclosed into discrete data points, decontextualized and aggregated with trillions of others as Big Data, organized into a schema of external control and the accumulation of profit. 

Within this hyper-interior space, people become detached from the needs of their own bodies, the concerns of other people, their impact on other living things and the environment. The boundaries to each of these nested enclosures is mediated and reinforced by the state, corporations and other powerful institutions, the dominant culture, and computer algorithms, so normalized as to be impervious, if not altogether invisible to scrutiny and critique.

Schools have always primarily acted as engines of social control, and the shift to this “data-driven” paradigm signals an evolution in complexity, pervasiveness, and penetration. Contingent upon our ignorance, or compliance gained through distraction or fear, this Orwell-Machiavellian project sends us hurtling toward a convergence point, a singularity, a reckoning, where and when either “we” or “they” — Big Other — will determine our future. 


References[+]
Scroll to Top