1
2
3
4
5
6 Sexism is a feature, not a bug
As Safiya Noble has shown, existing problems become entrenched and magnified by profit-seeking technologies masquerading as neutral public resources.
This is not new: as something that grew out of the Second World War and the Cold War that followed, electronic computing technology has long been an abstraction of political power into machine form.
Techno-optimist narratives surrounding high technology and the public good—ones that assume technology is somehow inherently progressive—rely on historical fictions and blind spots that tend to overlook how large technological systems perpetuate structures of dominance and power already in place.
-
UK:
-
fiction of meritocracy can scuttle industry
-
computing has been long aligned with neocolonial projects that present fantasies of control
-
-
1943: Britain led the world in electronic computing
-
1974: British computing industry was basically extinct
-
early computer work seen almost akin to factory work and it was denigrated for association with machinery ⇒ women became first computer operators and programmers
-
feminization of computer work continued after the war
-
women, despite having more experience, were never promoted to managerial positions and always paid less
-
a trans man had his pay immediately raised after his transition
-
a trans woman was told to downplay/hide her transition so her pay didn’t go down
-
-
computer systems were expanding to take over more aspects of government
-
while complexity of work didn’t change, the perception did
-
women computers who possessed skills to perform the job were being squeezed out by men trainees with no technical skills
-
women from Machine Grade forbade from applying for the new created management-aligned computer jobs
-
-
government and industry began major push to recruit men into technical positions, while training them for management positions.
-
lowering standards of technical proficiency to create an elite class above Machine Grades in name/power but beneath in technical skill
-
-
resulted in labor shortage: men didn’t want to get stuck in largely feminised "backwater" of computer work
-
many still saw machine work as unintellectual and working class
-
-
programming, systems analysis and computer operating of government and industry went unmet
-
government began outsourcing ⇒ software companies set up (not by choice) Stephanie "Steve" Shirley
-
After being passed over multiple times for a promotion she had earned, she found out that the men assigned to her promotions case were repeatedly resigning from the committee when her case came up, rather than risking having to give a woman a promotion. Her ambition was seen as a liability, even though in a man it would have been rewarded.
-
Shirley left her job and formed Freelance Programmers, one of the first companies to recognize software as a standalone product
-
allowed women to work from home ⇒ taopped into deep well of discarded expertise
-
The programming for the Concorde was managed and completed entirely by a remote workforce of nearly all women, programming with pencil and paper from home, before testing their software on rented mainframe time.
-
UK started to view computer as a powerful tool in international political arsenal
-
UK insisted on British computers to run all UK government work ⇒ computers could be a back door into the highest levels of the state
-
mid-1960: women programmers were able to get jobs in the new higher-level technical grades, but not if they came from "pink collar" machine grades
-
didn’t last and by mid-1967, went back to previous gendered practices
-
-
interwoven nature of computing processes with all functions of state ⇒ power technical workers held was becoming indispensable
-
solutions
-
training more young men was waste because many left
-
outsourcing was temp
-
employing women was nonstarter because of the power and prestige of these jobs
-
so managers at top of government decided to re-engineer computing systems to function with smaller labor force ⇒ more massive mainframes to centralize to the greatest extent possible
-
-
British government merged all computing companies, but by mid-1970s massive, expensive mainframes were no longer desirable so the company sank
-
IBM made smaller mainframes and more flexible, decentralized systems. ICL could not compete with IBM.
-
since majority of British computer industry was merged together and ICL neglected of development of smaller mainframes, the entire British computer industry went down
-
-
discrimination of women was a highly constructed and artificial feature
The manner in which government leaders and industry officials worked together to standardize and codify a gendered underclass of tech workers, and then to later upskill that work once the managerial power of computers became clear, was not evolutionary or accidental. It was an intentional set of systems design parameters intended to ensure that those who held the most power in predigital society, government, and industry continued to hold that power after the “computer revolution.
As Margot Lee Shetterly points out in Hidden Figures, Black women workers were only brought into critical jobs in NASA when Cold War tensions made their labor too valuable to ignore—too important to continue to exclude on the basis of their Blackness.
7
8
9 Your Robot Isn’t Neutral
It’s no surprise that we see a host of emergent robotic designs that are pointed toward women’s labor: from doing the work of being sexy and having sex, to robots that clean or provide emotional companionship. Robots are the dreams of their designers, catering to the imaginaries we hold about who should do what in our societies
Now more than ever it is crucial to interrogate the premise of anthropomorphiza- tion as a design strategy as one that relies on gender and race as foundational, infra- structural components. The ways in which gender and race are operationalized in the interface continue to reinforce the binaries and hierarchies that maintain power and privilege. While customization may offer some individual relief to problematic rep- resentations in the interface, particularly for marginalized users, sexism and racism persist at structural levels and, as such, demand a shifted industry approach to design on a broad level.
Instead, we need to think about how robots fit into structural inequality and oppression, to what degree capital will benefit from the displacement of women through automation, and how the reconstruction of stereotypical notions of gender will be encoded in gender-assigned tasks, free from other dimensions of women’s intellectual and creative contributions.
Crawford and Shultz warn that the use of predictive modeling through gathering data on the public also poses a serious threat to privacy; they argue for new frame- works of “data due process” that would allow individuals a right to appeal the use of their data profiles.
Moreover, the predictions that these policing algorithms make—that particular geo- graphic areas are more likely to have crime—will surely produce more arrests in those areas by directing police to patrol them. This, in turn, will generate more “historical crime data” for those areas and increase the likelihood of patrols. For those who live there, these “hot spots” may well become as much PII [personally identifiable informa- tion] as other demographic information.1
10
11
12 Coding is not empowerment
-
Code.org: blames on the "education pipeline"
-
Paul Graham: change middle school computer science curriculum
-
implicit bias training programs: no research demonstrating their effectiveness
-
sometimes made matters worse
-
-
pipeline argument: puts it on underrepresented groups to solve their own exclusion by learning to code at a early age.
-
Hadi Partovi: belief machines are objective and socially neutral
-
Jacob Kaplan-Moss: "Programmers like to think they work in a field that is logical and analytical, but the truth is that there is no way to even talk about programming ability in a systematic way. When humans don’t have any data, they make up stories, but those stories are simplistic and stereotyped."
-
harmful to believe tech industry is meritocratic
If the exclusion of minorities is naturalized as reflecting their lack of merit, rather than a moral failing within the industry, then diversity initiatives can only be justified in economic termsas a strategy to improve products or make companies more competitive.
-
Ellen Berrry: celebration of cultural differences as competitive advantage, diversity as end goal with instructional pay-offs
-
programs teaching coding: business case rather than fairness issue
-
explicit meritocratic: managerial positions favor male employee over equally qualified female employee
The myth of the superstar coder encourages managers to reward men’s "heroic" last-minute problem-solving over women’s proactive efforts to prevent crises from occurring in the first place.
-
macho heroics
-
encourage minorities to solve problems in their own communities; affluent white men don’t understand the problem and context and make products full of blindspots
14 Skills with not set you free
However, like the majority of skills training programs directed at marginalized youth in contemporary India, the Seelampur program produced precarious and low-paid workers at the fringes of the information economy.
-
focus on entrepreneurial individual deflected attention from responsibility of government and an unprotected labor market
-
new risks of exploitation that functioned through deception and opportunism
-
If leisure, creativity, and complex human emotions are intermeshed with work, then it becomes increasingly difficult for individuals to discern exploitative risks of labor and to practice resistance or moments of refusal to work.
-
technical training does not erase race, class, gender-based assumptions of what technically trained people look like.
Digital inclusion—or fixing the "bug" in the form of technology access and skills—was a celebrated goal for policy makers and elite IT professionals.
-
skill programs mainly produced employment at the lower rung of the information economy that is temporary, gendered and vulnerable to exploitation
13 Source code isn’t
-
Thompson hack: mathematical (quasi-mathematical) proof of the impossibility of completely verifying the security of any system
As Edwards elaborates, “higher-level applications are built on top of lower-level software such as networking, data transport, and operating systems. Each level of the stack requires the capabilities of those below it, yet each appears to its programmers as an independent, self-contained system.”
Edwards’s account stresses the rapid pace at which platforms can be developed, rolled out, see widespread use, and then fade away as they are replaced by new platforms. Software platforms are flickering, evanescent flames, burning on top of the old slow infrastructure while allowing users and developers to pretend that the infrastructure isn’t even there. The transition between low-level and high-level languages is the first flicker of this process, the first moment that software developers can begin to treat the machines that subtend software as irrelevant.
-
quines: a program that prints itself (string vs. C commands)
-
bootstrapping: adding features to compiler through using it repeatedly compile extensions to itself
-
compiler: chicken and egg problem
-
Because the compiler determines what \v means, the compiler cannot translate \v until \v has first been specified in a machine code version of the compiler. However, producing that machine code version of the compiler requires first having a working specifica- tion of \v. The solution to this problem is mundane: the code must be changed to define \v in reference to a lower-level standard:
if (c == ‘v’) return(11);
Why 11? This is the number (arbitrarily) given to vertical tab in ASCII, the encod- ing used on most computers for representing the Latin character set.17 The compiler produced using this code now correctly parses \v as vertical tab, at least so long as it is run on a system that uses the ASCII character set. Note that the compiler produced through this process now accepts the original definition of \v, the one that the previous version of the compiler flagged as an error. Once you have compiled one version of the source code containing the “magic number” 11 for vertical tab, you can change the code back to:
if (c == ‘v’) return (‘\v’);
and the now-educated compiler will compile it without complaint. You can forget the number 11 and the entire ASCII standard altogether; \v now means “vertical tab.”
It is inserted because at some point in the past, on some machine we have no knowledge of, source code existed that said that the pattern of characters associated with the code of login should be interpreted in this unexpected way. The sinister code haunts the workings of the machine with- out ever revealing itself in human-readable text. Our space oddity has completed its third-stage burn, and no one on the ground will ever know for sure where it’s gone.
Once we have the logic of the Thompson hack implemented, it is no longer possible to fully verify that any machine is uncompromised, because every single piece of software that itself generates software is a vector for diligent attackers to exploit. Every layer in every nth-order platform is suspect. We are reduced to having to simply trust that no such diligent attacker has targeted our machine—no matter how skilled we are at detect- ing attacks, and no matter how much time we have to analyze our machines for bugs. All our stable platforms are potentially riddled with invisible trapdoors.
In the early 1970s, Ken Thompson, esteemed Bell Labs employee, was empowered to play with his company’s machines in order first to implement an esoteric but extraordinarily effective Trojan horse development methodology that he had read about in a US Air Force paper. Further, he was empowered to use social-engineering techniques to get this Trojan installed on unauthorized machines. When less esteemed playful programmers—the “Dalton gang,” the “414 gang”—used similar technical and social-engineering techniques to break into systems without authori- zation, they were equivalent to drunk drivers and burglars.
This clever and playful use of computers resulted in chastisement from the instructors. Who gets to creatively play with computer technology depends less on creativity and more on identity categories.
If, to use Lawrence Lessig’s famous analogy, “code is law,” Ken Thompson had the power to write and alter the digital constitution by personal fiat.33 Thompson’s possession of power gave him the authorization to play—and to play irresponsibly—that the marginal Dalton gang lacked.
Moreover, programmers who already occupy privileged positions—already-esteemed software developers, college-educated computer science students from First World nations, and so forth—are the ones most empowered to play around with techniques like the Thompson hack. The perceived acceptability of using this technique depends less on the hacker’s skill and more on who the hacker is.
This paper describes a practical technique, termed diverse double-compiling (DDC), that detects this attack and some unintended compiler defects as well. Simply recom- pile the purported source code twice: once with a second (trusted) compiler, and again using the result of the first compilation. If the result is bit-for-bit identical with the untrusted binary, then the source code accurately represents the binary.