Security Systems

With regard to computer abuse, the term "malicious insider" tends to be associated with male employees, likely because men commit more crimes relative to women. We draw on the chivalry hypothesis to inform our study and explore whether managers demonstrate gender bias in decision-making regarding insider threats posed by subordinate employees. We recruited managers as participants in our study and randomly assigned them to an "employee gender" condition, wherein half the participants read a scenario with a female offender and half the participants read a scenario with a male offender.
With female names, voices and characters, artificially intelligent Virtual Personal Assistants such as Alexa, Cortana, and Siri appear to be decisively gendered female. Through an exploration of the various facets of gendering at play in the design of Siri, Alexa and Cortana, we argue that this gendering of VPAs as female may pose a societal harm, insofar as they reproduce normative assumptions about the role of women as submissive and secondary to men. In response, this article turns to examine the potential role and scope of data protection law as one possible solution to this problem.