Commercial enterprises are less interested in foolproof registration of the inhabitants of a territory. The focus is on obtaining relevant data on as many customers and potential customers as possible as part of their marketing and sales strategies. With customer loyalty no longer a given, companies are developing CRM in the hope of surviving in the competition of neoliberal market economies. At the same time, they try to find out which consumers can become their new customers and under what conditions. They appear to be less interested in uniquely identifying a particular client than in a sophisticated type of categorization that allows them to offer targeted services at the right time and place. Context is not only the central message of the followers of cultural theory. In fact, companies are not only interested in the attributes of predefined customer categories and potential customers, but rather invest in the question of which classes they should discriminate in the first place. This is where profiling comes in. Id. at *78.
For more information on whether Arizona laws and regulations contribute to the creation of racial profiling, see CRS report R41221, State Efforts to Deter Unauthorized Aliens: Legal Analysis of Arizona`s S.B. 1070, by [author name cleaned], [author name cleaned] and [author name cleaned]. Both corporate and global governance seem to require increasingly sophisticated means of identification. Supposedly justified by a call for security threats, fraud and abuse, citizens are filtered, located, detected and their data stored, aggregated and analyzed. At the same time, potential customers are profiled to recognize their habits and preferences in order to offer targeted services. Both industry and the European Commission are investing huge sums in what they call ambient intelligence and the creation of an „Internet of Things“. These smart, connected environments rely entirely on real-time monitoring and profiling, resulting in real-time personalization of the environment. In this article, the author will assess the risks and opportunities of such autonomous profiling in terms of its impact on individual autonomy and refined discrimination, and show how effective traditional data protection is with regard to profiling. Profiling is therefore not typically human, although we have developed our own type of profiling, which cognitive psychologists call stereotyping (Spears et al. 1997) and von Schauer (2003) in his profiles, probabilities and stereotypes. What`s special about humans is their ability – according to brain researchers who are perfectly embodied in the prefrontal cortex – to think about the profiles they offer.
This is a rare ability closely related to consciousness and language, and we will not explore this field much further, leaving it at the intersection of neuroscience and philosophy of mind (Haggard and Libet 2001; Overgaard, 2001). What matters is our ability to consciously reflect on the profiles we have generated unconsciously, as this gives us the freedom to think about them, reject them or reinforce them and apply them consciously. As Rouvroy (2008) aptly describes, this is what enables our self-education. This is the prerequisite for our actions to be qualified as stemming from freedom of action: we can become aware of the patterns that govern our actions and review them to change our habits. While most of our interactions are automated and autonomously managed by habits inscribed in our bodies and brains, we can recall them and verify their relevance, validity, fairness, and justice. This makes us autonomous actors who are able to make a conscious decision for a course of action and decide by which law we want to live. Autonomous is derived from the Greek autonomos: self and law. We can live according to our own rights and are therefore held accountable for our own actions (Hildebrandt 2008a). Woodhams, J., & Toye, K. (2007, February).
An empirical test of the business coupling and author profiling hypotheses with serial commercial flights. Psychology, Public Policy and Law. 13 (1): 59–85. doi:10.1037/1076-8971.13.1.59. This seems to create many flaws for the automated application of profiles. The third pitfall is that the article ceases to apply once the decision is not automated due to (routine) human intervention. In the case of autonomous profiling in an AmI environment, this would not be an option, as the seamless and real-time adaptation of the environment prevents such human intervention. This brings us to the fourth and final trap: if you are not aware that you are subject to such decisions, you cannot exercise this right. The fact that art.
12 granted the right „to know the logic of any automated processing of data concerning him, at least in the case of automated decisions referred to in Article 15, does not really help if you know nothing at all about automated decisions. This is also the case if the profile subsequent to the request may in fact constitute personal data and fall within the scope of Articles 11 and 12, which oblige the decision-maker concerned to inform and grant access to the data subject. In other words, the current technological and organisational infrastructure makes it almost impossible to seriously assess whether and when the directive is being violated, creating the illusion of adequate data protection. As Hosein (2005) has argued, the U.S. approach may actually yield better results because of the constitutional protections available and the more vigilant nature of civil society in the United States. The data has a legal status. They are protected, at least personal data. Europe tends to understand this protection as a personal right, which opens up the possibility of declaring certain data inalienable.
In practice, however, the leakage of personal data is understood as consent to its storage and use. Regardless of the written guarantees that we find in the privacy policy, in practice most people most of the time do not even have the slightest idea what happens to what data, which leads to the application of which profiles. Some American scientists, notably Lessig (1999), advocate co-modification to facilitate the trade in personal data. In their eyes, this should at least provide some sort of citizen control. However, as discussed above in relation to Schwartz (2000), market failure can be expected in the sense that due to grotesque asymmetries in knowledge, implied consent is based on ignorance – just as it is today. In both cases, one of the problems is that we do not have access to group profiles derived from the mass of aggregated data and we have no idea how these profiles affect our chances of life. It may be time to rethink the legal focus on protecting personal data, as well as the focus of privacy advocates on privacy-enhancing technologies. What we need is additional attention to dynamically derived group profiles, which should not be derived from one`s own personal data at all, but may still contain knowledge about likely (un)healthy habits, earning capacity, willingness to take risks, lifestyle preferences, consumption habits, political associations, etc. In Ortega-Melendres v. Arpaio,44 a federal district court recently upheld a class action lawsuit on behalf of Latinos in Maricopa County, Arizona, and found that the plaintiffs had provided sufficient evidence that the Maricopa County Sheriff`s Office intentionally engaged in racial profiling during traffic stops. Among the evidence cited by the court were statements by the sheriff suggesting that his officers are both authorized and encouraged to arrest people based on their appearance, with specific references to racial characteristics, which he said are the characteristics of individuals who have „the appearance of the Mexican illegal.“ 45 Accordingly, the Tribunal upheld the petitioners` Fourth and Fourteenth Amendments claims and issued an injunction prohibiting the Department from „detaining a person solely on the basis of reasonable knowledge or belief, without unlawfully residing in the United States“.
46 To address potential threats, we need to take a closer look at the asymmetries between citizens, on the one hand, and large organizations that have access to their profiles, on the other. We are not referring to the asymmetry of effective access to personal data, but to the asymmetry of effective access to knowledge. In particular, to the extent that such knowledge is protected under a trade secret or intellectual property, the citizens to whom such knowledge may be applied shall not have access to it. Zarsky (2002-2003) used a number of examples to show how this lack of access can lead to what he calls the „autonomy trap.“ Precisely because a person is not aware of the profiles applied to him, he can be seduced to act in a way that he would not have chosen otherwise. Imagine that my online behavior is profiled and associated with a group profile that predicts that the probability that I am a smoker about to quit smoking is 67%.