Abstract
Engineers have long demonstrated that anonymizing bulk data is insufficient a condition for ensuring that specific individuals are not isolated and re-identified out of broader amounts of data and metadata. However, this approach still focuses on the most traditional conception of privacy, rooted in “the individual” as “sovereign” over their own
personal sphere, whereby an individual’s privacy is to be protected as such. Instead, and just as worryingly, algorithms may seriously impair the right to privacy held by groups of individuals (and thus, more subtly, individuals themselves, but relationally). Indeed, the first rudimentary approaches to these issues concluded that a functional equivalence does exist between the way personal data are gathered and aggregated and the patterns of data analysis about groups of individuals, with the difference that whilst the first is (unevenly) regulated nationally and internationally, the second is almost consistently disregarded. Consequently, one should resolve to recognize full-standing “rights to group privacy” as a legal rather than moral obligation. It is thus advisable to assess whether individual and group rights to privacy should be enforced simultaneously, or
algorithms which protect singular identities rather ought to sacrifice the group dimension. Moreover, policymakers shall establish the (expectedly sophisticated) criteria for deciding what groups would not be simply considered as “aggregated bundles of individual privacy rights” (defined here as “collective rights”, like in a standard “class
action”) by algorithms’ owners, programmers, and end-users. The main contribution of the present study is to further the theoretical understanding of the relevance of mentioned distinction (collective vs. group rights) by drawing normative inferences from the modernpsychology theory of Gestalt, eventually encoding a wider range of legal-philosophy shades and plausible implications thereof for increasingly nuanced privacy entitlements in everyday life.
personal sphere, whereby an individual’s privacy is to be protected as such. Instead, and just as worryingly, algorithms may seriously impair the right to privacy held by groups of individuals (and thus, more subtly, individuals themselves, but relationally). Indeed, the first rudimentary approaches to these issues concluded that a functional equivalence does exist between the way personal data are gathered and aggregated and the patterns of data analysis about groups of individuals, with the difference that whilst the first is (unevenly) regulated nationally and internationally, the second is almost consistently disregarded. Consequently, one should resolve to recognize full-standing “rights to group privacy” as a legal rather than moral obligation. It is thus advisable to assess whether individual and group rights to privacy should be enforced simultaneously, or
algorithms which protect singular identities rather ought to sacrifice the group dimension. Moreover, policymakers shall establish the (expectedly sophisticated) criteria for deciding what groups would not be simply considered as “aggregated bundles of individual privacy rights” (defined here as “collective rights”, like in a standard “class
action”) by algorithms’ owners, programmers, and end-users. The main contribution of the present study is to further the theoretical understanding of the relevance of mentioned distinction (collective vs. group rights) by drawing normative inferences from the modernpsychology theory of Gestalt, eventually encoding a wider range of legal-philosophy shades and plausible implications thereof for increasingly nuanced privacy entitlements in everyday life.
Original language | English |
---|---|
Pages (from-to) | 55-114 |
Number of pages | 60 |
Journal | Loyola University Chicago Journal of Regulatory Compliance |
Issue number | VIII: Spring |
Publication status | Published - 2022 |