Add basic blog post pattern
All checks were successful
continuous-integration/drone/push Build is passing
All checks were successful
continuous-integration/drone/push Build is passing
This commit is contained in:
parent
36f600fc01
commit
c5742161e0
67
patterns/blog-post-content.php
Normal file
67
patterns/blog-post-content.php
Normal file
@ -0,0 +1,67 @@
|
|||||||
|
<?php
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Title: Blog Post content
|
||||||
|
* Slug: jett/blog-content
|
||||||
|
* Categories: jett_full_page_layouts, jett
|
||||||
|
* Post Types: post
|
||||||
|
*/
|
||||||
|
?>
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p><strong>Contemporary discussion about</strong> automated computer systems is feeding into a moral panic for which technology is the savior. As mentioned in various (and usually United States-based) news stories and popular discourse, systems powered by bad data, bad algorithmic models, or both lead to ‘high-tech’ discrimination – misclassifications, over targeting, disqualifications, and flawed predictions that affect some groups, such as historically marginalized ones, more than others. To remedy this problem, many argue that the introduction of fair, accountable, and transparent machine learning will thwart biased, racist, or sexist automated systems, or so the story goes.</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p><br>But what computer scientists, engineers, and industry evangelists of fair machine learning get wrong is the sufficiency of technical tweaks to prevent or avoid discriminatory outcomes. This weakness stems not only from the fact that fairness, the counterpart to discrimination2, means many different things depending on one’s normative understanding of equality. It also derives from the fact that these competing frameworks marshal different resources and remedies that variously involve laws, institutional policies, and procedures, as well as require cultural transformation to shift people’s behaviors, norms, and practices towards individuals and groups that differ from the status quo. Moreover, as Young (Citation1990) explains, discrimination ties to larger processes of oppression, which leave socially different groups susceptible to processes of violence, marginalization, exploitation, cultural imperialism, and powerlessness.</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p></p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:heading {"style":{"typography":{"fontSize":"42px"}}} -->
|
||||||
|
<h2 class="wp-block-heading" id="our-aims" style="font-size:42px">1. Our Aims</h2>
|
||||||
|
<!-- /wp:heading -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p>In this article, we grapple with the insufficiency of a techno-centric focus on data and discrimination by decentering debates on algorithmic bias and data injustices and connecting them to ongoing and often entrenched debates about traditional discrimination and injustice, which is not technologically mediated. This reflexive turn requires acknowledgment not only of the growing threats of surveillance capitalism (Zuboff, Citation2019), but also other social institutions or practices which have contributed to differential treatment of social groups.</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:quote -->
|
||||||
|
<blockquote class="wp-block-quote"><!-- wp:paragraph {"fontFamily":"instrument-sans-semicondensed"} -->
|
||||||
|
<p class="has-instrument-sans-semicondensed-font-family">But what computer scientists, engineers, and industry evangelists of fair machine learning get wrong is the sufficiency of technical tweaks to prevent or avoid discriminatory outcomes.</p>
|
||||||
|
<!-- /wp:paragraph --></blockquote>
|
||||||
|
<!-- /wp:quote -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p>To accomplish this aim, we briefly review the ‘techno-centricity’ of fairness, accountability, and transparency studies, as well as data justice studies, which adopt a more sociotechnical approach but which nonetheless privilege technology. We then develop a normative ‘decentered’ framework that relies on Fraser’s (Citation2010) recent theory of social justice. We use this framework to analyze how European civil society groups make sense of data and discrimination. Attending to ideas of maldistribution, misrecognition, and misrepresentation, our thematic analysis of interviews with 30 civil society representatives in Europe’s human rights sector. We show how many groups prioritize the specific experiences of marginalized groups and ‘see through’ technology, acknowledging its connection to larger systems of institutionalized oppression. This decentered approach contrasts the process-oriented perspective of tech-savvy civil society groups that shy from an analysis of systematic forms of injustice. We conclude by arguing for a plurality of approaches that challenges both discriminatory processes (technological or otherwise) and discriminatory outcomes and that reflects the interconnected nature of injustice today.</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:heading {"level":3,"style":{"typography":{"fontSize":"32px"}}} -->
|
||||||
|
<h3 class="wp-block-heading" id="technologically-mediated-discrimination" style="font-size:32px">Technologically mediated discrimination</h3>
|
||||||
|
<!-- /wp:heading -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p>To appreciate the relevance of Fraser’s theory of justice, it is helpful to understand differences in how technology has been centered in discussion about discrimination. A comparison between the emergent fields of fairness, accountability, and transparency in machine learning, on the one hand, and data justice, on the other, also reveals how marginalization or systems of oppression do – and do not – feature alongside discussions of technology.</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p>Fairness, accountability, transparency, and data justice in automated systems of a highly influential field focus on engineering and technical choices to deal with problematic automated systems that risk harming specific groups. This field, known as fairness, accountability, and transparency studies, concentrates on various ethical dilemmas related to automated computer systems (Barocas, Citation2015).</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:heading {"style":{"typography":{"fontSize":"42px"}}} -->
|
||||||
|
<h2 class="wp-block-heading" id="encryption-as-contemporary-resistance" style="font-size:42px">2. Encryption as contemporary resistance</h2>
|
||||||
|
<!-- /wp:heading -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p>In this article, we grapple with the insufficiency of a techno-centric focus on data and discrimination by decentering debates on algorithmic bias and data injustices and connecting them to ongoing and often entrenched debates about traditional discrimination and injustice, which is not technologically mediated. This reflexive turn requires acknowledgment not only of the growing threats of surveillance capitalism (Zuboff, Citation2019), but also other social institutions or practices which have contributed to differential treatment of social groups.</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p>To accomplish this aim, we briefly review the ‘techno-centricity’ of fairness, accountability, and transparency studies, as well as data justice studies, which adopt a more sociotechnical approach but which nonetheless privilege technology. We then develop a normative ‘decentered’ framework that relies on Fraser’s (Citation2010) recent theory of social justice. We use this framework to analyze how European civil society groups make sense of data and discrimination. Attending to ideas of maldistribution, misrecognition, and misrepresentation, our thematic analysis of interviews with 30 civil society representatives in Europe’s human rights sector. We show how many groups prioritize the specific experiences of marginalized groups and ‘see through’ technology, acknowledging its connection to larger systems of institutionalized oppression. This decentered approach contrasts the process-oriented perspective of tech-savvy civil society groups that shy from an analysis of systematic forms of injustice. We conclude by arguing for a plurality of approaches that challenges both discriminatory processes (technological or otherwise) and discriminatory outcomes and that reflects the interconnected nature of injustice today. ■</p>
|
||||||
|
<!-- /wp:paragraph -->
|
||||||
|
|
||||||
|
<!-- wp:paragraph -->
|
||||||
|
<p></p>
|
||||||
|
<!-- /wp:paragraph -->
|
Loading…
Reference in New Issue
Block a user