IASSIST Quarterly 2022-12-29T06:56:38-07:00 Karsten Boye Rasmussen Open Journal Systems <p class="p1">The <strong>IASSIST Quarterly</strong> at is an international, peer-reviewed, indexed, open access quarterly publication of articles dealing with social science information and data services, including relevant societal, legal, and ethical issues.</p> <p class="p1">The <strong>IASSIST Quarterly</strong> represents an international cooperative effort on the part of individuals managing, operating, or using machine-readable data archives, data libraries, and data services. The <strong>IASSIST Quarterly </strong>reports on activities related to the production, acquisition, preservation, processing, distribution, and use of machine-readable data carried out by its members and others in the international social science community. </p> The work continues 2022-12-15T13:10:49-07:00 Michele Hayslett <p>Welcome to the final issue of the <em>IASSIST Quarterly</em> for the year 2022 – <em>IQ</em> volume 46(4), our eagerly-awaited special issue on <strong>Systemic Racism in Data Practices</strong>.</p> <p>This issue represents more than you might think: the culmination of more than two years of the intellectual hard work of writing, of course, but that in itself is not unusual for any journal issue. However. The global pandemic exploded just after the conception of this special issue and hit all of us hard, wreaking not only physical destruction of lives but also unleashing social upheaval, job insecurity, housing insecurity, and major mental health challenges. Social injustice erupted during the pandemic, shocking and enraging many of us with its violence and disregard for human dignity. I was privileged to witness the genesis of this issue, and I helped recruit our guest editors, Trevor Watkins and Jonathan Cain. I salute their perseverance, patience and courage, and that of the article authors, in bringing this content to fruition. Many involved in this issue faced multiple personal challenges, from the loss of family members to repeated moves, job changes, and more in the process of trying to get this work done. Some were unable to surmount the many obstacles and were forced to withdraw their proposals. So I do not think it is hyperbole to say this is the hardest issue we have ever produced. Trevor and Jonathan, thank you again for spearheading this important work.</p> <p>Some good things have come from the societal call for racial justice for IASSIST, including this issue of the <em>IQ</em>. IASSIST has initiated several new ventures to advocate for diversity and equity, both within our organization and among researchers generally: We restructured our membership fees to allow half price for people joining from lower income countries. IASSIST also sponsored diversity scholarships for members to attend the American Library Association conference and the ICPSR Summer Program in Quantitative Methods in 2022. A new <a href="">Anti-racism Resources Interest Group</a> which focuses on compiling anti-racism resources has been working for more than two years and recently collaborated with the Professional Development Committee to present a webinar on varying national approaches to collecting (or not collecting) data about race and ethnicity (see <a href="">this page</a> for the webinar recording as well as the essays members have written). The group welcomes contributions of essays for additional countries and suggestions of other webinar topics. Looking ahead, the 2023 conference theme is Diversity in Research: Social Justice from Data, sure to result in some fascinating presentations (and future <em>IQ</em> papers!). And here at the <em>IQ</em>, we’re already contemplating a second special issue in this area around the role of social justice in data services. We invite volunteers who would like to serve as guest editors to contact us. And so the work continues.</p> <p>The <em>IQ</em> editorial team is happy to welcome a new volunteer, Phillip Ndhlovu, as our Managing Editor with this issue. Phillip is the Deputy Librarian at the Gwanda State University Library in Filabusi, Zimbabwe. We thank him profusely—his role is key to producing every issue and his participation enables Ofira and me to focus on learning the editor’s role. We welcome suggestions for new features or columns, and encourage you to reach out if you are interested in becoming involved.</p> <p>From all of us on the <em>IQ</em> editorial team, we wish you a much better year in 2023. And meanwhile, enjoy the hard work herein of your colleagues. Read on for Trevor and Jonathan’s guest editors’ notes describing the enclosed articles.</p> <p>For the <em>IQ</em> Editorial Team,</p> <p> </p> <p><strong>Michele Hayslett</strong> – December 2022</p> <p>Karsten Boye Rasmussen</p> <p>Ofira Schwartz-Soicher</p> 2022-12-28T00:00:00-07:00 Copyright (c) 2022 Michele Hayslett Systemic racism in data practices 2022-12-23T15:07:36-07:00 Trevor Watkins Jonathan O. Cain <p><strong>Positionality statement </strong></p> <p>As we begin to discuss this issue, its origins, and its importance in contemporary society, I wanted to acknowledge my positionality and the role that it may play in the formation of this issue. Jonathan O. Cain is an African-American male working in the LIS field. Before moving into administration, I taught data and digital literacy and worked on developing programs that focused on improving access to these critical skills at zero cost to learners.</p> <p>It is important to acknowledge my positionality and the lens through which I see the data science field. Trevor Watkins is an African American male working in the LIS field at an academic institution in an academic library. I teach critical data literacy workshops and engage in diversity and BIPOC-related digital projects with faculty, students, and the broader academic community across the country. I am also a researcher and practitioner in artificial intelligence (AI) and data science.</p> <h2>The global pandemic, its impacts, and why it matters</h2> <p>We first met in August 2020 to discuss the possibilities of this special issue about five months into the pandemic. We spent a good chunk of that meeting getting to know each other and, most importantly, discussed the toll the pandemic placed on our communities and us. It is probably safe to say that many of you, at some point, were uncertain of the future. Like most people worldwide, we lost family and friends or knew of people who succumbed to Covid-19 and other illnesses that weren't treated because the focus shifted to Covid-19. We get it. At one point, Covid-19 killed over three thousand people per day (Centers for Disease Control and Prevention (CDC), 2022). According to data from the CDC, 90% of the 385,676 people who died between March and December 2020 had Covid-19 listed as the underlying cause of death on their death certificate. The murders of Ahmaud Arbery in February, Breonna Taylor in March, and George Floyd in May 2020 sparked civic unrest across the United States (US) and protests across the globe in solidarity against racial injustice. When we announced this special issue and initiated a call for papers, we didn't get much of a response initially. We expected and acknowledged that it would probably take some time before we received inquiries or proposals about the issue, the intent to submit, or any submissions.</p> <p>Like many of you, we are still picking up the pieces from 2020 and dealing with the aftermath of Covid-19. The pandemic may be over now, depending on whom you ask, but the emotional scars are still there and may remain so for quite some time. Patience was the one quality we all had throughout this process, which is why we can present this publication today.</p> <h2>Data and liberatory technology</h2> <p>Liberatory technology. This is a concept that invited contemplation as we sat down to record our reflections on this special issue. In drawing together scholars, educators, and practitioners to address the issue of data and its relationship to race, ethnicity, and representation, we, as coeditors, were making a statement about the importance of data, the material impact that this seemingly abstract and ethereal object can and does have on individual and community lives. And thinking about that impact brought liberatory technology to the front of our minds. The definition of liberator technology offered by the IDA B. Wells Just Data Lab intrigues us and invites us to grapple with that topic. They defined liberatory as something that "supports the increased freedom and wellbeing of marginalized people, especially black people outside of capitalism and settler colonial power structures" and technology as "a tool used to accomplish a task." And as we contemplate this set of definitions, we are left to question whether data can be a liberatory technology or not. (LIBERATORY TECHNOLOGY AND DIGITAL MARRONAGE, n.d.)</p> <p>In Liberation Technology: Black Protest in the Age of Franklin, Richard S. Newman draws parallels with the asserting ownership and mastery of new communication technologies and black liberation activities. Reflecting on the transformative nature of print technology, he writes, "If the Marquis de Condorcet was right in 1793 that print had unshackled Europe from medieval modes of thought and action, then it is also true that print was perhaps the first technology to liberate blacks from the servile images that had long haunted their existence in Western culture." And draws a 19th-century example of how it expressly connects to black lives post-emancipation noting "W. E. B. Du Bois certainly thought that black history and print history worked in tandem. Wherever one found newspapers in the post-Civil War South, he observed, one found some form of black freedom" (Richard S. Newman, 2009, p. 175). He even notes how scholars note that black activists embraced other communication technologies like photography "to reshape the image of African Americans in nineteenth-century culture." (Richard S. Newman, 2009, p. 175)</p> <p>We have no shortage of examples of how data and data-driven technologies fail to support the "increased freedom and wellbeing of marginalized people outside of capitalism and settler colonial power structures." In 2016, ProPublica published Machine Bias, a report that looks at Risk assessment technologies used in arraignment and sentencing. They report that "The formula was particularly likely to falsely flag black defendants as future, wrongly labeling them this way at almost twice the rate as white defendants" and "white defendants were mislabeled as low risk more often than black defendants" (Julia Angwin, 2016). A 2021 article, Fairness in Criminal Justice Risk Assessments: The State of the Art, in their analysis, noted, "The false negative rate is much higher for whites so that violent white offenders are more likely than violent black offenders to be incorrectly classified as nonviolent. The false positive rate is much higher for blacks so that nonviolent black offenders are more likely than nonviolent white offenders to be incorrectly classified as violent. Both error rates mistakenly inflate the relative representation of blacks predicted to be violent. Such differences can support claims of racial injustice. In this application, the trade-off between two different kinds of fairness has real bite." (Berk et al., 2021, p. 33)</p> <p>These are just a few examples of how these technological developments, on their own merits, fail to meet the definition offered by the authors of the "Liberatory Technology and Digital Marronage" Zine from the Ida B. Wells Just Data Labs. Reflecting on the technological path illustrated by Newman, the work of ownership and mastery of the tool provides the potential for it to be liberatory. Through this lens, the work of the Just Data Lab is exemplary for this meditation; it draws a direct line from technology, education, mastery, and liberatory technology.</p> <h2>Data in higher education</h2> <p>Data literacy education is an area that has been a focus of our careers in librarianship. It's a space where we saw the libraries' ability to make a meaningful impact. Data has had a tremendous impact on college campuses, from how research is conducted to the pressures colleges feel from stakeholder groups: students, governments, funders, donors, and employers to prepare students with the data and technology skills to gain employment in the knowledge economy.</p> <p>As colleges and universities have turned (with varying degrees of success) to meet the needs of these communities, a myriad of explorations on the importance of the representation of these marginalized communities in these systems—to combat and dismantle the harmful practices that we see embedded in the systems that drive society and the potentially debilitating consequences they produce. That is partly why the works in this special issue are so important at this moment in time. These scholars and scholar-practitioners are engaging with these issues that drive the opaque structures surrounding us. And hopefully, their work can give us another perspective on how to engage with these structures and transform them to support liberatory practices.</p> <h2>The entries in this issue</h2> <p>We have some fantastic articles for you to read in this issue. We open with an article by Kevin Manuel, Rosa Orlandini, and Alexandra Cooper, who discuss how the collection process of racial, ethnic, and indigenous data has evolved in the Canadian Census since 1871, the erasure of minorities and indigenous citizens from those censuses, and the work to restore and accurately identify and categorize racialized groups.</p> <p>In the next article, Leigh Phan, Stephanie Labou, Erin Foster, and Ibraheem Ali present a model for data ethics instruction for non-experts by designing and implementing two data ethics workshops. They make important points about the failure of academia to incorporate the ethical use of data in course curriculums and digital literacy training and demonstrate how academic libraries have become an essential resource for the academic community. Their workshop structure can be modeled for any academic library that endeavors to provide a similar service to its community.</p> <p>In the third article, Natasha Johnson, Megan Sapp Nelson, and Katherine Yngve, interrogate the collective and local purposes of institutional data collection and its impact on student belongingness and propose a framework based on data feminism that centers the student as a person rather than a commodity.</p> <p>Finally, our closing article from Thema Monroe-White focuses on marginalized and underrepresented people in the data science field. The author proposes that racially relevant and responsive teaching is necessary to recruit more people from these groups and diversify the field. She discusses how the Ladson-Billings model of cultural relevant pedagogy has been applied and is beneficial to STEM curriculums, and how a liberatory data science curriculum could promote a student's voice and sense of belonging.</p> <h2>Conclusion</h2> <p>We want to thank all those involved in producing this special issue. We want to thank the authors first. Their patience, dedication, and perseverance throughout this process were much appreciated. The reviewers provided timely, very detailed, and thorough feedback. We would be remised if we didn't acknowledge their hard work and labor. We would like to thank the IQ Editorial Team, Michele Hayslett and Karsten Boye Rasmussen, for working with us over the last two years, and Ofira Schwartz-Soicher, for helping us get to the finish line.</p> <p>Trevor Watkins</p> <p>Jonathan O. Cain</p> <h2>References</h2> <p>Berk, R., Heidari, H., Jabbari, S., Kearns, M., &amp; Roth, A. (2021). Fairness in Criminal Justice Risk Assessments: The State of the Art. <em>Sociological Methods &amp; Research</em>, <em>50</em>(1), 3–44. <a href=""></a></p> <p>Flipsnack. (n.d.). <em>Liberatory Technology Zine</em>. Flipsnack. Retrieved December 17, 2022, from <a href=""></a></p> <p><em>LIBERATORY TECHNOLOGY AND DIGITAL MARRONAGE</em>. (n.d.). IDA B. WELLS JUST DATA LAB. Retrieved December 17, 2022, from <a href=""></a></p> <p>Mattu, J. A., Jeff Larson,Lauren Kirchner,Surya. (n.d.). <em>Machine Bias</em>. ProPublica. Retrieved December 17, 2022, from <a href=""></a></p> <p>Richard S. Newman. (2009). Liberation Technology: Black Printed Protest in the Age of Franklin. <em>Early American Studies: An Interdisciplinary Journal</em>, <em>8</em>(1), 173–198. <a href=""></a></p> 2022-12-28T00:00:00-07:00 Copyright (c) 2022 Trevor Watkins, Jonathan O. Cain Who is counted? Ethno-racial and indigenous identities in the Census of Canada, 1871-2021 2022-06-13T10:11:18-06:00 Kevin Manuel Rosa Orlandini Alexandra Cooper <p>Finding data on race, racialized populations, and anti-racism in Canada can be a complex process when conducting research. One source of data is the Census of Canada which has been collecting socio-demographic data since 1871. However, the collection of racial, ethnic, or Indigenous data has changed throughout the years and from Census to Census. In response to the need for more support in finding ethno-racial and Indigenous data, the Ontario Council of University Libraries’ Ontario Data Community has created an online guide to provide guidance, in part, about the terminology used for Indigenous and racialized identities over time in the Census. In this article, the modifications to how ethno-racial origin questions have been asked, and the ongoing changes to sociocultural perceptions impacting the Census are reviewed.</p> 2022-12-28T00:00:00-07:00 Copyright (c) 2022 Kevin Manuel A model for data ethics instruction for non-experts 2022-06-23T05:06:23-06:00 Leigh Phan Ibraheem Ali Stephanie Labou Erin Foster <p>The dramatic increase in use of technological and algorithmic-based solutions for research, economic, and policy decisions has led to a number of high-profile ethical and privacy violations in the last decade. Current disparities in academic curriculum for data and computational science result in significant gaps regarding ethics training in the next generation of data-intensive researchers. Libraries are often called to fill the curricular gaps in data science training for non-data science disciplines, including within the University of California (UC) system. We found that in addition to incomplete computational training, ethics training is almost completely absent in the standard course curricula. In this report, we highlight the experiences of library data services providers in attempting to meet the need for additional training, by designing and running two workshops: Ethical Considerations in Data (2021) and its sequel Data Ethics &amp; Justice (2022). We discuss our interdisciplinary workshop approach and our efforts to highlight resources that can be used by non-experts to engage productively with these topics. Finally, we report a set of recommendations for librarians and data science instructors to more easily incorporate data ethics concepts into curricular instruction.</p> 2022-12-28T00:00:00-07:00 Copyright (c) 2022 Leigh Phan, Ibraheem Ali, Stephanie Labou, Erin Foster Deficit, asset, or whole person? Institutional data practices that impact belongingness 2022-07-01T14:20:00-06:00 Nastasha Johnson Megan Sapp Nelson Katherine Yngve <p>Given the capitalist model of higher education that has developed since the 1980s, the data collected by institutions of higher education on students is based on micro-targeting to understand and retain students as consumers, and to retain that customer base (i.e. to prevent attrition/dropouts). Institutional data has long been collected but the authors will question how, why, and for whom the data is collected in the current higher education model. The authors will then turn to the current higher education focus on equity, diversity, inclusion, and particularly on the concept of belongingness in higher education. The authors question the collective and local purposes of institutional data collection and the fallout of the current practices and will argue that using existing institutional data to facilitate student belongingness is impossible with current practices. We will propose a new framework of asset-minded institutional data practices that centers the student as a whole person and recenters data collection away from the concept of students as commodities. We propose a new framework based on data feminism that intends to elevate qualitative data and all persons/experiences along the bell-shaped curve, not just the middle two quadrants.</p> <p> </p> 2022-12-28T00:00:00-07:00 Copyright (c) 2022 Nastasha Johnson, Megan Sapp Nelson, Katherine Yngve Emancipating data science for Black and Indigenous students via liberatory datasets and curricula 2022-06-13T10:15:33-06:00 Thema Monroe-White <p>Despite findings highlighting the severe underrepresentation of women and minoritized groups in data science, most scholarly research has focused on new methodologies, tools, and algorithms as opposed to <em>who</em> data scientists are or <em>how</em> they learn their craft. This paper proposes that increased representation in data science can be achieved via advancing the curation of datasets and pedagogies that empower Black, Indigenous, and other minoritized people of color to enter the field. This work contributes to our understanding of the obstacles facing minoritized students in the classroom and solutions to mitigate their marginalization.</p> 2022-12-28T00:00:00-07:00 Copyright (c) 2022 Thema Monroe-White