admin管理员组

文章数量:1582678

这就是搜索引擎

Have you used a search engine today? Chances are high you’ve conducted 3–4 searches on Google today, as you do per usual, every day. Just like many other people, amounting to around 3.5 billion Google searches made every day. Why are you using a search engine? Chances are, you are looking for accurate information, correct?

您今天使用过搜索引擎吗? 您每天像往常一样每天在Google上进行3-4次搜索的可能性很高。 与其他许多人一样,每天Google的搜索量约为35亿 。 为什么要使用搜索引擎? 您正在寻找准确的信息,对吗?

What if search engines make you more racist, though?

但是,如果搜索引擎使您更加种族主义怎么办?

We turn to technology for innovation, for solutions to our problem. We look at it as a means to an end. Technology as an application of the sciences to the useful arts, as Jacob Bigelow once famously put it. Social norms, values, and structures have existed before advances in technology. Tech fixes are an indicator of social issues presumed worthy of fixing. According to Safiya Umoja Noble, “there is a missing social context in commercial digital media platforms, and it matters, particularly for marginalized groups that are problematically represented in stereotypical or pornographic ways, for those who are bullied, and for those who are consistently targeted.”

我们求助于技术创新,解决问题的方法。 我们将其视为达到目的的手段。 正如雅各布·比格洛(Jacob Bigelow)曾经著名地指出的那样,技术是科学在实用艺术中的应用。 社会的规范,价值和结构在技术发展之前就已经存在。 技术修复是假定值得修复的社会问题的指标。 根据Safiya Umoja Noble的说法,“在商业数字媒体平台中缺少社交环境,这很重要,特别是对于那些以定型或色情方式出现问题的边缘化群体,被欺负的人以及始终如一的目标群体。”

压迫算法 (The Algorithms of Oppression)

What are the so-called algorithms of oppression? To understand search engines and the results they spit out when looking for a solution to our problem, we must understand why specific results appear. Search engine results you see are the most popular ones. Thus, the algorithms make us believe that we have objectively got the best results, although they are probably the best paid for results. Why is it problematic to have the most popular results turn up first when we look something up? If it’s popular, it must have helped many people looking for a solution to a similar problem.

所谓的压迫算法是什么? 要了解搜索引擎以及他们在寻找解决方案时吐出的结果,我们必须了解为什么会出现特定的结果。 您看到的搜索引擎结果是最受欢迎的结果。 因此,这些算法使我们相信我们客观地获得了最好的结果,尽管它们可能是为结果付出最好的结果。 当我们查找某些内容时,为什么首先显示最受欢迎的结果会带来问题? 如果它很流行,那么它一定已经帮助了许多人寻找类似问题的解决方案。

Therefore, we can trust the search results. No, we cannot. While we trust the results to be objective and accurate, we tend to forget that “multinational advertising platforms online are not trusted, credible public information portals. Most people think of Google and search engines in particular as a public library, or as a trusted place where they can get accurate information about the world.” Noble goes on to say:

因此,我们可以信任搜索结果。 不,我们不可以。 尽管我们相信结果是客观和准确的,但我们往往会忘记“在线跨国广告平台不是值得信赖的,可信的公共信息门户。 大多数人特别将Google和搜索引擎视为一个公共图书馆,或者是一个可以获取有关世界的准确信息的受信任的地方。” 来宝继续说 :

“They’re designing technologies for society, and they know nothing about society.”

“他们正在为社会设计技术,但他们对社会一无所知。”

Another way that racist slurs, bigotry, and anti-semitic sentiments are further promoted lies in the power of keyword targeting. Since search engines are making money with this service, the chances are that we see paid results. In 2020, Leon Yin and Aaron Sankin published their article Google Ad Portal Equated “Black Girls” with Porn, researching Google’s Keywords Planner. It showed that “Google’s systems contained a racial bias that equated people of color with objectified sexualization while exempting White people from any associations whatsoever. In addition, by not offering a significant number of non-pornographic suggestions, this system made it more difficult for marketers attempting to reach young Black, Latinx, and Asian people with products and services relating to other aspects of their lives.”

种族歧视,偏执和反犹太情绪进一步得到提升的另一种方式是关键字定位功能。 由于搜索引擎通过这项服务赚钱,因此我们很有可能看到付费的结果。 2020年,尹贤和亚伦·桑金(Aaron Sankin)发表了他们的文章,称 Google广告门户网站与Porn等同于“黑人女孩”,研究了Google的关键字规划师。 它表明:“ Google的系统包含种族偏见,使有色人种与客观化的性行为相提并论,而白人则不受任何社团约束。 此外,由于未提供大量非色情建议,该系统使营销人员更难以向他们提供与生活中其他方面有关的产品和服务,以吸引年轻的黑人,拉丁裔和亚洲人。”

Consequently, Google’s Keyword Planner, an essential part of its online advertising ecosystem, is very likely to privilege those with the most resources. Therefore, segments of society with the least resources, like children, will never be ready to fully control how they’re represented. Or as Hans Rollmann summarized the findings of the book Algorithms of Oppression by Safiya Umoja Noble:

因此,Google的关键字规划师是其在线广告生态系统的重要组成部分,很有可能会为拥有最多资源的用户提供特权。 因此,资源最少的社会阶层(例如孩子)永远不会准备好完全控制他们的代表方式。 或者,正如汉斯·罗尔曼(Hans Rollmann)总结的那样,萨菲亚·乌莫贾·诺布尔(Safiya Umoja Noble)的《 算法的压迫 》一书的发现:

“It demonstrates that search engines, and in particular Google, are not simply imperfect machines, but systems designed by humans in ways that replicate the power structures of the western countries where they are built, complete with all the sexism and racism that are built into those structures.”

“这表明搜索引擎,尤其是Google,不仅是不完美的机器,而且是由人类设计的系统,其复制了所建西方国家的权力结构,并内置了所有性别歧视和种族主义。这些结构。”

It raises the question of how we have to evaluate the results provided. How are these algorithms selecting the search results we’re exposed to? Who codes these algorithms?

这就提出了一个问题,即我们必须如何评估所提供的结果。 这些算法如何选择我们接触到的搜索结果? 谁编写这些算法?

算法是代码表达的观点 (Algorithms Are Opinions Expressed in Code)

People make these applications. However, as data science systems become more common, interpreting results, governing, and managing them is challenging. How do we reliably know the trustworthiness of data, algorithms, and models?

人们提出这些申请。 但是,随着数据科学系统变得越来越普遍,解释结果,控制和管理结果变得充满挑战。 我们如何可靠地了解数据,算法和模型的可信赖性?

It is undisputed that the view has been overwhelmingly white, male, and (maybe a little too) tech enthusiastic in the past. There is a need to challenge hierarchical (and empirically wrong) classification systems in data science. Why? Well, the overwhelming majority of humanity is not white and male.

毋庸置疑,过去该观点绝大多数是白人,男性和(也许也有一点)技术狂热者。 有必要挑战数据科学中的分层(经验上错误)的分类系统 。 为什么? 好吧,绝大多数人类不是白人和男性。

This problem is not new. According to Charlton McIlwain, “the question we have to confront is whether we will continue to design and deploy tools that serve the interests of racism and white supremacy.” In his article, he looks at the history of data collection and how technology has perpetuated racism from the very beginning.

这个问题并不新鲜。 查尔顿·麦克维尔(Charlton McIlwain)表示:“我们必须面对的问题是,我们是否将继续设计和部署为种族主义和白人至上的利益服务的工具。” 在他的文章中,他探讨了数据收集的历史以及技术从一开始就使种族主义永存。

Let’s face it. Google is no stranger to a long history of racism embedded in its search engine results. In her study from 2013, Havard professor Latanya Sweeney conducted searches of 2184 racially associated personal names across two websites to investigate ads’ delivery by Google AdSense. She found that “searching traditionally Black names on Google was far more likely to display ads for arrest records associated with those names than searches for traditionally White names.” Latanya Sweeney concludes that there “is discrimination in the delivery of these ads.” MIT Technology Review picked this up in an article stating she had “clearly Sweeney has discovered a serious problem here given the impact online presence can have an individual’s employment prospects” and that her idea of technology offering a solution to this problem by implementing algorithms that can “reason about the legal and social consequences of certain patters of click-throughs” very interesting and “one that Google, www.instantcheckmate and society, in general, ought to, consider in more detail.”

面对现实吧。 Google对其搜索引擎结果中嵌入的种族主义历史悠久并不陌生。 在2013年的研究中 ,哈佛大学教授Latanya Sweeney在两个网站上搜索了2184个种族相关的个人名字,以调查Google AdSense投放的广告。 她发现,“与传统上搜索白人姓名相比,在Google上搜索传统上黑人姓名的可能性更大,更可能显示与这些名字相关的逮捕记录的广告。” Latanya Sweeney得出结论,“这些广告的投放中存在歧视。” 麻省理工学院技术评论在一篇文章中对此进行了阐述,指出“鉴于网络存在对个人的就业前景的影响,斯威尼显然已经在这里发现了一个严重的问题”,并且她的技术思想通过实施可实现以下目标的算法来解决该问题“关于某些点击行为的法律和社会后果的理由”非常有趣,“一般来说,Google, www.instantcheckmate和社会应该更详细地考虑这一点。”

In 2015, Google was faced with another controversy when it’s photo service labeled pictures of black people as gorillas. How did Google fix this problem? They didn’t really. Instead, they blocked all images tagged as “gorillas,” according to a Wired report in 2018.

2015年,Google的照片服务将黑人的照片标记为大猩猩,因此面临另一个争议。 Google如何解决此问题? 他们不是真的。 相反,根据《连线》(Wired)2018年的报告,他们屏蔽了所有标记为“大猩猩”的图像。

In 2019, researchers in Brazil discovered that “searching Google for pictures of ‘beautiful woman’ was far more likely to return images of White people than Black and Asian people, and searching for pictures of ‘ugly woman’ was more likely to return images of Black and Asian people than White people.”

2019年,巴西的研究人员发现 ,“搜索Google中的'美丽女人'图片比黑人和亚裔人返回白人的可能性要高得多,而搜索'丑女人'的图片则更有可能返回白人的图像。黑人和亚洲人比白人。”

我们必须重新构造问题 (We Must Reframe the Problem)

The focus on how tech companies have perpetuated problematic power structures is essential, urgent, and cannot be stressed enough. Or, as Charlton McIlwain puts it:

关注技术公司如何使有问题的权力结构永久存在是至关重要,紧迫的,而且压力不够。 或者,正如查尔顿·麦克尔文(Charlton McIlwain)所说:

“If we don’t want our technology to be used to perpetuate racism, then we must make sure that we don’t conflate social problems like crime or violence or disease with black and brown people. When we do that, we risk turning those people into the questions we deploy our technology to solve, the threat we design to eradicate.”

“如果我们不希望我们的技术被用于使种族主义永存,那么我们必须确保我们不会将犯罪,暴力或疾病等社会问题与黑人和棕色种人混为一谈。 当我们这样做时,我们冒着将这些人变成我们部署技术来解决,设计来消除威胁的问题。”

Algorithms are rule-based; they are not rule-bound, leaving room for interpretation, creativity, and discovery. We need to look at these systems with a data feminist stance. Why? Because data feminism tries to identify oppressive power structures and flawed data by posing questions. Questions we have to ask ourselves next time we google something. Ask yourself data science by whom? Data science for whom? Data science with whose interests in mind?

算法是基于规则的; 它们不受限制,为解释,创造力和发现留下了空间。 我们需要以数据女性主义的立场来看待这些系统。 为什么? 因为数据女权主义试图通过提出问题来识别压迫性权力结构和有缺陷的数据。 下次我们谷歌搜索时,我们必须问自己的问题。 问自己数据科学由谁来做? 数据科学适合谁? 关注谁的数据科学?

I think it’s about time, don’t you?

我想是时候了,不是吗?

翻译自: https://medium/swlh/this-is-how-search-engines-reinforce-racism-43471ef85b46

这就是搜索引擎

本文标签: 这就是搜索引擎种族主义