2020
DOI: 10.17576/gema-2020-2004-06
|View full text |Cite
|
Sign up to set email alerts
|

Google Autocomplete Search Algorithms and the Arabs' Perspectives on Gender: A Case Study of Google Egypt

Abstract: Search engines have become an essential part of everyone's life, with Google being the most popular. Google Search provides the autocomplete feature for faster and easier search results, offering 10 top suggestions at a time, and these may influence how users view different social groups. Different scholars have explored online discourse to reveal stereotypes about certain groups. However, little or no attention has been paid to technological affordances to reveal broader gender biases and stereotypes in the A… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 36 publications
(40 reference statements)
0
3
0
Order By: Relevance
“…Previous work on the moderation of especially Google autocompletion has concerned itself with how it was once prone to outputting derogatory content such as 'are Jews [evil]', where the autocompletion part is in brackets [10]. Indeed, journalists and scholars alike have reported particularly shocking outputs for queries of 'women' [72], 'old men' and 'old women' [55], religions [10], sexual orientation [4], gender identity [2] and others. Generally speaking, up until 2016, Google product outputs, from web search to autocompletion, were described as 'organic' by the company, or reflections, however unpleasant, of 'what was happening on the web' [10].…”
Section: Related Work 21 Content Moderation Of Search Engine Autocomp...mentioning
confidence: 99%
See 1 more Smart Citation
“…Previous work on the moderation of especially Google autocompletion has concerned itself with how it was once prone to outputting derogatory content such as 'are Jews [evil]', where the autocompletion part is in brackets [10]. Indeed, journalists and scholars alike have reported particularly shocking outputs for queries of 'women' [72], 'old men' and 'old women' [55], religions [10], sexual orientation [4], gender identity [2] and others. Generally speaking, up until 2016, Google product outputs, from web search to autocompletion, were described as 'organic' by the company, or reflections, however unpleasant, of 'what was happening on the web' [10].…”
Section: Related Work 21 Content Moderation Of Search Engine Autocomp...mentioning
confidence: 99%
“…We decided to follow the categorisation of Choenni et al[13] in this case. We could have also created more intersectional categories (e.g., queer Indian men), but left the broader terms (with up to one qualifier) so that it would allow for comparison of moderation attention (across engines) in the categories demarcated by Google 2. The source code developed by[13] is available here: https://github.com/ RochelleChoenni/stereotypes_in_lms3 We found this prompt to be particularly effective in returning the maximal number of results for non-marginalised or non-politicised groups during an initial data exploration, compared to four others in the original research by Choenni et al[13].…”
mentioning
confidence: 99%
“…Recent research shows that search suggestions–the short lists of words and phrases users are shown as they type characters into the search bar–can also shift thinking and behavior [ 15 , 49 , cf. 50 57 ]. Because negative (or “low-valence”) words draw far more attention and clicks than neutral or positive words [ 58 , 59 ], one of the simplest ways to shift opinions to favor one candidate or cause is to suppress negative search terms for that candidate or cause.…”
Section: Introductionmentioning
confidence: 99%