Algorithms are calculation engines that sort information to form a single output that is pertinent to the user (Crawford 2016, p. 79), and this process is determined by programmers who design and modify the detailed instructions software adheres to. Algorithms are not autonomous as they are “rule-based mechanisms” (Crawford 2016, p. 85) and it is these unknown rules that produce the ‘winners’ of information contests (Crawford 2016, p. 86). Studies have concluded that users solely observe the first result page and of this first page, users often do not consider results that are ranked below third (Bar-Ilan 2007, p. 156) and therefore, it is significant for site owners to improve the rank and subsequent visibility of their websites. To ensure this improved placement, site owners “invest great efforts and money in pleasing the search engines [rather than] trying to please the users (Bar Ilan 2007, p. 156) and this has initiated attempts to ‘game’ search algorithms (Crawford 2016, p. 82) for their own advantage. Site owners can employ Search Engine Optimisation services (consequently favouring those with greater financial resources) or groups of people can develop Google bombs (Bar Ilan 2007, p. 156). Google bombing is defined as “the activity of designing Internet links that will bias search engine results so as to create an inaccurate impression of the search target” (Price 2002 cited in Bar Ilan 2007, p. 156), and this manipulation of the algorithm compromises the validity and equity of information contests. For example, if ‘Jew’ was written in the search engine, the algorithm would have directed users to a highly anti-Semitic website (Bar Ilan 2007, p 161). Google argued that the “objectivity of [their] ranking system [prevented them] from making any changes” (Google 2007 cited in Bar Ilan 2007, p. 162), however, the programmers of the search engine should have a responsibility to modify algorithmic ‘gaming’ that promotes offensive, subjective content. This is further complicated on social media platforms because content moderation is dependent on one’s interpretation of the terms of service (Matamoros-Fernandez 2017, p. 931) and their understanding of what constitutes as hate speech (Matamoros-Fernandez 2017, p. 936) Facebook allows humour and satire to produce controversial content, however, this may be problematic as “defences of satire and irony to disguise racist and sexist commentary are a common practice online … that fosters discrimination and harm” (Matamoros-Fernandez 2017, p. 936). It is therefore important to establish clear, thorough guidelines to reduce subjectivity in content moderation (Matamoros-Fernandez 2017, p. 937) and also discourage unjustified censorship of legitimate posts.

An example of a Google bomb. 'Failure' is written in the search engine and the official biography of George W. Bush is the fifth result
An example of a Google bomb

Bar-Ilan, J. 2007, ‘Manipulating search engine algorithms: the case of Google’, Journal of Information, Communication & Ethics in Society, vol. 5, no. 2, pp. 155-166.

Crawford, K. 2016, ‘Can an Algorithm be Agonistic? Ten Scenes from Life in Calculated Publics’, Science, Technology, & Human Values, vol. 41, no. 1, pp. 77-92.

Matamoros-Fernandez, A. 2017, ‘Platformed racism: the mediation and circulation of an Australian race-based controversy on Twitter, Facebook and YouTube’, Information, Communication & Society, vol. 20, no. 6, pp. 930-946.

SearchEnginePeople. 2010, The 10 Most Incredible Google Bombs, SEO, viewed 23 August 2019, <https://www.searchenginepeople.com/blog/incredible-google-bombs.html >.

Leave a comment