Update Seven

Image Three. This image is used to convey the difficulty of digital dark paths. The image is still being edited as I intend to add text that will further indicate and emphasise the message I want to communicate.

Update Six

Image Two. This image is used to convey that there are actions that users can undertake to ensure the privacy of their data (for example, limit ad tracking). I took the image of the mailboxes to symbolise the collection of one’s personal information and the pixelated mailboxes communicate how advertisers cannot use (or see) your data if you do not give them permission. Therefore, users are not victims (as portrayed in the previous image) but are active in securing their own digital privacy.

I think I need to link the text directly to data to emphasise the digital literacy topic to the audience.

Update Five

This image is still being edited.

Big Data is Watching You = Big Brother is Watching You

Eye = surveillance

Windows = privacy of users and their details

The logo of Facebook in pupil = Facebook is the source of surveillance. Should I make this more distinct or is the subtlety more effective?

Update Four

Researching ‘privacy zuckering’ as a form of digital dark path led to me further understanding the extent of data Facebook has on its users – and how they profit off this. The accumulation of data that Facebook stores reminded me of Foucault’s Panopticon or rather the Data Panopticon (as a product of digitalisation).

From this, I started brainstorming how I could construct an image that discussed the relationship of Facebook and the Data Panopticon, and how the use of this data contributes to surveillance capitalism. Surveillance capitalism can assist advertisers to adjust their advertising campaign to their public, however, the data received can be exploited and used to manipulate their public (as seen in political campaigns).

It was difficult to find an image of the panopitcon that I could ethically use so I then I wanted to find an image that represented the concept of the panopticon: surveillance. I saw this image and I know I can edit this further to illustrate my argument:

worms eye view of gray building

A signifier of windows is that people can look outside and therefore, people can look inside and invade the privacy of the inhabitant. This can work well with my image.

selective focus photography of lens

This image could be placed in the circle of the first image, however, I think it would be better if I used an image of my own instead. I want to take a photo of my iris (so that it is circular in shape) and increase its transparency so that you can see the Facebook logo behind it.

I want to connect the eye to the window somehow but I am still brainstorming how I can do this in a way that contributes to the concept I want to convey.

Update Three

There has been a change in organisations. I am not representing Ted nor am I using the BCII course because I couldn’t effectively justify my image drafts and how they were of particular importance or relevance for the organisation. Therefore, I will be using the Public Communication course and will be constructing a communications campaign for Insil as their use of technological developments in advertising (and marketing) will allow me to discuss their relationship with digital literacy and data privacy.

“Insil is recognised as a leading global digital marketing agency. We offer the agility and dedication of an independent agency with the backing of considerable resources, technology and expertise, both in Australia and internationally, bringing our collective experience with some of the world’s biggest brands to every brief or challenge.” https://www.insil.com.au/

The analysis of data and the application of algorithms has improved the personalisation of social media and out-of-home advertising while simultaneously creating new issues in data privacy and thus, it is pertinent for this assessment task.

Update Two

The week 10 discussion on ‘digital dark paths’ is associated with the privacy of data and the strategic use of interface design to manipulate online navigation. For example, LinkedIn included a dark pattern on their ‘Add Contacts’ page that promoted their access to the email contacts of the user and allowed LinkedIn to share (and spam) these contacts.

The image will inform the receiver as it may use dark pattern techniques that are designed to mislead, such as colour, size, placement and complex phrasing. I am also considering the use of an image that will metaphorically convey the excessive process of a dark pattern (determined to confuse) by editing an image of a maze / labyrinth.

or

This has introduced me to the term ‘Privacy Zuckering’ which I intend to research further for the next update. I may use this term to incorporate the use (and exploitation) of data privacy on Facebook as this is prevalent in contemporary media discussions.

Update One

I am completing a double degree in Public Communication and Creative Intelligence and Innovation, and the organisation I will use is associated with the latter course. TED provides a platform for individuals to share ‘ideas worth spreading’ and therefore, it often informs and stimulates its audience members. For example, I enjoyed watching the TEDTalk ‘Facebook’s role in Brexit – and the threat to democracy’ by Carole Cadwalladr (and the documentary ‘The Great Hack’ that she features in), and it is this discussion of data privacy that has impacted the campaign I am developing for digital literacy. The campaign can be used at a TEDx Program or Conference to further educate those who attend. I will specify this audience further as I progress.


Bachelor of Creative Intelligence and Innovation: Problems to Possibilities

The Problems to Possibilities course is associated with the campaign as the images will communicate the contemporary (and diverse) problems in data privacy and convey that an understanding of data privacy will improve the digital literacy of the user and consequently ensure greater control of their personal information and online navigation.

I would like to use the images to convey the problem of data privacy when it is exploited and the possibilities of data privacy when it is understood and respected by the content producer and consumer.

Post Four: Algorithms

Algorithms are calculation engines that sort information to form a single output that is pertinent to the user (Crawford 2016, p. 79), and this process is determined by programmers who design and modify the detailed instructions software adheres to. Algorithms are not autonomous as they are “rule-based mechanisms” (Crawford 2016, p. 85) and it is these unknown rules that produce the ‘winners’ of information contests (Crawford 2016, p. 86). Studies have concluded that users solely observe the first result page and of this first page, users often do not consider results that are ranked below third (Bar-Ilan 2007, p. 156) and therefore, it is significant for site owners to improve the rank and subsequent visibility of their websites. To ensure this improved placement, site owners “invest great efforts and money in pleasing the search engines [rather than] trying to please the users (Bar Ilan 2007, p. 156) and this has initiated attempts to ‘game’ search algorithms (Crawford 2016, p. 82) for their own advantage. Site owners can employ Search Engine Optimisation services (consequently favouring those with greater financial resources) or groups of people can develop Google bombs (Bar Ilan 2007, p. 156). Google bombing is defined as “the activity of designing Internet links that will bias search engine results so as to create an inaccurate impression of the search target” (Price 2002 cited in Bar Ilan 2007, p. 156), and this manipulation of the algorithm compromises the validity and equity of information contests. For example, if ‘Jew’ was written in the search engine, the algorithm would have directed users to a highly anti-Semitic website (Bar Ilan 2007, p 161). Google argued that the “objectivity of [their] ranking system [prevented them] from making any changes” (Google 2007 cited in Bar Ilan 2007, p. 162), however, the programmers of the search engine should have a responsibility to modify algorithmic ‘gaming’ that promotes offensive, subjective content. This is further complicated on social media platforms because content moderation is dependent on one’s interpretation of the terms of service (Matamoros-Fernandez 2017, p. 931) and their understanding of what constitutes as hate speech (Matamoros-Fernandez 2017, p. 936) Facebook allows humour and satire to produce controversial content, however, this may be problematic as “defences of satire and irony to disguise racist and sexist commentary are a common practice online … that fosters discrimination and harm” (Matamoros-Fernandez 2017, p. 936). It is therefore important to establish clear, thorough guidelines to reduce subjectivity in content moderation (Matamoros-Fernandez 2017, p. 937) and also discourage unjustified censorship of legitimate posts.

An example of a Google bomb. 'Failure' is written in the search engine and the official biography of George W. Bush is the fifth result
An example of a Google bomb

Bar-Ilan, J. 2007, ‘Manipulating search engine algorithms: the case of Google’, Journal of Information, Communication & Ethics in Society, vol. 5, no. 2, pp. 155-166.

Crawford, K. 2016, ‘Can an Algorithm be Agonistic? Ten Scenes from Life in Calculated Publics’, Science, Technology, & Human Values, vol. 41, no. 1, pp. 77-92.

Matamoros-Fernandez, A. 2017, ‘Platformed racism: the mediation and circulation of an Australian race-based controversy on Twitter, Facebook and YouTube’, Information, Communication & Society, vol. 20, no. 6, pp. 930-946.

SearchEnginePeople. 2010, The 10 Most Incredible Google Bombs, SEO, viewed 23 August 2019, <https://www.searchenginepeople.com/blog/incredible-google-bombs.html >.

Post Three: Access and Participatory Media

It is significant to differentiate disability from impairment as the former is socially constructed to justify the unnecessary barriers and exclusion of those who possess impairments (Ellis & Goggin 2015, p. 78). The fluid nature of the terms emphasise that disability occurs when infrastructure does not accommodate for an impairment and thus, does not enable the user (Ellis & Goggin 2015, p. 78). This ableism should be corrected by the technology corporations who developed the infrastructure; however, it is those with disabilities who are ensuring greater accessibility for themselves by crowdfunding improvements (Ellis & Goggin 2015, p. 85). It is proposed that crowdfunding can afford participation opportunities for people with disabilities (Ellis & Goggin 2015, p. 84) but it should not be the responsibility of subaltern counterpublics (Sommerfeldt 2013, p. 282) to ensure their own inclusion by providing the “expertise, labour and funds for technology creation and redesign efforts” (Ellis & Goggin 2015, p. 85). Technology corporations, such as Twitter, must understand that contemporary users should expect content that is “designed to be accessible without individuals needing to identify their particular needs” (Arthur 2017, para. 4) and therefore, features that increase user compatibility (such as alternative text) should become a necessity of web design.

A drawn image that has a piece of paper with writing on it in the bottom right corner and from this piece of paper, there are three dotted lines that link the paper to a drawing of an eye, a hand and an ear - to represent the senses and their interaction with the communication on the paper.

Additionally, participatory culture has allowed greater acknowledgement of disability occur in digital media. A participatory culture is defined as an environment in which “members believe their contributions matter, and feel some degree of social connection with one another” (Jenkins et al. 2007 cited in Jenkins 2015, p. 4) and this has allowed people with disabilities to consume, exchange and produce digital content meaningfully. It has been proposed that this participatory digital media is an extension of the public sphere (Forde et al. 2002 cited in Ellis & Goggin 2015, p. 79) but as such media is often excluded from the dominant discourse, it may be more accurate that these new spaces (Ellis & Goggin 2015, p. 80) contribute to the formation of smaller, public sphericules of “parallel discursive arenas” (Sommerfeldt 2013, p. 282). For example, Ouch! has been criticised for marginalising the expert perspective of writers with disability as they do not feature their articles in the mainstream BBC discourse (Ellis & Goggin 2015, p. 81). It is thus evident that disability communities are often excluded from the public sphere and that this “barrier to participation is not [necessarily because of] the technology but the kinds of privilege that are often ignored in meritocratic discourse” (Jenkins 2015, p. 22).


Arthur, L. 2017 Why alt text matters, UTS, Sydney, viewed 16 August 2019, <https://futures.uts.edu.au/blog/2017/06/01/alt-text-matters/ >.

Caralyn, G. 2018, Writing Accessible Web Content, Sparkbox, viewed, 16 August 2019, <https://seesparkbox.com/foundry/writing_accessible_web_content >.

Ellis, K. & Goggin, G. 2015, ‘Disability Medi Participation: Opportunities, Obstacles and Politics’, Media International Australia, vol. 151, no. 1, pp. 178-188.

Jenkins, H. 2015, ‘Defining Participatory Culture’, in Jenkins, H., Ito, M. & Boyd, D. (eds) Participatory culture in a networked era: a conversation on youth, learning, commerce, and politics, Polity Press, Cambridge, pp. 1-31.

Sommerfeldt, E.J. 2013, ‘The civility of social capital: Public relations in the public sphere, civil society, and democracy’, Public Relations Review, vol. 39, no. 4, pp. 280-289.