Aleph of Emotions is a project that was created during my study at LASALLE College of the Arts, Singapore. The project is about emotions and the possible observable patterns in global emotions.
The Aleph, according to author, Jorge Luis Borges, is a point in the Universe where all other points exist. Therefore, anyone looking at the Aleph could see everything in the Universe at once. In this project, I use the Aleph as a metaphor for an archive; Aleph of Emotions refers to an archive of emotions. This archive is produced by data collected from twitter. Data is collected based on keywords that define certain emotions. The results are finally presented with an interactive object.
This work uses Robert Plutchik's list of emotions as a starting point. This list proposes there are eight basic emotions. Based on this list, keywords that relate directly to these emotions were chosen. A custom software was then written to collect tweets that contain these keywords. This software makes use of the Twitter API to collect data. Tests were be conducted to ensure keywords are suitable for the particular emotions. Once test results are satisfactory, data mining began. The data mining process went on for a month. Meanwhile, an interactive tangible interface was devised to read the data collected and visualize it. The camera-like interface reacts according to how the user interacts with it. It illustrates the relevant data received, according to the user's interaction.
Twitter was mined for data over 35 days. A Python program was written to filter and save the tweets with some relevant data such as location, timezone, etc. Since the geo locations that were stored by the users were inconsistent, timezones were used to determine locations of tweets. The entire data collected was over 25 gigabytes and this data had to be compressed to a much smaller file for quick visualization. Therefore, another program was written to calculate and store the number of tweets for each emotion on each day. This new file worked out to be less than a megabyte and could easily be stored and accessed. An android phone is used as the display. The android device runs an app that merely communicates with an arduino board that is programmed and set up to read human interaction. Human interaction here refers to the pointing of the 'camera' along a particular direction and focusing the lens to a particular city.
Emotions are categorized on the basis of Plutchik's theory into joy, sadness, trust, disgust, fear, anger, surprise and anticipation. Plutchik also created a wheel-like diagram to explain his theory. In most illustrations, each emotion has a particular colour (see links below). These exact colours were used to visualize each emotion in this project.
Yellow represents joy, light green represents trust, dark green represents fear, light blue represents surprise, dark blue represents sadness, pink represents disgust, red represents anger and orange represents anticipation.
Data is visualized for each day of the week and overall as well. The top most bar shows an overall distribution of emotions for the place. Each bar that follows represents a day of the week starting from Sunday.