Misplaced Trust (2019)

Project Outline

I was inspired to make this piece by the events surrounding the Referendum on the UK’s membership of the EU, in which AggregateIQ (IQ) used targeted adverts to sway voters. The adverts were designed to blend into the normal background noise of social media, playing on people’s insecurities. In contrast to this I decided to play with the concept of targeting and make this a central part of my project. I decided to set the work in a voting booth, drawing on the idea that the end result of the activities of IQ is a change in voting habits. A user’s political leaning is estimated through their reactions to a selection of articles. This data is then used to target them (or not) if they have appeared to be near the center of the scale, and therefore susceptible to an advert. If a user is targeted they are blasted with air and the booth strobes white light, directly opposite to the low-profile adverts used by IQ and others. The user also receives a printout of a graph with their political leaning as well as an explanation of the project.

Intent

The intended audience for this piece is anyone who has previously used social media, and has therefore been potentially exposed to the actions of AggregateIQ. I have built it in the hope of educating people on how even a small selection of online interactions can be used to target ads that may sway you in one way or another. I started thinking about how machine anthropomorphism is bluring the lines between what is human and what is machine. Specifically I thought a lot about how devices such as alexa creep into our social spaces and in doing so lure us into revealing more information about ourselves to the networks we connect with. These ideas solidified around the revelations that AggregateIQ had used facebook to target hard to reach voters using personal details they had lured them into providing through cleverly crafted advertising [1],[2]. This was captured well in Channel 4’s Brexit: The Uncivil war, which explains the tactics, and shows the opposition of the established MPs who did not believe in the tactic. As with all my pieces, I have endeavoured to convey these ideas through the interactivity in my piece. In this case I use a a combination of the work’s presence as an art piece and inviting, Facebook style reactions to gauge people’s opinions on a set of topics indroduced through newspaper headlines. I make these less obvious by interspersing them with other more mundane posts. In all the program presents 15 pieces of content. Given the potency of the content, these few reactions are enough to give a ballpark estimate of a person’s position on a political spectrum. While this is not the depth of data avaliable to IQ, it is enough to surprise people when it is handed to them. The data gathered also informs the basic algorithm which sorts those who might be susceptible to political advertising from those who would not be.

Production

I designed the project with many independant parts that would all be controlled by a core program. This way, even if I ran into problems with all of the external systems I would still have a working piece. The base program was ‘The Network’ interface, which could display content it loaded from a folder, making sure it would be visible on the screen, correctly scaled. It would also display responses which could be clicked to progress through content, while indicating your feelings on that topic. This system could stand on it’s own and work quite nicely I thought. To elaborate on this however, making it more of a memorable experience, I turned the idea of AggregateIQ’s hidden adverts on it’s head and decided to create the opposite, most obvious targeting system possible. I took an Idea from an earlier formation of the project, an automatic water cannon that would be lethal to a machine but not a human. In this application I simplified it to just shoot air, both to make it easier to build but also so as to avoid need for lengthy warning messages which would put some people off. The cannon consisted simply of an air compressor which could hold a volume of air, and automatically cut off, an arduino with a relay, and a solenoid valve. The compressor and the valves were both the 1/8” size commonly used for airbrushing, as these were readily available and operated at a safe, relatively low pressure level. The arduino controller started out as a custom built piece of stripboard containing an ESP8266 (wifi enabled arduino) and the relay integrated circuit. This was later simplified as I found a relay module that attached to an ESP-01 (cut down version of the ESP8266) which was more reliable. I found a library for these ESP modules that allows for control via the Art-Net protocol which I am familiar with from previoius lighting work. This meant I could control the air blast using a DMX channel.

The lights came together similarly, I was planning to use small LED PAR stage lights I had available, creating custom Art-Net to DMX boards as usually these systems cost in the hudreds of pounds. I had this working with sACN, a simmilar protocol to Art-Net with a much less well documented openFrameworks addon. I therefore switched the lighting to a better documented system of LED tape. This system allowed me to pre-program various scenes in the arduino before hand, using a single DMX channel to switch between them. This took the burden of heavy DMX generation away from the main openFrameworks program too which was a bonus, as the lighting did not need to be generated by openFrameworks. This system was accomplished with the same Art-Net addon for arduino as the relay, but this time controlling FastLED for controlling a string of WS2812B LEDs.

Finally the receipt printers. The major leap here was building an image in openFrameworks, and having it print out to a networked printer. I had used a guide on adafruit to network my receipt printer for a previous project. The solution ended up being slightly odd in that it involved printing the contents of the receipt to the screen for one second while it was screengrabbed. This screengrab was then saved as a png file in the sketch directory. Lastly, I wrote a batch script which used the command line printing utility lpr, to print the contents of that file to the networked printer. Perhaps not the most elegant solution, but I thought a rather interesting one.

Once I had all the extra components built, I added functions in the code for printReceipt() and setLighting() to control these systems. printReceipt() worked as described above, printing then screengrabbing the receipt before running the script using the system() command. setLighting() was a further simplification of the already simple Art-Net addon I was using, meaning I could litter my code with references to setLighting() to make sure the lighting always reflected the state the program was in.

One unforseen issue I encountered near the end of the process is that of Art-Net sender clashing. As I doubled up on everything to increase throughput, I had two sets of relays and lighting to control. I assigned them all separate channels and this worked, although any time one machine updated the DMX universe, the other lighting and relay would return to default as they were listening to which ever machine most recently sent packets. I ended up fixing this problem by hard coding the IP addresses of the ESPs into the program so they only heard from the machine they were assigned to. Through troubleshooting I have learned plenty about the relative merits of unicast, multicast and broadcast IPs.

If I had to add anything, up there would be a hidden reset button somewhere in the program or in the system which would reset abandoned sessions without me playing out the while procession of events, which sometimes served to spoil the surprise for some people.

Outcome

Overall I was incredibly pleased with the reception of my work, at the private view and on subsequent days there was frequently a queue to enter one of the two booths. After people exited, it was a common sight to see people comparing notes on how they answered and how the algorithm had placed them on the spectrum. Some even repeated the experience aiming for different outcomes. One thing I thought would be more prevalent given the simplicity of the algorithm, was people disputing the result they were given. Astonishingly, most people were if not pleased at the outcome, reluctantly agreed that deep down the algorithm was pretty correct. If anything this has served to make me even more uneasy about the potency of predictions that must be possible with data on the scale of AggregateIQ or Cambridge Analytica.

Code

For this Project I wrote C++ openFrameworks code and C code for arduino.

The Github Repositories are linked below.

Frontend Interface Running on touch screen PCs : https://github.com/RobHallArt/FMPFrontend

Backend Software used to classify content manually : https://github.com/RobHallArt/FMPContentClassifier

Arduino Code to control relay using ArtNet : https://github.com/RobHallArt/ESPArtNetRelay

Arduino Code to control LEDs using ArtNet : https://github.com/RobHallArt/ESPArtNet-FastLED

Printer code using CUPS print server accomplished using this guide : https://learn.adafruit.com/networked-thermal-printer-using-cups-and-raspberry-pi/network-printing