For my cloud text experiment, I decided to create a twitterbot using the class tutorial. Though I had a small issue in the beginning, I found it relatively easy to pull it together using the template scripts, and you can see it running on my twitter account at: https://twitter.com/AndrewDunnLIS (this is an old account I created back during library school)
For this experiment, I decided to focus on the #NotNormal hashtag, used on twitter and Facebook in connection with racist incidents and other examples of extreme behavior/opinions by Trump and his associates. The point of the hashtag is prevent him, his policies, and his administration from becoming normalized by the media. I got the idea from a strategy guide published by Congressman Jerry Nadler (http://www.jerrynadler.com/news-clips/how-we-resist-trump-and-his-extreme-agenda). The guide is fascinating and very helpful, and so I encourage everyone to check it out.
To construct the Twitterbot, I spent two days “listening” to and scraping tweets from the #NotNormal feed. In the end, I had about 260 non-duplicate tweets, which I collected in a text file that serves as the primary directory. Most of the tweets have multiple hashtags, and most link to news articles or infographics that explain what the authors find so outrageous, and not normal. In building this library, I decided to delete the addresses of specific people (@_____) – I’d rather not get embroiled in specific feuds, if that’s possible; I’d rather just amplify political statements. In essence, then, the bot is retweeting things from the last 48 hours. In completing this project, by the way, I was struck by how frequently people just retweeted things in their feed, without taking what I think is necessary time to reformulate thoughts in their own words, and to really process/digest/reflect critically on the arguments they’re promoting. To a large degree, these are structural limitations built into the interface – most importantly, the character limit, which people can only bypass by including links. Still, now that I’m doing the same thing using automation, it strikes me as strange that so much of our current model for public discourse is automatic, and literally equivalent to something a program could do. (On top of that, I have some reservations about a project in which I’m just retweeting links to news sources/stories that I have not vetted, so I doubt I’ll leave this running for long. Also, because the links concern breaking news, they’ll fall out of relevance shortly.) I’m starting to think that it would be really interesting to map how a single tweet spreads through the twittersphere – I know there are programs that do that, and if anyone has recommendations I would love to see them, especially now that I have an API access key.