I am starting a new job in November. This is not a prank like last time. But before the grand reveal of where, first I’ll subject you to a lengthy blog post about my thoughts about the how and why. Hopefully this provides an additional perspective to the excellent posts by Lana Yarosh and Jason Yip on their experiences on the computer/information science academic job market. But those of you who know the rhythms of the academic job market are already realizing that (spoiler alert), I’m not starting a tenure-track faculty role. Instead, I’m going to spend the next few years being a data scientist. But I definitely promise not to be this guy:
Last week, the Proceedings of the National Academy of Science (PNAS) published a study that conducted a large-scale experiment on Facebook. The authors of the study included an industry researcher from Facebook as well as academics at the University of California, San Francisco and Cornell University. The study employed an experimental design that reduced the amount of positive or negative emotional content in 689,000 Facebook users’ news feeds to test whether emotions are contagious. The study has since spawned a substantial controversy about the methods used, extent of its regulation by academic institutions’ review board, the nature of participants’ informed consent, the ethics of the research design itself, and the need for more explicit opt-in procedures.
What makes misinformation spread? It’s a topic of vital importance with empirical scholarship going back to 1940s on how wartime rumors spread. Rumors, gossip, and misinformation are pernicious for many reasons, but they can reflect deeply-held desires or are reasonably plausible, which makes them hard to stay ahead of or rebut. I have an interest in the spread of misinformation in social media and have published some preliminary research on the topic. So it was fascinating for me to witness misinformation spread like wildfire through my own academic community as it speaks to our data-driven anxieties and dreams.