Detection of Crowd Manipulation in Social Media
Navy STTR 2019.A - Topic N19A-T024 ONR - Mr. Steve Sullivan - [email protected] Opens: January 8, 2019 - Closes: February 6, 2019 (8:00 PM ET)
TECHNOLOGY AREA(S):
Information Systems ACQUISITION PROGRAM:
Distributed Common Ground/Surface System-Marine Corps (DCGS-MC) OBJECTIVE: Develop
information stream analysis models and analytic tools to detect, characterize,
and visualize computational propaganda to detect influence campaigns and
propaganda that target the emotions of anger, hate, fear, and disgust. Ensure
that the proposed capability is able to indicate and analyze influence
campaigns in progress and evaluate their potential impacts on target audiences. DESCRIPTION: Operating in the
information environment today is highly challenging for Navy, Marine Corps, and
other military warfighters. The information environment includes multiple platforms,
social communities, and topic areas that are polluted with disinformation and
attempts to manipulate crowds, spread rumor, and instigate social hysteria.
Polarization of crowds is a significant problem with nation-state actors
conducting malicious campaigns to spread and amplify civil discontent and
chaotic social dynamics�usually by manipulating the emotional mood of crowds.
Hate, anger, disgust, fear, and social anxiety are heightened using
computational propaganda. Current �sentiment� models are poorly suited to
measuring emotional content in online media. These measures are not currently
well synchronized with measurements of manipulation by information actors who
are intent on subverting civil discourse and discrediting the messages of civil
authorities. PHASE I: Develop prototype
algorithms, models, and tools that use Government supplied synthetic data
supplemented by case studies to demonstrate a proof of concept to identify
computational propaganda content and emotional valences of messages in Twitter,
including indicators of manipulation and the capability to segment actor
communities (i.e., botnet, bot-enhanced, human). Integrate simple models of
emotions (such as Ekman�s model)� and consider using more sophisticated, finer
grained models (such as Russell�s model with Scherer�s updates). Note: These
models are considered to be illustrative; developers are free to use other
models of emotions. Ensure that the prototype successfully identify sets of
messages, gists, and stories; determine their emotional content in a general
sense; estimate whether these sets are likely to represent manipulated
discourse; and visualize the discourse by gist (topics) and story (such as
URL). Develop a Phase II plan. PHASE II: Develop the models
of emotion and propaganda so as to be able to identify computational
propaganda, its emotional valences and arousal state. Estimate the degree of
artificial manipulation present in gists and stories present in live
information streams from Twitter, websites, and blogs. Ensure that model
results are exportable to other tools (such as social network tools,
visualization tools, databases and dashboards). Make available to the Navy a
user-friendly, working prototype with built-in help capabilities for testing
and evaluation in a cloud-based environment by multiple users in the context of
an online military virtual tabletop as the final technical demonstration of
this project. Conduct and complete model development and validation prior to
Phase III. PHASE III DUAL USE
APPLICATIONS: Apply the knowledge gained in Phase II to further develop the
interface, capabilities, and training components needed to make the
technologies able to transition to military customers. Make the technologies
available on an existing cloud platform of the customer�s choosing (e.g.,
SUNNET, Navy Tactical Cloud, Amazon Cloud) working with cloud owners to deliver
a subscription-based tool interoperable with other tools in enclave settings.
Expand and develop the model to cope with real-time information flows and
evolving information tactics. REFERENCES: 1. Langroudi, George,
Jourdanous, Anna, and Li, Ling. �Music Emotion Capture: sonifying emotions in
EEG data.� Symposium on Emotion Modeling and Detection in Social Media and
Online Interaction.� 5 April 2018, University of Liverpool.
https://www.emotiv.com/independent-studies/music-emotion-capture-sonifying-emotions-in-eeg-data/ 2. Harvey, Robert, Muncey,
Andrew, and Vaughan, Neil. �Associating Colors with Emotions Detected in Social
Media Tweets.� Symposium on Emotion Modeling and Detection in Social Media and
Online Interaction.� 5 April 2018, University of Liverpool. https://docplayer.net/82902361-Symposium-on-emotion-modelling-and-detection-in-social-media-and-online-interaction.html 3. D�Errico, Francesca and
Poggi, Isabella. �The lexicon of being offended.� Symposium on Emotion Modeling
and Detection in Social Media and Online Interaction. 5 April 2018, University
of Liverpool.
https://www.researchgate.net/publication/326096901_The_lexicon_of_feeling_offended 4. Badugu, Srinivasu and
Suhasini, Matla. �Emotion Detection on Twitter Data Using Knowledge Base
Approach.� International Journal of Computer Application, Volumbe 162, No 10.
March 2017.
https://pdfs.semanticscholar.org/6698/5a996eab1e680ffdd88a4e92964ac4e7dd56.pdf 5. Agarwal, Nitin,
Al-Khateeb, Saamer, et. Al. �Examining the Use of Botnets and Their Evolution
in Propaganda Dissemination.� Defense Strategic Communications. Vol 2, Spring
2017.
https://www.stratcomcoe.org/nitin-agarwal-etal-examining-use-botnets-and-their-evolution-propaganda-dissemination 6. Dijck, Jose and Poell,
Thomas. �Understanding Social Media Logic.� Media and Communication, August
2013, Vol 1, Issue 1,pp. 2-4.
https://www.cogitatiopress.com/mediaandcommunication/article/view/70/60 KEYWORDS: Social Media, Computational
Propaganda, Crowd Manipulation, Social Hysteria, Rumor
|