Leveraging propaganda as Social Engineering

Mark Honeycutt Compliance, Governance, Investigation, Management, Network, Risk, Security, Strategy 10 Comments

Editorial preface

The below article was written by Mark Honeycutt, and is the first in a series of what already promises to be an excellent read.

Mark is the owner of Shark Cybersecurity and is a Social Engineering expert who specialises in Social Engineering penetration testing engagements which are followed up with employee training.  He has a Master’s Degree in Rhetoric where he studied Propaganda Theory and Persuasive Discourse.  During his doctoral work, he shifted his focus to Social Engineering where he explored “Hacking Humans.”  He is also vested in his interests in algorithm design, deep learning, and human-machine interaction.

The comments section at the foot of this article will be open for any comments or questions you may have.


When President Trump visited Poland in July, he said that the West must “meet new forms of aggression, including propaganda in new ways and on all new battlefields”. This was no more than six weeks before the DoD initiated the elevation of U.S. Cyber Command to Unified Combatant Command.  There is no doubt that we are at the beginning stages of a potential global cyber war with the likes of Russia, Iran, North Korea, or China.  It’s just a matter of time before someone launches an attack that makes Stuxnet look like a tiny experiment.  

I’m not going to write about the network side of this coming conflict.  There are far too many people much more qualified than me to address the intricacies of that scenario.  I am, however, highly qualified to talk about another form of hacking that has just recently come to public attention after the recent U.S. and French elections – state-sponsored fake news and troll-brigade propaganda.  

Before I go further, let me take a second to qualify why I am in a unique position to discuss this issue.  Twenty-five years ago, I started graduate school to study Rhetoric and Communication.  By the time I finished my Master’s Degree, I’d found a speciality in Propaganda Theory and what the Ancient Greeks called “The Art of Persuasion.”  After some time, I began to work on my PhD in Rhetoric and Communication.  Enough time had gone by that I was able to branch off in a new field called Media Ecology where I studied the role of persuasive discourse in an online environment.  Of course, that eventually morphed into Social Engineering in a practical setting.  It is due to this background that I feel uniquely qualified to discuss such issues.

It is also important that I qualify a few concepts that are not intertwined at the moment in tech circles regarding Social Engineering discourse.  I recently had a good chat with someone who was equally interested in Social Engineering.  He was full of questions, and my answers led him to further questions which I was more than happy to discuss.  Before we had our talk, I’m pretty sure that he held the same view that has been preached for a decade in tech circles regarding Social Engineering – it’s relegated to a few people who try to fool other people into opening an attachment or giving them access to the server room.  In this regard, it’s akin to the con artist who’s lying his way into a widow’s bank account.

This definition is fairly relegated to the tech community.  For those of us who have studied its 2,000-year-old history, we know that Social Engineering has a far more expansive definition that not only covers the one-on-one discourse but also mass discourse.  Up until twenty years ago, mass discourse was more limited in scope, but thanks to studies in Media Ecology, it now includes the Internet in all of its powerful glory.

Without going into a long-winded definition as to what Social Engineering actually means, I’m going to just keep it simple for simplicity’s sake:  Social Engineering is any attempt to persuade an individual or a large group of individuals to another truth.  To be successful, the Social Engineer must establish belief, trust, and authority on the subject or situation, and he/she must not only change the other parties’ truths but must embed it deeply as belief.  In order to do this, an entire host of powerful rhetorical devices are at their disposal.  These rhetorical devices have been used since Gorgias discussed them in 600 BCE and can be studied through the years.  Most people think of these when they talk about Adolph Hitler and Josef Goebbels.  But the most powerful examples of this type of manipulation has its very roots in Western culture since the turn of the 20th Century.  Therefore, we can look at Social Engineering as an umbrella term for Propaganda and the standard tech definition for Social Engineer.

The heart of this discussion isn’t in the techniques though.  I would be remiss to let this audience think that.  No, the blood of Social Engineering is pumped through the concept of Truth vs. truths.  Although there is a very interesting field devoted to this very concept as it applies to culture, society, and communication (Social Semiotics), for this discussion, it’s only necessary to understand the difference between the two terms.  

Truth (with a capital ‘T’) implies that there is only one Truth, and that every human who has ever lived either accepts this Truth or rejects it.  Please take all religious belief-systems out of this discussion.  In this sense, we are referring not to God or gods; we are talking about perceptions of reality.  Truths (with a small ‘t’) states that there is no one central truth among individuals, family units, neighborhoods, communities, etc.  It states that each and every individual perceives his/her own truths, and it’s understood that other people have their own truths as well.  Groups of people, therefore, may share values and beliefs (truths), but they will never have a Truth because our perceptions are based upon many things including individual life experiences.

Why is this important for a Social Engineering discussion?  It’s very important, because the very basis of propaganda and local Social Engineering is based upon this very concept.  The rhetorician (propagandist, social engineer, con artist, etc.) must understand that truths are hackable.  They are extremely hackable.  In fact, it’s the easiest thing to hack in the world if you have a good understanding of certain principles and techniques.  

I’m not here to talk about 20th Century propagandist.  The Nazis were empowered by it.  US corporations and the US government used it effectively (and still do).  If you really get into the dirty details, it’s clear to see that we make few decisions today that we are not led to make.  It’s almost impossible to have a defined set of truths because we are constantly pushed and pulled one way or another to believe something differently.  Much of this is because of blind faith.  More of it is emotion.  And a whole lot of it is because we’re so busy that we don’t want to think anymore.  

Russia has long been a culture that valued the study of Rhetoric.  Knowing this has put them on the leading edge of a new type of propaganda and Social Engineering.  Their “Web Brigade,” or as the Western media have called it, “The Russian Troll Army” set up shop on the Internet using sockpuppets through both fake media news and aggressive or persuasive trolls.  This is a new layer of propaganda because truths have further been displaced by false narratives and “facts” that make it difficult for anyone to develop their own personal truths.  It is wholly an emotional argument that preys on everyone because, as we know, our emotions create strength in our ideals.  

But it doesn’t stop there.  To counter this propaganda, the Western allies have initiated counter-propaganda techniques aimed at the same audience that the Russian fake news propagandists are aiming at, and this has created mass confusion and distrust.  Thus, this large-scale Social Engineering experiment is beginning to implode, and that’s what we need to be mostly concerned about.  

The Internet began as a haven for sharing information.  Google has made this information available to everyone.  However, we live in a time where we cannot discern a perceivable truth from an outright lie.  The extremes on every side are growing stronger because they have accepted that the only way to fight this lack of “knowing truths” is to form groups who accept a singular Truth by which all things are perceived.  Much of our social unrest stems from these movements by way of an ideology that can be shared.  They are all emotional ideologies, and it’s counter to what a healthy society should be like.

In my next article, I’m going to discuss in specifics how all of this impacts the tech culture directly.  From the “Information Highway” to privacy and anonymity, tech is slowly finding itself at the heart of a major war that won’t just define a nation-state.  It’s a war that will define the future of humanity.

About the Author
Mark Honeycutt

Mark Honeycutt


Mark Honeycutt is the owner of Shark Cybersecurity and is a Social Engineering expert who specializes in Social Engineering penetration testing engagements which are followed up with employee training. He has a Master’s Degree in Rhetoric where he studied Propaganda Theory and Persuasive Discourse. During his doctoral work, he shifted his focus to Social Engineering where he explored “Hacking Humans.” He is also vested in his interests in algorithm design, deep learning, and human-machine interaction.

Leave a Reply

4 Comment threads
6 Thread replies
Most reacted comment
Hottest comment thread
3 Comment authors
Tassos ArampatzisMark HoneycuttMark Cutting Recent comment authors
newest oldest most voted
Notify of
Mark Cutting

Great article Mark, and some excellent points. I particularly find the discussion around the “rhetorician” very interesting – ultimately, everything is “hackable”, and by definition, that includes the truth. By simply changing the shape of this truth, a cyber criminal is able to extort information that he or she may not necessarily have access to without investing time and effort. We as humans need to learn that not everything is as it seems – far from questioning literally every decision we take, it certainly yields positive results to question why you’re receiving a Facebook invite to an address that isn’t associated with your account. Similarly, blindly clicking on links is no longer acceptable – we are all accountable.

Mark Cutting


When are you planning to release the second article in the series ?


Mark Cutting


This is a really good point – education here is key if security in any form is to succeed.