Transcript of Currents 044: Zak Stein on Propaganda and the Information War

The following is a rough transcript which has not been revised by The Jim Rutt Show or by Zak Stein. Please check with us before using any quotations from this transcript. Thank you.

Jim: Today’s guest is Zach Stein. Zach’s been on the show a couple, three, four times before I think. Some very interesting episodes on what was it? Hierarchical complexity, I think was the last one was very interesting. And then we did three episodes on his very interesting book called Education in a Time Between Two Worlds.

Jim: Zach’s a writer, and he’s an educator, and he’s a futurist working to bring a greater sense of sanity and justice to education. Now that is a good concept. That’s a good idea, right? Not too many other people seem to be doing that. He’s also one of the key folks at the Consilience Project, which is focused on research, publication, and the building of a decentralized movement towards enhanced collective intelligence. You can find more about the Consilience Project at consilienceproject.org and of course, as always, those links and anything else that we referenced in the podcast will be available at the episode page at jimrutt.com.

Jim: On the interest of full disclosure. I should say that I am an advisor to the Consilience Project and I’ve had a relationship with the project in one form or another since before it started. So if I’m not as mean and nasty as I usually am, you’ll know why. So, welcome, Zach.

Zak: It’s great to be here, Jim. It’s great to be back.

Jim: Yes, I think I’m looking forward to this conversation. This is some important stuff.

Zak: Yeah.

Jim: But today we’re going to talk about information wars and propaganda as discussed in two recent Consilience Project essays.

Jim: And I decided after reading them both, I thought about how they’re related. So I’m going to spend maybe a little less time than I might otherwise do on the information wars part. And then maybe a little more time than I might otherwise do on the propaganda part, on the theory that learning the nature of propaganda may be something our listeners can actually put to use in helping us create a better information ecosystem. So with that little preamble, let’s jump in.

Jim: The first essay is called It’s a Mad Information War. I mean, the way you guys described this, it’s not so good. And I start off by saying that the article paints a picture of massive and sophisticated information warfare, including between nation states, between political parties. And you give the example that any citizen living in a swing state is constantly barraged by people trying to manipulate us. What else can you tell us about the scale and scope of the current information war?

Zak: Yeah, so I mean, it’s important to understand that there’s this long complex history of informational warfare, right? Sun Xu wrote about it. You have examples of it in ancient Egypt, something humans have always done as a supplement to physical warfare. They’ve waged information war. And information war was always bound by the communication media, right? So like Ramesses I built huge obelisks on which he basically carved the examples of his brutality of his victories and placed them all over the place. Basically saying “I’m frightening you before you ever encounter me.” And similarly, like D-Day, when we began entry into that world war, we littered hundreds of thousands of pamphlets across the landscape to get the opposing enemies, knowing that we were coming, afraid that we were coming.

Zak: So information warfare has always been a part and accompaniment to physical warfare. But there was a flip, and I talk about this in the paper explicitly, with Eisenhower and psychological warfare as instituted during the cold war, and the mobilization, not just of printing press, and not just of radio, but also of television, and a large academic apparatus, and entertainment industry. And this huge multi-pronged approach, which people like Edward Bernays and others started to rebrand as public relations. And what happened was we got the wide kind of like large-scale institutionalization of these habits of manipulative communication. And again, it was one thing when they accompanied physical warfare.

Zak: But when they were instituted of their own volition and actually came to be more important than the things taking place with physical warfare, and this is a complicated argument I lay out, you ended up getting the supremacy of the behavioral sciences as an aspect of the military industrial complex, and large, very widespread efforts to make increasingly sophisticated mechanisms for manipulative communication.

Zak: And so we’ve crossed the threshold now. And this is the argument in the paper, where all sides in the information war have such powerful informational weapons that they should be thought of almost like as weapons of mass destruction. And that we’re reaching a point where these propaganda campaigns competing with one another are destroying the landscape to such an extent that no one can possibly win the culture war. That the culture war is a self-terminating war, that it’s a war of mutually assured destruction because we’ve escalated the informational warfare so sophisticatedly.

Zak: And let me pause and say one thing, which is that I’m a psychologist and an educator by training. And so it was actually interesting for me to get deeply engaged in the research around propaganda because it’s actually was in my wheelhouse because I’m studying basically applied psychology as could be institutionalized, right? Which is education and psychology coming together. And so I realized that, yeah, you know, propaganda is kind of the evil twin of education. That in many ways, it suffuses the environment just like education does.

Zak: And it, in many ways, can masquerade as education, but in fact is different than we can identify those structures. And we’ll do that in a minute. But so the point about the information war is that yet we’ve the threshold where we can’t just kind of do this the way we used to because the weapons are too big. So just like with kinetic warfare, we eventually reached a point where we’re like, whoa, we can’t drop bombs anymore guys, because literally the whole planet will be destroyed. We’re making a similar effort to make that point clear with the consilience project saying like this is a key theoretical thing to get an information war that is also true. You can reach thresholds of sophistication of information or weaponry where now you’re destroying a whole population’s capacities to collaborate and make choices. And now you’ve crossed thresholds, where if you’re both doing that to each other, you’re assuring the totalized destruction of culture and nobody wins that, everybody loses.

Jim: And as you know, I tend to look at things from an evolutionary in game theoretic perspective, and we should not expect anything else. Right. Essentially one side ratchets it up. The other side responds. I mean, this is the nature of arms races to call it race to the bottom or race to the top, whether you’re an arms merchant or whether you’re a victim. And of course we’ve had some trends that have accelerated that, actually two that come on more or less simultaneously. One is the ever-increasing reach of mass media, right? And then the second one that’s perhaps not so well known to the public is the very massive increase in real knowledge of psychology and cognitive science. Psychology kind of got off into a weird, strange land of behavioralism for a long time, but starting, I don’t know, in the late fifties, early sixties, we started to get real information again about the mind and combine the two.

Jim: And then combine that with arms race dynamics, no surprise that you know, weapons that are the psychological equivalent of Hiroshima bombs can be created. And you make a very interesting point in the paper that maybe we have Hiroshima bombs today on our network. But imagine we’re in a world of pervasive virtual reality and augmented reality. In fact, my most recent podcast is on the metaverse with Rob Tercek, where we talk a little bit about what the world might look like in a world where the biggest companies on earth are trying to drive people to spend as much time as possible in a very high fidelity virtual world. We may go from Hiroshima information warfare to hydrogen bomb level information warfare. And you know, the world would actually have survived a war fought with a couple of hundred Hiroshima bombs, but the world would not survive, or at least advance world civilization would not survive a war, fought with hydrogen bombs.

Jim: And we may be at that stage right now, maybe like 1950 in the terms of kinetic warfare and that’s worth considering. The other thing that I thought was very interesting. And this was a point that most people forget, you guys say the idea that any group of leaders is immune to the cognitive and emotional distortions that they afflict upon the masses is misleading, right? Once the stuff is let loose in the world, constantly, the whole world is polluted by it. And that includes the elites that let it loose. Could you say a little bit more about that because that’s a hugely important point.

Zak: Yeah. That’s what’s important to understand is that in prior environments of warfare, you wouldn’t have population-centric warfare, and you wouldn’t have population-centric information warfare, which is basically like you’re trying to drive, not just the enemy soldiers insane, you’re trying to drive the entire population of the other country insane or completely misinform them or disrupt them. And so when you reach that scale of saturation of propaganda, you have to create agencies within your own government that specialize in deceptive communication. And then you expect them to tell you how they’re spending your money without deceiving you, right? So like you literally incentivize the creation of experts in deception and manipulation and propaganda. And they’re part of a government bureaucracy, which depends upon trust between government agencies. Right?

Jim: A famous example of that is how the CIA hacked into the computer systems on Capitol hill to understand how frankly, to promote more money for CIA budgets.

Zak: Yeah. And there’s tons of examples. And of course also in the Soviet union and the paranoia that ends up ensuing from the insulated political class that believes they’re somehow not themselves susceptible to it. And so the other thing to remember is that you have to refresh that political class as the generation turns over, which means you have to pull in younger people to the mix. Which means have they been socialized in a completely propagandized environment? How do you then respect the opinions that they give you as a strategic leader, when you know that the sources they have their information are. So that’s one element. The other element is that we’re in this situation even more complex because it’s the social media, micro-targeted attention capture technologies they’re delivering the propaganda, which means that even if you’re not buying the propaganda you make, which is to say you know you’re manipulating people and that’s fine.

Zak: You’re like, whatever your position is, you think you’re immune, you’re not immune. If you are on social media for any amount of time, and this is important to get. So people hearing this and thinking, oh yeah, they’re right. And the anti-vaxxers and the Trumpists, they’re all manipulated by propaganda. But me, over here, with the New York times, I’m not, right? Then you definitely are. If you’re not reflective about the information or environment in which you’re engaging and realizing that the places like the New York times and Fox and all those places have been so deep in the public relations infrastructure built by the federal government for so long. And this is so well-documented, that yeah, you have to kind of pop your head above water and realize that it’s a totalized propaganda environment.

Zak: Now, once you make the distinction between education and propaganda, you can start to see that education also exists and is out there. And so it’s about making judicious choices between them. But yeah, there’s no insulating yourself from it. And in many cases, the deeper you’re in the core of power, in something like Game A, the greater are the chances that you are being manipulated, and that your best ideas and that your most treasured facts and values have been engineered to be useful for you to think.

Jim: And important for people to remember, there’s really good psychology research support for this. You know, some of our listeners who are mostly smart well-educated folks will say, “oh, well, you know, maybe for them numbskulls in Arkansas, they’re going to fall for all this shit.” Guess what folks! The research shows, the more educated you are and the more intelligent you are, the more susceptible you are to confirmation bias, which is to believe things that reinforce the beliefs you already have, as opposed to being willing to look objectively at news stories. So those of you who think, well, I’m real smart, I’m real educated, this stuff doesn’t apply to me. You are dead wrong. It applies to you.

Zak: And that goes for us, too Jim.

Jim: Oh, not me. Maybe you.

Zak: But there’s a weird trick. And there’s a problem with doing propaganda research, actually at a high level. And I fell into the rabbit hole myself. And I mentioned this in the first paper, which is that once you start to see information this way, and you start to realize the decades of history of sophisticated, psychologically manipulative information warfare, you start to be potentially suspicious of everything. Like everything. Including me, like maybe I’ve been paid by the CIA to say certain things about propaganda, right? So it’s like, whoa, wait, where did we stop? And can I get reliable information about past government propaganda activities? When we know that government agencies write books to dissuade people from understanding, well, what has actually occurred? So this becomes interesting, and this is why the difference between education and propaganda is so important and why certain forms of self-reflection and discourse and conversation need to be understood as antidotes to forms of propaganda that are otherwise very difficult to resist.

Zak: And so we can talk about that later. And so that’s the way one could like stand forward and say, Hey, that’s propaganda. You can only do that if you’re standing on some footing that is non-propagandistic. Instead of creating what has happening now, just counter propaganda, right? Which is just competing propagandas where you know you’re not standing on anything that’s valid, but you know what you’re saying is very useful in the context of information war. And so you stand for values and facts, which you don’t quite believe in, but that are useful politically, strategically. And so widespread, strategic manipulative communication replacing educational communication for mutual understanding.

Jim: Yeah. And you know, you give some examples of how, again, this arms race, for instance, say in presidential politics Obama in 2008 and then even more so in 2012, developed quite a large amount of expertise in social media. Micro-targeted manipulation of voters, but the Trumpsters took that to a whole new level. And I’m sure then, in the next cycle, it’ll be an even higher level of manipulation. So whoever moves first might have a first-mover advantage one time, but other side will ratchet it up. That’s very important to keep in mind when we’re thinking about the dynamics here. Another point that you guys make, and this is a direct quote from the article, “society has been left teetering on the edge of mass insanity, caught up in the dynamics of mutual assured destruction, reaching the final limit of military strategies of total war.” Do you really believe that? You think we’re on the verge of mass insanity?

Zak: So it sounds like a strong claim, but it’s kind of a technical claim. Like when you say mass insanity, you think people running through the streets like mad people and stuff. But that’s not what I’m describing. I’m describing something like ubiquitous, low grade psychopathology. And when you look at the mechanisms of propaganda, and this goes way back to the twenties and thirties, where it began with the synthesis of Freud and political science, you see that the degree of pathology and neurosis in a culture makes the culture susceptible to propaganda and manipulative communication. And then that communication itself induces more neurosis and psychopathology. So it becomes a bad feedback loop.

Zak: And so, yes, I do think we are already witnessing widespread ubiquitous psychopathology. If you look just at the mental health statistics. So that’s one just simple objective fact. And during the pandemic, when we’ve seen a tremendous amount of propaganda, we’ve seen a tremendous increase in mental health problems. And these things are related. There are studies being done specifically on how the media and propagandistic media, which is to say public health campaigns and other things that are like battering people with guilt and shame and all the fear and, that these things induce mental illness. And so, yeah. So I think it’s a fair claim, but it doesn’t look like people running around like mad people though. I’m not ruling that out.

Jim: It happens from time to time and at an increasing rate. But yeah, so I think you have the idea of madness and sort of dysfunction in the information sphere.

Zak: If you define insanity specifically as absence of contact with reality, then it gets even more interesting epistemologically. Which is to say, and this gets into Baudrillardian simulation and hypotheses about civilizations creating illusions that sustain them. And so their populations are drawn into fictitious understandings of their actual situation at a very broad, deep kind of almost systemic level. So that, really crazy shit can exist. And no one kind of notices it because it’s not part of the simulation.

Jim: And then at some point it becomes illegitimate to critique it.

Zak: Correct. Yeah. And then, so-

Jim: You must be crazy, right?

Zak: Right. And, but the Baudrillardian simulation model, is this an existential risk or civilizational collapse model? Because it’s basically saying we get far enough away from reality to the point where we don’t even care about reality and that’s where-

Jim: We’ll talk about that in the next section, information nihilism.

Zak: Right. Information nihilism. And so again, I would define that as insane. I would define not caring whether what you say is true or false is insane. Now, if you know you’re doing that, because you’re trying to say things that are kind of true enough to get you a political advantage in a particular conversational context. And you’re leveraging the default norm set by the propagandized media to get your own advantage through the use of language strategically. That’s one thing that’s not quite insane. That’s more like sociopathy.

Jim: Yeah, sociopathy. So we got sociopaths, a country of sociopaths leading an army of insane people.

Zak: Of insane people. It’s possible. But the sociopathic movement becomes increasingly possible and easy to make when all the language has been weaponized and seeded in, the cultural has been seated with conflict, by the way the languages is loaded.

Jim: Essentially you mentioned, sort of don’t care if it’s true or not, or the truth even doesn’t matter reminds me of a book. I re-read last week in preparation for a series of podcasts I’m doing with John Vervaeke, Harry Frankfurt’s book “On Bullshit”. He makes this very interesting distinction between lies and bullshit. You know, when you’re a liar, you’re depending on people thinking you’re telling the truth. When you’re a bullshitter, you don’t care about truth or lies. You just say anything that is useful, right?

Zak: Yeah, totally. And what’s funny is that I’m feeling into the listener. And so you’re imagining, like, when we say these things, if you strongly identify with the left, you believe this is what the right’s doing. That they’re stuck in an illusion, that they’ll believe anything, that they’ll say anything. Right. But if you strongly identify with the right, you think this is what the left’s doing. That they’re so strategic with their communication and the politically correct language and the propaganda from the pandemic. And so it’s just interesting to note that both sides think the other side is in bad faith, incorrigibly, irrevocably. Right? And so that’s a sign that we’re in a deeply propagandized environment where we’re given the language to dehumanize the people we disagree with.

Jim: That’ll be a big theme in our next section. Before we move on to the second paper, you basically make some, at least one suggestion, well, one and a half suggestions on how start to address the problem. One, we need to agree to put aside our most powerful weapons, right? And this is like the equivalent I’m saying we’re not going to use nukes, which is sort of the default these days, but then the second, and maybe that could happen, but I don’t see it happening any time soon. I mean, again, we’re still in the game, theoretic arms race, and people don’t yet perceive that we’re in the Hiroshima era on the verge of the hydrogen bomb here. They don’t know it yet. So I do see you guys doing good work and trying to educate people of that, but I don’t get any real sense yet that very many people believe that.

Jim: But your second one is something that may be more doable, which is the idea of to set up the equivalent of demilitarized zones. That’s actually your language. Society must begin to rebuild cultural areas in which education can take place rather than information warfare. That I suppose could be possible, but it’s certainly not happening today. At least not at the macro-scale in our big institutions. If anything, they continue to get more and more politicized, more and more captured by quite literally, bullshit. But I suppose we could build these at the local level or the self-organizing network level.

Zak: Yeah. I mean, the resilience of the propaganda does take place at the level of family socialization and community cultural resilience. Sometimes religious communities specifically in terms of the resilience, that’s possible psychologically against profound propaganda, which is what I believe we’re facing now. And so, yeah, it’s not the schools that I’m talking about. When I talk about demilitarized zones, the schools literally in the cross hairs of the cultural wars and they always happen. And this is one of the saddest things I think about the state of information war. And it began in the Cold War, was that it really is the children who were placed in the crossfire because they’re the ones being socialized. And so, like I said, propaganda is the evil twin of education, which means that ideally you’d be propagandizing the young and of course, Hitler nailed it.

Zak: Hitler set up the youth camps and Hitler. So there’s a long history of using schools and other institutions to capture the minds of the youth, et cetera. So I’m saying no no no, pull-out back into context where you can know for sure that the people you’re with are not acting in bad faith and then begin to expand those contexts. And so it’s very similar to the education hub networks that I recommended in my book, where we’re looking at a decentralization and relocalization of educational authority, but kind of like made non-parochial through digitally network distributed system of educational relationships with other such local hubs.

Zak: And so I’m recommending that kind of resilience to the propaganda to begin there, but that would require tremendous discipline, especially with engagements around social media and with considerations of curriculum and other things. And so it’s very easy to think a group that is completely homogenous and has no descending voices, is therefore doing fine and is not propagandized. So the idea here is not to insulate ourselves, but to rather create new conditions of socialization and we’ve long had these. And that’s the thing, there’s a tremendous amount of educational material on the internet.

Zak: The thing. There’s a tremendous amount of educational material on the internet as well. And there’s the right thing. And there’s a tremendous amount of educational material and relationship possible in community still. And people can tell the difference between those conversations which are captured by the informational warfare, and those conversations which are not.

Jim: At least some people can and that’s going to be the point here of this second essay, is that being able to discern what is propaganda and what is education is. At this point, one of the most valuable skills a human could have, right? And I’d argue at least based on what we see running around loose in the world, that most people don’t seem to have that. As you point out, it’s the right and it’s the left, this is not by any means one side or the other. That’s guilty here. And in fact, the title of the second essay makes that point. We don’t make propaganda! They do! Right?

Zak: Exactly, and that’s how it often appears. That the side we agree with and the political views we agree with can basically use any means necessary to communicate those correct views to others. And that’s education. Getting the correct view into the mind of other people is education, irrespective of the method, make it catchy, right? Make it sticky, right? The slogans and the provocative videos with the great soundtracks. That’s all permissible. Even micro-targeting advertisements, all of that permissible because the view is correct, right? The view that’s incorrect is propaganda irrespective of how they’re communicating it because they’re communicating something that we’d all know is wrong, right?

Zak: So that analysis is basically deeply, deeply flawed. That’s the worst way that you can make the distinction between education and propaganda in terms of the correctness or appropriateness of the content that’s being transmitted. But that’s what everyone does and so that leads people to say, “no, they make propaganda, we don’t”. And that’s by definition. And that’s what’s wrong with defining it that way, right, because it becomes circular. It becomes a definition whereby anything I agree with is not propaganda and anything I disagree with is propaganda. And it doesn’t help people to see that sometimes their own communication practices, not the content of what they’re saying, but the way they’re saying it is actually backfiring. That they’re creating counter propaganda and resistance by not realizing that they’re not educating, but trying to propagandize. And they’re tricked into thinking, because they’re saying the thing they believe is correct, that they couldn’t possibly be doing propaganda. And so there are these deep confusions.

Zak: Another confusion is that there is no difference, that there’s actually no such thing as education and this gets into something like an epistemological nihilism, right, where basically we’re saying, “nope, it’s all power”. We’re like radical followers of Fuko and we believe that ‘nope, no one has access to reality. Everyone pretends to have access to reality. And we’re just strategically manipulating one another with claims to have access to reality when in fact that is all a ruse’, right? So the idea that there is no difference, how dare you even try to make a distinction, it’s all manipulative. It’s all strategic communication. That’s another deep, deep confusion.

Zak: As a epistemologist philosopher psychologist when you look at how humans develop and you look at how socialization works, even scientifically let’s say, from the perspective of a developmental psychologist, you realize that manipulative communication is parasitic on non-manipulative communication. You can’t get the game of psychological development and socialization going without at least a mom or somebody who is teaching the kid what it means to have an honest conversation about objects in the world together and about our own interior states together. And so there’s a performative contradiction in the denial of the educational relationship wholesale and replacing it with sheer manipulation, that would lead to a society that couldn’t work. You couldn’t even have the next generation.

Zak: So with those set aside, then we’re left with saying, “okay, it’s not that whether it’s true or not”, right. That’s the difference. And it’s not that there is no difference. There is a difference, there’s something where you can have a positive, communicative, educational relationship based on mutual understanding and respect. And there’s something where it’s manipulative. So I think there’s a difference. So then you start to looking at the structure of the relationship, the structure of the communication patterns and the kinds of things that are indicators of propaganda, as opposed to things that are indicators that education is taking place. Not in terms of the content of what’s being transmitted, but the way people are speaking and the structure of their relationship they’re establishing through their communication and speech.

Zak: So I identify a couple of these in the paper. I’ll move through too, there’s actually more than what I identified in the paper, but the paper was the lowest hanging fruit. One has to do with the nature of the differences in knowledge between the two parties and how those two parties relate to that difference, right? So often it’s the case that both the propagandist and the educator, no more than the person that they’re educating or propagandizing, right? So there’s what I would call an epistemic asymmetry, right? There’s a real difference in knowing and often a real difference in power and both parties know that, the student knows that and the person who can propagandize know that, right? So the difference is that in a propaganda campaign, there’s no intention for you to graduate from the propaganda campaign. There’s no intention for the communication of the propagandist to lead you up and into their epistemic position of knowing what they know.

Zak: Whereas the educator, that’s the whole point of their communication. The whole point of the educator is to obsolete the epistemic asymmetry, right? The whole point of the educator is to have their communication with you, bring you up and into their position where you know exactly what they know, and then more, because the educator is intending to transmit intergenerationally, they’re intending to bring you up and into responsibility for knowing and handling society. The propagandizer is not trying to do that. The intention there is to control your behavior through the manipulation of information, so there will be an unbridgeable epistemic asymmetry that is guarded against the breach that is put in place by the propagandist and that’s important to get. So that’s one very strong indicator that basically is it educational, which means that we both know that you’re bringing me up to a position of being able to graduate and then improve upon the work whereas the propaganda where we actually kind of both know that I’ll never know as much as you do because you’ve put in place things that will make it impossible for me to know. And so that’s, that’s an important difference.

Zak: And then another difference is the nature and style of the communication itself. So when you’re looking at contexts where things like brainwashing occur or contexts where things like mind control can be exercised, you’re looking at very specific applications of behavioral, psychological insight to the manipulation of stimulation and communication. And so the messages that are receivable to the human nervous system under certain forms of duress, sensory overwhelm, phobia indoctrination, conceptual double binds and confusions, fatigue. So it’s a whole bunch of factors, which I believe many forms of digital media actually induce in viewers that make it so that it’s not a fair match between your mind and what you’re trying to be convinced of.

Zak: And so that’s another sign of the difference between education and propaganda, that the educator is super concerned that you’re in exactly the right state, to be able to be reflective and to be able to uptake this into your existing knowledge system in a non-disruptive way. The educator wants you to fully have your wits about you so that you can fully work on the problems at hand. The propagandizers would actually prefer you to be malleable and susceptible and not thinking clearly and emotionally manipulable. And so again, when you’re looking even at the level of digital media platforms, and you look at the amount of time people will spend, for example, engaging with TikTok or Facebook, it’s the psychological equivalent in some ways of going to something like a Tony Robbins concert or being held in captivity by the cops for a bunch of hours, right? Under a light without food, right?

Zak: So, what I’m saying is that, and behavioral psychologists know this, this is part of the attention capture economy where you’re in an addictive feedback loop with the screen and you’re trancing out. And you’re watching videos that have speed of cut, which is, say the editing and the videos so dense with information that you’re put into a state of sensory overwhelm.

Jim: Yeah. It makes you more susceptible to the advertising messages, which that’s your whole point. Now, it’s interesting for reasons I won’t go into, I finally broke down and tried TikTok starting about two weeks ago. I’ve used it three times and that’ll be all I use it, but man, that is more addictive than crack cocaine. This thing is fucking brilliant in its evilness. And folks that know me know that I am a very, very punctual person and believe punctuality is a very important virtue. And the second time I used it, this is really scary to me, how a person who is very disciplined about such things. I use it week one, week two, I had about 10 minutes before I was going to do a zoom conversation with somebody who I was going to have a very important conversation with us. I wanted to kill 10 minutes and try this TikTok thing again. Fuck I look up and 25 minutes has gone by and I had left this person hanging for 15 minutes.

Jim: And I felt like total shit. I never do that. And that is the power of the intentional addiction of a platform like TikTok, really quite remarkable. And subsequently I’ve talked to some experts in this domain and they say, “oh yeah, well in some ways, TikTok may not have some of the bad side effects like the body issue issues or things that Instagram might. In terms of addiction, this is a whole new level. This is the phentenol of social media” and then it make sense. And we’re in an arms race. If I can put you in this trance, I can sell your at my ads for more because you are more susceptible to them. How fucked up is that?

Zak: Yeah. It’s crazy. I mean this is, again, it sounds like extreme language, but it’s kind of like a brainwashing machine.

Jim: I mean, that’s what it is. I mean, it overcame me. I’m an old cynical fucker who’s been around through 40 years of development of online media. I know better. And it got me, right? What’s it going to do to a 11 year old kid, right?

Zak: Yeah. Well, it’s an important to get that it’s got into you to capture your attention, which is one of the things that’s trying to do. But then within that, there are all these things happening that actually are influencing thought and behavior, even if it’s just influencing your behavior to stay on-site and to click around in certain directions. But it’s obviously deeper than that because it’s saturated with advertisement.

Jim: Okay. Think about TikTok, which is so amazingly pernicious, you don’t click around right? It’s just straight and the only signal is, do I watch this little video, do I flick it off in one second, or do I watch it through, do I watch it twice or do I like it? And so it appears just, oh my God, this is the most clever evil thing I have ever seen. So people out there, if you’re of strong will, look at TikTok once and then delete it from your phone. If you’re not of strong will, don’t look at it at all. This is the phentenol of social media. It’s quite remarkable. I’m going to toss out something else for your consideration to think about when we were trying to distinguish what is propaganda. One of the things I always look at, you don’t mention this in the essay, but I don’t want to throw it out there because one of my favorite science tells that somebody is creating propaganda is a system of non falsifiability, right?

Jim: So that you can’t challenge what they say and that’s built into the structure of what they’re saying. Think of medieval Catholicism, to even question the doctrine is to be a heretic and to be burned at the stake, to challenge that capitalism is the greatest system of all time in 1952 that makes you a trader, to-

Zak: This is very true.

Jim: Yeah Marxism-Leninism, if you question anything significant your class enemy, if you question critical race theory, which is our current non-falsifiable bit of memetics, you’re racist, right? Even if you challenge it with objective facts. And so I always use that as a strong tale, that propaganda is going on when something is, within its own terms, non-falsifiable.

Zak: Yeah. That’s an interesting point. And I would, some subtlety there would be that I would make a distinction here between something like ideology and propaganda. Propaganda are pieces of media that get created often in the interest of ideologies. And in many cases, the ideology that drives propaganda, the stronger the propaganda the less coherent the ideology has to be. If you’re driving a good propaganda campaign, they’ll put up with a whole bunch of internal contradictions in their propaganda campaign, and you don’t really need to hold your ideology together very well conceptually, basically because you’re using it manipulatively. So that’s this distinction. And so what that means is that if you’re doing education, you actually have to have a really coherent ideology because you’re inviting everybody up and into it. But if you’re doing propaganda, you can basically think whatever you want ideologically and use thought terminating cliches as an aspect of your propaganda to just shut off, further advancement, deeper into the ideology.

Zak: And so I thought terminating cliche is an example where ideology and propaganda come together. My favourite one is that science is settled. That’s my favorite thought terminating cliche. The science is settled. Well, it’s a very problematic phrase, right? It’s also kind of a strange, confusing phrase because anyone who knows anything about scientific method knows that the last thing that’s settled is science. But it also in a discursive situation, which is to say in a conversational contexts, it basically says ‘I’m stopping the conversation’. That’s what the phrase says and there’s many of these in the culture.

Jim: Oh, that’s another frankly, an example of non-falsifiability, right? It’s an assertion of it. And those of us who actually do science and know some of the most senior scientists in the world know that, as you say, the antithesis of a scientific view, no good scientists think that the science is settled, period forever. Right?

Zak: Exactly. So the use of that phrase by scientists who know better is indicative, right? And then the use of that phrase by people who don’t know better also indicative, but in both contexts, it’s basically saying we’re stopping the conversation because you’re getting too deep into the ideology, which is actually incoherent, at least. Why else would they protect you from adventuring further into the ideology, right? And there’s a whole bunch of things which can be used rhetorically to the same effect. And that’s the term thought terminating cliche was, it’s a Robert Lifton’s term. He studied brainwashing in communist China and the Maoist revolutionary techniques of indoctrination.

Zak: And yeah so the thought terminating cliche isn’t one that I use and it’s tied into your point about, ‘okay yeah, some of this stuff just does not hold together’. And as soon as you start to question it, you get shut down. And that’s the interface of ideology with propaganda. So another sign of propaganda is that it is driven by incoherent ideology. And there’s a couple of philosophers who say, that’s one of the diagnostic signs is that actually when you examine it philosophically, logically, not so much empirically because you can string together a bunch of facts if you want to. It’s about the broader frame that holds the facts together. When you examine that logically, it just doesn’t hold together. And yet everyone believes it.

Jim: Exactly. And I’m going to get down into the weeds just a bit, because I found this actually very useful again, for people that are, we’re trying to build an immune system against propaganda here, and that is, I call it your typology of different kinds of propaganda, and it’s not all what we think it is, right? For instance, “oh, propaganda is always a lie”, right? Not necessarily folks. And so I’m going to lead you through five of your typologies and rattle on at some length about them or at least to the point that you can communicate the concept. The first of these typologies is overt versus covert.

Zak: Yeah. So the propaganda typology, I was inspired by the work of Jacques Ellul, who was a French theologian sociologist. His book just called propaganda with the best ones on the topic. And so he makes this distinction overt versus covert propaganda. And so when we think propaganda, if you were to purchase a book on propaganda, you’d see this thing, it would be uncle Sam pointing his finger saying, “I want you, join the army” right? That’s overt propaganda, right? Everyone knows it’s propaganda. Everyone knows who made it. Everyone knows why it’s there, but it’s still propaganda, right? So, a Nazi rally, with the banners and the marching and all that’s overt propaganda. Everybody knows what the hell is happening here.

Jim: I know by the way, the national Anthem to start a football game.

Zak: The national Anthem to start a football game, right? The presidential inauguration with the pomp and the circumstance and they’re tremendously moving, but everyone knows what’s happening. The state has marshaled a tremendous amount of resources, both symbolically and financially to create the spectacle, which is to grab you and make you think certain things. So everyone knows that that’s happening. So overt propaganda is in some ways the best kind, because everyone sees it coming. Everyone knows it’s coming. And this is true even in pandemic related things, CDC, other places, you know, that these people are propagandizing you or at least you should. So there’s overt propaganda.

Zak: And then there’s covert propaganda, where basically it’s coming at you and it’s propaganda and you don’t really know that it’s propaganda. And so the example I use there is in the sixties, the CIA’s involvement with the student protest groups. And so many of the protests for feminism and the war and other things that were fomented on college campuses in the sixties, were actually supported financially and sometimes operationally by CIA operatives who are interested in having it appear that the United States had a protest culture because the Soviets did not, right? And we were claiming to be democracies and we wanted college campuses where student uprisings could occur and we would actually respond to them and change things, whereas in the Soviet union that could never happen, right? And so that’s an example where that wasn’t revealed until the seventies. And no, and it was, “holy cow, so you didn’t even know”.

Jim: That’s a pretty deep strategy actually, right?

Zak: It’s an incredible strategy and the funny thing is that it’s not bad, many of these things had ramifications out, probably what we would consider to be very positive ramifications on the culture. And yet they were fomented secretly by the CIA and almost nobody involved knew there were two or three points of connection. So, that’s an example of covert propaganda. And then of course you also give covert non-domestic propaganda. So on Facebook in the 2016 election famously, right, people were looking at Russian propaganda and not knowing that it was Russian propaganda. That’s crazy. We had this machine that was displaying for us stuff that was built by Soviet Intelligence operatives on our screen while we’re taking a shit in the morning and we’re looking at this thing and it’s propaganda. We don’t know it’s propaganda. We think it’s shared by someone in our group, in our Facebook group, right? And that dude’s outside of St. Petersburg working at [Hahn 00:44:00] like a massive, a bunch of screens and running a bot control. And so it’s just, so that kind of stuff, that’s covert propaganda as opposed to overt propaganda.

Jim: And obviously harder to distinguish. It’s one of the skills that this 21st century person, either we have personally to get, or groups of people together collectively had to get, or through regulation of our communication platforms we have to enforce. And I continue to advocate for in most cases requiring real name identity online, for instance, right? It’s an argument against it. And then I would allow some exceptions narrowly, for instance, for say domestic violence support groups, or for support groups for us, a specific illness or something like that, but only for cause, kind of like the way the right to carry a pistol used to be for good reason only, but strikes me that when you open the door to anonymity or pseudo anonymity, you’re just asking for that kind of problem, right? People aren’t, you don’t know who they are, they’re masquerading as somebody they’re not, and it’s just a giant hole in our system. Let’s go on to the next item in your typology, which is deceitful versus truthful but misleading. And I think this is really important for people to get their heads around.

Zak: Yeah. It’s very important. When we think of propaganda again, we often think of deceitful propaganda, we think of kind of crazy disinformation campaigns where foreign governments are lying to us, basically. That’s what we often think of and that totally happens. There is the C4 propaganda, and there’s a lot of misinformation which is created for political ends both by our own domestic political parties and by outside ones. But deceitful propaganda is a kind of like a short-term strategy. The great propaganda campaigns and the great propagandists always use as much truthful information as possible like Goebbels, right, Hitler’s main propaganda guy. He was always saying, “we have to tell the truth, but we don’t tell the whole truth. We select very specific truths from all the things that we could say to create a very specific picture”. And actually one of the things he would do is point out to the British people, how their own government had lied to them during world war one with their propaganda, right?

Zak: So there’s a risk always when you lie with your propaganda, of The Boy Who Cried Wolf effect kicking in, when it gets discovered what happened, right? So, in the lead up to world war one, the United States, Wilson in particular built the committee on public information, which propagandized the American people to unite them for the world war effort basically. In that context, stories were made up, atrocity stories that German soldiers allegedly committed, right? They didn’t commit these stories. When our boys returned home and revealed the fact that none of that stuff had actually happened, there was a major outcry among the American people. And then in World War II, when the Germans actually started to do that stuff, there was resistance to believing it because that kind of stuff had been made up before to galvanize people to come to war, right? So that backfire effect of the C4 propaganda is also a known thing by propagandists who are looking to wage multi-decade long propaganda campaigns to change holes, I guess, within societies.

Zak: So they use very strategic, truthful propaganda, but misleading. And so this is where you learn to lie with facts extremely well. And you will learn to use Russell Conjugations to put a bright side on things and a negative side on things that actually aren’t that bad, right? And so there’s all of these techniques to never get yourself into a situation to have to say very complex lies, to always have a kind of weigh out a plausible deniability of fiction where you’re saying, “no, I’ve been saying truths all along”. And so this is very common and we’re seeing this.

Zak: … Same truths all along. And so this is very common, and we’re seeing this constantly. And so again, I’m seeing a lot of misunderstanding about manipulative communication with the misinformation, disinformation, fake news conversation, as if the problem is people saying things that aren’t true when there’s a tremendous amount of propaganda that can be run through your best fact checkers. And that’s important to get.

Zak: So again, New York Times guy who believes that it’s the right making up all these lies that are propagandizing everyone, and that all the stuff in New York Times has been fact checked by Politico or whatever, and therefore it’s true… No. You can have a article that’s completely correct with all the facts it sites, but that leaves out so much other stuff and so much other context that you’re getting a completely deceptive understanding of what’s occurring, even though nothing has been said that’s factually incorrect, and none of the fact checkers will have any problems with it. So that’s very important to get.

Zak: That’s your modern propaganda. Very sophisticated. And then you’re manipulating whole academic fields. So you have an opportunity to begin to define what actually counts as true in whole contested areas of discourse, setting the measures, setting the standards, setting the professional criteria.

Zak: And again, that sounds like a conspiracy theory, but these are things that have been going on, again, since the World War I mobilization effort in the United States. And the committee on public information set up this kind of relationship between the government, and the academy, and the press, to make propaganda campaigns that were true but manipulative. And then deception gets in there as well. But again, the deception always backfires and is super risky.

Jim: Yeah. A couple of points. Kind of an industrialized version of true but misleading are the ideologically motivated think tanks, right? These are scholars, but if X, the big chemical Czar is funding you, which is a very well known, motivated, ideological think tank, there’s a whole bunch of questions you do not answer.

Jim: And if think tank Y, funded by a famous billionaire, very progressive hedge funder, you don’t ask other questions, right? So one of the versions of the truthful, but misleading is the ideologically motivated think tank.

Jim: I think people underestimate the influence that’s had in our society over the last 60 or 70 years, since they started to be invented in the sixties.

Zak: And you got to understand they’re not lying to people. They’re just pursuing particular research agendas and not other research agendas, and taking particular meta theoretical orientations to how they present their material, but they’re doing good work insofar as they’re doing that work.

Zak: And so the belief that this can all be solved by fact checking, or the belief that we can all somehow get the liars out of the room, completely misunderstands the situation, and is actually an example of a kind of propagandized way of thinking, because you’ve created a class of people who are just liars and bad, when we’re saying it’s much more complex and that there’s manipulative information on all sides, and that there can be actually conflicting truth claims about the same situation, truths which are very hard to reconcile.

Jim: And if you have a motivation to only present one side of the story, that’s to what you’ll do.

Zak: Correct.

Jim: And then you’ll be able to say, “This is true.” All right, let’s go on to the next item, which I thought was quite interesting, which is the distinction between vertical and horizontal propaganda.

Zak: Yeah, so vertical is your classic propaganda, so what we think of as propaganda. It’s a centralized, usually government run, intelligence agency aided information campaign. Centralized. Vertical, meaning top down. And those run both domestically and towards our enemies, or our enemies towards us.

Zak: So when we think of propaganda, we also often think of deceitful vertical propaganda, perpetrated against us by our enemies. And so that would be an example of you sitting there looking at your Facebook and seeing Russian propaganda that you don’t know as propaganda.

Zak: That’s the result of a vertical propaganda campaign, from the top down, through the internet research agency and the Russian trolls, to your phone, courtesy of the Kremlin. And so that’s important, to get vertical propaganda, we understand that.

Zak: Most of that propaganda, especially the Russian propaganda in particular, looks like was intending to put memes in place, which then we would propagate horizontally of our own effect.

Zak: And so horizontal propaganda has no centralized authority dictating it and is created by the very people who are the target of the propaganda. They come to embrace the propaganda, and actually improve upon it and spread it of their own volition because they’re so convinced by it.

Zak: And so propaganda taps into psychological motivations that have to do with the super ego, and religiosity, and symbolism, and other things. So you get people who, once a propaganda campaign gets started, get really excited and lit up, stop thinking, because they believe the thought terminating cliches, and begin to propagate the thought terminating cliches, and be true believers of the ideology espoused by the propaganda. They create horizontal propaganda.

Zak: And I give the example of Jimmi Hendrix playing the Star Spangled Banner at Woodstock. Amazing. Talk about the CIA being like, “Take that, KGB.” When that went around the world and everyone saw that, and the freedom that was expressed at Woodstock, and the complete absence of the other counter narratives that were possible to critique… Woodstock… Just this blaring American freedom.

Zak: You couldn’t design propaganda as powerful as that, and yet people believe things like blue jeans and rock and roll were really the thing that took the Soviet Union down, that it was actually the success of our psychological warfare. The Beatles, Hendrix, et cetera.

Zak: But all that was horizontal. And so that is important to get, that a lot of the propaganda we’re exposed to now, TikTok for example, Facebook, is people, of their own volition, repackaging and repurposing, and then spreading basically, propaganda slogans, and ideas, and values, and other things.

Jim: Also, you didn’t point it out in the article, and it may not quite officially fit your definition of propaganda, but I think typologically, it’s very similar, is that we now… And social media has really accelerated this horizontal propagation.

Zak: Yes.

Jim: And as our mutual friend Jordan Hall has pointed out in some of his situational assessment papers, it’s also set up a situation where there can be emergent propaganda that’s not necessarily top down at all.

Jim: So it’s not necessarily top down, then spread. It’s just mind viruses spin up, out in their own, out in the Petri dish of horizontal culture, somehow get enough coherence in their memetics to essentially become self-generated engines.

Zak: That’s very important to get, and that’s one of the things where the situation changed fundamentally on the ground, with regards to the communication media, so that it used to be, you could do vertical propaganda, predict a horizontal campaign that spirals out from it, and that would be it.

Zak: Now you do vertical propaganda and you cannot predict the degree and intensity of the counter propaganda that will be leveled at you, because the barrier of access to the information war as been so drastically lowered. This is one of the reasons we’re in a spiraling information war arms race, is because we got people making memetic propagandas and incredible propaganda in their mom’s basement.

Jim: QAnon, for example. Now, there is one theory that this is actually a state actor. I think it’s a good probability that it’s not. It’s just a very talented, crazy person. Or not really crazy person, but someone for whatever reason has let this thing loose into to the world. One, two or three people.

Jim: And it’s spread, and it’s got its own engine now, and tens of thousands of people creating additions and accretions to it, almost like fan fiction. And it’s got a life of its own. And it’s this quite amazing thing.

Zak: It is. And the capacity for that to become disruptive of state run vertical information campaigns, domestically within the United States, that’s the sign that we’re in a mutually assured destruction information war, basically.

Zak: It used to be, like in the lead up to World War I, when the propaganda campaigns rolled out, or like with the polio vaccine, when those propaganda… There was a little bit of resistance, but forget about it, man. The vertical propaganda won.

Zak: And right now that’s not the case. We actually have these machines that allow for people to be propagandized, and they can run anyone’s propaganda. And that means that basically, you can create propaganda as powerful as the government propaganda. Small little unit, working out of your dorm room with some radical views, knowing what to do.

Zak: And so that changes the game fundamentally. It means that many of the vertical strategies that used to succeed will no longer succeed, and will create problems worse than they intended to create, which means that every time an expert speaks in the old school propagandaistic way, where they speak from statistics that they actually can’t give you, every time that’s done, there’ll be a counter expert discourse that emerges, which creates more confusion than the expert resolved through his speech.

Zak: And that’s the situation I believe we’re in now with the pandemic, where actually, the vertical public relations campaigns are failing, and creating more confusion than they are resolving.

Jim: Yeah, that is quite interesting. Let’s switch over and talk about the pandemic a little bit, because this is a very curious, found… Maybe we’ll come back to the other two parts of the typology, or maybe not. I think we hit the three most important ones.

Jim: Talk about the pandemic, and propaganda, and the horizontal backlash. A real interesting example was the CDC’s early claim that masks were not effective. And the reason they said it was actually a lie, right? Because there weren’t very many available, and so they didn’t want to cause a panic.

Jim: But that then seeded a whole culture of lash back against masks, of which we’re still living with today. A very interesting example at a maybe noble lie attempt at vertical propaganda.

Zak: And that’s the thing, is that that’s an example of how propaganda used to work. So that would’ve been a good idea to do in the eighties, when you could basically say something like that, and there would be a few people who knew that this was not correct, but they wouldn’t be able to mobilize enough communication resources to get themselves on one of the few broadcast network channels that could actually do it.

Zak: And if you weren’t an expert, you wouldn’t have access to the journals in the libraries. None of the journals were online. And so there was just this limited bandwidth of broadcast propaganda possibility.

Zak: And that is no longer existing, but they still try to run the central messages through these simplistic broadcast channels, not realizing that they’re creating these eddies that spiral out of counter propaganda.

Zak: And again, the boy who cried wolf thing happened with the masks, because it’s like you begin this whole thing with a series of lies to cover your ass about the fact you took away pandemic preparedness procedures that had been in place for decades, in order to save money in the hospitals.

Zak: And so you’re trying to save your ass, but now you’ve actually undermined your teacherly authority. You’ve undermined the legitimacy of your saying and being believed. And it’s either incompetence or arrogance.

Zak: And one of the things you realize with the study of propaganda is that it’s been ingrained for so long in beliefs by the governing classes that you can’t reason with people. That’s been ingrained for so long. Behavioral psychology, Freud, evolutionary psychology. All of the stuff.

Zak: And so there’s actually many, many people who hold a rather disparaging view, and who really believe that there’s no difference between education and propaganda, and so kind of get over it. “We’re manipulating the people, man. Because what are you supposed to do? They’re crazy monkeys they’ll they can’t take the truth.”

Jim: And maybe it is some truth to that. I don’t know.

Zak: That’s the thing. It’s complex. My argument would be that under conditions where you can actually effectively implement propaganda, sometimes it’s good to do so in order to get certain things to happen at scale that are difficult for people to do psychologically.

Zak: World War I and World War II are an example of this. Not saying we should go to war, but I’m saying, when there’s imminent threat, sometimes you do need to mobilize the people through means of communication where you don’t have the time to be fully educational, even though you’d like to be. So that’s about possible.

Zak: But if you want to do that, but you’re actually not capable of doing that, and don’t try to do that, do something else. So my argument is not that people should never use propaganda, actually. It’s much more subtle than that. What I’m saying is that our traditional modes of propaganda are in a mutually assured destruction fail state, and that if we continue to try to do propaganda the way we have been traditionally, we will continue to fail.

Zak: And that as a mode of social control has been obsoleted, which means a new method of informationally based social cooperation needs to come online. And this gets to the [inaudible 01:01:53] hypothesis, which is that we actually need fundamentally different civic infrastructure, which is like an educational infrastructure, which would allow us to release very complex claims into the public in a non propagandistic way, and actually galvanize people into large scale cooperation through education, as opposed to forcing them into cooperation through propaganda.

Zak: And again, now we’re getting a smaller and smaller percentage of the people who buy it. And many of the people who behave as if they buy it are actually deeply concerned about the irrationality of the ideology that they’re being brought into.

Jim: And if there’s nothing to replace the top down vertical, then we’re approaching information nihilism, where I don’t believe anything. Or my idiot uncle, whatever he says is every bit as likely to be true as a scholar that studied X for his whole lifetime.

Zak: Exactly.

Jim: And at that point, the signal to noise ratio just goes out of control. And as you say, society drives itself crazy.

Zak: And that’s kind of like the background radiation or the nuclear fallout from the use of these information weapons, which is this ambient cynicism about the possibility of actually being told the truth by anyone.

Jim: Yeah. And we’ll get to that just in just a second. I just wanted to add one other little thing, which is a very subtle form, and you do mention it in passing. And this is the Nudge idea that was popularized by Thaler and Sunstein. And I think Sunstein was pointed to the Obama administration to try to get people to do the right thing by subtly nudging them.

Zak: Yes.

Jim: That’s kind of a new class of propaganda. May be good. I don’t know. What do you think?

Zak: No, I don’t think so. The next propaganda paper in the series is going to be addressing nudging pretty specifically. What’s interesting is that the nudge units in the UK and the United States are massively influential, and employ behavioral scientists, and collaborate in depth with digital media companies.

Zak: And the thing about nudging is that the intended effect… Nudging is actually a form of covert propaganda. Nudging is attempting to set a choice architecture that has you choosing certain things without knowing that your choice has been affected. Yeah.

Jim: I’ll give the classic example, and the one that Sunstein advocated for, was to make signing up for the retirement plan, the IRA retirement plan at your company… The default is yes rather than no. And that did make a gigantic change in the percentage of people that signed up for these things.

Jim: And these plans, by any rational analysis, make a tremendous amount of sense for people. The tax advantages are stupendous. Companies often have matches. I remember one of my companies, I think the effective rate of return for being in the plan was like a hundred percent, per year, per annum, at least for the first few years.

Jim: And yet we still had 30 or 40% of people who didn’t sign up. I go, “What the fuck?” And it’s just human inertia, essentially. And Sunstein’s insight was “Well, if inertia is a key human characteristic, which it is, is procrastination, which it is, if you just reverse the polarity of whether you’re signed up or not for the IRA, it’ll go from 60% to 85 or 90%. And maybe that’s a good thing.”

Zak: And he uses that example because it’s a benign example, and I’m not sure it’s a great example of nudging because it’s really, to me, something about consumer protection and contract law, not about behavioral science, engineering of choice architectures. Those occur on social media platforms.

Jim: Well, let’s leave that one for today, and I’ll look forward to your next paper on this. Let’s come back and talk about COVID a little bit, because I think this is really quite interesting. The issue of, how does someone know? You point out in the paper that the raw information that the vaccine companies gather during their mandated trials is not available to the public.

Jim: And that then puts an epistemic gap between what we could know and what we’re told to believe. On the other hand, the truth is, if you dumped all that data on Joe Random, their chance of being able to make any sense out of it is mostly zippo.

Jim: But in between, we do have, allegedly, our socially appointed experts, being the FDA, who do have access to this data. At least I believe they do. And so we have essentially built a proxy for access to the data through the FDA. What’s wrong with that? And why does it not seem to be working?

Zak: Well, there’s two things I mentioned. I mentioned the unbridgeable epistemic asymmetry that’s guarded against the breach, that’s put in place between the vaccine manufacturers and the general population, a large percentage of whom could make sense of the data.

Jim: Raw data? I don’t know. But anyway, neither here nor there. But let’s say they don’t allow it, for business reasons probably [crosstalk 01:06:57] but they do allow the FDA to have access to it.

Zak: But I’m not sure about that last bit. So first of all, we were told we would have full access. Well, let me take a step back to way before we even started this conversation.

Zak: I note that it is difficult to have this conversation. So that would be the first thing that I would note, which is that the fact that we’re discussing this, some people would think is problematic. Because it sounds like I’m saying something conspiratorial about vaccine manufacturers.

Zak: Because I’m also going to say that they have no accountability for the effects their vaccine have on the health of the population. This is also true.

Zak: So those are two things which are very important to get. But what I want to note before we get to those things is that it’s hard to have these conversations because of the emotional charge around the issue. And the emotional charge around the issue has to do with the nature of the propaganda that’s been perpetrated, which characterizes anyone who questions the idea that the vaccine is the only best way to go, who characterizes anyone who does that as being a particular kind of person.

Zak: And so that’s just worth noting, that the evidence that we’ve been deeply propagandized is right now in our nervous systems, and bodies, and hormonal systems, as we react to the turn that the conversation is taking. I’m noting into my own body, that it feels risky for me to talk about this. And why is that?

Jim: Good point.

Zak: This is a very important public issue.

Jim: Part of the information ecosystem is creating this, the preparatory aspect of the propaganda.

Zak: Right, exactly. So I just want to note that, but then I will say that there’s an unbridgeable epistemic asymmetry that’s put in place, and there is zero accountability for negative and positive things that occur as a result of the vaccine. Well actually, not positive because they’re taking credit for success.

Zak: So that’s important to get. And that sets up a situation where you can’t claim to be in an educational relationship. You can only claim to be in a persuasive or coercive one.

Zak: Like if you’re a teacher and you’re trying to teach me something about my health, something that I should do for my health, and you’re telling me to do this thing, and I say, “Okay, show me the data.” And you say, “Well, I can’t show you all the data.” And I say, “Okay, if something bad happens to me, are you accountable?” And you say, “No, I’m not accountable,” why would I trust you? Doesn’t make sense.

Jim: What about having the FDA in there as a proxy for ourselves?

Zak: I do not believe that the FDA solves this problem, because the FDA is in a similar situation. Now, the FDA has set up an unbridgeable epistemic asymmetry between me and the data. And now the FDA is also not accountable for what happens to me if I take the vaccine. Nobody is.

Jim: Well, indirectly. Because let’s say it turns out 10% of the people take the vaccine die. I guarantee there would be a revolution in American politics and the Democratic and Republican parties would not survive it. So at least in extremists there’s accountability if the FDA has blown the call on this one.

Zak: Well, the point is that that’s a strange counterfactual to have to bring into a conversation when it should be like, “No, you’re accountable for what you do, and there’s no reason to not give the data. If you can’t understand the data, you can’t understand the data. If you can, you can.”

Jim: So let’s talk about a possible solution. Would having well documented public data repositories for evidence be something that really ought to become general practice, and which we should demand if we’re not going to declare pronouncements to be propaganda?

Zak: Correct. Exactly. In today’s day and age, with the state of technologies for the display and storage of public data, civic data, there’s really no excuse to set up unbridgeable epistemic asymmetries around public data.

Zak: And that didn’t used to be the case. It used to be very hard to house, and store, and display public data. We had libraries where the journals were stored. So things have so fundamentally changed that now, the refusal to do that is what’s driving the counter propaganda. The refusal to do that and to take responsibility.

Zak: Those two refusals are the thing that are driving the counter propaganda, because they set up a situation which, by no light can you say that it’s a situation where one feels that they’re being educated. One feels that they’re being manipulated. I mean, it’s just bizarre to me. But anyway, like I said, it’s dangerous to have this conversation.

Jim: When I was reading that piece, that section in your essay, I said, “Hmm, let me think of an analogy from my own life.” And I thought about when I was in college and learning about quantum mechanics. Quantum mechanics is famously the most accurate known theory in science. It can be demonstrated to 14-

Jim: Known theory in science. It can be demonstrated to 14 decimal points. And even though I was a physics student at MIT and knew the science and the math pretty well, in 1973, I didn’t really have access to the raw data, but rather I relied on the inter-subjective confirmation of the inter-objective from the scientific community. I would have to assume they’re all fucking lying if somebody just made this up and it ain’t true. And truthfully, even as a sophomore physics major, especially then when it wasn’t likely to be very computerized, someone gave me a magnetic tape with the data from particle collider experiments, I wouldn’t have had clue one what to do with it. And so why not encourage this inter-subjective, inter-objective validation as a way to build social coherence and trust around pronouncements like things like the vaccine?

Zak: Totally. I mean, we do need to build social coherence and trust around these things, and the nature of the incentive structures for the knowledge production processes and the institutions that create the data to which I don’t have access are very important. So the question of what kinds of incentives drive investigations into quantum mechanics and the professors who speak about quantum mechanics and what would be their motivations for lying to their students about quantum mechanics, very different from the incentives that drive pharmaceutical drug development and the interests of the pharmaceutical companies making profits off the drugs that they develop in terms of… So I’m just saying that that’s a strange example because the whole issue here is propaganda. No one’s propagandizing anyone about quantum mechanics.

Jim: Well, actually they do. There’s all kinds of propaganda wars about the interpretations of quantum mechanics, but not [crosstalk 01:13:56]-

Zak: Those are academic. Those are academic debates. That’s not a small political class trying to manipulate a large group of people behave even a certain way with regards to quantum mechanics. But this is what we’re having with regards to pharmaceutical drug development, where… And the FDA is a great example. I mean, we’ve known for decades that the FDA is not to be trusted. I mean, look at how they’ve handled the opioid epidemic. And so again, we’re in a situation of institutional decay where there’s widespread simulation and dysfunction deep within these institutions. And we see it in obvious places, but it’s seeping into all of them. And so to hold, for example, biomedical industrial complexes as exempt from the same dynamics of civilizational collapse that are characterizing our cultural production complexes or fossil fuel production complexes would be foolish.

Zak: It’s remarkable that the medical system is trusted more than other major systems in the country when in fact, we pay more and get less for our medical services in the United States. We’re sicker than any other country and consume more biomedical products. And yet we trust this system more than other systems.

Zak: And actually, people I know who are radically, radically skeptical of big oil and big agriculture and the government and a whole bunch of stuff, do whatever their doctors tell them. And so this is interesting to note that the pharmaceutical companies spend more money on advertising than they do on research and development. It’s been like that for decades. And so, yeah, I wish we were in a situation where there wasn’t an unbreachable epidemic asymmetry and an absence of accountability on the part of those people who are waging one of the most unprecedented propaganda campaigns in history. But we are. And again, because my saying that is dangerous, just note that because we’re talking about propaganda, I’m saying all this, just as an example, to get a rise out of people to show-

Jim: Make them think.

Zak: Make them think about the fact that they’re ready to wield thought-terminating cliches against the arguments as rapidly as they can.

Jim: Well, I should also point out to folks that even if it is propaganda, it doesn’t mean it’s not true. It may well be that the vaccines are safe and the smart thing to do. But I think what you’re exposing here is that we don’t have the inter-objective, inter-subjective verification that would lead us to necessarily feel confident about that proclamation. I think that’s the real issue.

Zak: There’s that, and there’s the evidence in the informational field of overt propaganda to create scapegoat populations and other things. So that’s also true too, that the nature of the propaganda is heavily politicized and the creation of a subclass of people is also being done. And again, just by virtue of me speaking this way, people seek to place me in that subclass.

Jim: I will say that it’s very disturbing. The war against anybody who disagrees via the social media platforms is very disturbing. I can remember when, “It escaped from a lab in China,” was totally suppressed on the social media platforms. And now, well, if I had to put money down, it’d be 50-50. I don’t know that it was, but I don’t know that it wasn’t. The evidence is not yet clear and convincing, but nor is the evidence in the other direction. And so this idea of suppressing the minority view or the heterodox view is one of the worst things going on in our society.

Zak: And more specifically, the creation of languages that promote the hate of that class. And so I’m seeing language used among people who identify as progressive and liberal, just as hateful, dehumanizing language used against fellow citizens being promoted. And so that to me is a sign that the propaganda has gotten a little bit out of hand. Anytime there’s a class of people where the media is suggesting that one can legitimately hate them and rejoice in their problems, be happy when they get sick, feel righteous when they get sick, when the media’s promoting that kind of stuff, we’re in a bad spot. We’re in a bad spot. And that’s where the discourse has gone, in part because of the inability to organize an educational campaign and the inability to execute an old-school propaganda campaign, which left us in a no-man’s land of complete absence of ideological leadership in the context of a very complex and morally dangerous time.

Jim: Yeah. And in the context where the self-generated horizontal counterflows to inly generated top-down propaganda was likely to occur and indeed did.

Zak: Yeah, exactly.

Jim: In the few minutes we have left… but take as long as you want… you highlight the fundamental is overcoming asymmetries of knowledge and power via education, rightly considered. Why don’t you just tell us in as much length as you want what you think that might look like?

Zak: Yeah. So it comes to this core of the consilience hypothesis, which is that exponential digital technologies have made obsolete the ways we used to run states and economies and cultures, that the government, the economy, and the culture are all so fundamentally transformed by the presence of digital that we need to rethink most of them. It’s possible to, as China is doing, turn the digital to something that allows for basically the broadcast propaganda mechanisms to still work. It’s possible to use the internet to create one centralized view of the world and one really well-organized social control system. So that’s one way to go. Order. Basically, authoritarian order.

Jim: Make vertical propaganda work again.

Zak: Make vertical propaganda work again. And many people right now in the United States are craving that, just to note.

Jim: Well, certainly the lefties are. I mean, the wokies. That’s what they love. They’re trying to get-top down propaganda to become the law of the land, right?

Zak: Right. So there’s the opportunity. There’s that opportunity. We solve the digital by collapsing it down into some ordered thing. And that has all of these problems with it, which we could go into, but it’s a self-terminating pattern that will not last. It will break. It’s a complicated solution to a complex problem. And then we’ve got what we’re doing with the internet, which is basically running into escalating mutually-reinforcing feedback dynamics between various polarized groups and creating complete cultural chaos, economic chaos, and political chaos, basically, as a result of the intervention of the digital. And so there’s chaos and there’s order. Chaos could turn into something like some strange new form of techno-feudalism eventually, which would be equally self-terminating to the authoritarian pattern. And so then the question becomes, is there some third attractor by which we can use the affordances of the digital to create a fundamentally different kind of civic architecture that would be basically a broadly distributed educational and communications architecture that would allow for a different form of deliberative democracy to take place.

Zak: And so one way to think about the nature of the solution space is to think about what are the characteristics of an open society that need to be preserved, that give it its strategic advantage, and make it a complex solution to the problem. So this is basically saying, “How do we assure that the basically digital mediation context… When I’m talking to you, we’re talking together. The digital’s mediating our relationships. How does that become a structure that enables freedom, a structure, like a grammatical structure. I’m obeying the laws of grammar when I speak to you. But by obeying the laws of grammar, I can do infinite things. So I’m talking about the design of an open structure for the digital, which is an ordered open structure. This is complexity talk.

Zak: So one of the things we’re thinking about here is how much noise is too much noise in a digital information ecosystem? The argument would be that if you lock down too hard, you’re getting none of the benefits of digital. You might as well just have TVs and radio and stuff. If you open it up too wide, then you have the information chaos. So there’s some modulation. One solution is something you’re talking about. It’s a clean space. Everyone gets one ID. Everyone has no anonymity, which means there’s no bots, there’s no artificial intelligence-produced texts, there’s no advertisers coming from places where you don’t know where the advertisements come from. So there’s technical solutions that make the space clean.

Jim: At least cleaner.

Zak: Cleaner. The kind of stuff Tristan talks about, like humane technology. So that’s one aspect. But the other aspect is using that technology to create new forms of embodied communication between people because one of the issues… Like I said, the digitally mediated relationship, it’s not just that it’s anonymous. It’s also that it’s asynchronous text-based communication or it’s video feeds that have bugs in them and you lose communication. And so there’s a premium that actually needs to return to civic discourse. And so as an educator, I believe that it’s not just the technical solutions. It’s also using technology to bring people together in new ways.

Zak: This is one of the principles of the education hub network, that the educational technology should never keep you on your screen. It should be putting you in touch with people who will benefit from being in communication with you. So the possibility of using the network power of the digital to galvanize what I would call noetic polities, which are groups of people who are committed to certain values, who can come together and build education and build community and build conversation. Right now, we think we have platforms that do that, but we actually don’t. We have platforms that create market demographic groups and basically propagandized subclasses that unite under banners that are not of their own creation.

Jim: Or even if they are, they’re very flat, simple-minded, and don’t have enough institutional structure on them to make them work very well.

Zak: Yeah. So yeah. So I’m listing big design principles and not specifics.

Jim: Let me jump in here with one that ties back to this COVID thing and to numerous other ones, which is… At least it strikes me that what comes next, whether we call it game B or something else? One of the problems that we have to solve is how does somebody in an intelligent way choose the experts they’re going to listen to and how do those experts actually have standing to be experts? For instance, I don’t know that much about vaccine science. Even if you threw all the data to me I could probably put some pretty graphs on the screen, et cetera, but could I actually tell you whether the trade-off is worth it? No. On the other hand, I can know people who know that, or I could have access to candidate experts in that. And I could essentially give my proxy for whether I should take the vaccine to somebody and maybe they give their proxy to somebody else.

Jim: So we could have a recursive chain of proxies, which I could always see. It would always be transparent. And so the question on, should I take the vaccine or not, I would get the yes or the no from some expert, either through direct delegation or through a multi-step proxy to somebody who was actually qualified to make that decision and had access to the data. That strikes me as something that we absolutely need in our society, where you have on one hand, either have to accept what the powers that be say, or you have information nihilism where somebody’s drunken uncles spouting on Facebook is what you choose to follow to make a relatively important decision. And I mean, neither of those is correct.

Zak: Yeah. There’s a concept I like. I sometimes call it educational democracy or qualified democracy. And the idea being that many of the things we vote on, we have no idea about… I don’t know how the electrical grid works in Vermont, but I’m expected to vote on the electrical grid in Vermont. But there are some issues that I know a lot about like education, for example, which I feel very comfortable voting about how education could work in Vermont. And so everyone is in similar relationship to similar questions. And so one view is that, well, therefore, if you’re not an expert in that field, you allow other experts to make the decision. Another view is that in fact, there’s a responsibility to put in place the educational resources that would allow someone to come into a position to understand well enough what affects them if they want to. You don’t have to, if you’re comfortable enough saying, “Hey, that expert can make a decision for me. I let them do it.”

Zak: But if you’re like, “No, I got to really get in there. You got to explain this to me. And you got to put it in terms that are relatable and convincing enough to me that I will actually believe you, not just take you on your word and not be propagandized by you.” And so the idea that we can give over our choice-making power to experts seems to be completely necessary given how complex society is. But that works only if there’s a qualification that the educational on-ramp to expertise and full information is not guarded against the breach. So for example, right now, there’s big problems with tractors because John Deere won’t let you repair your own tractor and they won’t teach you enough about your tractor to repair it yourself even if you want to, even if you to learn the computer engineering expertise [crosstalk 01:28:14]-

Jim: And in fact, for at least John Deere, for some of their high-end tractors, if you do it yourself, you’re in breach of contract.

Zak: Exactly.

Jim: You can do it yourself.

Zak: So that’s another perfect example of this weird epistemic asymmetry that’s put in place in guarding against the breach, which is obviously in not the interest of the people who are most affected by it. And so similarly for the vaccine issue, it’s like if some people want to give over their authority to experts, they should be allowed to do that, to not take the full cognitive burden of that. But if other people want to know, then those experts need to be educators and not just experts. And so with those stipulations in place, then yeah, a lot of this looks like reconfiguring communication relations between different parts of the social system, which is a lot of what public relations as a field emerged to do in these complex urban societies was to explain the perspective of this part of the city to that part of the city, and to explain what these experts thought about transportation to the people who’d been driving in the cars.

Zak: So that need to orchestrate those kinds of communication across epistemic asymmetries, and then delegation of choice-making power with possibility for educational advancement along certain vectors, that needs to be in place. So the education democracy equation is tight. Do we have that correct? You don’t get open societies without a lot of educational resources. Right now, we have a ton of informational resources, but not all of them are educational resources. And so using all of the techniques of psychometric profiling and behavior monitoring and all of that stuff, we could actually make algorithms that are educational. This is where it ends, which is that right now, all of our technology development is driven to make algorithms that keep you addicted to certain sites.

Zak: Imagine all that same prowess and technological expertise was actually devoted to showing you that sequence of things you should look at that would most bring your mind into a healthy, mature, and capable state. Not making your mind like everyone else’s mind, your mind, your unique mind, specifically micro-targeted for educational advancement as opposed to attention capture and knowledge implanting. So that’s basically what it comes down to. If you want to think about the new civic infrastructure, you have to think about what do social media technologies look like that are built to be educational as opposed to exploitative, and what do algorithms look like that can curate the web in educational ways that can actually take the full architecture of what’s available and know exactly what the most appropriate thing would be given the state of your mind and your knowledge.

Jim: That would be amazing. And that would be also unbelievably radical.

Zak: It would be completely radical. Yeah.

Jim: As I always say, the center of the beast today is that everything is driven by short-term money on money return. That is the cycle that crushes everything. And to change that inner cycle to… I might even go a step further. Education’s a big part [inaudible 01:31:32]. Maybe you construe education to include this. But I would call it human development, broadly construed. I mean, for instance, the topic we talked about last time, hierarchical complexity, is that actually education or is that some higher order thing of human development? Even if you think they’re two different things, there’re very, very close related and education is a key component of developing one’s hierarchical complexity and growing into being a true sovereign adult at the right age and with the right experiences. If that were the metric for our society. Holy fucking shit, Batman.

Zak: And that’s the thing is it’s a completely solvable problem. We’ve never tried to solve it. We’ve put all of our resources into solving the problem of how to get people addicted to sites that can sell them advertisements.

Jim: Money on money return.

Zak: Money on money return. If we put all those same resources towards answering this question of what do the social technologies look like that actually benefit people educationally in terms of their human development, hierarchical complexity, personality, maturity, reflective, metacognitive awareness, a whole bunch of stuff.

Jim: Non-reactivity.

Zak: Non-reactivity. Exactly. And so that same architecture that right now is creating these dystopian chaotic possibilities could be repurposed to create the most profound educational infrastructure that has ever existed. In fact, I would argue that the reason we came so hard with broadcast propaganda was because we were building these massive cities. The urban environments became so huge and complex the need to regulate behavior in them became pressing. And if we could have, we would’ve loved to… this is my nice argument… we would’ve loved to create an educational complex that could have handled the same problem, but we didn’t have the technologies. And so I’m saying now is the time to begin to exercise a form of social control that is actually educational as opposed to propagandistic. We need high-order forms of social communication that allow for cooperation and collaboration, which is to say, we need mechanisms of social control.

Zak: People need to cooperate, especially in times of crisis. But how they’re brought into those forms of cooperation can be more or less coercive. And I’m saying we have the technologies now to bring people into alignment in non-coercive ways. It appears that the technologies inevitably will divide us, but that’s because of how we’ve been using them. An educator, I have faith that there’s a road to truth for each person and that all the roads lead up the one mountain, which is to say that if we do the education thing right, there will be less conflict and less division. This was Dewey’s hypothesis as well. Create the melting pot. Get them all in there and it’ll work well.

Jim: Let’s end it right there on that very hopeful vision. I hope you’re right.

Zak: Me too.