Transcript of EP 239 – Alex Fink on Improving Information Quality

The following is a rough transcript which has not been revised by The Jim Rutt Show or Alex Fink. Please check with us before using any quotations from this transcript. Thank you.

Jim: Today’s guest is Alex Fink. Alex is a Silicon Valley expat and the founder and CEO of Otherweb, an AI-based platform that helps people consume higher quality information online. Formerly he was the co-founder of Swarmer, an AI company that helps drones do things humans want done. That sounds both good and scary, so I don’t know if we’ll talk about that, but that’s interesting too. Welcome, Alex.

Alex: Thank you so much, Jim.

Jim: Yeah. Now having spent 15 years in various engineering and executive roles in tech, Alex has decided to dedicate himself to solving one of the biggest problems facing humanity today, digital junk. As regular listeners to the Jim Rutt show know this is one of my pet fucking peeves. In fact, I often talk about the AI accelerated flood of sludge. And by the day you can see the quality of Google search going down, down, down.

And in fact, I made a bit of a noise late last week, was it? Maybe over the weekend where I declared I was fucking done with Google except for finding the nearest Mexican restaurant and that I had switched my flag to perplexity goddammit. And I will say I found a few limits to perplexity too, but it’s way better than Google and we’re being done in by the flood of sludge. And also regular listeners know that I regularly rant. Anybody wants to make a trillion dollars solve this problem in a comprehensive info agent fashion, which we’ll no doubt talk about further. But to get there, people have to do various piece parts. And Alex has been doing his part with his Otherweb. So tell us a little bit how you got to the point of disgust with our digital infosphere.

Alex: Gradually and then suddenly, as most of these things go. But the reality is that I’m kind of primed to care about this problem because I was born in the Soviet Union and I saw what this information does to a society when all the adults around you are supposed to know what’s going on in the world, don’t know anything. So I saw this, I saw the great lengths to which my parents had to go to find out something useful about the world. They would log themselves in the closet or the bathroom at 4:00 AM and listen to Voice of America on the radio. That was the only source of information that they could find. Obviously that in itself was foreign propaganda from the US aimed at the Soviet Union, but it was better than what everybody else had.

So I’m primed to care that you’re supposed to know what’s going on around you. You can’t just ignore the news, eventually it’ll come back to bite you. And then for decades, I’ve been watching the quality of the news in the West where I live now, go down and become worse and worse and worse and worse. I mean, Google is just a facade for showing you what’s out there, but what’s out there is terrible all the way down or all the way up. You go to places that are supposed to be good that used to be good 20 years ago. And I mean when CNN has headlines like “Stop what you’re doing and watch the elephant play with bubbles,” or when the New York Times has headlines like, “This most horrible animal is now the star of YouTube, and you’re supposed to click through to find out which animal it is, I guess. But that’s in the news section and it’s not news.

Jim: I ranted about this on Twitter about three months ago. Most of the news sites now are proliferating their agony aunt sections, the dating advice and oh, blah blah, blah, blah, “My neighbor pissed on my fucking edge and what should I do about it?” Kind of stuff. I think The Telegraph, which I read fairly regularly because it’s a different perspective, the UK Telegraph, different perspective than the mainstream media in the US. It’s got like eight of these agony aunts and usually four of them are on the front page every day. What the hell? And of course, as you say, it’s all about hijacking our attention.

Alex: And specifically, I kind of date the beginning of this trend to way before AI. Roughly 20 years ago, we became really, really good at tracking how ads are doing. Basically being able to track every click and every view. And from the moment we developed this ability, we started optimizing all content to maximize how well the ads on that content are doing. And so there’s the single selective pressure on all content on the internet to become clickbait. And so it does.

Jim: Absolutely. And you’re just about right. And there’s another thing that happened at about the same time, about four years earlier. I’m an old timer. I’ve been building online systems since 1980 when I worked for The Source, which is the world’s first consumer online system when it was 300 baud at 10 bucks an hour, if you can believe it or not that. But it was the only place in the world you could get email, chat, bulletin board, stock prices, all that stuff. And so I’ve been doing this for a long time and much actually now it would be a bit less than half of my life doing online products, was in the epoch before ads were a significant source of revenue relative to costs.

Until about 2000, the costs of building, hosting and running any kind of online property were greater than what you could monetize via advertising around 2000 that switched networks and computers, operating systems, all the tools, the whole stack, got inexpensive enough that you could just barely, if you’re good at it, monetize an online site strictly with advertising. And Chris Anderson wrote a famous book around that time called Free where he argued that the price free will drive all charged services out of business in any domain where it can be free. And by that he meant ad supported. And so I think that’s the other side. It now became technically and economically possible and then soon thereafter we got fucking good at it. And then that was the first step on the road to hell.

Alex: I think realistically it was possible already in the radio and TV, but it was so hard to track the results of this, the way people calculated the ratings to figure out which show you’re supposed to advertise on. There was no feedback loop that actually made you turn the show into the thing that attracts the most advertising. You had to just try to put on a good show, but online you don’t.

Jim: It’s interesting. That actually shows that sometimes less information is better, which is kind of paradoxical. I also did business in the direct mail days where we used to do millions of dollars a year worth of direct mail and direct mail was better than space advertising. We do that too. And about space advertising we’d say, “Yeah, probably half of it works, but you don’t know which half.” And interestingly and curiously, the fact that that feedback loop did not really exist, the pressure was nowhere near so extreme as it is now to just follow the fucking clicks.

Alex: Yeah, it’s a process we have to break somehow because the incentives are going to drive us in that direction. AI just makes it faster. It’s basically like we just added nitrous to the car, but we’re still driving off a cliff.

Jim: Yeah, absolutely. Which is why I say trillion dollar opportunity young man. I’m too old, I’m too rich and I’m too lazy to do it myself. But if I was 45, this is what I’d be doing is to figure out how to use LLMs and other related technologies like RAG and latent semantic networks and human curation and using the network to curate itself and other things to build a comprehensive infosphere smart agent. So nobody had to deal with these motherfuckers trying to hijack our attention period. And I think it’s possible. I think it’s doable now.

And just like the curve crossed around 2000 where the costs of running online internet platforms became low enough that you could do this ad-supported, I would now stipulate that we they’re at or damn close to at where we’ll be able to use AI and closely related technologies to build the counter weapon, the info agent. And then we now get into this very interesting arms race where the info agent versus the attention hijackers, and I think it’s going to be very interesting and I would suggest work that you’re doing is at least the first steps in that direction.

Alex: So right now we are in this exact arms race where we are the spam filter, but the people are developing the spam and we’re kind of competing. Because what we’ve developed initially, again using the same AI tools you just mentioned, is a curation engine. Essentially we aggregate stuff, we curate the best stuff, we give users the most tools we can give them to try to customize things to their liking, but that is an arm’s race. The next step has to be to now not just curate from whatever is out there but to also generate better things in parallel and then you can win this war.

Jim: Well, what you need to do then is build an ecosystem that incents people to create for the ecosystem. But having, as I said, been doing this for 40 some goddamn years, pressurizing a two sided market is very difficult. Because to get creators, you need readers; to get readers, you need creators. And so I think you are taking the right path, which is to parasitize the existing ecosystem until your readership is big enough that it makes sense to incent people to create content for your ecosystem or for the ecosystem. I’d say what you’re doing is a piece of what this bigger ecosystem that’s trying to give birth to itself in the world.

Alex: And I would actually say what we’re doing is limited in that the tool that we’re giving people is binary. They either filter something out or not. And so yes, that creates an incentive for the creators to create better stuff that will not get filtered out by our readers. And when we’re large enough, that incentive is great. But I want to get us to the point where it doesn’t have to be binary anymore, where we can go to an ad network and tell them, “Here’s the algorithm that’s going to evaluate content. Why don’t you use this algorithm as a data stream to your advertisers?” So they can decide how much to pay for an ad on this content.

Jim: Don’t do that. Don’t do that. I’m going to say that the vision I have, now, I may be a dreamer, I’ve always been a dreamer and a commercial enough dreamer, I’ve made piles of dough over the years, which is I don’t want no fucking ads period. Now, I might have a inbox called offers or something like that. So if somebody wants to send me offers to my inbox, I can put whatever price I want on my inbox and my current thinking price is $10. You want to reach me? Pay me fucking $10 and you can send my demographics out to whoever the hell you want. And you’ve learned a lot about me by watching my online behavior here now for the last three years. And you can tell the advertisers, “All right, this guy’s actually a pretty good candidate for poison arrowheads for crossbows,” and it’s probably worth 10 bucks to send him an ad for a hundred dollars worth of poisoned arrowheads.

That kind of advertising I would tolerate. But I think if I were the fucking dictator of the world, which is not a bad idea actually, I would ban advertising online for the reason that you said, unlike old school advertising that doesn’t produce this convergence to shit. The fundamental nature of online produces in shitification no matter what you do.

Alex: It’s possible. I tend to be more of a realist. I’m not trying to be dictator of the world. So I’m looking at this and I’m thinking where can we innovate with the tools that we have with the stuff that my team and I know how to build. And innovating the business model of the internet is probably not within my capacity for now. So we’re trying to innovate in consumer tools for how to filter stuff out. That’s step one. Then we can innovate in producer tools, let’s call them that, create better content. And then you just bridge those two with customization tools that basically create this end-to-end system that connects facts that are documented by somebody somewhere and the reader and you don’t need feeds and search engines or anything else in between. All of this we can do with the tools we have. Trying to get the internet unhooked from advertising is probably something where we need your help because I can’t do that.

Jim: Well, I do know an answer and I was musing on this with somebody else the other day about two weeks ago, and if we look back at the history of technology, often the most innovative solutions start out in very exclusive elite markets. And I gave the example that when we did an extension on our house back in 1999 and finally finished it in 2000, I went out and sprung for a 73 inch TV. And you know how much a 73 inch TV cost in 2000?

Alex: Probably $5,000, $6,000?

Jim: $10,000. And this was not for the top of the line. This was for the silver, not the gold. And three years before that it had been $30,000. And just for fun, when I was talking to the fellow, I looked it up on Amazon, you get a perfectly reasonable 75 inch or not a star one, but a decent one for 600 bucks. And so that product is just going… Same’s true for almost everything, category after category you’ve seen that trend.

I’m thinking that the super info agent service starts as at a very elite product, maybe $500 a month per user and for $500 a month, we’ll go and negotiate on your behalf and get a hook right into Twitter and Facebook and Instagram and Reddit and we’ll have access to the fucking graph because we can pay them much more than the person’s worth for advertising and we can just start out that brute force fashion and as it grows that we have economies of scale, things come down once we have millions of customers, we keep reducing the price we offer to Facebook until they eventually say, “Go away.” We go, “Okay, that’s fine.” Now we have our own ecosystem and that’s at least one possible way to get there. Now you start with 10,000 users paying you $10,000 a year on average or $6,000 a year. So that’s a $60 million business, not bad with 10,000 customers. And I guarantee you there’s 10,000 people in the world for whom $500 a month would be cheap if you could really deliver the really good information agent.

Alex: It could be a possible way though. You are banking here on the fact that you will go to Twitter and you will tell them, “Give me access to your graph and to everything you’re producing because I intend to replace you in two years.” And they will say, “Yeah, but you won’t replace us, so just give us the money.”

Jim: No, I’m not going to tell them that. I’m just going to say, “I’m go talk to Elon.” The only way I would do this sit down personally with Elon and pitch them and say, “Hey Elon, you hate advertising. You hate those cocksuckers. Those are the most miserable sons of bitches of in new world.” He hates advertisers by all account. He does no advertising on his other businesses. What’s your economics? Well, I looked it up. Turns out the economics of Twitter is about 25 cents per user per month in advertising. It was a dollar before he kind of pissed everybody off. So let’s say it’s a dollar. Facebook, it’s $2 a month in advertising or advertising, I’ll pay you 10 bucks. Now what kind of person would not take in his case 10x the revenue density for a user? It would just be a stupid decision not to do tactically, even if I am going to skin his later and I’m going to predict that he’ll say yes.

Alex: It’s possible. It’s an interesting idea. So there’s a professor at UMass Amherst, Ethan Zuckerman whose team is actually working on an adversarial interoperability approach of basically trying to get hooks into these platforms without asking them and trying to get the information out and then allow the user to have a client that basically aggregates all of their social media in one place. Now obviously that works for some services, doesn’t work for others. For Twitter, it actually worked until last February and then it stopped working and we know that because we also had their own bots on Twitter at the time and now we don’t.

Jim: Yeah, screen scrapers work till they don’t work and they’re getting better and better at detecting. Yeah, someone suggested that, “Well, why don’t you just do screen scraping and just have an army of Nigerians tweaking the code every hour to keep your screen scrapers working.” I go, “I’m afraid we’ll lose that arm’s race.” You can automate the other side of it. It’s easier to automate the scraper detector than it is to fix the scraping every time I change.

Alex: And it’s not even just that you also have a resource discrepancy. So I remember the fight that happened, what is it, four, five years ago between ad block plus and Facebook?

Jim: Oh, yeah.

Alex: Where they were fighting over whether or not the ads are going to get blocked or not, and Facebook just kept changing the code. So Adblock Plus wouldn’t work and eventually they had to make a deal because there’s no way that a non-profit organization or I think that they have some sort of a hybrid structure, but they were a non-profit at the time, that they could compete with Facebook’s army of engineers when they just randomly change code every once in a while and the ad blocker stops working.

Jim: Oh yeah, that was my point, why screen scraping is not actually a scalable idea. You can do it at a small scale, but if you got big enough to be economically interesting, they have the horsepower and the expertise to make it exceedingly difficult. So I would say the least for the elite product, just go straight to them, make them the offer they can’t refuse.

Alex: It’s an interesting approach. I’ll consider it, run it by the guys and we’ll see what happens. For now, we started off as more like a product for all the common folks where it’s free, it’s ad free. Our investors are basically footing the bill for now, and we’re just trying to give everybody, hopefully as many people as possible, access to good information and see how we can leverage that later. Maybe later on we leverage that for B2B products where we go to journalists and tell them, “Hey, do you want to write articles that our algorithms actually approve of? Here’s a way that you can check it while you’re writing.” So that’s a part of the direction, but it’s interesting to make interoperability paid essentially.

Jim: Now one quick question at least maybe I didn’t go far enough. Everything I clicked on, I actually got the full article. Historically, the bane of these kinds of curation sites have been paywalls. How do you deal with paywalls? Do you only deal with non-paywall sites?

Alex: Yeah, we blacklisted every website with a paywall from the system.

Jim: Which is essentially more sites every day.

Alex: Again, eventually when we’re big enough, we can probably make a deal with those guys, but for now we are not big enough to come there and negotiate with them, at least in any kind of balanced way. And so our approach has been you have a paywall that tells us you don’t like to share. We respect that we will not touch your content.

Jim: Every day the ecosystem’s evolving the other direction. At least that’s what it seems like to me. Even sites that used to give you three free articles a month, now they’re not giving you anything, goddammit. Of course, they probably figured out that those of us who have things like VPNs could keep getting as many free articles as we want. Not that many people are that clever to game the system.

Alex: I think for most of these companies, they’re essentially committing suicide by doing that. So they’re trying to milk the cow faster, but the cow is going to die earlier. So when they block things using a paywall, over time they train people not to go there. And so less and less people will go there. Maybe their subscriptions will go up temporarily for now, but ultimately that’s a way for these companies to kill themselves.

Jim: Certainly that would be my thought. They’re short-term optimizers. Back to a point about offering them 10 bucks a user per month. The perverse thing about late stage hyperfinancialized capitalism is it does not know how to think any farther out than three years and mostly about 90 days.

Alex: But here, I think even if they were registered somehow differently, and if they weren’t thinking quarter to quarter, I have to empathize with them and understand that if you’re in the position of somebody, let’s say, Gannett, and your revenue has been going down something like 7% or 8% per year for the past, at least five years-

Jim: More like 20 years.

Alex: Well, maybe, but I’m looking at, let’s say, for the past five years, they’ve gone from more than three and a half to $2.6 billion, and now you’re still making $2.6 billion, but your market cap is $300 million, which basically tells you what the market thinks is about to happen. So what do you do at that point? You try to squeeze every penny you have. You can’t afford not to because the market is already pressuring you. So I don’t hold it against them that they do it, they have to do it. But yeah, it’s probably just accelerating the demise of that model.

Jim: I got to tell you a funny story. I was a senior executive at Thomson, now Thomson Reuters, and one of the things we owned was 150 newspapers. This was back in the late ’90s. And I sat on the executive committee where we questioned the business heads every year and even 1997, ’98, I was asking them every year, “What’s your actual growth in subscribers and what’s the trend over the last 10 years?” “Oh, it’s been almost flat.” Okay, now here’s the key question. But it growing very, very slowly, like 1% a year. Said, “What has been the change in the marginal cost to acquire a new subscriber?” Start out being really cheap. And it was like 20x after 10 years. And I go, and then the third one was, “What is your median age of your subscribers?” And over 10 years, it rose about seven years.

And I said, “Do you realize you’re on a going out of business curve?” And they, “What? What do you mean this is a $2 billion business,” blah, blah, blah, blah. And I will say the top brass listened to me and one other guy with our annoying questions enough that we sold our newspapers in 1999, right at the top of the dot-com bubble when the help wanted ads were blowing out the revenue and this is an extremely profitable business and we sold them for a really good price. One of the better timing things I’ve ever seen in my business experience. Because today, the newspaper business, unless you’re the New York Times or the Wall Street Journal basically.

Alex: And that’s the top line of that business. But if you look at the bottom line and what they’re spending the money on, then I’m looking at it as an engineer primarily. And any event happens, I go to Bing News, let’s say I search for the keywords of that event. I get at least 50 articles all written within the same hour, all describing the same event in different words. And I’m thinking, okay, I get the first two. Why are the other 48 working on it? It seems like duplication of effort by definition.

Jim: Yeah. Why did they exist? And I got another thing that’s very, very similar for ever since I was a kid. I wondered when you see the president give a speech, there’s like 50 photographers there taking pictures of the dude standing at the lectern giving a speech about the most uninteresting topic imaginable. Why the fuck do you need 50 photographers to take that picture? And the only interesting picture is that if he happened to have a stroke and keel over dead or something, most of the time it’s just the most stupid ass picture. Joe Biden going… At the lectern. Why would there be 50 photographers? It’s nuts. There’s something seriously maladjusted about how this whole thing works.

Alex: Well, the irony is at some point when the boats were going to Europe and back and they figured there’s too many boats doing the same trip, then they started AP as a way to kind of unify all the boats into one. But now Gannett and I think McClatchy as well broke their contract with AP, so they’re not going to work with them anymore. So we’re kind of going in the opposite direction perhaps. But I don’t know, I think everybody is just trying to survive and trying to find whatever happens to work to get them to the next quarter. So again, I empathize with them. I don’t think that somebody is doing anything maliciously. It’s hard to survive in this market. It’s a tough one.

Jim: Yeah. Late stage hyperfinancialized capitalism is a very rough and rugged landscape, and if five of those guys about the only play they really have is to go private, that’s what they should do.

Alex: From my vantage point, I think the best play they have is to embrace AI and actually automate the way everything that is redundant. Because investigative journalism is hard, you still need humans to go to a garage somewhere, talk to a whistleblower, things like that. But writing is kind of a well-defined skill that can be automated away pretty well right now. And so right now they’re fighting back against it. But the reality is the media industry is the first one that should be embracing it because this might be the only way it can survive in its current form.

Jim: It’s funny you mentioned that. I have a little startup, we’re using LLMs to write movie screenplays. Something called… I’m not even going to say what, it’s still in stealth mode. But it’s amazing what it will do. And the recent Writers Guild contract makes it very difficult for Hollywood producers to use AI. They basically have to pay the guys twice. It’s all fucked up. Anyway, we were talking to a very big producer last week and he basically says, “Fuck them writers. If they’re not going to use AI, I ain’t hiring them.” Because he says, “It’s just ridiculous how much more productive they are. I don’t give a shit what the contract says.” Any classic crusty old Hollywood dude, and of course they’re going to be using AI, they’re going to have to use it for all aspects of movie production and probably the cost of movie production will fall by a factor of 10 over the next five years. And anybody that doesn’t go along will basically just follow Gannett on the going out of business curve.

Alex: And realistically, look, AI is at different stages of development in different kind of subclasses of it. But the large language model is the one that’s most advanced. And what is it good at?

Jim: Language?

Alex: It’s good at guessing the next word. Basically just writing without knowing what it’s writing about. So if you’re using it to try to figure out what to write about, you’re going to get hallucinations. That’s a bad idea. But if you already determined what you’re writing about and all the facts are laid out and you just need something written up well describing those facts, it’s just as good as a human. That’s how it was trained by definition.

Jim: And particularly then the art of being able to give it the right prompts. So instead of being a writer, your whole writing room is prompt engineers, prompt engineers with some sense about what’s newsworthy and what’s important. You could probably reduce your staffing by a factor of five, at least, getting better every day.

Alex: Well, I’m in the privileged position where it’s not prompt engineers, it’s data scientists and software engineers on my side, and we’re just fine-tuning the model. So we have a model that doesn’t need to be prompted in some fancy way. It just does things right from the first time.

Jim: Yeah, yeah. That’s more difficult though when you’re trying to figure out how to do the news, but that will happen on the other side of it. So here we are in a world being bombarded with all kinds of shit. It’s interesting. There’s different kinds of shit. One is what I call truly malicious stuff. Let’s say, that the GRU, they have a whole unit, they’re trying to subvert countries via the internet. I’m sure China has the same. And so that’s malicious content, but then probably the bigger part is the part driven by this money-on-money return loop.

For instance, the analysis I’ve seen about the famous 2016 election where supposedly the Russians influence the election, the Russians were actually only semi-competent, and they didn’t hit it all that hard. It had a little effect, but the ones that had the big effect were some teenagers in Macedonia who were doing it strictly for the money.

Alex: I mean, if you look at 2016 is a good example, the most widely shared article on Facebook that year was the Pope endorses Donald Trump. I guarantee you the GRU was not smart enough to come up with that.

Jim: That was probably the kids in Macedonia or their equivalent someplace else.

Alex: Exactly. But that was more than 800,000 shares.

Jim: Yep. It’s quite amazing. Then the other degradation is clickbait. And again, that is the one that even the most, like we was talking about New York Times, UK Telegraph, they’re all in the clickbait business, and if you get down below the very top 20 or so news sources and look at their fucking ads, I mean, the ads are the most ridiculous things I’ve ever seen in my life, makes the old National Enquirer look respectable.

Alex: But the thing is, the ads, I’m kind of okay with the ads being bad. I know you don’t like them. But I think, okay, good content, bad ads, I can just stomach the bad ads and switch to the good content. So I get angry when I see the content becoming worse to accommodate the ads. And this is what we’re seeing everywhere. And yes, you go to the Daily Mail, you go to their science section, and there is no science there. I guarantee you. It’s a ghost town. There’s no science to be seen. It’s all clickbait. And that’s relatively good? That’s a legitimate outlet? What happens when you go to all the websites that just exist for SEO?

Actually, I had this weird situation a couple of weeks ago where I was going through one of these marketplaces that basically sell you backlinks because I wanted to understand how that market works. I think it was called the No BS Marketplace or something like that, ironically. And suddenly I see a domain name, I recognize Aura.co, which as a camera company I used to be the VP engineering of. And so I go there, I see 300 articles all written by the same guy published on the same day. Obviously AI written, but no, I remember this domain being my camera company that I was building the hardware for.

So I go in the time machine, archive.org, and I see that four years ago, Aura.co was showing our VR camera, and then probably the company went out of business about a year after I left, and they sold the domain as part of bankruptcy, I guess, and now it’s this kind of SEO mill.

Jim: Oh, yeah. Again, horrible shit. And again, this is what you get when you allow advertising because you’ve sent the incentive to the world to do this kind of crap. And naturally that’s what the signal of money on money return in our world produces is that horseshit. So has there ever been a historical parallel to this? Has there been another technology change that essentially flooded the world in horseshit?

Alex: Well, I mean there’s been quite a few. I mean, it’s different levels of horseshit, but probably the earliest one we can see is the printing press itself.

Jim: Exactly.

Alex: Where the moment the printing press was invented, we all remember this as this great event because Europe became literate and we got the Enlightenment, but we kind of forget that we got the Enlightenment 200 years later. What we got for 200 years was 52 different religious wars. We got witch hunts where 80,000 women died, and we got inquisitions all over Europe. That was the initial result of the printing press because now anybody could suddenly write anything, but people trusted what was written. They assumed that if it’s written, it must be true. And we have the same effect on the internet now. Where my parents sent me something and I tell them, “Nope, this looks fake.” They say, “No, but it’s written on a website.”

Jim: It’s on the internet, it must be true.

Alex: And I say, yes, but it’s fake. So I think the same situation happened probably in the 15th century when somebody wrote The Hammer of Evil, that book that launched the witch hunts essentially. I’m pretty sure people read it or heard from somebody who read it because most people couldn’t read, and they thought, well, it’s written in a book, it must be true, therefore let’s go burn some witches.

Jim: Yeah, it’s like the analogy is not bad. Prior to the printing press, it was books copied by monks. Very expensive to make a copy of a book. And so the incentives were such that one did not duplicate a book unless it was a pretty high quality. If it’s going to cost the equivalent of a thousand dollars to make a book, it better be a pretty good book. Once you got the printing press and you knock the price down the equivalent of $3, but yet the written word will still have the carryover prestige of the past when the curation was still working. That is actually very interesting.

Alex: So it takes the ecosystem a while to adjust to realizing that something is not trustworthy anymore. We’re seeing this process right now as well. I have an engineer in Pakistan, and they had an election recently right before the election on the day of, he got a robocall from somebody who sounds like Imran Khan telling him to boycott the election. Now he’s an engineer, he realized Imran Khan is probably not calling me personally to tell me to boycott an election in which I’m going to vote for him because everybody knows that from the previous election’s voter rolls. But I’m pretty sure a lot of people just didn’t show up because Imran Khan told them not to. And so this is another example of this kind of carryover trust that people have, because previously an audio recording was proof of something in court. It was incontrovertible, and now it’s the easiest thing to fake.

Jim: And of course, earlier, I remember in the ’80s when Photoshop first came out, there was all kinds of concern that the world was going to melt down because you couldn’t trust an image anymore. Though interestingly, the world did not end over that one. I wonder why.

Alex: Well, I think though now you really can’t trust an image anymore. With Photoshop, it was crude enough or it still is crude enough that I can just zoom in and see the problem in the editing process.

Jim: You can see that little boundary between the zones typically, right?

Alex: Well, that’s one. Usually the thing that people mess up the most is lighting. So they would Photoshop a face onto where there used to be a different face, but the lighting on this face is not the lighting in the rest of the image or the shadow is missing underneath something, or the boundaries are over compressed or things like that. But with the way that people use AI right now, the general adversarial networks, now you’re creating images that really cannot be distinguished anymore.

So I think coming from the camera space, maybe I have my biases, but I think that the only way right now to tell the difference between the real image and the fake image is if we can convince all camera manufacturers to watermark the images with their own kind of encrypted CC, and then you can see basically, okay, this came encrypted from Canon and wasn’t changed. So I can at least trust that it came from a camera. But unless you do that, then you just have to assume all images are fake by definition.

Jim: Yeah, it’s interesting you mentioned watermarking. People always talk about that as a panacea. I ran into some product online tool I wanted to use that could detect watermarks and wouldn’t let me do something because of watermarking. I didn’t own it, blah, blah. So I just took it down, put it in Photoshop, and added a tiny amount of Gaussian noise to the image, and guess what? It couldn’t detect the watermark. And from a human view, it was exactly the same image.

Alex: But I mean, look, watermarking can be done in the visual domain the way that you described, I don’t know if you want to get this technical, right?

Jim: Yeah, let’s do it.

Alex: But in general, we can calculate to check some of the image itself. Then we can encrypt it using a private key that only the manufacturer of the camera has. And we can put that in the exit file, not in the visual domain, basically in the thing that gets added to the visual part of the JPEG image. And then you and I can all look at it with a public key and verify whether that image has been altered in any way or not. Because if the image is altered, then you decrypt that thing, you check that checksum or CRC, and it doesn’t match the visual contents of the image, therefore somebody altered it. You don’t know how they altered it, but you just know this is not what came out of the camera. So that’s probably a solution that my guess is we’re going to see somebody has to convince all camera manufacturers to do it, but it’s doable. It’s much easier than convincing all software manufacturers to do something.

Jim: Now I another, I came up with another hack solution for this. Did you ever read the book, Stranger in a Strange Land by Robert Heinlein?

Alex: Yep.

Jim: And remember the concept of the fair witness, the fair witnesses with these guild of essentially priests who would appear at any event and they would memorize everything that was said, that part is not relevant to this, but they would also testify to the fact that it actually happened. One could imagine a simplified version of fair witnesses that would attend anything that was being filmed or photographed or recorded orally, they would also take a recording of it and they would post it under their own name and they would certify that this event actually happened, and they would provide a link to the fair witness site that the publisher or the owner of the artifact could circulate with their artifact. Of course, they paid the fair witness five bucks or 10 bucks or something, and there would be a cross check that you could then see that this actually happened at the state that it occurred, presuming that you could trust the fair witnesses. And then the book, the fair witnesses were this cult that nobody questioned.

Alex: Two caveats that you have to add with that. First of all, in many cases, the videos that we are trying to verify whether they are real or not are the ones that were leaked against the interest of the person who was filmed. So if you remember, let’s say the Mitt Romney 47% video, Mitt Romney wouldn’t have invited a fair witness to that event to attest to the video being true. It was in his interest to be able to claim that the video is fake. But the video was leaked and it basically tanked his election chances. The second problem with this entire story is that we now have a lot of research showing that two people watching the exact same event don’t remember the same thing about it. So even if you can trust the fair witnesses to do their best, to be fair, they’re not fair.

Jim: Well, what I would suggest, the fair witness would also have their own camera, which they would post. It would say, this is the fair witness version of this event, either audio or video, and it won’t be identical to the one that they published, but it’ll be close enough that you can see. Anyway, it’s just a hack. It’s obviously not the long-term solution, but it’s a hack that you could do today. And I suspect if someone put the fair witness business out there, there’d be at least a small business out there for people doing that. You have to think through the details a little bit.

Alex: I think it’s a reasonable one because some stuff that you see published as adversarial news. That’s something that somebody else doesn’t want published. But a lot of the things that we publish are press releases. The company wants the world to know X. So for that kind of use case, they would absolutely also invite somebody to document the fact that they did something.

Jim: Yeah. Of course, the whole industry of trade press is pretty hilarious. Having been on both sides of that, having worked for a company that had a bunch of trade press and being a startup guy who issued a lot of press releases, an awful lot of the trade press is just very lightly edited or truncated press releases. And I imagine the quality of trade press has probably gone up considerably with LLMs because you can now take a press release and give some pretty nice editorial prompt to Opus 3 and say, “Rewrite this, but in a somewhat elegant fashion,” rather than the tortured language of press releases, and I imagine it wouldn’t cost hardly anything to do. So I imagine the literary quality of trade press has probably gone up.

Alex: I actually had a call with a fellow startup founder that is doing exactly that a couple of weeks ago. They’re still in stealth, so I’m not going to say the name of the company or anything like that, but yes, it’s improving a lot.

Jim: Yeah, that’s kind of interesting. So we talked about the printing press. And of course, the other thing famously about the printing press, I don’t know if this is true or not, but this is what I read on the internet, so it’s probably true, which is that the financier who backed Gutenberg, the reason he wanted the printing press was to mass produce indulgences. Which were these, the Catholic priests could give you time off from your time in purgatory in return for a contribution to the church, but it had to have paperwork with it. And so to reduce the cost of the paperwork, the investor paid Gutenberg to invent the printing press, just knock out tens of thousands of the forms for the indulgences, so they didn’t have to be written by hand. So there was another goofy-ass use of a technology.

Alex: I’ll give you a parallel back to our times. Almost every innovation you see on the internet was developed for porn.

Jim: Absolutely. Yeah, absolutely.

Alex: So things don’t change that much.

Jim: When I was CEO of Network Solutions, we could see what was a pretty good idea what the traffic on the internet was. It was never less than 25% porn, let me put it that way. Sometimes at the times a day, it was higher than that, and that goes back at least to 1992. And of course, certainly it was true of the video tape machine as well, when the very first video stores, oh, I would say half their inventory was probably porn, as I recall, in 1987.

Alex: Yeah. So indulgence drives innovation then and now.

Jim: Exactly, exactly, to some sense. And again, this is that money will find a way to subvert potential good from any channel. I mean, money also does other good things, which is it routes investments efficiently, it compensates people for innovation, but it also incents foul behavior.

Alex: I tend to be kind of an optimist and think that it is a good denoter of the value that you provided to others in some kind of measurable way. But yes, the problem is that people get addicted to stuff, and then if you provide to them what they want, you’ve actually done something not good, and you get money for it. And I’ve heard a lot of people say that the best way to make money as a startup founder is to build companies that cater to one of the seven deadly sins.

Jim: In fact, I had a little venture capital firm once, three partners, and we called ourselves LFG Partners, and FLG stood for fear, lust or greed. We put this right on our perspective. If your business plan didn’t appear to either appeal to either fear, lust, or greed, we weren’t interested.

Alex: But look, it doesn’t have to be fear, lust, or greed. Twitter is rage, and that works.

Jim: We didn’t claim to be an across the board venture firm. We were specialists in fear, lust, and greed.

Alex: I mean, companies that cater to gluttony do pretty well as well.

Jim: Absolutely. The seven deadly sins are not… We’ll have to get all the initials out there, and that’ll be the next VC firm. But unfortunately, you want a VC firm to be somewhat specialized. You don’t want try to boil the ocean. The gluttony firm. I like that.

Alex: And this reminds me, in Thank You for Smoking, the movie, there were the angels of death or the merchants of death, I think they were called.

Jim: That’s funny you should mention, I just sent that book to a relative of mine who recently went to work for Philip Morris. Oh dear. Let’s change directions a little bit, talk about addiction. And this is actually, if we read people like Jonathan Haidt. The social media addiction is probably, there are some good things that come out of it, but it’s also doing some serious harm to people in terms of mental health measures, particularly for young people and particularly for young women. What do you know about social media addiction?

Alex: Well, I think the reason that social media affects young women more is that young men are affected by porn at the same time. And so it is just two different addictions affecting those two different populations. But I think Jonathan Haidt is actually a great source, is much better than me at articulating the data that we have on how it affects people. But I just see what the process is like and how it works from the side of the industry. I see that things are developed in order to be as addictive as possible. Because ultimately, the goal of almost every algorithm that social media companies develop is to maximize engagement. And engagement is getting something out of you. It’s not giving you something. It’s getting your attention, getting your shares, getting your clicks, getting anything.

And the way that you get a person to give you something over and over and over again is to create compulsive behavior. So the industry is becoming really good at it, and it’s almost hard to imagine anybody being better than TikTok at it, but I’m willing to bet money that somebody will be better than TikTok at some point, because TikTok was better than Instagram, Instagram was better than Facebook. Facebook was better than MySpace. It just keeps improving from the industry’s perspective, but becoming and worse from ours.

Jim: You should mention that we seem to have an awful lot of very similar thoughts. I kept hearing about this goddamn TikTok, and I said, I’m going to have to go check this out. And this was in February 2022. And as I said, I’ve got 40 years plus of designing, investing in, running online products. And within five minutes, I said, “This is the most brilliant product ever created.” And I immediately said to somebody, “This is online fentanyl.” And this thing is so brilliant that it blows away anything I’d ever seen before or even imagined. And here’s another thing, very scary, this is how addictive this thing is. I’m a person who very highly values punctuality, being on time. I’m almost never late for anything. A family value talked to me by my parents. I talked to our children, they’re very conscientious, punctual.

Third time I used TikTok, I had a Zoom call set up, pretty important Zoom call, and I had 10 minutes to kill before the Zoom call so I picked up TikTok and said, “Oh, let me do…” This is my third session on TikTok. Let me do a little bit more TikToking here. Well, guess what? When I looked up, it was 10 minutes after the hour, and I was 10 minutes late for my Zoom call. Something I never ever do, at least not without warning the person, “I’m running 10 minutes behind.” And that’s essentially the equivalent of being a degenerate drunk in my book that Jim Rutt would be late and not even know it. And ever since then, I’ve been saying, “You’re better off giving your ten-year-old kid cigarettes than you are giving your ten-year-old kid TikTok.”

Alex: No, cigarettes are probably easier to quit, especially now. So yeah, it absolutely is the case. And I remember when Facebook just crossed the 20-minute barrier, they were able to keep people on Facebook for 20 minutes. They were celebrating this, it was public. Everybody in the industry knew a milestone has been reached. I think TikTok is at 90 minutes a day now.

Jim: After four sessions, I deleted TikTok from my fucking phone. And I will just say that listeners of the show have heard this a million times, but if you’re a parent, get TikTok off your kid’s phone, period. The other things there’s arguable about the net pluses and minuses, but TikTok is just a foul fentanyl machine, and no responsible parent should ever let their kids touch TikTok.

Alex: But I think in general, if you look at Jonathan Haidt’s research, you’ll see that also the age at which a kid gets first exposed to it matters quite a bit. So if you’re able to postpone the moment at which they will get exposed to these addictive things, then you’ve probably eliminated most of the risk.

Jim: At least a lot.

Alex: Not that adults don’t have any risk, but at least the mental health effects. Adults at least know there are other ways to communicate and then they get addicted to this one. But if being addicted to this one is the first and only way that you can communicate with the world, then you’re kind of screwed.

Jim: And of course, that’s also true of nicotine. And the cigarette companies know that. When I was a kid, a lot of people started smoking when they were 13 or 14. I would say by the time they were 15, maybe 30%, 40% of the kids in my fairly rough hometown were smoking ciggies. And a lot of those people that started at that young age were never able to kick. Those who started later when they were in college or even after college, most of them eventually quit smoking cigarettes. And the research does show the earlier you start smoking cigarettes, the deeper groove it wears into your brain and the harder it is to quit.

Alex: Yep. And I think my brother started smoking when he was 22 and quit when he was something like 32. So yes.

Jim: Yep. So that fits. Yep.

Alex: Within my family at all fits. And I personally have never smoked anything other than the occasional cigar and maybe a water pipe hookah.

Jim: I probably smoked two packs of cigarettes in my life, so never I got… I used to smoke a bunch of cigars, but I stopped that about 10 years ago. So what do you think? Does a decent society say no TikTok, even though the money on money return loop that signal says, “Profitable as fuck.”

Alex: Yeah, I don’t know that the society is able to say, no, TikTok, at least when we have the First Amendment. So we can limit TikTok to people who are under a certain age, that would certainly fix a part of the problem. And that seems like we should be allowed to do that within the existing Constitution.

And we might be able to put some other limitations on, I don’t know if we want to go as extreme as to say, basically you have to put everything in anti-chronological order, and you cannot order the things in a way that gets people more addicted. That would be extreme. But in theory, we should be able to do that at least legally. So those are the options we have. I don’t know which one is right. I would generally be very cautious when it comes to government interventions because again, born in the Soviet Union, I’m not a big fan of the government trying to intervene, but in this particular case, saying kids under 18 shouldn’t use Instagram or TikTok or anything like that, I’d be all for it.

Jim: And of course, as we have the analogy with tobacco and alcohol, you could make the age 21. Both of those are now, which I think is kind of stupid. When I was allowed the drinking age was 18, we started drinking when we were 16, typically didn’t do us any harm, did it?

Alex: My father poured me my first glass of wine when I was in the fifth grade.

Jim: Well, that was Russia. We know they’re a bunch of fucking drunks over there, right?

Alex: I was in Israel at the time, but yeah, so it was still assumed that wine for a celebration, that’s fine.

Jim: Yeah, exactly. Yeah. The Jews and the Italians got that one. They know how to drink for celebration with food, not just to get drunk.

Alex: I mean, drinking has to be a social process, otherwise you’re just an alcoholic.

Jim: That’s the Russian thing. You and your buddy and a bottle of vodka.

Alex: Well, not all Russians. I’ve seen Russians that don’t even need a buddy, and it’s just bottle of vodka after work, and then just lie on your bed for the next six hours staring at the ceiling, and then change clothes and go to bed, and that’s the entire day. Repeat over and over and over. So once you’re in middle Russia, the place that I didn’t even get to visit until I was 25. But in those places, it’s pretty rough. You have the alcohol and therefore you don’t need a buddy. I guess, kind of like TikTok.

Jim: Yeah, exactly. I said, “TikTok, oh my. It just drives me crazy.” Just a total aside, totally apropos of nothing we’ve talked about, but I find it very interesting that Putin is a teetotaler, has never drank at all.

Alex: So is Trump.

Jim: And so is Biden. Which is extremely unusual for an Irish Catholic. I come from an Irish Catholic background, and they’re almost as bad of drunks as Russians, let me tell you. But Biden has never had a drink in his life. Trump has never had a drink in his life. And I don’t know if Putin’s never had a drink in his life, but he hasn’t drunk at least since he was a young man. See what happens when people don’t drink?

Alex: Well, I mean, if you look at every study on alcohol, the mechanistic studies show it does damage to your body at any quantity. So even one drink per week does some damage to your body. But when you look clinically at basically how the entire population is doing, then you see that the people who live the longest are the ones who actually have about one drink a day on average. Because yes, maybe mechanistically it’s better not to drink, but if you don’t drink, then either that ruins your social life and then you become a psychopath. Or maybe there’s something wrong with you to begin with.

Jim: Yeah. They’ll say Putin Trump and Biden. There we got three.

Alex: Those are three pretty strong exhibits.

Jim: Pretty exhibits. Kids drink, but don’t look at fucking TikTok. That’s worse than cigarettes.

Alex: Drink, but only good wine, one glass at a time, please.

Jim: My father did teach me some good drinking rules. Never drink alone, never drink before noon, which I later amended to 5:00 PM. Never drink to set off a hangover, which I had never done in my life because of how I was raised. I’m absolutely convinced if you do it even once, you’ll instantly become an alcoholic and several other really good rules on how to drink. I wish more people taught their kids how to drink intelligently. Anyway, back onto our topic. We’ve talked about a number of negative and bad things. Maybe you could talk a little bit about your visions of the future that might be more positive.

Alex: Well, I think in general, we discussed already that AI, it’s basically a tool. It can be used for good things and for bad things. Just like a kitchen knife can be used to make a salad or to stab somebody. So I think that AI opens a lot of possibilities that are really great. I mean, when we look at this entire ecosystem of news, for instance, a lot of it is junk. A lot of it is biased. A lot of it is created intentionally to manipulate populations. You mentioned GRU agents, et cetera. Wouldn’t it be nice if you had an entire ecosystem that basically connects facts to humans with everything in the middle being open source.

So you can inspect the code and see that nobody injected any bias or any manipulation in there, and you don’t have to wonder whether I can trust this person or not. The code is there. You can just look at it. So that’s a possibility. That’s probably best case scenario. I’m not sure we can get there, but that just shows you technology could be used for bad, but it could also fix an existing problem that we’ve been unable to fix otherwise.

Jim: I’m still pretty convinced if you run the industry on advertising, that’s never going to happen because they need to manipulate you and they need for you to not know how you’re being manipulated.

Alex: But think about this as a way to bind your future self. Most companies, when they start, they start out with good intentions, and then eventually the basic C Corp structure kind of catches up to them and they have to maximize shareholder value. But at the phase at which they have good intentions, they can bind their future selves.

So we, for example, chose to register as a public benefit corporation. Why? Because then we can put a mission in our incorporation papers. In our case, it’s improving the quality of information people consume. And that mission is binding. So now I cannot just ruin the quality of information for the purpose of making money, that would essentially allow my shareholders to sue me.

Another thing we did is we opened up a lot of our models, not as open source, but a source available. So people can look at them, audit them, test them, just can’t use them in their own products. But that also is a way to bind our future selves. If somebody who isn’t me in the future decides to add a left-wing or right-wing bias to some model, either is going to have to close the source of the model so people don’t see it, but people would notice that the source has been suddenly closed even though it was opened before. Or they would need to change it for the world to see, and everybody will see that now there’s this bias in the model or in the data set. So that’s another way to bind our future selves.

If we create kind of an ethos, at least within small companies within startups of doing things that don’t just sound good now, but that kind of force you to be good later otherwise people will notice that you’ve changed, I think it would be good because those companies eventually will become the next Google and Facebook, et cetera. But they will be bound by what they believed when they were young and didn’t have investors yet.

Jim: You’re like OpenAI or Google, don’t do evil. So you guys actually a B Corp, or did you do another form of benefit corporation?

Alex: Public benefit corporation. So B Corp is essentially a private certification after you’ve already registered as a public benefit corporation where you submit paperwork and basically become the certified organic. But in our case, we are a public benefits corporation. So the state of Delaware knows what our mission is and in all the public papers, but we didn’t have the funds to get the third party certification.

Jim: Now, the other area in this kind of nuevo governance is what happens if you get acquired then your corporation goes away. Do you have a stewardship capital model where people issue a golden share that must vote for any acquisition and give it to a trustworthy third party?

Alex: No, we didn’t do that. First time I hear about it. Sounds interesting, but I didn’t know about it until now.

Jim: It’s called stewardship capital. It’s actually very important. Because otherwise, somebody acquires your assets. That’s typically how acquisitions… Not you, you’re a nice fellow, but the person that takes over from you, an internal coup d’etat or something, happens all the time, and sells out to Meta corporation, and next thing you know, all your shit is corrupted because you no longer have a corporate existence, you haven’t violated any rules.

Yeah, look up stewardship capital. Here’s the essence. There’s one golden share of essentially a new class of stock, let’s call it S class for stewardship, and the bylaws of the corporation are written such that no acquisition may be done without the approval of the S share, period. And the S share is given to a very trustworthy person, whoever you think is trustworthy, and they are able to then transfer the S share to whoever they want, who they believe to be trustworthy as they age out, et cetera. It’s very important to close the loop with that.

Alex: Yeah, I mean, obviously you still have the problem of who’s watching the watchers. So if you can be bought and maybe this very trusted person that you gave the share to can be bought in the same way.

Jim: Of course.

Alex: So my only hope here is that, like Epictetus said, “Don’t sell your character, but if you do by God, don’t sell it cheap.”

Jim: Exactly. At least it makes it more expensive to make the corruption happen.

Alex: So ultimately everything can be corrupted with enough money. You can just add a bit more difficulty to that process.

Jim: Exactly. Exactly. I guess a final question. When you do a thing like a benefit corporation, and particularly if you do something like a stewardship equity rule, that’s a pretty big turnoff for investors. How have you been able to round up investors for this project with the benefit corporation in place, and have you found that to be a resistance from investors?

Alex: I’ll find out soon because what we’ve done until now is instead of going to regular VCs, we actually did a crowdfunding route, and so we have more than 2000 investors who each invested on average 800, 900 bucks. That’s been most of our funding so far. So we didn’t do that for the money. We mostly did that because we kind of suck at building a community around stuff. Because we’re all engineers and barely talk to people. And so I figured it would be great if our investors were also our community, and just by raising money from 2,000 regular people and not one or two people in Silicon Valley, we created a community that now gives us feedback on the product, the roadmap, things like that. But we’ll start to talk to regular VCs probably in the near future at some point. I mean, right now we’re still pretty well funded by the crowdfunding process, but if we want to accelerate, and we probably do given how fast things are moving, then I’ll find out what VCs think about the public benefit corps soon.

Jim: It’s funny, I’ve never known anyone that did a B Corp to ever go out to raise VC. So this would be, I’d love to hear your response.

Alex: I know a few companies that did that, including a few here in Austin, so I know it exists. I know right now amongst startups, B Corps are about 8%. So it’s not huge, but it’s not just-

Jim: Okay, well, that’s good.

Alex: … Yeah, one in the entire country. We’re not the only ones. So they raised VC money. Technically, it is the same from the VC’s perspective because the paperwork is identical. There’s basically just two extra lines in the incorporation certificate. Everything else is about the same. But I agree. We’ll find out if that’s a turn off to them because they think you care about something other than money, therefore you’re univestable. I hope not.

Jim: Or you’re crazy or something. Anyway, I want to thank Alex Fink for an extremely interesting conversation. Check out his Otherweb at Otherweb.com. I’ve been playing with it for a couple of days. I tentatively like it. He also has a podcast, The Otherweb and I actually listened to an episode. So Alex, thank you very much for being a guest on the Jim Rutt Show.

Alex: Thank you so much, Jim.

Jim: Alrighty.