More gruel
 
Democracy in the Age of the Algorithm

Democracy in the Age of the Algorithm

Credit: Guillaume Chaslot

There’s something special about the matrix of lies that determined the 2016 Election. It’s normal for politicians and their advocates to obscure the truth. Our system evolved to absorb much of the impact of these distortions. What made 2016 unique wasn’t dishonesty, it was a new infrastructure of reality distortion unlike anything in prior experience.

Data analytics, AI, and concentrated interference by wealthy interests and foreign governments produced a propaganda juggernaut the Soviets would have envied. As our computing technology grows more powerful, there may be nothing we can do to limit the influence of “reality enhancement” on our politics. It is unclear how democracy, and the humanistic values on which it’s based, will weather this storm.

Lies, conspiracy theories and simple ignorance have always pulled at the fringes of democracy. Facing the Catholic Democrat Al Smith in 1928, Herbert Hoover’s campaign fanned one of America’s all-time great conspiracy theories. They convinced many fearful Protestants that Smith would build a tunnel from Washington to the Vatican, enabling a Catholic takeover of America.

Stories like this could be viewed as a form of folklore. The story was organic. It took no special brilliance to identify the power of anti-Catholic bias to mobilize American voters. In a simpler civilization with less concentrated power, we could survive the relatively weaker falsehoods and myths of hand-made, artisanal propaganda. That tunnel story had little if any influence on the 1928 election.

Even 20th century totalitarian states lacked the lie-power that we face. Their weakness was their dependence on centralization and control. A single rogue radio signal could pose an existential threat. Forced to choose a singular narrative and impose it uniformly across an entire population, their engines of disinformation were relatively brittle. They could only preserve their power by staving off complexity, an exercise that left them, in relative terms, artificially poor, artificially weak, and eventually vulnerable to systemic collapse.

What distinguishes our emerging ecosystem of lies from the past is their volume, sophistication and reach. AI-driven engines of misinformation turned loose in the past few years are more than a mere megaphone. A new generation of data-driven propaganda tools can evolve on their own, learning from our feedback which messages are most successful, refining and amplifying those lies. Using AI platforms, even simple ones available for free download, I don’t need to know anything about American racism to exploit American racism. These engines can mine our biases, even the ones we hide, and construct lies to exploit them.

Thanks to obstruction by Republicans, we’ve been slow to gain insights on the 2016 disinformation campaign, but some details have emerged. Disinformation engines under development on the right since the Bush II years explored new techniques. An existing rightwing infrastructure of fake news factories like Breitbart, phony think tanks, and old-fashioned propaganda tools like Fox News were paired with new, far more sophisticated tactics.

Organizations like Cambridge Analytica explored the use of social media algorithms to manufacture content tailored to individual biases. They don’t appear to have been successful, but the Russians were. With an entire state security project dedicated to the American election, a foreign propaganda infrastructure including outlets like RT and Sputnik, and a long-established hacking infrastructure already prepared to leverage common AI and big data tools, they were a powerhouse.

Strengthened by intelligence gleaned from their American collaborators and data stolen via hacking, the Russians were able to craft particularly potent lies. They could spread those lies through a powerful propaganda engine, leveraging AI tools like their Twitter bots and phony Facebook accounts to hone the lies, pouring energy into the most successful conspiracy theories and distortions.

Computerized firepower directed at our collective limbic systems was so powerful, they got a guy elected president who you wouldn’t trust to walk your dog. Sure, blame the Russians and the unpatriotic Republicans who collaborated with them, but don’t miss the bigger picture. Breaking up the Russian/Republican project might end that particular campaign, but it won’t disarm this cheap, readily available weapon, just waiting to be used again.

We live in a world that’s growing ever more hostile to unaided, unenhanced human biology. Our machines operate on a speed and scale not merely beyond our capacity, but in some ways hazardous to our health. To date, we’ve leveraged evolutionary tools like Valium and Prozac to help us perform under accelerating demands. We’ve engineered social tools like mass public education to manufacture a human population better adapted to an emerging economy. We’ve created social democracy to buy ourselves time to adapt to declining human value. We’re learning to use chemicals like Ritalin to enhance our performance. New genetic tools like CRISPR offer the potential to modify our core biology, deliberately engineering more adaptable traits. Despite these adaptations, the median human being is steadily losing ground as the capabilities of our synthetic environment grow more powerful by the day.

What happens to democracy when no human can beat a machine at chess, or when wars are fought without soldiers? What does democracy mean when the material success of a civilization depends more on healthy computers than on healthy people?

China is giving us a peek at a potential future, in which algorithms do much of the work once performed by bureaucrats, police, and politicians. Credit ratings are a new development so powerful and sweeping that we forget we once lived without them. Credit already dictates much of American life without any form of central control. Introduce tiny tweaks from political leaders, and a credit rating becomes a new form of government, independent of most human intervention.

“Social credit ratings” now determine a Chinese citizen’s capacity to engage in almost any common activity, including travel. Imagine Yelp for people, where your compliance with the law is a component of your rating, and that’s a social credit rating.

This may sound extreme, but the only real difference between the Chinese system and ours is that our system is lagging behind. Cameras now monitor almost everything anyone does in a major American city. Law enforcement would be practically inconceivable without mass surveillance and mass data collection. Between your credit rating, your feedback on dating sites, your LinkedIn account, and social media history, algorithms already determine much of your access to the glories of a modern economy and a modern social fabric. Adding government compliance is just a single, perhaps unnecessary layer of data on an already thick digital cake. The Chinese are learning to govern without governors, leveraging some of the same tools used to wreck our democracy to promote order and stability, however Orwellian (or Huxleyian), that order might be.

When humans invented and spread the scientific method, they unleashed a force that would destroy civilizations premised on god-based religions. Humanism evolved out of hundreds of years of mayhem to rule a long era. One could argue, as Yuval Noah Harari has in his book Homo Deus, that the digital age has birthed a new evolutionary development displacing Humanism as our highest, most adaptive value.

I’m not convinced that Hariri has the right answers, but in light of democracy’s growing dysfunction, he is certainly asking the essential questions. Election 2016 taught us the superior power of the algorithm over reason. Finding a way to preserve human meaning and value in a data state is the most consequential dilemma we face.

In other words, what comes after Humanism, and the political order it spawned? Does humanism survive in an world of algorithms, and in what forms? How do we preserve an order premised on human value when the economic importance of each ordinary human being wanes beyond a critical mass. The Chinese have programmed their values into the algorithms governing their social order. Can the same be done with humanistic values?

Algorithms defeated democracy in 2016. Will we be ready for the next round of this fight? Should we, in fact, be fighting this trend at all, or should we be looking for ways to leverage this power toward the preservation of human values?

19 Comments

  1. I think the point is really that all this “information”, absent the skills and discipline of critical thinking, is far worse than useless – it’s potentially dangerous. Fact is it’s not that far away from your earlier gun anology.

    BTW – this is no call for ‘information control’, as our friend Dins seems to advocate. The problem there is just who gets to decide? Pravda?

  2. Off topic but related to this general theme.

    I found the “patriotism” display at the White House yesterday totally nauseating. It was actually pure jingoism at its worst. It is probably a winning strategy with the portion of the population that does not understand the difference between patriotism and jingoism. But nevertheless, this is from a man who arranged to get a doctor’s letter stating that he had bone spurs to avoid the Vietnam draft, and then those spurs miraculously went away. And who attacked a Gold Star family.

    I saw a sticker the other day on a steel pole with a quote that perfectly explains the difference between jingoism and patriotism. I quote it below. Unfortunately I do not know the source and at the moment do not have time to research it. Anyway, here goes:

    “Jingoism is to patriotism what idol worship is to a religious practice. Jingoism is all about worshipping lifeless symbols while patriotism is practicing in every moment what those symbols are supposed to represent. Being a patriot isn’t about symbols, flags or anything one could see or touch. It’s on the inside. It’s what you do. It’s how you treat your fellow Americans, no matter how they differ from you. Patriotism is about cherishing where you live and the people you share it with.”

    If any of you recognize this. please reply with the source.

  3. Seems to me, while awash in information, (both actual and counterfactual), we’ve become no better at discerning the difference. This is the problem.

    Algos and bots can serve up as much as our collective confirmation bias can consume. While the raving of Trilateral Commissioners and John Birchers has always been with us, its sheer density
    can be daunting today. Much, but far from all, of this is driven by social media. About the only measures a free society can take to counter this is education. We do a monumentally shitty job of teaching clear thinking and objectivity. Those are the only tools to separate truth from fiction.

    Extraordinary claims require extraordinary evidence, as Sagan told us. Bots, algos, and retweets and FaceBook Forwards provide none of the latter. Teach your children well.

    1. In my day, we called this kind of learning, “critical thinking skills.” I agree with you, Fifty, in your assessment. The abundance of information is useless for those who do not make an effort or know how to assess what they are reading and hearing….parsing opinion from fact and making informed, independent decisions from the aggregate process.

      1. I think the point is really that all this “information”, absent the skills and discipline of critical thinking, is far worse than useless – it’s potentially dangerous. Fact is it’s not that far away from your earlier gun anology.

        BTW – this is no call for ‘information control’, as our friend Dins seems to advocate. The problem there is just who gets to decide? Pravda?

  4. Dins, isn’t the real question how we mortals have used the tools of social media rather than any inherent harm from its existence? Sort of like guns – in the hands of responsible people, guns are an important tool for safety and food. Conversely, in the hands of those who use them for nefarious reasons, guns are dangerous devices. I personally think discourse has expanded through social media – although the interpersonal nature has certainly changed.

    A tell on myself – I see so many teens hunched over their phones – in a group – not conversing with one another but totally absorbed in their electronic devices. It falsely leads one to think the ability of these teens to speak intelligently is diminished through lack of practice. With the Parkland massacre, we clearly saw teens who don’t lack for speaking skills even though their communication of choice may be electronic.

    Now, I realize that your deeper point is the ready availability of social media has encouraged people to abuse it and this abuse has carried over into how we interact with one another. There is truth in that. Undoubtedly, electronic communication necessitates great vigilance to keep personal information private greater emphasis on civility that is lost in this less personal communication. But we are in an information age and there is no going back, only forward. There will always be those who abuse technology but it also provides the opportunity for engagement and rebuttal…except when it is used to inflict deliberate harm. Vigilance is key. Someone needs to wake up in White House circles to the fact that with technology there will always be those who will try to use it for harm….We have to guard and protect. Listening John Bolton? Donald Trump? (Don’t complain of leaks when your phone is not secure.)

    https://www.politico.com/story/2018/06/04/trump-cybersecurity-leader-vulnerable-622813

    1. Mary, I think you have made my point when comparing social media platforms to guns. Sane people want guns heavily regulated, and many people banned from owning/using them.

      Yes, social media is one of the tools of the Net, and yes, in and of itself, neutral. But once humans get involved using it for evil, that is when a hands-off approach must end.

      1. Yes, I am one of those who want sensible regulations while respecting rights of responsible gun owners. That is a different scenario from eliminating social media. Each of us can “limit” our use of media, but I don’t want someone else making the decision for me.

  5. Try to imagine a world without Facebook, Instagram, Twitter, and the various other social media platforms that for the most part did not exist 15 years ago. Outside of the typical “OMG, you can’t stop free enterprise, that is communist!”, the planet would be far far better off if all these media platforms were abolished (Zuckerberg owns at least 3 of them).

    For those old enough to remember, which is pretty much anyone over 30, politics was imbued with something closer to the truth that today. It was still rife with lies, always has been, but spinning total lies was much more difficult to maintain and propagate.

    I for one would be happy to lose my Facebook account if it meant we would have something like what we had even 15 years ago. Of course, hand in hand with that would be re-establishing legislation severely curtailing the concentration of media outlets in the hands of so few.

    Ban all social media. It is one of the horrible, horrible side uses of the Internet.

      1. Chris, do you seriously believe that election cycles were not better 10, 20 years ago? By better, I mean more substantive.

        Do you actually think that social media, in general, has had a net benefit on society? Do you think the puppet tyrant would have been elected, or managed to divert by chaos, as he does every day, if there was no social media?

      2. If your response to an evolutionary challenge is to curl up in a ball and scream “NOOO, I want to go back in time…!” worse things will follow. Yes, sometimes change is intimidating. Survivors look closely at those challenges and identify adaptations which will be successful.

        Otherwise, at least there’s Valium. And frankly, that’s an adaptation that helps a lot of people.

      3. EJ

        As a child of the social media age, I feel that it’s been enormously beneficial for closeted communities everywhere. It’s hard to imagine the modern LGBTQA movement without it, for one example, a lot of forns of domestic abuse and exploitation are far harder when people are less isolated, and the reporting of police misconduct is unimaginably easier.

        This is not to say that valium is a bad idea, of course.

      4. Everybody agrees, though, that there are some communities which deserve to be closeted. (By the way, how was that graph at the top generated?) The democratization and chaos of the Internet form a many-edged sword.

      5. If you are speaking of inappropriate speech, we’d all like the luxury of closeting “certain” groups/individuals (Trump?) but in a free speech society, democracy demands their right to express their views “within limits per their tolerance for being sued for defamation, etc”. It is very difficult for me to listen to some of the ugly, racist, ignorant remarks people make, so I usually elect not to be around people like that. In the public sphere, public sentiment usually prevails (Roseanne)…but then there’s Trump with seeming impunity from crudeness, lying and outright meanness. Slippery slope, this business of containing free speech.

      6. I’m quite for free speech, both legally and socially; while I could in principle be persuaded by arguments to shut out certain voices altogether, I have found that in practice this tends to be executed poorly and that tribal groupthink rather than reasoning have sought to determine the boundaries of acceptability.
        But I had a much broader concept in mind. Conspiracy theorists and cultists have an easier time finding each other and creating a niche online than what they possibly could have had before the net. It’s not only a question of speech; I was musing about how these new technologies aid with a fragmentation of society. It could even be something completely innocuous like a fan base for an obscure band forming in cyberspace when it never could have before. The Internet makes it very easy to form a subculture.
        While I’m at it I’ll clarify my earlier question. How was it decided which videos were represented in the graph?

      7. And Chris, I don’t think you want to phrase your argument against me in terms of an evolutionary challenge. Evolution is the differentiation of species when faced with a changing environment, where essentially mutations turn out to be best adapted to the new environment, and those beings with that mutation thrive. The rest, they die.

        When the environment changes so rapidly, so dramatically, in less than one generation, there is no time for mutations. Everything dies. We could very well be witnessing the end of democracy in the next 20 years.

        When the U.S. adopts the Chinese model, as you have already pointed out the U.S. is careening towards, then the last bastions of democracy will be Germany, Japan, and India. (The U.K. is cruising into an economic disaster with Brexit) The irony is so rich. 75 years ago one was a military dictatorship, one a military oligarchy, and the third a colony.

        You ask in your article how democracy and humanism will defeat the algorithms in 2020 and beyond. I state one valid way is to do something undemocratic and abolish Twitter, Facebook et al, and to re-instate legislation that bans companies like Sinclair owning the amount of radio and TV stations that they do today.

        Of course, to re-instate such legislation means that the Democrats have to regain power, and have a really strong leader. I see neither of those things happening.

      8. Hmm, we have a chance if the DNC doesn’t screw it up. Howard Schultz is retiring from Starbucks and is showing some interest in politics. He’s young, smart, progressive, and rich. He’s also a very good man and well regarded. Running someone like this against T or whoever the Repub nominee might be makes tremendous sense to me. None of the other Dem candidates I’ve heard mentioned do.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.