# Three Very Modern Uses For A Nineteenth-Century Text Generator

Andrey Markov was trying to understand poems with math when he created a whole new field of probability studies

Some of the algorithms that underlie commonplace technology today have their roots in the nineteenth century–like the Markov chain.

The brainchild of Andrey Markov–who was himself born on this day in 1856–Markov chains are a way of calculating probability. As an example, consider how your iPhone can predict what you’re going to type next. The phone knows what you just typed and makes an educated guess about what you want to say next based on the probability of certain words appearing next to each other.

Although the algorithm that powers cell phone predictive text relies on some of the ideas behind Markov chains, it’s more complex than what’s being discussed here. That’s partly because the user, not the algorithm, picks the next step in the chain.

A "true" Markov chain would calculate what you are going to type next based on the last thing you typed, without any human input (kind of like when you play the "middle-button game," hitting the next suggested prediction mindlessly until the computer generates a "sentence" of sorts).

“Markov chains are everywhere in the sciences today,” writes Brian Hayes for American Scientist. They “help identify genes in DNA and power algorithms for voice recognition and web search,” he writes. For instance, Google’s PageRank algorithm relies on a really complex system of Markov chains, according to Hayes.

But Markov chains aren’t just essential to the internet: they’re on the internet for entertainment purposes as well. Although it’s uncertain how Markov himself would have felt about these uses of his algorithm, take the Markov chain for a spin and see what you come up with.

Write a poem

Be like any other writer you like with Markomposition, a Markov generator. Input text–the sample text provided by creator Marie Chatfield includes non-copyrighted works such as the Declaration of Independence and Grimm’s Fairy Tales, but you can use whatever you want. Chatfield suggests that lots of text produces better poems, as does text with word repetition.

Compose some fanfiction

Markov chains can help write prose, as well as poetry. Jamie Brew, writer for parody site Clickhole, has created a predictive text generator that works on Markov-like principles to write fanfiction and other things. Like cell-phone predictive text, it’s not proper Markov text as the user is the one selecting the words, writes Carli Velocci for Gizmodo.

“[It’s] like a choose your own adventure book that’s running on autopilot,” Brew told Velocci. Take a look at his classic “Batman Loves Him a Criminal” and do it yourself using the source code (or, for that matter, using your phone’s predictive text interface.)