This Neural Network Can (Maybe) Start a Novel Better Than You

As the end of NaNoWriMo draws near, take a look at one researcher’s effort to help find that perfect first line

books.jpeg
Janelle Shane's neural network needs a lot of first lines before it can teach itself to write good ones. Pexels

One month. 50,000 words. A global community. Hashtags on countless social networks. That’s NaNoWriMo.

National Novel Writing Month is a struggle for most participants: After all, cranking out an entire novel in just thirty short days in the dark of the year is quite the task. One expert in machine learning set out to help, and her project has kicked up some interesting results.

Janelle Shane was hoping to give NaNo participants a start by training a neural network to write the first sentence for a novel. “All I have to do is give the neural network a long list of examples and it will try its best to teach itself to generate more like them,” she writes on her blog. She incorporated “a couple hundred of the most famous first lines,” but it wasn’t enough for the network to learn from. The results of this early attempt were pretty absurd, like this example:

The snow in the story of the best of times, it was the season of Darkness, it was the season of Light, it was the epoch of belief, it was the worst of times, it was the season of Light, it was the season of Darkness, it was the season of Light, it was the season of Light, it was the season of Darkness, it was the season of exploding past of Eller, and Junner, a long sunset side of the World.

The more input you give a neural network, the better the output is, so Shane solicited help from the NaNo crowd, asking for more first lines. With just two days to go, she's received a total of 11,135 submissions. The results so far have included original first lines from would-be novelists as well as first lines from the likes of Terry Pratchett, Diana Wynne Jones and Stephen King, as well as infamous online smut author Chuck Tingle. She’s planning to publish the results after NaNo ends but has kept a running series of comments on Twitter about its progress. 

Yesterday, she announced that she had downloaded the data set and started training her AI. The early results are... mixed: 

I was the bad door thereby the edge had a can. 

Her emlage were playing the three of the fible of the Sinnia Ously of St, in the hole is his life in a moist king.

This isn’t Shane’s first foray into training neural networks to produce new ideas. Check out her recipe experiments or sometimes-sweet neural network-generated pickup lines (“You are so beautiful that it makes me feel better to see you.”)

“Shane–an industrial research scientist with a background in laser science, electrical engineering and physics–describes herself as a hobbyist when it comes to machine learning,” writes Jacob Brogan for Slate. “She thinks of her work as a form of ‘art and writing.’ Nevertheless, the output of her networks is typically silly and charming in equal measure, partly because it often fails spectacularly.” Hopefully, this new project will produce some successes. Some people are probably planning their novels for next year.

Get the latest stories in your inbox every weekday.