advertisement
advertisement
advertisement

This Neural Network Learned The Art Of Storytelling From Taylor Swift

Artificial neural networks–a form of artificial intelligence that simulates the way neurons work in the human brain–have made some pretty impressive advancements as of late. First brought to prominence by Facebook for identifying faces in photos, neural nets are now capable of doing everything from writing by “hand,” to forging Picasso paintings, to giving a crash course in how to take the perfect selfie.

Now, two machine learning researchers, Samim Winiger and Ryan Kiros, have trained a neural network to analyze images and generate little stories to serve as captions. And who better to teach a machine the art of storytelling than Taylor Swift and a handful of romance novelists?

To test the neural storyteller, Winiger ran 5000 randomly selected images from the web through two pretrained models: one trained on the sugary lyrics of T. Swift, and the other trained on 14 million seductive passages from a diverse array of romance novels. The results, as you might imagine, are hilarious.

Here’s Lloyd Blankfein, CEO of Goldman Sachs, with a few touching thoughts about a business partner:

An alternate history of the Beatles’ Abbey Road crossing, as told by a brokenhearted 22-year-old:

Then there’s the story that manages to make the famous pottery wheel scene from Ghost very unsexy:

While most of these are more nonsensical than they are romantic, Winiger mentions in his Medium piece that this is just a harbinger of things to come:

Neural-storyteller gives us a fascinating glimpse into the future of storytelling. Even though these technologies are not fully mature yet, the art of storytelling is bound to change. In the near future, authors will be training custom models, combining styles across genres and generating text with images & sounds.

In the meantime, enjoy more comedic image-generated stories here.

[via Prosthetic Knowledge]

MM