等不及马丁新作,人工智能续写《冰与火之歌》!想看戳这里
为了给苦苦等待的粉丝们找点乐趣,软件工程师扎克·图特(Zack Thoutt)让循环神经网络人工智能技术学习该剧原著《冰与火之歌》前五部的内容,然后续写五章剧情。这些人工智能创作的情节与粉丝早前的一些推测部分吻合,比如,詹姆最终杀死了瑟曦,囧雪成了龙骑士,而瓦里斯毒死了龙母。如果你感兴趣,可以在GitHub的主页上查看所有章节。 下面来了解一下人工智能是如何做到的: After feeding a type of AI known as a recurrent neural network the roughly 5,000 pages of Martin's five previous books, software engineer Zack Thoutt has used the algorithm to predict what will happen next. According to the AI's predictions, some long-held fan theories do play out - in the five chapters generated by the algorithm so far, Jaime ends up killing Cersei, Jon rides a dragon, and Varys poisons Daenerys. 如果你感兴趣,可以在GitHub的主页上查看所有章节。附上传送门: https://github.com/zackthoutt/got-book-6/tree/master/generated-book-v1 Each chapter starts with a character's name, just like Martin's actual books. But in addition to backing up what many of us already suspect will happen, the AI also introduces some fairly unexpected plot turns that we're pretty sure aren't going to be mirrored in either the TV show or Martin's books, so we wouldn't get too excited just yet. For example, in the algorithm's first chapter, written from Tyrion's perspective, Sansa turns out to be a Baratheon.
"It's obviously not perfect," Thoutt told Sam Hill over at Motherboard. "It isn't building a long-term story and the grammar isn't perfect. But the network is able to learn the basics of the English language and structure of George R.R. Martin's style on its own." Neural networks are a type of machine learning algorithm that are inspired by the human brain's ability to not just memorize and follow instructions, but actually learn from past experiences. A recurrent neural network is a specific subclass, which works best when it comes to processing long sequences of data, such as lengthy text from five previous books. In theory, Thoutt's algorithm should be able to create a true sequel to Martin's existing work, based off things that have already happened in the novels. But in practice, the writing is clumsy and, most of the time, nonsensical. And it also references characters that have already died.
"Arya saw Jon holding spears. Your grace," he said to an urgent maid, afraid. "The crow's eye would join you. "A perfect model would take everything that has happened in the books into account and not write about characters being alive when they died two books ago," Thoutt told Motherboard. "The reality, though, is that the model isn't good enough to do that. If the model were that good authors might be in trouble ... but it makes a lot of mistakes because the technology to train a perfect text generator that can remember complex plots over millions of words doesn't exist yet." One of the main limitations here is the fact that the books just don't contain enough data for an algorithm. Although anyone who's read them will testify that they're pretty damn long, they actually represent quite a small data set for a neural network to learn from. But at the same time they contain a whole lot of unique words, nouns, and adjectives which aren't reused, which makes it very hard for the neural network to learn patterns.| Thoutt told Hill that a better source would be a book 100 times longer, but with the level of vocabulary of a children's book. |