Login

Artificial Intelligence and Writing for Video Games

Artificial Intelligence and Writing for Video Games

After prying myself out of my turkey- coma this morning, cup of coffee with Cinnabon-flavored creamer in hand, I lay in bed with my tablet tuned into the second season of Star Trek: The Next Generation. Episode 9, “The Measure of a Man” – a formal hearing is convened to determine whether Data is considered property without rights or is a sentient being. He is defined as an android, yet also possesses artificial intelligence. We are still a long way off from creating any Datas and true AI, yet it’s on the list of things to make from science fiction. Star Trek is only one example of how artificial intelligence could benefit humankind, yet brings into question the ethical and moral implications that come along with AI. Most immediately, there is talk of AI shaping the future of open-world games, what that means for developers and publishers, and what the means for writers. Will AI make game writers obsolete at some point in the future?

My question stems from Keith Stuart’s article on eurogamer.net, and I have to say it scares me to think about the slightest possibility that humanity would render any creative skill obsolete within its own ranks. It may be on a small and unsophisticated level now, but for procedurally generated quests or environments, it’s already happening. It’s not an impressive piece of technology at the moment, but humans have always favored invention and innovation, finding ways to produce things more efficiently, even at the cost of people’s jobs. I can see the allure with AI in terms of creating mundane, straightforward quests; it would free up writers to focus on storytelling, narrative, and character development, and provide a cost-effective alternative for studios.

However, the best open-world games have a human element, a soul, in every bit of the story, even the mundane quests. Sometimes, the choices that you make decide if you character ends up good or evil. AI may mean intelligence, but it doesn’t mean emotional intelligence. A real person needs to write the quests that break convention, that stray away from overused patterns. A real person needs to be there to elicit an emotional response from the player. Also, there needs to be real people to figure out how to fit story into procedurally generated quests. So, we’re always going to need people to craft missions and write narratives. Even when Data finally received his emotion chip, he had a learning curve to figure out which emotions were which, and he has to learn to control them. Do companies in the distant future want to deal with a computer that is acting like a teenager?

On the other side of the coin, there is player preference. For me, there is nothing more tedious than grinding for XP by going through repetitive missions. I need story. I need drama. I need an emotional connection. The only way I can handle grinding is if I am on a team and we are trying to reach a common goal or we are helping each other – the human element will always win, whether it’s in the game or not. But let’s say AI is able to replicate emotion and human experiences one day. What then? Stuart’s article states that it’s something already being explored by GiantOtter, a company that has been developing a crowd-sourcing AI platform called GroupPlay over the last few years.  Basically, they recruited testers to improvise their own dialogue and interactions while playing a short, RPG mini-game. Everything was recorded and added to a database of conversations and actions that could then be made available to NPCs. Also, Stuart states that Mark Riedl, head of the Entertainment Intelligence Lab at the Georgia Tech School of Interactive Computing, has been working on a project that is a crowd-powered automated story generation system; it creates its own interactive fiction by studying groups of plot lines developed by humans. If these methods are used to develop open-world games in the near or distance future, who does the creative property belong to? The studio? The individuals that improvised their own dialogue? Will this be the future look of a writer’s room? What are the legal implications for crowd-sourced dialogue and narrative? Who gets paid? Who gets their name on the credits? Who’s to say that a similar program can’t be designed to parse text from novels or other games to create a database of options – would that be plagiarism?

These things scare me. If they can apply these systems to an open-world format, what’s to say that they cannot also be utilized to develop other genres of games? It seems like that these tools are being created to make non-linear story telling easier for writers, yet I would imagine that some companies would try to use these tools or refine these tools in order to eliminate writing jobs or other types of jobs to save money. Since the industrial revolution, this has happened time and time again – automation leaves less of a need for human workers. I might be a bit a doomsday-believer, but with new technology comes great responsibility – it is up to us to think about how it will affect everyone in every way, from an economic to a legal standpoint. Of course, the AI technology we are talking about is nowhere near that of the fictional character of Data, but we must all put our science fiction hats on and think about what implications certain new technology can have for the future. So, in a sense, we must all think like writers, and we must not write ourselves out of relevancy. We need writers. We need humans.

I mean…we’ve all seen The Matrix, right?

Register with UGDB and start rating games now!

Advertisement

comments powered by Disqus