When you’re dealing with software such as Google Search or virtual assistants like Apple’s Siri, there’s not an awful lot of friendliness or emotion in the exchange – and any warmth you can find has probably been deliberately programmed in by human software engineers (like Siri’s famous Easter Eggs).
But now Google is working on a solution, and it’s looking for love in perhaps the most obvious of places: romance novels. The company is feeding romance novels – thousands of them – to one of its artificial intelligence (AI) engines, in a bid to enhance the software’s personality and people skills.
“In the Google app, the responses are very factual,” Google software engineer Andrew Dai told Alex Kantrowitz at BuzzFeed News. “Hopefully with this work, and future work, it can be more conversational, or can have a more varied tone, or style, or register.”
The project has been running for the last few months, with the team giving the neural network some 2,865 romance novels to plough through. The software learns as it goes, detecting more and more meanings and subtleties with the more it reads.
According to Kantrowitz, romance fiction serves as a better educational resource for the software than more basic literature like children’s learn-to-read books, since they offer a broader range of linguistic examples for the network to learn from.
Learning how to banter well with people might represent an even greater challenge than besting them at a game, but Google’s team is optimistic. They’ve reportedly trained the engine to write sentences that resemble the kind of language you’d find in romance novels, and the next step is to try to use its new diction in Google products.
One such product could be the Google app – the mobile app version of Google Search on the desktop, but packed with assistant-like features and voice control – and another could be the company’s ‘Smart Reply’ feature for Google Inbox, which could deliver more intelligent automatic responses to messages thanks to its new and improved English skills.
“It would be much more satisfying to ask Google questions if it really understood the nuances of what you were asking for, and could reply in a more natural and familiar way,” Dai explained to Lindsey J. Smith at The Verge. “It’s like how you’d rather ask a friend about what do to in a vacation spot instead of calling their visitor centre.”
Let’s just hope Google doesn’t end up with a situation like Tay – an AI chatbot designed by Microsoft that was taken offline after just 16 hours after its innocent mimicking of real-world Twitter users saw it rapidly transform into a racist, xenophobic troll.
To avoid a similar fiasco, Google will take a more conservative approach with its own experiment, seeking to make sure humanity’s own bad habits won’t surface in the AI.
“It’s quite sexy. It’s very imaginative,” Dai told Buzzfeed News. “We work directly with the product folks on how to develop this with minimal risk of it doing bad things, things that we don’t expect.”