With the tremendous success of deep learning techniques, a lot of interesting researches on dialogue systems called chat-bot with deep neural networks have been attempted. In general, sequence-to-sequence models with recurrent neural network (RNN) are...
With the tremendous success of deep learning techniques, a lot of interesting researches on dialogue systems called chat-bot with deep neural networks have been attempted. In general, sequence-to-sequence models with recurrent neural network (RNN) are conventional dialogue systems which process one or two sentences at a time and can reply to user’s utterance without considering a conversation history. To overcome this problem, we propose consecutive Seq-DNC-seq structures for context understanding in dialogue. For this, we take differentiable neural computer (DNC) which has external memory to store the contextual information related to a conversation. We show that this memory can handle previous conversation history, so that the chat-bot can reply after understanding context information. We implemented consecutive Seq-DNC-seq structures such that the first layer processes first turn and second layer processes second turn and so on. Since each of the layer has the previous conversation’s information, our chat-bot system can generate various sentences even if the input is same. The proposed model can be a start point for more intelligent chat-bot system like human