Skip to content

Latest commit

 

History

History
9 lines (4 loc) · 1.22 KB

README.md

File metadata and controls

9 lines (4 loc) · 1.22 KB

Textual Entailment

Building machines with commonsense reasoning is a really important and interesting research problem. Given two sentences, the problem of textual entailment is the problem of deciding whether the first sentence can be used to prove that the second is true, false, or uncorrelated. This ability is very useful in many applications including question-answering and text summarization.

One interesting question we may ask is whether an agent can learn to make inferences and deductions from textual stories. Suppose we're given a short narrative which has been jumbled, that is, the sentences have been randomly shuffled. Can we build a learning algorithm to correctly reorder these sentences? This will require both commonsense knowledge and temporal understanding.

This repo contains one solution to this problem using recurrent neural networks. At a high-level, the network architecture incorporates layers of LSTMs along with a differentiable attention mechanism based on this paper. The files rnn_attention.py and rnn_seq_attention.py detail two variants of this architecture.