An RNN That Generates Text

Activations in a Neural Network

Feeding

Return

Recurrent Neural Networks

  • x, the first output
  • “def”, the second piece of text that we fed into this multi-network structure.
  • x: x is being influenced by nothing other than “abc”. The only thing that the network takes in is “abc”, making that the only determining factor on what the value x will be.
  • y: y is being influenced by 2 things. First thing, “def”. Just like x, we fed “def” into the neural network which makes it a factor on what y will be. However, we also input x into this neural network which derives y. All inputs affect the output, making x another determining factor for y. Now let’s remember… What determined x? The string “abc”. Exactly. y was determined by “def” and x, while x was derived using “abc”. Using transitivity (chain rule, as some like to call it) y was influenced indirectly by “abc” and directly by “def”.
  • z: You get the point. z was influenced by “ghi” and y. y was influenced by “def” and x, x was influenced by “abc”. So z was influenced by “abc”, “def” and “ghi”. Pretty cool right?

Restricting Factors

Same Network

Decoy Activation

RNN’s In Text

Explaining code

  • char_indices = {a = “0”, b = “1” … }
  • indices_char = {0 = “a”, 1 = “b”, …}

Results

Lines 1 and 2 are the input, the rest is RNN generated
Line 1 is the input, the rest is RNN generated
Line 1 is the input, the rest is RNN generated.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store