(7 pts) Complete the function in sentence_generation.py to load your trained model and generate new sentence from it.
Basically, once a language model is trained, it is able to predict the next character after a sequence, and this process can be continued (predicted character serve as history for predicting the next).
More specifically, your model should be able to predict the probability distribution over the vocabulary for the next character, and we have implemented a sampler sample_next_char_id
which samples according to the probability. By repeating this process, your model is able to write arbitrarily long paragraphs.
For example the following passage is written by a GRU trained on Shakespeare:
ROMEO:Will't Marcius Coriolanus and envy of smelling!
DUKE VINCENTIO:
He seems muster in the shepherd's bloody winds;
Which any hand and my folder sea fast,
Last vantage doth be willing forth to have.
Sirraher comest that opposite too?
JULIET:
Are there incensed to my feet relation!
Down with mad appelate bargage! troubled
My brains loved and swifter than edwards:
Or, hency, thy fair bridging courseconce,
Or else had slept a traitors in mine own.
Look, Which canst thou have no thought appear.
ROMEO:
Give me them: for that I put us empty.
RIVERS:
The shadow doth not live: and he would not
From thee for his that office past confusion
Is their great expecteth on the wheek;
But not the noble fathom were an poison
Here come to make a dukedom: therefore--
But O, God grant! for Signior HERY
VI:
Soft love, that Lord Angelo: then blaze me all;
And slept not without a Calivan Us.
Note that the model learns how to spell each word and write the sentence-like paragraphs all by itself, even including the punctuations and line breaks.
Please use ROMEO and JULIET as history to begin the generation for 1000 characters each, and attach the generated text in your report.