Search⌘ K
AI Features

Project Creation: Part Three

Explore how to generate text by sampling from a Markov chain model using a fixed-length context. Understand the role of context length and probability sampling to predict the next character in a sequence. Gain practical experience implementing text generation and recognize the limitations of syntactic models compared to advanced methods like LSTM.

We'll cover the following...

Sampling the text

We already created our Markov model in the previous lesson. Now we need to sample the text and generate some new text based on this sampling.

Python 3.5
def sample_next(ctx,model,k):
ctx = ctx[-k:]
if model.get(ctx) is None:
return " "
possible_Chars = list(model[ctx].keys())
possible_values = list(model[ctx].values())
print(possible_Chars)
print(possible_values)
return np.random.choice(possible_Chars,p=possible_values)
sample_next("commo",model,4)

Explanation:

  • The function, sample_next(ctx,model,k), accepts three parameters: the context, the model, and the value of K.

  • The context is nothing but the text that will be used to generate some new text. However, only the last K characters from the context will be used by the model to predict the next character in the sequence.

  • For example, we passed the value of context ...