Skip to content

Using a Markov Model and LSTM Neural Network to generate text

Notifications You must be signed in to change notification settings

richfremgen/NLP-Text-Generation-Project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 

Repository files navigation

NLP-Text-Generation-Project

Using a Markov Model and LSTM Neural Network to generate text

Step 1:

Chose an NLP problem, not restricted to the following:

  • Reconstruction
  • Document Classification
  • Token Classification
  • Language Modeling
  • Machine Translation

Step 2:

Identify or construct a solution based on a generative probabilistic language model. Describe the model in detail and develop a solution using parameter inference (and/or decoding).

Step 3:

Identify or construct a solution based on a discriminative neural network. Describe the network structure in detail and develop a solution using parameter inference (and/or decoding).

Step 4:

Train and apply both approaches to real data acquired legally. Evaluate the results qualitatively and quantitatively. Highlight situations where each approach performs well or poorly. Any unusual/unexpected results require explanation.

Step 5:

Train and apply both approaches to synthetic data acquired legally. Evaluate the results qualitatively and quantitatively. Highlight situations where each approach performs well or poorly. Any unusual/unexpected results require explanation.

Step 6:

Discuss pros and cons of the two approaches. Consider:

  • quality/correctness
  • data, time, and computational requirements
  • interpretability

About

Using a Markov Model and LSTM Neural Network to generate text

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published