Text prediction is a task that we use so often in our lives that we've taken it for granted. From the auto-fill feature in our messaging apps to search engines predicting search terms, text prediction technology saves our time and helps us to make our lives easier. It also links into other tasks such as text generation, which can eventually be used to write stories or longer paragraphs. In this notebook, I will be using the Brown corpus to create and train a LSTM model to predict the next word in a sentence. To design and develop word predictor application for the English language that would suggest better words and thus save key-strokes. The program uses a probablistic language model based on the Deep Neural Network, RNN, LSTM,TensorFlow,Keras. By using such model , the program can be easily modified to work with languages other than English in future. It will learn from the user’s typing history,predicting words on the basis of frequency and recent use. Self-learning function makes predictions faster by increasing typing speed.
node9909 / -text-prediction-using-recurrent-neural-networks Goto Github PK
View Code? Open in Web Editor NEWThis project forked from angarak/-text-prediction-using-recurrent-neural-networks
Text prediction is a task that we use so often in our lives that we've taken it for granted. From the auto-fill feature in our messaging apps to search engines predicting search terms, text prediction technology saves our time and helps us to make our lives easier. It also links into other tasks such as text generation, which can eventually be used to write stories or longer paragraphs. In this notebook, I will be using the Brown corpus to create and train a LSTM model to predict the next word in a sentence.
License: MIT License