In this article, I will use the YouTube trending videos dataset and the Python programming language to train a model of text generation language using machine learning, which will be used for the task of title generator for youtube videos or even for your blogs.
Title generator is a natural language processing task and is a central issue for several machine learning, including text synthesis, speech to text, and conversational systems.
Also, Read – Machine Learning Full Course for free.
To build a model for the task of title generator or a text generator, the model should be trained to learn the probability of a word occurring, using words that have already appeared in the sequence as context.
Title Generator with Machine Learning
I will start this task to build a title generator with Python and machine learning by importing the libraries and by reading the datasets. The datasets that I am using in this task can be downloaded from here:
Now we need to process our data so that we can use this data to train our machine learning model for the task of title generator. Here are all the data cleaning and processing steps that we need to follow:
Generating Sequences
Natural language processing tasks require input data in the form of a sequence of tokens. The first step after data cleansing is to generate a sequence of n-gram tokens.
An n-gram is an adjacent sequence of n elements of a given sample of text or vocal corpus. Elements can be words, syllables, phonemes, letters, or base pairs. In this case, the n-grams are a sequence of words in a corpus of titles. Tokenization is the process of extracting tokens from the corpus:
Padding the sequences
Since the sequences can be of variable length, the sequence lengths must be equal. When using neural networks, we usually feed an input into the network while waiting for output. In practice, it is more efficient to process data in batches rather than one at a time.
This is done by using matrices [batch size x sequence length], where the length of the sequence corresponds to the longest sequence. In this case, we fill the sequences with a token (usually 0) to fit the size of the matrix. This process of filling sequences with tokens is called filling. To enter the data into a training model, I need to create predictors and labels.
I will create sequences of n-gram as predictors and the following word of n-gram as label:
LSTM Model

In recurrent neural networks, the activation outputs are propagated in both directions, i.e. from input to output and outputs to inputs, unlike direct-acting neural networks where outputs d activation are propagated in only one direction. This creates loops in the architecture of the neural network which acts as a “memory state” of neurons.
As a result, the RNN preserves a state through the stages of time or “remembers” what has been learned over time. The state of memory has its advantages, but it also has its disadvantages. The gradient that disappears is one of them.
In this problem, while learning with a large number of layers, it becomes really difficult for the network to learn and adjust the parameters of the previous layers. To solve this problem, a new type of RNN has been developed; LSTM (long-term memory).
Title Generator with LSTM Model
The LSTM model contains an additional state (the state of the cell) which essentially allows the network to learn what to store in the long term state, what to delete and what to read. . The LSTM of this model contains three layers:
- Input layer: takes the sequence of words as input
- LSTM Layer: Calculates the output using LSTM units.
- Dropout layer: a regularization layer to avoid overfitting
- Output layer: calculates the probability of the next possible word on output
Now I will use the LSTM Model to build a model for the task of Title Generator with Machine Learning:
Title Generator with Machine Learning: Testing The Model
Now that our machine learning model for title generator is ready and has been trained using the data, it’s time to predict a headline based on the input word. The input word is first tokenized, the sequence is then completed before being passed into the trained model to return the predicted sequence:
Generating Titles:
Now as we have created a function to generate titles let’s test our title generator model:
print(generate_text(āspidermanā, 5, model, max_sequence_len))
Code language: PHP (php)
Output: Spiderman The Voice 2018 Blind Audition
I hope you liked this article on how to build a title generator model with machine learning and Python programming language. Feel free to ask your valuable questions in the comments section below.