#!/usr/bin/env python # coding: utf-8 # # Exercise 09 # # ## Sequence Classification using LSTM # # Sequence classification is a predictive modeling problem where you have some sequence of inputs over space or time and the task is to predict a category for the sequence. # # What makes this problem difficult is that the sequences can vary in length, be comprised of a very large vocabulary of input symbols and may require the model to learn the long-term context or dependencies between symbols in the input sequence. # [The Large Movie Review Dataset](http://ai.stanford.edu/~amaas/data/sentiment/) (often referred to as the IMDB dataset) contains 25,000 highly-polar movie reviews (good or bad) for training and the same amount again for testing. The problem is to determine whether a given movie review has a positive or negative sentiment. # In[1]: import numpy as np from keras.datasets import imdb from keras.models import Sequential from keras.layers import Dense from keras.layers import LSTM from keras.layers.embeddings import Embedding from keras.preprocessing import sequence from keras.callbacks import History np.random.seed(7) # In[2]: from livelossplot import PlotLossesKeras get_ipython().run_line_magic('matplotlib', 'inline') # In[29]: # load the dataset but only keep the top n words, zero the rest top_words = 5000 index_from = 3 (X_train, y_train), (X_test, y_test) = imdb.load_data(num_words=top_words, index_from=index_from) # In[4]: y_train[0] # In[49]: np.min([np.min(x) for x in X_train]), np.max([np.max(x) for x in X_train]) # The words have been replaced by integers that indicate the ordered frequency of each word in the dataset. The sentences in each review are therefore comprised of a sequence of integers. # In[5]: X_train.shape # In[50]: print(X_train[0]) # Next, we need to truncate and pad the input sequences so that they are all the same length for modeling. The model will learn the zero values carry no information so indeed the sequences are not the same length in terms of content, but same length vectors is required to perform the computation in Keras. # In[54]: # truncate and pad input sequences max_review_length = 500 X_train_pad = sequence.pad_sequences(X_train, maxlen=max_review_length) X_test_pad = sequence.pad_sequences(X_test, maxlen=max_review_length) # In[8]: X_train_pad.shape # ### Word Embedding # # We will map each movie review into a real vector domain, a popular technique when working with text called word embedding. This is a technique where words are encoded as real-valued vectors in a high dimensional space, where the similarity between words in terms of meaning translates to closeness in the vector space. # # Keras provides a convenient way to convert positive integer representations of words into a word embedding by an Embedding layer. # # We will map each word onto a 32 length real valued vector. We will also limit the total number of words that we are interested in modeling to the 5000 most frequent words, and zero out the rest. Finally, the sequence length (number of words) in each review varies, so we will constrain each review to be 500 words, truncating long reviews and pad the shorter reviews with zero values. # # Now that we have defined our problem and how the data will be prepared and modeled, we are ready to develop an LSTM model to classify the sentiment of movie reviews. # # Exercise 09.1 # # Train a Deep Neural Network with the following architecture: # # - Input = pad_sequences (input_length=max_review_length) # - Embedding(top_words, 32, input_length) # - LSTM(100) # - Dense(1, sigmoid) # # Optimized using adam using as loss binary_crossentropy # # Hints: # - test with two iterations then try more. # - learning can be ajusted # # Evaluate the performance using the testing set (aprox 87% with 10 epochs) # In[ ]: # # Exercise 09.2 # # Predict the sentiment of the following reviews # In[51]: reviews = ["I was fortunate enough to see this movie on pre-release last night and, though I wasn't expecting to, actually really enjoyed the movie for the most part. The rescues and sea effects were amazing to watch and definitely provided edge of the seat tense moments, probably all the more so knowing that there are guys who do this for a living. The weaker parts of the movie revolve largely around using stereotypical set scenes. I'm not going to spoil the movie but this really follows along the lines of An Officer and a Gentleman and those moments give it a little bit of a cheesy aftertaste.

Like I said over all this movie is pretty good and worth checking out as long as you can get past the clichés.", '"The Dresser" is perhaps the most refined of backstage films. The film is brimming with wit and spirit, for the most part provided by the "energetic" character of Norman (Tom Courtenay). Although his character is clearly gay, and certainly has an attraction for the lead performer (Albert Finney) that he assists, the film never dwells on it or makes it more than it is.

The gritty style of Peter Yates that worked so well in "Bullitt" is again on display, and gives the film a sense of realism and coherence. This is much appreciated in a story that could so easily have become tedious. In the end, "The Dresser" will bore many people silly, but it will truly be a delight to those who love British cinema.

7.7 out of 10', "So real and surreal, all in one. I remember feeling like Tessa. Heck, I remember being Tessa. This was a beautiful vignette of a relationship ending. I especially liked the protesters tangent. It is nice to see symbolism in a movie without being smacked over the head with it. If you get the chance to see this, take it. It is well worth the 30 minutes.", "This is a pale imitation of 'Officer and a Gentleman.' There is NO chemistry between Kutcher and the unknown woman who plays his love interest. The dialog is wooden, the situations hackneyed. It's too long and the climax is anti-climactic(!). I love the USCG, its men and women are fearless and tough. The action scenes are awesome, but this movie doesn't do much for recruiting, I fear. The script is formulaic, but confusing. Kutcher's character is trying to redeem himself for an accident that wasn't his fault? Costner's is raging against the dying of the light, but why? His 'conflict' with his wife is about as deep as a mud puddle. I saw this sneak preview for free and certainly felt I got my money's worth.", "I was at Wrestlemania VI in Toronto as a 10 year old, and the event I saw then was pretty different from what I saw on the Wrestlemania Collection DVD I just watched. I don't understand how the wwE doesn't have the rights to some of the old music, since most of those songs were created by the WWF they shouldn't have to worry about the licensing and royalty fees that prevent shows like SNL from releasing season sets. Its pretty stupid to whine about, but for me hearing Demolition come out to their theme music at a Wrestlemania in person was a memory that I never forgot, and it didn't exist on this DVD. What is the point of them even owning the rights to this huge library of video if they have to edit it so drastically to use it?", "Wow! What a movie if you want to blow your budget on the title and have it look real bad ask the guys that made this movie on how to do that. They could have spent the money on a good rewrite or something else. Or they could have spent it on beer when they made this movie at least it would have come out better." ] # Reviews must be preprocessed # In[25]: from keras.datasets.imdb import get_word_index vocab = get_word_index() vocab = {k:(v+index_from) for k,v in vocab.items()} vocab[""] = 0 vocab[""] = 1 vocab[""] = 2 # In[32]: {k:vocab[k] for k in list(vocab.keys())[:10]} # Lets see how X_train is encoded # In[56]: X_train_pad[0] # In[52]: print(X_train[0]) # Lets get the text # In[53]: id_to_word = {value:key for key,value in vocab.items()} print(' '.join(id_to_word[id] for id in X_train[0] )) # Compared with the original review in lowercase: # # "this film was just brilliant casting location scenery story direction everyone's really suited the part they played and you could just imagine being there robert redford's is an amazing actor and now the same being director norman's father came from the same scottish island as myself so i loved the fact there was a real connection with this film the witty remarks throughout the film were great it was just brilliant so much that i bought the film as soon as it was released for retail and would recommend it to everyone to watch and the fly fishing was amazing really cried at the end it was so sad and you know what they say if you cry at a film it must have been good and this definitely was also congratulations to the two little boy's that played the part's of norman and paul they were just brilliant children are often left out of the praising list i think because the stars that play them all grown up are such a big profile for the whole film but these children are amazing and should be praised for what they have done don't you think the whole story was so lovely because it was true and was someone's life after all that was shared with us all" # In[ ]: