# Questions tagged [sequence-to-sequence]

This tag is used for Google's deprecated seq2seq framework, an encoder-decoder framework for Tensorflow (revamped version is called Neural Machine Translation)

**0**

votes

**0**answers

33 views

### How to build and train a sequence 2 sequence model in tensorflow.js

Hi I am trying to build a text summarizer, using sequence 2 sequence model in tensorflow.js. My dataSet(example) :{Text: i want to return this product because it was broken when i received it....

**0**

votes

**0**answers

15 views

### What is the purpose of Wa vector in the general approach of score function in the Luong Attention?

In the score function of Tensorflow's implementation of Luong Attentionthere is the code (github link):# batched matmul on:# [batch_size, 1, depth] . [batch_size, depth, max_time]# resulting in ...

**0**

votes

**0**answers

20 views

### Long repetitive output after changing vocabulary in seq2seq model

I trained a neural question generation model, which produces sensible questions for the vocabulary that they distributed with the paper. I wanted to run the model on a different set of word embeddings ...

**0**

votes

**2**answers

42 views

### Variable Input for Sequence to Sequence Autoencoder

I implemented a Sequence to Sequence Encoder Decoder but I am having problems with varying my target length in the prediction. It is working for the same length of the training sequence but not if it ...

**1**

vote

**1**answer

43 views

### Difference between two Sequence to Sequence Models keras (with and without RepeatVector)

I try to understand what the difference between this model describde here, the following one: from keras.layers import Input, LSTM, RepeatVectorfrom keras.models import Modelinputs=Input(shape=(...

**0**

votes

**0**answers

54 views

### seq2seq with variable output length

I implemented the Keras seq2seq model for timeseries data, it is working fine when I test it with same sequence length. ut when I want to use a target_seqs > input_seqs (length=10000) for the ...

**0**

votes

**1**answer

67 views

### Creating a custom metric in Keras for sequence to sequence learning

I want to write a custom metric in Keras (python) to evaluate the performance of my sequence to sequence model as I train. Sequences are one-hot encoded and the tokens are words instead of characters. ...

**0**

votes

**1**answer

90 views

### sequence tagging task in tensorflow using bidirectional lstm

I am little interested in sequence tagging for NER. I follow the code "https://github.com/monikkinom/ner-lstm/blob/master/model.py" to make my model like below:X=tf.placeholder(tf.float32, shape=[...

**0**

votes

**0**answers

170 views

### Keras timestep-wise concatenation with variable length of input sequence

I'm working on implementing a seq2seq inspired model that uses GRU cells instead of LSTM. Here's my current code:import constantsfrom keras.layers import Input, GRU, Dense, TimeDistributed, Dropout...

**0**

votes

**0**answers

67 views

### Keras binary crossentropy on sequence to sequence prediction turns negative

i am trying to predict if an appliance is turned on by the power signal of the whole household. I build an 1D CNN with 1440 timestamps in and 1440 timestamps out. It is compiled with a ...

**1**

vote

**0**answers

75 views

### I do not know why in my Keras neural network model, the prediction shape is not consistent with the shape of labels while training?

I have built a Keras ConvLSTM neural network, and I want to predict one frame ahead based on a sequence of 10 time steps:Model:from keras.models import Sequentialfrom keras.layers.convolutional ...

**1**

vote

**1**answer

364 views

### Concept of Bucketing in Seq2Seq model

To handle sequences of different lengths we use bucketing and padding. In bucketing we make different bucket for some max_len and we do this to reduce the amount of padding, after making different ...

**-1**

votes

**1**answer

124 views

### In word embedding, how to map the vector to word?

I checked all API and couldn't find a way to map vector to word no matter in word2Vec or glove. Google doesn't help that much. Does anybody know to do this? Background: I'm training a chatbot by ...

**1**

vote

**0**answers

97 views

### I would like to have an example of using Tensorflow ConvLSTMCell

I would like to have a small example of building an encoder-decoder network using Tensorflow ConvLSTMCell.Thanks

**2**

votes

**1**answer

487 views

### TensorFlow sequence_loss with label_smoothing

Would it be possible to use the label_smoothing feature from tf.losses.softmax_cross_entropy with tf.contrib.seq2seq.sequence_loss ?I can see that sequence_loss optionally takes a ...

**0**

votes

**0**answers

171 views

### How to build an encoder-decoder model with Tensorflow ConvLSTMCell?

I would be really thankful if someone can explain to me that how I can build an encoder-decoder model with Tensorflow ConvLSTMCell(), tf.nn.dynamic_rnn(), and tf.contrib.legacy_seq2seq.rnn_decoder(). ...

**2**

votes

**1**answer

63 views

### Tensorflow seq2seq: Tensor' object is not iterable

I am using seq2seq below code, I found below error:cell=tf.nn.rnn_cell.BasicLSTMCell(size)a, b=tf.nn.dynamic_rnn(cell, seq_input, dtype=tf.float32)cell_a=tf.contrib.rnn....

**1**

vote

**1**answer

172 views

### How to build a decoder using dynamic rnn in Tensorflow?

I know how to build an encoder using dynamic rnn in Tensorflow, but my question is how can we use it for decoder?Because in decoder at each time step we should feed the prediction of previous time ...

**0**

votes

**0**answers

41 views

### What are the correct training batches, given a sequence length of RNNs?

My question comes from this tutorial about RNNs, but it can be a general RNNs implementation question.Suppose we want to develop a model to predict the next character using a RNN, and we have the ...

**4**

votes

**1**answer

85 views

### AttentionDecoderRNN without MAX_LENGTH

From the PyTorch Seq2Seq tutorial, http://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html#attention-decoderWe see that the attention mechanism is heavily reliant on the ...

**1**

vote

**0**answers

392 views

### TensorFlow BeamSearchDecoder outputs the sample_id as (actual sample_id+1)

Have I written custom code (as opposed to using a stock example script provided in TensorFlow): Yes. Based on the NMT tutorial, I am writing a customized code for my own task.OS Platform and ...

**0**

votes

**0**answers

26 views

### Including context information in Sequence2Sequence (Keras)

I have been following the standard Keras tutorial for Sequence2Sequence machine translation at this site.For solving my concrete problem, I need to somehow extend this "standard" approach by ...

**3**

votes

**1**answer

214 views

### What does the “source hidden state” refer to in the Attention Mechanism?

The attention weights are computed as:I want to know what the h_s refers to.In the tensorflow code, the encoder RNN returns a tuple:encoder_outputs, encoder_state=tf.nn.dynamic_rnn(...)As I ...

**0**

votes

**0**answers

261 views

### How to use LSTM Autoencoder output, trained on variant-length sequences, for unsupervised clustering with DBSCAN?

I am a medical doctor and new to machine learning. I am using Time-aware LSTM to develop patients' trajectories (disease progression) based on the time stamped diagnoses (sequential time series) data ...

**2**

votes

**1**answer

201 views

### Is tensorflow embedding_lookup differentiable?

Some of the tutorials I came across, described using a randomly initialized embedding matrix and then using the tf.nn.embedding_lookup function to obtain the embeddings for the integer sequences. I am ...

**1**

vote

**1**answer

295 views

### Seq2Seq Models for Chatbots

I am building a chat-bot with a sequence to sequence encoder decoder model as in NMT. From the data given I can understand that when training they feed the decoder outputs into the decoder inputs ...

**2**

votes

**1**answer

136 views

### Tensorflow seq2seq error

I'm trying the seq2seq tensorflow RNN tutorial from here:https://github.com/tensorflow/models/tree/master/tutorials/rnnHere's how I create a model:model=seq2seq_model.Seq2SeqModel(...

**0**

votes

**0**answers

43 views

### Loading a model in sequence to sequence for inference

I am using my own tensorflow code. I have created three separate graphs: train graph, eval graph and infer graph.After training, I save my model. Now when I try to load this model during inference I ...

**3**

votes

**2**answers

652 views

### Tensorflow RNN: how to infer a sequence without duplicates?

I'm working on a seq2seq RNN generating an output sequence of labels given a seed label. During the inference step I'd like to generate sequences containing only unique labels (i.e. skip labels that ...

**0**

votes

**0**answers

27 views

### Training of sequence to sequence models

I am wondering, when we are training a sequence to sequence model should we feed in the ground truth?I mean for prediction at time step (t+1), should we feed in the decoder with the prediction of ...