This tag is used for Google's deprecated seq2seq framework, an encoder-decoder framework for Tensorflow (revamped version is called Neural Machine Translation)

learn more…| top users | synonyms

0
votes
0answers
25 views

What are the correct training batches, given a sequence length of RNNs?

My question comes from this tutorial about RNNs, but it can be a general RNNs implementation question.Suppose we want to develop a model to predict the next character using a RNN, and we have the ...
0
votes
1answer
31 views
+50

AttentionDecoderRNN without MAX_LENGTH

From the PyTorch Seq2Seq tutorial, http://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html#attention-decoderWe see that the attention mechanism is heavily reliant on the ...
1
vote
0answers
22 views

TensorFlow BeamSearchDecoder outputs the sample_id as (actual sample_id+1)

Have I written custom code (as opposed to using a stock example script provided in TensorFlow): Yes. Based on the NMT tutorial, I am writing a customized code for my own task.OS Platform and ...
0
votes
0answers
12 views

Including context information in Sequence2Sequence (Keras)

I have been following the standard Keras tutorial for Sequence2Sequence machine translation at this site.For solving my concrete problem, I need to somehow extend this "standard" approach by ...
3
votes
1answer
44 views

What does the “source hidden state” refer to in the Attention Mechanism?

The attention weights are computed as:I want to know what the h_s refers to.In the tensorflow code, the encoder RNN returns a tuple:encoder_outputs, encoder_state=tf.nn.dynamic_rnn(...)As I ...
0
votes
0answers
35 views

How to use LSTM Autoencoder output, trained on variant-length sequences, for unsupervised clustering with DBSCAN?

I am a medical doctor and new to machine learning. I am using Time-aware LSTM to develop patients' trajectories (disease progression) based on the time stamped diagnoses (sequential time series) data ...
-1
votes
0answers
10 views

Save seq2seq predictions to a file

In https://github.com/google/seq2seq,Currently the predictions are printed to console. And piped to a textfile as below:python -m bin.infer \--tasks "- class: DecodeText- class: ...
2
votes
1answer
42 views

Is tensorflow embedding_lookup differentiable?

Some of the tutorials I came across, described using a randomly initialized embedding matrix and then using the tf.nn.embedding_lookup function to obtain the embeddings for the integer sequences. I am ...
1
vote
1answer
62 views

Seq2Seq Models for Chatbots

I am building a chat-bot with a sequence to sequence encoder decoder model as in NMT. From the data given I can understand that when training they feed the decoder outputs into the decoder inputs ...
2
votes
1answer
46 views

Tensorflow seq2seq error

I'm trying the seq2seq tensorflow RNN tutorial from here:https://github.com/tensorflow/models/tree/master/tutorials/rnnHere's how I create a model:model=seq2seq_model.Seq2SeqModel(...
0
votes
0answers
35 views

Loading a model in sequence to sequence for inference

I am using my own tensorflow code. I have created three separate graphs: train graph, eval graph and infer graph.After training, I save my model. Now when I try to load this model during inference I ...
3
votes
2answers
183 views

Tensorflow RNN: how to infer a sequence without duplicates?

I'm working on a seq2seq RNN generating an output sequence of labels given a seed label. During the inference step I'd like to generate sequences containing only unique labels (i.e. skip labels that ...
0
votes
0answers
20 views

Training of sequence to sequence models

I am wondering, when we are training a sequence to sequence model should we feed in the ground truth?I mean for prediction at time step (t+1), should we feed in the decoder with the prediction of ...
0
votes
1answer
55 views

Optimizing the neural network after each output (In sequence-to-sequence learning)

In sequence-to-sequence learning when we are predicting more than one step ahead, should we optimize the neural network after each output or should we optimize the outputs of every sequence together?...
0
votes
1answer
35 views

Training trained seq2seq model on additional training data

I have trained a seq2seq model with 1M samples and saved the latest checkpoint. Now, I have some additional training data of 50K sentence pairs which has not been seen in previous training data. How ...

153050per page
angop.ao, elkhabar.com, noa.al, afghanpaper.com, bbc.com, time.com, cdc.gov, nih.gov, xnxx.com, github.com,