Commit Graph

259 Commits

Author SHA1 Message Date
Eren Golge 9d5a5b0764 linter 2019-10-24 14:34:31 +02:00
Eren Golge ea32f2368d linter fix 2019-10-24 14:11:07 +02:00
Eren Golge 77f5fd0584 compute and config update with new attention entropy loss 2019-10-15 15:05:42 +02:00
Eren Golge 53ec066733 replace zeros() with a better alternative 2019-10-12 18:34:12 +02:00
Eren Golge 98af061d2e formatting, merge GST model with Tacotron 2019-09-24 16:18:48 +02:00
Eren Golge e8d29613f1 fix stop condition 2019-09-24 15:38:28 +02:00
Eren Golge 9a2bd7f9af fix for 2 dim memory tensor 2019-09-23 10:25:51 +02:00
Eren Golge e085c4757d bug fix 2019-09-23 10:25:51 +02:00
Eren Golge 8d3775a7d6 Update tacotron2 for gradual training and chnage the indexing of prenet nputs to oick the last frame 2019-09-23 10:25:51 +02:00
Eren Golge a1322530df integrade concatinative speker embedding to tacotron 2019-09-12 10:39:15 +02:00
Reuben Morais 3c5aeb5e22 Fix installation by using an explicit symlink 2019-08-29 11:49:53 +02:00
Eren Golge 72ad58d893 change the bitwise for masking and small fixes 2019-08-19 16:24:28 +02:00
Eren Golge c637aa04a2 pylint 2019-08-16 15:49:12 +02:00
Eren Golge 5629292bde bug fixes 2019-08-16 15:08:04 +02:00
Eren Golge 8fde0ac00e pylint 2019-08-16 14:23:26 +02:00
Eren Golge b22c7d4a29 Merge branch 'dev-gradual-queue' into dev 2019-08-16 13:20:17 +02:00
Eren Golge 64f2b95c31 update regarding torch 1.2 2019-08-13 12:14:34 +02:00
Thomas Werkmeister ab42396fbf undo loc attn after fwd attn 2019-07-25 13:04:41 +02:00
Thomas Werkmeister f3dac0aa84 updating location attn after calculating fwd attention 2019-07-24 11:49:07 +02:00
Thomas Werkmeister 40f56f9b00 simplified code for fwd attn 2019-07-24 11:47:06 +02:00
Thomas Werkmeister a6118564d5 renamed query_rnn back to attention_rnn 2019-07-24 11:46:34 +02:00
Thomas Werkmeister fb7c5b1996 unused instance vars 2019-07-23 20:02:31 +02:00
Thomas Werkmeister 82db35530f unused var 2019-07-23 19:33:56 +02:00
Thomas Werkmeister 98edb7a4f8 renamed attention_rnn to query_rnn 2019-07-23 18:38:09 +02:00
Eren Golge 2bbb3f7a40 don't use sigmoid output for tacotron, fix bug for memory queue handling, remove maxout 2019-07-22 15:09:05 +02:00
Eren Golge f038b1aa3f new way of handling memory queue and enable/disable queuing in right/wrong conditions 2019-07-22 02:10:21 +02:00
Eren Golge d47ba5d310 gradual traning with memory queue 2019-07-20 12:33:21 +02:00
Reuben Morais 11e7895329 Fix Pylint issues 2019-07-19 09:08:51 +02:00
Eren Golge 0f0ec679ec small refactoring 2019-07-16 21:15:24 +02:00
Eren Golge 72c5062c02 use memory queue if r is smaller than queue size 2019-07-16 15:18:47 +02:00
Eren Golge f8195834ee TODO added 2019-07-16 15:18:00 +02:00
Eren Golge c72470bcfc update forward attention 2019-06-24 16:57:29 +02:00
Eren Golge 88575edd5a gst_layers 2019-06-05 18:34:48 +02:00
Eren Golge d7e0f828cf remove print 2019-06-04 00:40:03 +02:00
Eren Golge 4678c66599 forward_attn_mask and config update 2019-06-04 00:39:29 +02:00
Eren Golge f774db0241 bug fix #207 2019-05-29 00:37:41 +02:00
Eren Golge 013ec2f168 bug fix for tacotron adapting it to new common layers 2019-05-28 14:29:17 +02:00
Eren Golge 0b5a00d29e enforce monotonic attention in forward attention y for batches 2019-05-28 14:28:32 +02:00
Eren Golge 35b76556e4 Use Attention and Prenet from common file 2019-05-27 15:30:57 +02:00
Eren Golge 59ba37904d enforce monotonic attention for forward attention in eval time 2019-05-27 14:41:30 +02:00
Eren Golge ba492f43be Set tacotron model parameters to adap to common_layers.py - Prenet and Attention 2019-05-27 14:40:28 +02:00
Eren Golge 2586be7d33 batch-wise operation 2019-05-24 13:40:56 +02:00
Eren Golge 3a4a3e571a Force alignment of forward attention 2019-05-24 13:18:18 +02:00
Eren Golge e62659da94 update separate stopnet flow to make it faster. 2019-05-17 16:15:43 +02:00
Eren Golge bb2b705e01 small bug fixes 2019-05-14 13:53:26 +02:00
Eren Golge 2b60f9a731 Fix trans agent implementation in relation to the paper. Use query vector insteadd of processed_query 2019-05-12 17:39:43 +02:00
Eren Golge 6331bccefc make dropout oprional #2 2019-05-12 17:35:31 +02:00
Eren Golge 820d18c922 make dropout at prenet optional 2019-05-12 17:34:57 +02:00
Eren Golge afb5a17221 bug fix 2019-04-30 10:59:29 +02:00
Eren Golge e2439fde9a make location attention optional and keep all attention weights in attention class 2019-04-29 11:37:01 +02:00
Eren Golge 38213dffe9 bug fix #2 2019-04-18 18:55:37 +02:00
Eren Golge 9ba13b2d2f fix forward attention 2019-04-18 18:36:01 +02:00
Eren Golge f450fe3571 use stop token again 2019-04-18 15:20:19 +02:00
Eren Golge 3c2d500f53 Changesat windowing and some comments 2019-04-12 16:13:40 +02:00
Eren Golge 312a539a0e Enable optional forward attention with transition agent 2019-04-10 16:41:30 +02:00
Eren Golge e3647fa7b3 bug fix for prenet setup 2019-04-08 18:28:19 +02:00
Eren Golge 961af0f5cd setup_model externally based on model selection. Make forward attention and prenet type configurable in config.json 2019-04-05 17:49:18 +02:00
Eren Golge 043e49f367 active windowing 2019-04-05 08:40:09 +02:00
Eren Golge 6e8b66aa89 smaller windowing range 2019-04-02 10:35:13 +02:00
Eren Golge 68f8ef730d stop conditioning with padding for inference_truncated 2019-04-01 14:10:38 +02:00
Eren Golge 5212a11836 longer stop_token padding 2019-04-01 12:07:24 +02:00
Eren Golge e1cd253d65 change stop conditioning 2019-03-31 16:44:17 +02:00
Eren Golge fdca8402c7 config updates 2019-03-26 15:46:26 +01:00
Eren Golge 0a92c6d5a7 Set attention norm method by config.json 2019-03-26 00:48:12 +01:00
Eren Golge 786510cd6a loss functions updates 2019-03-23 17:33:47 +01:00
Eren Golge 82cde95cfa add bias to attention v 2019-03-19 12:21:36 +01:00
Eren Golge 1b68d3cb4e control synthesis lenght as an additional stop condition 2019-03-15 14:01:43 +01:00
Eren Golge 4f89029577 Merge branch 'state-pass' into dev-tacotron2 2019-03-12 09:52:15 +01:00
Eren Golge 3128378bdf bug fix for stop token prediciton 2019-03-12 00:20:57 +01:00
Eren Golge 527567d7ce renaming 2019-03-12 00:20:43 +01:00
Eren Golge b9b79fcf0f inference truncated NEED TO BE TESTED 2019-03-11 17:40:09 +01:00
Eren Golge 4ffda89c42 reshape input vectors before and after bn layer 2019-03-11 13:03:43 +01:00
Eren Golge a144acf466 use BN for prenet 2019-03-07 16:28:50 +01:00
Eren Golge 4b116a2a88 Look for the last two attention values for stop condition and attend to the first encoder verctor if it is the first decoder iteration 2019-03-06 23:46:02 +01:00
Eren Golge b031a65677 compute sequence mask in model, add tacotron2 relatedfiles 2019-03-06 13:14:58 +01:00
Eren Golge a4474abd83 tacotron parse output bug fix 2019-03-06 13:10:54 +01:00
Eren Golge 4326582bb1 TTSDataset formatting and batch sorting to use pytorch pack for rnns 2019-03-06 13:10:05 +01:00
Eren Golge 44c66c6e3e remove comments 2019-03-05 13:34:33 +01:00
Eren Golge 1e8fdec084 Modularize functions in Tacotron 2019-03-05 13:25:50 +01:00
Eren Golge 1c99be2ffd Change window size for attention 2019-02-18 13:06:26 +01:00
Eren Golge 6ea31e47df Constant queue size for autoregression window 2019-02-16 03:18:49 +01:00
Eren Golge 90f0cd640b memoru queueing 2019-02-12 15:27:42 +01:00
Eren Golge c5b6227848 init with embedding lyaers 2019-02-06 16:54:33 +01:00
Eren Golge d28bbe09fb Attention bias setting Revert to old 2019-02-06 16:23:01 +01:00
Eren Golge e12bbc2a5c init decoder states with a function 2019-01-22 18:25:55 +01:00
Eren Golge 66f8d0e260 Attention biased chaged 2019-01-22 18:18:21 +01:00
Eren Golge 562d73d3d1 Soem bug fixes 2019-01-17 15:48:22 +01:00
Eren Golge 4431e04b48 use sigmoid for attention 2019-01-16 16:26:05 +01:00
Eren Golge 7e020d4084 Bug fixes 2019-01-16 16:23:04 +01:00
Eren Golge af22bed149 set bias 2019-01-16 15:53:24 +01:00
Eren Golge f4fa155cd3 Make attn windowing optional 2019-01-16 12:33:07 +01:00
Eren Golge 8969d59902 Use the last attention value as a threshold to stop decoding. since stoptoken prediction is not precies enough to synthsis at the right time. 2019-01-16 12:32:40 +01:00
Eren Golge ed1f648b83 Enalbe attention windowing and make in configurable at model level. 2019-01-16 12:32:40 +01:00
Eren Golge 916f5df5f9 config update and increase dropout p 0.5 2019-01-16 12:14:58 +01:00
Eren Golge 84814db73f reduce dropout ratio 2019-01-02 12:52:17 +01:00
Eren Golge 4826e7db9c remove intermediate tensor asap 2018-12-28 14:22:41 +01:00
Eren Golge 3a72d75ecd Merge branch 'master' of github.com:mozilla/TTS 2018-12-12 17:08:39 +01:00
Eren Golge 8d865629a0 partial model initialization 2018-12-12 15:43:58 +01:00
Eren Golge 703be04993 bug fix 2018-12-12 12:02:10 +01:00
Eren Golge dc3d09304e Cache attention annot vectors for the whole sequence. 2018-12-11 16:06:02 +01:00