Commit Graph

193 Commits

Author SHA1 Message Date
Eren Gölge cd69da4868 linter fixes #2 2021-04-08 16:57:46 +02:00
Eren Gölge 4d3e1e9d9a linter fix 2021-04-08 14:57:46 +02:00
Eren Gölge 53f54898bc small fixes 2021-04-08 14:22:47 +02:00
Eren Gölge 006b1d3aaa bug fix 2021-04-08 13:17:45 +02:00
Eren Gölge 3f0993aebe remove junk 2021-04-08 12:17:02 +02:00
Eren Gölge 773f1db6fa refactor HifiGAN discriminator 2021-04-08 11:28:30 +02:00
Eren Gölge 15f362d5b1 formatting 2021-04-08 11:28:30 +02:00
Eren Gölge aee24b0704 set different seed in gan_dataset when it is multi-workers 2021-04-08 11:28:30 +02:00
Eren Gölge 7cecd2fb2e add hifigan D 2021-04-08 11:27:40 +02:00
Eren Gölge 13dca6e6b6 revert some of Hifigan generator updates 2021-04-08 11:27:40 +02:00
Eren Gölge 02bc776c35 prevenet grad in TorchSTFT 2021-04-08 11:27:40 +02:00
Eren Gölge cf44624df8 more docstring 2021-04-08 11:27:40 +02:00
Eren Gölge d95b1458e8 Linter fixes and docstrings for HiFiGAN 2021-04-08 11:27:40 +02:00
Eren Gölge bd7a1c177b fix #419 2021-04-08 11:26:41 +02:00
Eren Gölge 57f6bd1afa make using different samples for G and D networks optional 2021-04-08 11:26:01 +02:00
Eren Gölge 241e968df1 load_checkpoint for hifigan and no_grad for inference 2021-04-08 11:25:29 +02:00
Eren Gölge de3a04f104 some commeting for Generator loss and check if the argument is defines in the config file 2021-04-08 11:25:29 +02:00
Eren Gölge ff07c5f5e3 update TorchSTFT to enable melspec 2021-04-08 11:25:29 +02:00
Eren Gölge 4a5b1d4ac2 update hifigan config 2021-04-08 11:24:21 +02:00
Eren Gölge d57f416957 small fixes 2021-04-08 11:22:30 +02:00
Eren Gölge 8c9e1c9e58 hifigan implementation update 2021-04-08 11:21:43 +02:00
Eren Gölge a14d7bc5db hifigan config update 2021-04-08 11:20:33 +02:00
Eren Gölge 8d4fd79cd7 update hifigan config 2021-04-08 11:20:33 +02:00
rishikksh20 b533474e3b Remove minor bugs and make code trainable 2021-04-08 11:20:33 +02:00
rishikksh20 1535777f64 1) Add ExponentialLR 2021-04-08 11:18:36 +02:00
rishikksh20 c20a6b1185 * Format the model definition
* Update code and integrate training code
2021-04-08 11:18:36 +02:00
rishikksh20 39b5845810 1) Add hifigan json files
2) Rename MPD disc
3) Re-format remove weight norm generator
2021-04-08 11:14:39 +02:00
rishikksh20 7b7c5d635f 1) Combine MSD with Multi-Period disc
2) Add remove weight norm layer on Generator
2021-04-08 11:14:39 +02:00
rishikksh20 4493feb95c Add HiFi-GAN v1 generator and discriminator classes 2021-04-08 11:14:39 +02:00
Eren Gölge c86c559349 docstring and optional padding in TorchSTFT 2021-04-07 12:36:15 +02:00
Eren Gölge f890454de3 linter fixes 2021-04-07 12:36:03 +02:00
Eren Gölge b86e7fb2e8 pad short samples when loading precomputed features in vocoder trainign 2021-04-06 16:24:50 +02:00
Eren Gölge 4337e9ff87 pad_mode in torch_stft 2021-03-10 14:41:00 +01:00
Eren Gölge 594d8d8f09 linter fixes 2021-03-08 11:22:59 +01:00
Eren Gölge 00b5090974 linter fix 2021-03-08 11:05:30 +01:00
Eren Gölge e15734c3fc linter fix 2021-03-08 05:29:43 +01:00
Eren Gölge 9a48ba3821 a ton of linter updates 2021-03-08 05:06:54 +01:00
gerazov 2451a813a2 refactored keep_all_best 2021-03-08 02:57:11 +01:00
gerazov f2e474cd37 loading last checkpoint/best_model works, deleting last best models options added, loading last best_loss added 2021-03-08 02:56:36 +01:00
Branislav Gerazov ed56944c4a improve robustness of defining wavernn in config file 2021-03-08 02:54:21 +01:00
Branislav Gerazov 5e2bc8c99f update wavernn test config, delete cap=True 2021-03-08 02:54:21 +01:00
Branislav Gerazov b1e3160884 waveRNN fix 2021-03-08 02:54:21 +01:00
Eren Gölge 5ee73c2bae Merge branch 'dev' of https://github.com/mozilla/TTS into dev 2021-01-22 13:26:27 +01:00
Eren Gölge c990b3a59c linter fixes and test fixes 2021-01-22 02:32:35 +01:00
Eren Gölge 9addfabc43 wavernn load_checkpoint function 2021-01-21 15:31:13 +01:00
root 1bc8fbbd3c set eval mode whe nloading models 2021-01-20 02:14:18 +00:00
root 5bd7238153 interpolate spectrogram in vocoder generic utils for matching sample
rates
2021-01-20 02:13:01 +00:00
root ca3743539a load_checkpoint func for vocoder models 2021-01-20 02:12:29 +00:00
gerazov b2b4828f17 set requires_grad=False 2021-01-16 19:46:04 +01:00
gerazov c96f7a2614 TorchSTFT to device fix 2021-01-16 12:21:16 +01:00
Alexander Korolev f42ca2b73f
Update wavegrad.py
This should fix the issue https://github.com/mozilla/TTS/issues/581
2020-12-04 16:43:39 +01:00
erogol e3eda159d1 wavegrad_dataset update 2020-11-25 14:50:50 +01:00
erogol 4b92ac0f92 tune_wavegrad update 2020-11-25 14:49:48 +01:00
erogol c65712426a change noise scheduling for wavegrad. Compute beta values externally to enable better flexibility 2020-11-14 13:01:10 +01:00
erogol 6cc464ead6 fix ton of tesnting bugs 2020-11-12 16:33:29 +01:00
erogol 25551c4634 change wavernn generate to inference 2020-11-12 12:52:52 +01:00
erogol c76a617072 linter updates 2020-11-09 13:18:35 +01:00
erogol c80225544e tune wavegrad to fine the best noise schedule for inferece 2020-11-06 13:04:46 +01:00
erogol a44ef58aea wavegrad weight norm refactoring 2020-10-30 13:23:24 +01:00
erogol 183fe56d95 Merge branch 'ssim_loss' into dev 2020-10-29 23:49:09 +01:00
erogol 39c71ee8a9 wavegrad refactoring, fixing tests for glow-tts and wavegrad 2020-10-29 15:47:15 +01:00
erogol 946a0c0fb9 bug fixes for single speaker glow-tts, enable torch based amp. Make amp optional for wavegrad. Bug fixes for synthesis setup for glow-tts 2020-10-29 15:45:50 +01:00
erogol 14c2381207 weight norm and torch based amp training for wavegrad 2020-10-29 12:31:43 +01:00
erogol b76a0be97a wavegrad model and layers refactoring 2020-10-29 12:31:43 +01:00
erogol dc2825dfb2 wavegrad dataset update 2020-10-29 12:31:43 +01:00
erogol 5b5b9fcfdd wavegrad config updates 2020-10-29 12:31:43 +01:00
erogol 7bcdb7ac35 wavegrad updates 2020-10-29 12:31:43 +01:00
erogol a1582a0e12 fix distributed training for train_* scripts 2020-10-29 12:31:43 +01:00
erogol 193b81b273 add universal_fullband_melgan config 2020-10-29 12:30:37 +01:00
erogol e02cd6a220 initial wavegrad layers model and trainig script 2020-10-29 12:30:37 +01:00
erogol ac57eea928 add wavegrad to vocoder generators 2020-10-29 12:30:37 +01:00
erogol e723b99888 handle distributed model as saving 2020-10-29 12:30:37 +01:00
erogol 9d0ae2bfb4 wavernn dataloader handling for short samples and mixed precision training 2020-10-28 12:31:01 +01:00
erogol a6f564c8c8 pylint fixes 2020-10-27 12:35:10 +01:00
erogol 0becef4b58 small updates 2020-10-27 12:17:38 +01:00
sanjaesc 2ee47e9568 fix pylint once again 2020-10-27 12:17:38 +01:00
sanjaesc 1e646135ca add model params to config 2020-10-27 12:17:38 +01:00
sanjaesc bef3f2020b compute audio feat on dataload 2020-10-27 12:17:38 +01:00
sanjaesc 7c72562fe7 fix travis + pylint tests 2020-10-27 12:17:38 +01:00
sanjaesc 91e5f8b63d added to device cpu/gpu + formatting 2020-10-27 12:17:38 +01:00
sanjaesc 016a77fcf2 fix formatting + pylint 2020-10-27 12:17:38 +01:00
sanjaesc e8294cb9db fixing pylint errors 2020-10-27 12:17:38 +01:00
sanjaesc 878b7c373e added feature preprocessing if not set in config 2020-10-27 12:17:38 +01:00
sanjaesc e495e03ea1 some minor changes to wavernn 2020-10-27 12:17:38 +01:00
Alex K 9c3c7ce2f8 wavernn stuff... 2020-10-27 12:17:38 +01:00
Alex K 6378fa2b07 add initial wavernn support 2020-10-27 12:17:38 +01:00
erogol 154f90bc44 format speaker encoder imports 2020-09-28 11:19:19 +02:00
erogol 665f7ca714 linter fix 2020-09-24 12:57:54 +02:00
erogol c008003506 do not check sample rate as loading stats file for normalization to enable interpolation for different sample rate vocoder 2020-09-18 12:52:19 +02:00
erogol 3660c57f1e time seperable convolution encoder, huber loss for duration predictor 2020-09-17 03:10:58 +02:00
erogol 7c2c4d6f27 pass x_mask to layer norm 2020-09-12 03:41:37 +02:00
erogol 540d811dd5 solve pickling models after module name change 2020-09-11 12:03:39 +02:00
erogol df19428ec6 rename the project to old TTS 2020-09-09 12:27:23 +02:00