Actions
Bug #1277
openLỗi stg không thể training
Description
container "voice-clone-api-tts-service-dev" in pod "voice-clone-api-tts-service-dev-6bf8fb4f8d-8xxqc" is not available 2025-01-12 14:02:51,838 INFO MainProcess generated new fontManager 2025-01-12 14:03:03,104 INFO MainProcess Start 2025-01-12 14:03:03,105 INFO MainProcess Started the training --- 2025-01-12 14:03:24,625 INFO MainProcess Ended the training --- 2025-01-12 14:03:24,628 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:03:24,629 INFO MainProcess Polling for infer queue 2025-01-12 14:03:46,228 INFO MainProcess Ended the infer --- 2025-01-12 14:03:46,231 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:03:46,232 INFO MainProcess Polling for infer queue 2025-01-12 14:04:07,863 INFO MainProcess Ended the infer --- 2025-01-12 14:04:07,866 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:04:07,867 INFO MainProcess Polling for infer queue 2025-01-12 14:04:29,519 INFO MainProcess Ended the infer --- 2025-01-12 14:04:29,522 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:04:29,522 INFO MainProcess Polling for infer queue 2025-01-12 14:04:51,122 INFO MainProcess Ended the infer --- 2025-01-12 14:04:51,125 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:04:51,126 INFO MainProcess Polling for infer queue 2025-01-12 14:05:12,772 INFO MainProcess Ended the infer --- 2025-01-12 14:05:12,775 INFO MainProcess Started the training --- 2025-01-12 14:05:34,748 INFO MainProcess Ended the training --- 2025-01-12 14:05:34,752 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:05:34,753 INFO MainProcess Polling for infer queue 2025-01-12 14:05:56,356 INFO MainProcess Ended the infer --- 2025-01-12 14:05:56,360 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:05:56,361 INFO MainProcess Polling for infer queue 2025-01-12 14:06:17,961 INFO MainProcess Ended the infer --- 2025-01-12 14:06:17,965 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:06:17,966 INFO MainProcess Polling for infer queue 2025-01-12 14:06:39,575 INFO MainProcess Ended the infer --- 2025-01-12 14:06:39,578 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:06:39,579 INFO MainProcess Polling for infer queue 2025-01-12 14:07:01,187 INFO MainProcess Ended the infer --- 2025-01-12 14:07:01,190 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:07:01,191 INFO MainProcess Polling for infer queue 2025-01-12 14:07:22,791 INFO MainProcess Ended the infer --- 2025-01-12 14:07:22,794 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:07:22,795 INFO MainProcess Polling for infer queue 2025-01-12 14:07:44,400 INFO MainProcess Ended the infer --- 2025-01-12 14:07:44,403 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:07:44,404 INFO MainProcess Polling for infer queue 2025-01-12 14:08:06,019 INFO MainProcess Ended the infer --- 2025-01-12 14:08:06,022 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:08:06,023 INFO MainProcess Polling for infer queue 2025-01-12 14:08:27,626 INFO MainProcess Ended the infer --- 2025-01-12 14:08:27,629 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:08:27,630 INFO MainProcess Polling for infer queue 2025-01-12 14:08:49,229 INFO MainProcess Ended the infer --- 2025-01-12 14:08:49,233 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:08:49,234 INFO MainProcess Polling for infer queue 2025-01-12 14:09:10,824 INFO MainProcess Ended the infer --- 2025-01-12 14:09:10,828 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:09:10,829 INFO MainProcess Polling for infer queue 2025-01-12 14:09:32,466 INFO MainProcess Ended the infer --- 2025-01-12 14:09:32,470 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:09:32,471 INFO MainProcess Polling for infer queue 2025-01-12 14:09:54,735 INFO MainProcess Ended the infer --- 2025-01-12 14:09:54,739 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:09:54,739 INFO MainProcess Polling for infer queue 2025-01-12 14:10:16,342 INFO MainProcess Ended the infer --- 2025-01-12 14:10:16,346 INFO MainProcess Started the training --- 2025-01-12 14:10:38,005 INFO MainProcess Ended the training --- 2025-01-12 14:10:38,008 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:10:38,009 INFO MainProcess Polling for infer queue 2025-01-12 14:10:59,626 INFO MainProcess Ended the infer --- 2025-01-12 14:10:59,630 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:10:59,631 INFO MainProcess Polling for infer queue 2025-01-12 14:11:21,206 INFO MainProcess Ended the infer --- 2025-01-12 14:11:21,209 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:11:21,209 INFO MainProcess Polling for infer queue 2025-01-12 14:11:42,813 INFO MainProcess Ended the infer --- 2025-01-12 14:11:42,818 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:11:42,819 INFO MainProcess Polling for infer queue 2025-01-12 14:12:04,414 INFO MainProcess Ended the infer --- 2025-01-12 14:12:04,418 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:12:04,419 INFO MainProcess Polling for infer queue 2025-01-12 14:12:26,024 INFO MainProcess Ended the infer --- 2025-01-12 14:12:26,028 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:12:26,028 INFO MainProcess Polling for infer queue 2025-01-12 14:12:47,614 INFO MainProcess Ended the infer --- 2025-01-12 14:12:47,619 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:12:47,620 INFO MainProcess Polling for infer queue 2025-01-12 14:13:09,271 INFO MainProcess Ended the infer --- 2025-01-12 14:13:09,274 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:13:09,275 INFO MainProcess Polling for infer queue 2025-01-12 14:13:30,873 INFO MainProcess Ended the infer --- 2025-01-12 14:13:30,877 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:13:30,878 INFO MainProcess Polling for infer queue 2025-01-12 14:13:53,075 INFO MainProcess Ended the infer --- 2025-01-12 14:13:53,078 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:13:53,078 INFO MainProcess Polling for infer queue 2025-01-12 14:14:14,692 INFO MainProcess Ended the infer --- 2025-01-12 14:14:14,696 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:14:14,696 INFO MainProcess Polling for infer queue 2025-01-12 14:14:36,324 INFO MainProcess Ended the infer --- 2025-01-12 14:14:36,329 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:14:36,329 INFO MainProcess Polling for infer queue 2025-01-12 14:14:57,962 INFO MainProcess Ended the infer --- 2025-01-12 14:14:57,965 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:14:57,965 INFO MainProcess Polling for infer queue 2025-01-12 14:15:19,556 INFO MainProcess Ended the infer --- 2025-01-12 14:15:19,560 INFO MainProcess Started the training --- 2025-01-12 14:15:41,096 INFO MainProcess Ended the training --- 2025-01-12 14:15:41,099 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:15:41,099 INFO MainProcess Polling for infer queue 2025-01-12 14:16:02,707 INFO MainProcess Ended the infer --- 2025-01-12 14:16:02,710 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:16:02,711 INFO MainProcess Polling for infer queue 2025-01-12 14:16:24,325 INFO MainProcess Ended the infer --- 2025-01-12 14:16:24,328 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:16:24,329 INFO MainProcess Polling for infer queue 2025-01-12 14:16:45,948 INFO MainProcess Ended the infer --- 2025-01-12 14:16:45,952 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:16:45,953 INFO MainProcess Polling for infer queue 2025-01-12 14:17:07,640 INFO MainProcess Ended the infer --- 2025-01-12 14:17:07,644 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:17:07,645 INFO MainProcess Polling for infer queue 2025-01-12 14:17:29,848 INFO MainProcess Ended the infer --- 2025-01-12 14:17:29,852 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:17:29,852 INFO MainProcess Polling for infer queue 2025-01-12 14:17:51,435 INFO MainProcess Ended the infer --- 2025-01-12 14:17:51,439 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:17:51,439 INFO MainProcess Polling for infer queue 2025-01-12 14:18:13,067 INFO MainProcess Ended the infer --- 2025-01-12 14:18:13,072 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:18:13,072 INFO MainProcess Polling for infer queue 2025-01-12 14:18:34,708 INFO MainProcess Ended the infer --- 2025-01-12 14:18:34,712 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:18:34,713 INFO MainProcess Polling for infer queue 2025-01-12 14:18:56,322 INFO MainProcess Ended the infer --- 2025-01-12 14:18:56,325 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:18:56,326 INFO MainProcess Polling for infer queue 2025-01-12 14:19:17,946 INFO MainProcess Ended the infer --- 2025-01-12 14:19:17,949 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:19:17,950 INFO MainProcess Polling for infer queue 2025-01-12 14:19:39,566 INFO MainProcess Ended the infer --- 2025-01-12 14:19:39,569 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:19:39,570 INFO MainProcess Polling for infer queue 2025-01-12 14:20:01,189 INFO MainProcess Ended the infer --- 2025-01-12 14:20:01,193 INFO MainProcess Started the training --- 2025-01-12 14:20:28,943 INFO Train-20250112T142021Z Start training 1 items 2025-01-12 14:20:28,944 INFO Train-20250112T142021Z Validation training request with message id 085ae046-2181-4820-b712-d3d80befc90b 2025-01-12 14:20:29,840 INFO Train-20250112T142021Z Has 1/1 validated items 2025-01-12 14:20:29,840 INFO Train-20250112T142021Z Downloading for trace_id:a1124856-f19c-4461-b7df-61f64c2bc038 s3: models_instant/82/20250112_142005_900658/Voice Việt Cường.mp3 2025-01-12 14:20:29,841 INFO Train-20250112T142021Z Will be using path : cdk-ttsopenai-gpt-stg-upload-bucket 2025-01-12 14:20:31,899 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:20:31,900 INFO MainProcess Polling for infer queue Splitting: 0%| | 0/1 [00:00<?, ?it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 0%| | 0/51 [00:00<?, ?it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 12%|█▏ | 6/51 [00:00<00:00, 59.93it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 24%|██▎ | 12/51 [00:00<00:00, 52.99it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 35%|███▌ | 18/51 [00:00<00:00, 42.37it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 45%|████▌ | 23/51 [00:00<00:00, 36.34it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 55%|█████▍ | 28/51 [00:00<00:00, 34.54it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 63%|██████▎ | 32/51 [00:00<00:00, 32.14it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 71%|███████ | 36/51 [00:01<00:00, 16.33it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 82%|████████▏ | 42/51 [00:01<00:00, 22.44it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 92%|█████████▏| 47/51 [00:01<00:00, 26.96it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 100%|██████████| 51/51 [00:01<00:00, 27.64it/s] Writing /app/tmp/training-jobs/20250112142028/outputs/dataset_raw_raw/607/607.mp3: 100%|██████████| 51/51 [00:01<00:00, 28.86it/s] Splitting: 100%|██████████| 1/1 [00:07<00:00, 7.74s/it] Splitting: 100%|██████████| 1/1 [00:07<00:00, 7.74s/it] [14:20:43] INFO [14:20:43] Skip preprocess_resample.py:71 /app/tmp/training-jobs/20250112142 028/outputs/dataset_raw/607/607_27 4.100_274.200.wav because it is too short. Preprocessing: 0%| | 0/86 [00:00<?, ?it/s] Preprocessing: 1%| | 1/86 [00:00<00:14, 5.97it/s] Preprocessing: 6%|▌ | 5/86 [00:00<00:05, 15.10it/s] Preprocessing: 10%|█ | 9/86 [00:00<00:03, 22.32it/s] Preprocessing: 17%|█▋ | 15/86 [00:00<00:02, 26.04it/s] Preprocessing: 27%|██▋ | 23/86 [00:00<00:02, 30.12it/s] Preprocessing: 31%|███▏ | 27/86 [00:01<00:02, 27.26it/s] Preprocessing: 36%|███▌ | 31/86 [00:01<00:02, 27.11it/s] Preprocessing: 41%|████ | 35/86 [00:01<00:01, 29.30it/s] Preprocessing: 45%|████▌ | 39/86 [00:01<00:01, 30.50it/s] Preprocessing: 50%|█████ | 43/86 [00:01<00:01, 28.17it/s] Preprocessing: 55%|█████▍ | 47/86 [00:01<00:01, 30.61it/s] Preprocessing: 59%|█████▉ | 51/86 [00:01<00:01, 28.67it/s] Preprocessing: 64%|██████▍ | 55/86 [00:02<00:01, 28.18it/s] Preprocessing: 69%|██████▊ | 59/86 [00:02<00:00, 29.55it/s] Preprocessing: 73%|███████▎ | 63/86 [00:02<00:00, 31.54it/s] Preprocessing: 78%|███████▊ | 67/86 [00:02<00:00, 31.02it/s] Preprocessing: 84%|████████▎ | 72/86 [00:02<00:00, 34.99it/s] Preprocessing: 88%|████████▊ | 76/86 [00:02<00:00, 33.22it/s] Preprocessing: 95%|█████████▌| 82/86 [00:02<00:00, 37.60it/s] Preprocessing: 100%|██████████| 86/86 [00:02<00:00, 30.98it/s] 0%| | 0/85 [00:00<?, ?it/s]2025-01-12 14:20:44,549 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/so_vits_svc_fork/preprocessing/preprocess_flist_config.py:41: FutureWarning: get_duration() keyword argument 'filename' has been renamed to 'path' in version 0.10.0. This alias will be removed in version 1.0. if get_duration(filename=path) < 0.3: 1%| | 1/85 [00:00<00:09, 8.95it/s] 15%|█▌ | 13/85 [00:00<00:01, 67.40it/s] 28%|██▊ | 24/85 [00:00<00:00, 84.84it/s] 42%|████▏ | 36/85 [00:00<00:00, 95.08it/s] 56%|█████▋ | 48/85 [00:00<00:00, 101.24it/s] 71%|███████ | 60/85 [00:00<00:00, 104.88it/s] 85%|████████▍ | 72/85 [00:00<00:00, 106.72it/s] 99%|█████████▉| 84/85 [00:00<00:00, 108.43it/s] 100%|██████████| 85/85 [00:00<00:00, 97.49it/s] 2025-01-12 14:20:45,311 INFO Train-20250112T142021Z Writing /app/tmp/training-jobs/20250112142028/outputs/filelists/44100/train.txt 2025-01-12 14:20:45,311 INFO Train-20250112T142021Z Writing /app/tmp/training-jobs/20250112142028/outputs/filelists/44100/val.txt 2025-01-12 14:20:45,311 INFO Train-20250112T142021Z Writing /app/tmp/training-jobs/20250112142028/outputs/filelists/44100/test.txt 2025-01-12 14:20:45,312 INFO Train-20250112T142021Z Writing /app/tmp/training-jobs/20250112142028/outputs/configs/44100/config.json 2025-01-12 14:20:45,340 INFO Train-20250112T142021Z n_jobs automatically set to 2, memory: 16376 MiB Some weights of HubertModel were not initialized from the model checkpoint at lengyue233/content-vec-best and are newly initialized: ['encoder.pos_conv_embed.conv.parametrizations.weight.original0', 'encoder.pos_conv_embed.conv.parametrizations.weight.original1'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. Some weights of HubertModel were not initialized from the model checkpoint at lengyue233/content-vec-best and are newly initialized: ['encoder.pos_conv_embed.conv.parametrizations.weight.original0', 'encoder.pos_conv_embed.conv.parametrizations.weight.original1'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. [14:20:52] INFO [14:20:52] F0 inference time: 0.068s, RTF: f0.py:214 0.028 INFO [14:20:52] HuBERT inference time : 0.118s, utils.py:234 RTF: 0.049 WARNING [14:20:52] warnings.py:109 /opt/conda/lib/python3.10/site-packages/torc h/functional.py:660: UserWarning: stft with return_complex=False is deprecated. In a future pytorch release, stft will return complex tensors for all inputs, and return_complex=False will raise an error. Note: you can still call torch.view_as_real on the complex output to recover the old return format. (Triggered internally at /opt/conda/conda-bld/pytorch_1711403380909/w ork/aten/src/ATen/native/SpectralOps.cpp:874 .) return _VF.stft(input, n_fft, hop_length, win_length, window, # type: ignore[attr-defined] 0%| | 0/42 [00:00<?, ?it/s] [A 0%| | 0/43 [00:00<?, ?it/s] [14:20:52] INFO [14:20:52] F0 inference time: 0.304s, RTF: f0.py:214 0.047 INFO [14:20:52] F0 inference time: 0.118s, RTF: f0.py:214 0.033 INFO [14:20:52] HuBERT inference time : 0.038s, utils.py:234 RTF: 0.011 2%|▏ | 1/42 [00:00<00:13, 2.95it/s] [A INFO [14:20:52] HuBERT inference time : 0.121s, utils.py:234 RTF: 0.019 WARNING [14:20:52] warnings.py:109 /opt/conda/lib/python3.10/site-packages/torc h/functional.py:660: UserWarning: stft with return_complex=False is deprecated. In a future pytorch release, stft will return complex tensors for all inputs, and return_complex=False will raise an error. Note: you can still call torch.view_as_real on the complex output to recover the old return format. (Triggered internally at /opt/conda/conda-bld/pytorch_1711403380909/w ork/aten/src/ATen/native/SpectralOps.cpp:874 .) return _VF.stft(input, n_fft, hop_length, win_length, window, # type: ignore[attr-defined] [14:20:53] INFO [14:20:53] F0 inference time: 0.284s, RTF: f0.py:214 0.028 INFO [14:20:53] HuBERT inference time : 0.037s, utils.py:234 RTF: 0.004 [14:20:53] INFO [14:20:53] F0 inference time: 0.134s, RTF: f0.py:214 0.030 5%|▍ | 2/42 [00:00<00:10, 4.00it/s] [A 2%|▏ | 1/43 [00:00<00:26, 1.57it/s] INFO [14:20:53] HuBERT inference time : 0.039s, utils.py:234 RTF: 0.009 INFO [14:20:53] F0 inference time: 0.027s, RTF: f0.py:214 0.030 INFO [14:20:53] F0 inference time: 0.026s, RTF: f0.py:214 0.026 INFO [14:20:53] HuBERT inference time : 0.031s, utils.py:234 RTF: 0.035 INFO [14:20:53] HuBERT inference time : 0.041s, utils.py:234 RTF: 0.042 INFO [14:20:53] F0 inference time: 0.025s, RTF: f0.py:214 0.028 INFO [14:20:53] HuBERT inference time : 0.037s, utils.py:234 RTF: 0.041 INFO [14:20:53] F0 inference time: 0.155s, RTF: f0.py:214 0.038 INFO [14:20:53] F0 inference time: 0.294s, RTF: f0.py:214 0.031 7%|▋ | 3/42 [00:00<00:12, 3.20it/s] [A 5%|▍ | 2/43 [00:00<00:15, 2.59it/s] 9%|▉ | 4/43 [00:01<00:07, 5.00it/s]2025-01-12 14:20:53,510 INFO MainProcess Ended the infer --- 2025-01-12 14:20:53,513 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:20:53,517 INFO MainProcess Polling for infer queue INFO [14:20:53] HuBERT inference time : 0.029s, utils.py:234 RTF: 0.003 INFO [14:20:53] HuBERT inference time : 0.069s, utils.py:234 RTF: 0.017 INFO [14:20:53] F0 inference time: 0.066s, RTF: f0.py:214 0.044 INFO [14:20:53] HuBERT inference time : 0.038s, utils.py:234 RTF: 0.025 12%|█▏ | 5/42 [00:01<00:09, 3.72it/s] [A 12%|█▏ | 5/43 [00:01<00:08, 4.40it/s] [14:20:54] INFO [14:20:54] F0 inference time: 0.305s, RTF: f0.py:214 0.030 [14:20:54] INFO [14:20:54] F0 inference time: 0.414s, RTF: f0.py:214 0.041 INFO [14:20:54] HuBERT inference time : 0.014s, utils.py:234 RTF: 0.001 INFO [14:20:54] HuBERT inference time : 0.035s, utils.py:234 RTF: 0.004 14%|█▍ | 6/42 [00:01<00:08, 4.37it/s] [A INFO [14:20:54] F0 inference time: 0.276s, RTF: f0.py:214 0.037 INFO [14:20:54] F0 inference time: 0.277s, RTF: f0.py:214 0.038 INFO [14:20:54] HuBERT inference time : 0.045s, utils.py:234 RTF: 0.006 INFO [14:20:54] HuBERT inference time : 0.032s, utils.py:234 RTF: 0.004 17%|█▋ | 7/42 [00:01<00:09, 3.64it/s] [A 14%|█▍ | 6/43 [00:01<00:11, 3.13it/s] INFO [14:20:54] F0 inference time: 0.059s, RTF: f0.py:214 0.025 INFO [14:20:54] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.005 INFO [14:20:54] F0 inference time: 0.132s, RTF: f0.py:214 0.030 INFO [14:20:54] HuBERT inference time : 0.039s, utils.py:234 RTF: 0.009 INFO [14:20:54] F0 inference time: 0.299s, RTF: f0.py:214 0.030 INFO [14:20:54] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 19%|█▉ | 8/42 [00:02<00:10, 3.28it/s] [A 16%|█▋ | 7/43 [00:02<00:11, 3.01it/s] 19%|█▊ | 8/43 [00:02<00:10, 3.38it/s] [14:20:55] INFO [14:20:55] F0 inference time: 0.416s, RTF: f0.py:214 0.050 INFO [14:20:55] HuBERT inference time : 0.036s, utils.py:234 RTF: 0.004 [14:20:55] INFO [14:20:55] F0 inference time: 0.265s, RTF: f0.py:214 0.045 INFO [14:20:55] F0 inference time: 0.012s, RTF: f0.py:214 0.023 INFO [14:20:55] HuBERT inference time : 0.081s, utils.py:234 RTF: 0.162 INFO [14:20:55] HuBERT inference time : 0.102s, utils.py:234 RTF: 0.017 24%|██▍ | 10/42 [00:02<00:08, 3.61it/s] [A 21%|██ | 9/43 [00:02<00:12, 2.78it/s] 23%|██▎ | 10/43 [00:03<00:09, 3.48it/s] INFO [14:20:55] F0 inference time: 0.298s, RTF: f0.py:214 0.030 INFO [14:20:55] F0 inference time: 0.413s, RTF: f0.py:214 0.066 INFO [14:20:55] HuBERT inference time : 0.016s, utils.py:234 RTF: 0.002 INFO [14:20:55] HuBERT inference time : 0.035s, utils.py:234 RTF: 0.006 26%|██▌ | 11/42 [00:03<00:10, 3.07it/s] [A INFO [14:20:55] F0 inference time: 0.153s, RTF: f0.py:214 0.026 [14:20:56] INFO [14:20:56] HuBERT inference time : 0.039s, utils.py:234 RTF: 0.007 29%|██▊ | 12/42 [00:03<00:10, 2.94it/s] [A 26%|██▌ | 11/43 [00:03<00:11, 2.86it/s] [14:20:56] INFO [14:20:56] F0 inference time: 0.293s, RTF: f0.py:214 0.029 INFO [14:20:56] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 INFO [14:20:56] F0 inference time: 0.110s, RTF: f0.py:214 0.033 INFO [14:20:56] HuBERT inference time : 0.036s, utils.py:234 RTF: 0.011 31%|███ | 13/42 [00:03<00:09, 3.19it/s] [A 28%|██▊ | 12/43 [00:03<00:11, 2.80it/s] INFO [14:20:56] F0 inference time: 0.033s, RTF: f0.py:214 0.041 INFO [14:20:56] HuBERT inference time : 0.040s, utils.py:234 RTF: 0.050 33%|███▎ | 14/42 [00:04<00:07, 3.60it/s] [A INFO [14:20:56] F0 inference time: 0.274s, RTF: f0.py:214 0.035 INFO [14:20:56] HuBERT inference time : 0.036s, utils.py:234 RTF: 0.005 INFO [14:20:56] F0 inference time: 0.290s, RTF: f0.py:214 0.033 INFO [14:20:56] HuBERT inference time : 0.039s, utils.py:234 RTF: 0.005 36%|███▌ | 15/42 [00:04<00:06, 4.40it/s] [A 30%|███ | 13/43 [00:04<00:10, 2.77it/s] INFO [14:20:56] F0 inference time: 0.410s, RTF: f0.py:214 0.041 [14:20:57] INFO [14:20:57] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 [14:20:57] INFO [14:20:57] F0 inference time: 0.306s, RTF: f0.py:214 0.031 INFO [14:20:57] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 38%|███▊ | 16/42 [00:04<00:07, 3.62it/s] [A 33%|███▎ | 14/43 [00:04<00:11, 2.51it/s] INFO [14:20:57] F0 inference time: 0.143s, RTF: f0.py:214 0.025 INFO [14:20:57] HuBERT inference time : 0.038s, utils.py:234 RTF: 0.007 INFO [14:20:57] F0 inference time: 0.126s, RTF: f0.py:214 0.032 INFO [14:20:57] F0 inference time: 0.294s, RTF: f0.py:214 0.035 INFO [14:20:57] HuBERT inference time : 0.022s, utils.py:234 RTF: 0.006 INFO [14:20:57] HuBERT inference time : 0.037s, utils.py:234 RTF: 0.004 40%|████ | 17/42 [00:04<00:07, 3.24it/s] [A 35%|███▍ | 15/43 [00:04<00:09, 2.88it/s] 37%|███▋ | 16/43 [00:05<00:08, 3.37it/s] INFO [14:20:57] F0 inference time: 0.296s, RTF: f0.py:214 0.030 INFO [14:20:57] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 INFO [14:20:57] F0 inference time: 0.403s, RTF: f0.py:214 0.040 INFO [14:20:57] HuBERT inference time : 0.014s, utils.py:234 RTF: 0.001 43%|████▎ | 18/42 [00:05<00:08, 2.99it/s] [A [14:20:58] INFO [14:20:58] F0 inference time: 0.290s, RTF: f0.py:214 0.029 INFO [14:20:58] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 45%|████▌ | 19/42 [00:05<00:08, 2.82it/s] [A 40%|███▉ | 17/43 [00:05<00:09, 2.82it/s] [14:20:58] INFO [14:20:58] F0 inference time: 0.412s, RTF: f0.py:214 0.041 INFO [14:20:58] F0 inference time: 0.062s, RTF: f0.py:214 0.030 INFO [14:20:58] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 INFO [14:20:58] HuBERT inference time : 0.037s, utils.py:234 RTF: 0.018 48%|████▊ | 20/42 [00:06<00:07, 2.75it/s] [A INFO [14:20:58] F0 inference time: 0.252s, RTF: f0.py:214 0.040 INFO [14:20:58] HuBERT inference time : 0.039s, utils.py:234 RTF: 0.006 50%|█████ | 21/42 [00:06<00:06, 3.42it/s] [A 42%|████▏ | 18/43 [00:06<00:09, 2.52it/s] INFO [14:20:58] F0 inference time: 0.418s, RTF: f0.py:214 0.042 INFO [14:20:58] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 [14:20:59] INFO [14:20:59] F0 inference time: 0.326s, RTF: f0.py:214 0.033 INFO [14:20:59] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 52%|█████▏ | 22/42 [00:06<00:06, 3.26it/s] [A 44%|████▍ | 19/43 [00:06<00:10, 2.33it/s] INFO [14:20:59] F0 inference time: 0.148s, RTF: f0.py:214 0.046 [14:20:59] INFO [14:20:59] F0 inference time: 0.403s, RTF: f0.py:214 0.040 INFO [14:20:59] HuBERT inference time : 0.029s, utils.py:234 RTF: 0.009 INFO [14:20:59] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 55%|█████▍ | 23/42 [00:06<00:06, 2.96it/s] [A INFO [14:20:59] F0 inference time: 0.270s, RTF: f0.py:214 0.041 INFO [14:20:59] HuBERT inference time : 0.027s, utils.py:234 RTF: 0.004 57%|█████▋ | 24/42 [00:07<00:05, 3.34it/s] [A 47%|████▋ | 20/43 [00:07<00:10, 2.23it/s] INFO [14:20:59] F0 inference time: 0.361s, RTF: f0.py:214 0.052 INFO [14:20:59] HuBERT inference time : 0.037s, utils.py:234 RTF: 0.005 INFO [14:20:59] F0 inference time: 0.028s, RTF: f0.py:214 0.026 INFO [14:20:59] HuBERT inference time : 0.029s, utils.py:234 RTF: 0.026 [14:21:00] INFO [14:21:00] F0 inference time: 0.289s, RTF: f0.py:214 0.029 INFO [14:21:00] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 [14:21:00] INFO [14:21:00] F0 inference time: 0.123s, RTF: f0.py:214 0.026 60%|█████▉ | 25/42 [00:07<00:05, 3.18it/s] [A 49%|████▉ | 21/43 [00:07<00:09, 2.22it/s] INFO [14:21:00] HuBERT inference time : 0.036s, utils.py:234 RTF: 0.007 INFO [14:21:00] F0 inference time: 0.292s, RTF: f0.py:214 0.029 INFO [14:21:00] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 62%|██████▏ | 26/42 [00:07<00:05, 3.03it/s] [A 53%|█████▎ | 23/43 [00:07<00:06, 3.27it/s] INFO [14:21:00] F0 inference time: 0.411s, RTF: f0.py:214 0.041 INFO [14:21:00] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 INFO [14:21:00] F0 inference time: 0.286s, RTF: f0.py:214 0.029 INFO [14:21:00] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 64%|██████▍ | 27/42 [00:08<00:05, 2.88it/s] [A 56%|█████▌ | 24/43 [00:08<00:06, 2.83it/s] [14:21:01] INFO [14:21:01] F0 inference time: 0.417s, RTF: f0.py:214 0.042 INFO [14:21:01] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 [14:21:01] INFO [14:21:01] F0 inference time: 0.271s, RTF: f0.py:214 0.032 INFO [14:21:01] HuBERT inference time : 0.031s, utils.py:234 RTF: 0.004 67%|██████▋ | 28/42 [00:08<00:04, 2.82it/s] [A 58%|█████▊ | 25/43 [00:08<00:07, 2.56it/s] INFO [14:21:01] F0 inference time: 0.295s, RTF: f0.py:214 0.029 INFO [14:21:01] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 INFO [14:21:01] F0 inference time: 0.398s, RTF: f0.py:214 0.040 INFO [14:21:01] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 69%|██████▉ | 29/42 [00:09<00:04, 2.80it/s] [A INFO [14:21:01] F0 inference time: 0.294s, RTF: f0.py:214 0.029 INFO [14:21:01] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 71%|███████▏ | 30/42 [00:09<00:04, 2.74it/s] [A 60%|██████ | 26/43 [00:09<00:07, 2.41it/s] [14:21:02] INFO [14:21:02] F0 inference time: 0.022s, RTF: f0.py:214 0.028 INFO [14:21:02] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.015 [14:21:02] INFO [14:21:02] F0 inference time: 0.404s, RTF: f0.py:214 0.040 INFO [14:21:02] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 INFO [14:21:02] F0 inference time: 0.304s, RTF: f0.py:214 0.030 INFO [14:21:02] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 74%|███████▍ | 31/42 [00:09<00:04, 2.71it/s] [A 63%|██████▎ | 27/43 [00:09<00:06, 2.29it/s] INFO [14:21:02] F0 inference time: 0.418s, RTF: f0.py:214 0.046 INFO [14:21:02] F0 inference time: 0.143s, RTF: f0.py:214 0.028 INFO [14:21:02] HuBERT inference time : 0.022s, utils.py:234 RTF: 0.002 INFO [14:21:02] HuBERT inference time : 0.038s, utils.py:234 RTF: 0.008 79%|███████▊ | 33/42 [00:10<00:02, 3.32it/s] [A 65%|██████▌ | 28/43 [00:10<00:06, 2.20it/s] INFO [14:21:02] F0 inference time: 0.281s, RTF: f0.py:214 0.035 INFO [14:21:02] F0 inference time: 0.270s, RTF: f0.py:214 0.043 INFO [14:21:02] HuBERT inference time : 0.036s, utils.py:234 RTF: 0.004 INFO [14:21:02] HuBERT inference time : 0.039s, utils.py:234 RTF: 0.006 81%|████████ | 34/42 [00:10<00:02, 3.54it/s] [A 67%|██████▋ | 29/43 [00:10<00:06, 2.32it/s] [14:21:03] INFO [14:21:03] F0 inference time: 0.128s, RTF: f0.py:214 0.033 INFO [14:21:03] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.003 [14:21:03] INFO [14:21:03] F0 inference time: 0.257s, RTF: f0.py:214 0.041 INFO [14:21:03] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.002 83%|████████▎ | 35/42 [00:10<00:02, 3.31it/s] [A 70%|██████▉ | 30/43 [00:10<00:04, 2.80it/s] INFO [14:21:03] F0 inference time: 0.415s, RTF: f0.py:214 0.041 INFO [14:21:03] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 INFO [14:21:03] F0 inference time: 0.293s, RTF: f0.py:214 0.029 INFO [14:21:03] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 86%|████████▌ | 36/42 [00:11<00:01, 3.27it/s] [A 72%|███████▏ | 31/43 [00:11<00:04, 2.52it/s] INFO [14:21:03] F0 inference time: 0.286s, RTF: f0.py:214 0.029 [14:21:04] INFO [14:21:04] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 [14:21:04] INFO [14:21:04] F0 inference time: 0.286s, RTF: f0.py:214 0.029 INFO [14:21:04] HuBERT inference time : 0.036s, utils.py:234 RTF: 0.004 88%|████████▊ | 37/42 [00:11<00:01, 3.07it/s] [A 74%|███████▍ | 32/43 [00:11<00:04, 2.58it/s] INFO [14:21:04] F0 inference time: 0.275s, RTF: f0.py:214 0.035 INFO [14:21:04] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.002 INFO [14:21:04] F0 inference time: 0.299s, RTF: f0.py:214 0.030 INFO [14:21:04] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 INFO [14:21:04] F0 inference time: 0.052s, RTF: f0.py:214 0.029 INFO [14:21:04] HuBERT inference time : 0.020s, utils.py:234 RTF: 0.011 90%|█████████ | 38/42 [00:11<00:01, 2.90it/s] [A 77%|███████▋ | 33/43 [00:12<00:03, 2.68it/s] INFO [14:21:04] F0 inference time: 0.053s, RTF: f0.py:214 0.031 INFO [14:21:04] F0 inference time: 0.066s, RTF: f0.py:214 0.039 INFO [14:21:04] HuBERT inference time : 0.041s, utils.py:234 RTF: 0.024 INFO [14:21:04] HuBERT inference time : 0.036s, utils.py:234 RTF: 0.021 93%|█████████▎| 39/42 [00:12<00:01, 2.81it/s] [A 81%|████████▏ | 35/43 [00:12<00:02, 3.99it/s] INFO [14:21:04] F0 inference time: 0.320s, RTF: f0.py:214 0.032 INFO [14:21:04] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 95%|█████████▌| 40/42 [00:12<00:00, 3.46it/s] [A [14:21:05] INFO [14:21:05] F0 inference time: 0.438s, RTF: f0.py:214 0.044 INFO [14:21:05] HuBERT inference time : 0.012s, utils.py:234 RTF: 0.001 INFO [14:21:05] F0 inference time: 0.018s, RTF: f0.py:214 0.030 [14:21:05] INFO [14:21:05] F0 inference time: 0.161s, RTF: f0.py:214 0.028 INFO [14:21:05] HuBERT inference time : 0.031s, utils.py:234 RTF: 0.052 INFO [14:21:05] HuBERT inference time : 0.040s, utils.py:234 RTF: 0.007 98%|█████████▊| 41/42 [00:12<00:00, 3.11it/s] [A 84%|████████▎ | 36/43 [00:12<00:02, 3.14it/s] 100%|██████████| 42/42 [00:13<00:00, 3.34it/s] [A 100%|██████████| 42/42 [00:13<00:00, 3.21it/s] INFO [14:21:05] F0 inference time: 0.402s, RTF: f0.py:214 0.042 INFO [14:21:05] HuBERT inference time : 0.023s, utils.py:234 RTF: 0.002 INFO [14:21:05] F0 inference time: 0.124s, RTF: f0.py:214 0.035 INFO [14:21:05] HuBERT inference time : 0.033s, utils.py:234 RTF: 0.009 [14:21:06] INFO [14:21:06] F0 inference time: 0.385s, RTF: f0.py:214 0.039 INFO [14:21:06] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 INFO [14:21:06] F0 inference time: 0.454s, RTF: f0.py:214 0.046 INFO [14:21:06] HuBERT inference time : 0.033s, utils.py:234 RTF: 0.003 [14:21:07] INFO [14:21:07] F0 inference time: 0.289s, RTF: f0.py:214 0.035 INFO [14:21:07] HuBERT inference time : 0.033s, utils.py:234 RTF: 0.004 INFO [14:21:07] F0 inference time: 0.382s, RTF: f0.py:214 0.038 INFO [14:21:07] HuBERT inference time : 0.013s, utils.py:234 RTF: 0.001 88%|████████▊ | 38/43 [00:13<00:01, 3.32it/s] 91%|█████████ | 39/43 [00:13<00:01, 3.63it/s] 93%|█████████▎| 40/43 [00:14<00:00, 3.08it/s] 95%|█████████▌| 41/43 [00:14<00:00, 2.58it/s] 98%|█████████▊| 42/43 [00:15<00:00, 2.60it/s] 100%|██████████| 43/43 [00:15<00:00, 2.47it/s] 100%|██████████| 43/43 [00:15<00:00, 2.78it/s] 2025-01-12 14:21:09,006 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/joblib/externals/loky/process_executor.py:752: UserWarning: A worker stopped while some jobs were given to the executor. This can be caused by a too short worker timeout or by a memory leak. warnings.warn( Downloading D_0.pth: 0%| | 0.00/178M [00:00<?, ?iB/s] Downloading D_0.pth: 4%|▍ | 7.38M/178M [00:00<00:02, 77.3MiB/s] Downloading D_0.pth: 10%|█ | 18.6M/178M [00:00<00:01, 101MiB/s] Downloading D_0.pth: 17%|█▋ | 29.8M/178M [00:00<00:01, 108MiB/s] Downloading D_0.pth: 23%|██▎ | 41.0M/178M [00:00<00:01, 112MiB/s] Downloading D_0.pth: 29%|██▉ | 52.1M/178M [00:00<00:01, 113MiB/s] Downloading D_0.pth: 35%|███▌ | 63.2M/178M [00:00<00:01, 115MiB/s] Downloading D_0.pth: 42%|████▏ | 74.2M/178M [00:00<00:00, 113MiB/s] Downloading D_0.pth: 48%|████▊ | 86.2M/178M [00:00<00:00, 117MiB/s] Downloading D_0.pth: 55%|█████▍ | 97.4M/178M [00:00<00:00, 117MiB/s] Downloading D_0.pth: 61%|██████ | 109M/178M [00:01<00:00, 117MiB/s] Downloading D_0.pth: 67%|██████▋ | 120M/178M [00:01<00:00, 117MiB/s] Downloading D_0.pth: 73%|███████▎ | 131M/178M [00:01<00:00, 117MiB/s] Downloading D_0.pth: 80%|███████▉ | 142M/178M [00:01<00:00, 117MiB/s] Downloading G_0.pth: 0%| | 0.00/200M [00:00<?, ?iB/s] [A Downloading G_0.pth: 1%| | 1.19M/200M [00:00<00:16, 12.4MiB/s] [A Downloading D_0.pth: 86%|████████▌ | 153M/178M [00:01<00:00, 113MiB/s] Downloading G_0.pth: 3%|▎ | 5.69M/200M [00:00<00:06, 32.8MiB/s] [A Downloading G_0.pth: 5%|▍ | 9.69M/200M [00:00<00:05, 35.8MiB/s] [A Downloading D_0.pth: 92%|█████████▏| 164M/178M [00:01<00:00, 87.0MiB/s] Downloading G_0.pth: 9%|▉ | 17.8M/200M [00:00<00:03, 54.9MiB/s] [A Downloading D_0.pth: 97%|█████████▋| 173M/178M [00:01<00:00, 75.6MiB/s] Downloading G_0.pth: 12%|█▏ | 24.6M/200M [00:00<00:03, 59.3MiB/s] [A Downloading D_0.pth: 100%|██████████| 178M/178M [00:01<00:00, 98.5MiB/s] Downloading G_0.pth: 16%|█▋ | 32.5M/200M [00:00<00:02, 67.1MiB/s] [A Downloading G_0.pth: 21%|██ | 41.8M/200M [00:00<00:02, 76.9MiB/s] [A Downloading G_0.pth: 27%|██▋ | 53.3M/200M [00:00<00:01, 90.5MiB/s] [A Downloading G_0.pth: 32%|███▏ | 64.5M/200M [00:00<00:01, 98.8MiB/s] [A Downloading G_0.pth: 38%|███▊ | 75.7M/200M [00:01<00:01, 104MiB/s] [A Downloading G_0.pth: 44%|████▎ | 86.9M/200M [00:01<00:01, 108MiB/s] [A Downloading G_0.pth: 49%|████▉ | 98.1M/200M [00:01<00:00, 111MiB/s] [A Downloading G_0.pth: 55%|█████▍ | 109M/200M [00:01<00:00, 113MiB/s] [A Downloading G_0.pth: 60%|██████ | 120M/200M [00:01<00:00, 114MiB/s] [A Downloading G_0.pth: 66%|██████▌ | 132M/200M [00:01<00:00, 115MiB/s] [A Downloading G_0.pth: 72%|███████▏ | 143M/200M [00:01<00:00, 116MiB/s] [A Downloading G_0.pth: 77%|███████▋ | 154M/200M [00:01<00:00, 116MiB/s] [A Downloading G_0.pth: 83%|████████▎ | 165M/200M [00:01<00:00, 116MiB/s] [A Downloading G_0.pth: 88%|████████▊ | 176M/200M [00:01<00:00, 116MiB/s] [A Downloading G_0.pth: 94%|█████████▍| 188M/200M [00:02<00:00, 117MiB/s] [A Downloading G_0.pth: 100%|█████████▉| 199M/200M [00:02<00:00, 117MiB/s] [A Downloading G_0.pth: 100%|██████████| 200M/200M [00:02<00:00, 98.4MiB/s] 2025-01-12 14:21:14,243 INFO Train-20250112T142021Z Using strategy: auto INFO: GPU available: True (cuda), used: True 2025-01-12 14:21:14,267 INFO Train-20250112T142021Z GPU available: True (cuda), used: True INFO: TPU available: False, using: 0 TPU cores 2025-01-12 14:21:14,267 INFO Train-20250112T142021Z TPU available: False, using: 0 TPU cores INFO: HPU available: False, using: 0 HPUs 2025-01-12 14:21:14,268 INFO Train-20250112T142021Z HPU available: False, using: 0 HPUs 2025-01-12 14:21:14,280 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/so_vits_svc_fork/modules/synthesizers.py:81: UserWarning: Unused arguments: {'n_layers_q': 3, 'use_spectral_norm': False, 'pretrained': {'D_0.pth': 'https://huggingface.co/datasets/ms903/sovits4.0-768vec-layer12/resolve/main/sovits_768l12_pre_large_320k/clean_D_320000.pth', 'G_0.pth': 'https://huggingface.co/datasets/ms903/sovits4.0-768vec-layer12/resolve/main/sovits_768l12_pre_large_320k/clean_G_320000.pth'}} warnings.warn(f"Unused arguments: {kwargs}") 2025-01-12 14:21:14,363 INFO Train-20250112T142021Z Decoder type: hifi-gan 2025-01-12 14:21:14,371 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/torch/nn/utils/weight_norm.py:28: UserWarning: torch.nn.utils.weight_norm is deprecated in favor of torch.nn.utils.parametrizations.weight_norm. warnings.warn("torch.nn.utils.weight_norm is deprecated in favor of torch.nn.utils.parametrizations.weight_norm.") 2025-01-12 14:21:15,178 INFO MainProcess Ended the infer --- 2025-01-12 14:21:15,185 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:21:15,186 INFO MainProcess Polling for infer queue 2025-01-12 14:21:15,935 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/so_vits_svc_fork/utils.py:246: UserWarning: Keys not found in checkpoint state dict:['emb_g.weight'] warnings.warn(f"Keys not found in checkpoint state dict:" f"{not_in_from}") 2025-01-12 14:21:15,937 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/so_vits_svc_fork/utils.py:264: UserWarning: Shape mismatch: ['dec.cond.weight: torch.Size([512, 256, 1]) -> torch.Size([512, 768, 1])', 'enc_q.enc.cond_layer.weight_v: torch.Size([6144, 256, 1]) -> torch.Size([6144, 768, 1])', 'flow.flows.0.enc.cond_layer.weight_v: torch.Size([1536, 256, 1]) -> torch.Size([1536, 768, 1])', 'flow.flows.2.enc.cond_layer.weight_v: torch.Size([1536, 256, 1]) -> torch.Size([1536, 768, 1])', 'flow.flows.4.enc.cond_layer.weight_v: torch.Size([1536, 256, 1]) -> torch.Size([1536, 768, 1])', 'flow.flows.6.enc.cond_layer.weight_v: torch.Size([1536, 256, 1]) -> torch.Size([1536, 768, 1])', 'f0_decoder.cond.weight: torch.Size([192, 256, 1]) -> torch.Size([192, 768, 1])'] warnings.warn( 2025-01-12 14:21:15,996 INFO Train-20250112T142021Z Loaded checkpoint '/app/tmp/training-jobs/20250112142028/outputs/models/44100/G_0.pth' (epoch 0) 2025-01-12 14:21:16,162 INFO Train-20250112T142021Z Loaded checkpoint '/app/tmp/training-jobs/20250112142028/outputs/models/44100/D_0.pth' (epoch 0) INFO: LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0] 2025-01-12 14:21:16,179 INFO Train-20250112T142021Z LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0] ┏━━━┳━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━┳━━━━━━━┓ ┃ ┃ Name ┃ Type ┃ Params ┃ Mode ┃ ┡━━━╇━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━╇━━━━━━━┩ │ 0 │ net_g │ SynthesizerTrn │ 45.6 M │ train │ │ 1 │ net_d │ MultiPeriodDiscriminator │ 46.7 M │ train │ └───┴───────┴──────────────────────────┴────────┴───────┘ Trainable params: 92.3 M Non-trainable params: 0 Total params: 92.3 M Total estimated model params size (MB): 369 Modules in train mode: 486 Modules in eval mode: 0 2025-01-12 14:21:16,634 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/lightning/pytorch/loops/fit_loop.py:310: The number of training batches (6) is smaller than the logging interval Trainer(log_every_n_steps=50). Set a lower value for log_every_n_steps if you want to see logs for the training epoch. 2025-01-12 14:21:16,635 INFO Train-20250112T142021Z Setting current epoch to 0 2025-01-12 14:21:16,635 INFO Train-20250112T142021Z Setting total batch idx to 0 2025-01-12 14:21:16,635 INFO Train-20250112T142021Z Setting global step to 0 2025-01-12 14:21:26,241 WARNING Train-20250112T142021Z /opt/conda/lib/python3.10/site-packages/torch/functional.py:660: UserWarning: stft with return_complex=False is deprecated. In a future pytorch release, stft will return complex tensors for all inputs, and return_complex=False will raise an error. Note: you can still call torch.view_as_real on the complex output to recover the old return format. (Triggered internally at /opt/conda/conda-bld/pytorch_1711403380909/work/aten/src/ATen/native/SpectralOps.cpp:874.) return _VF.stft(input, n_fft, hop_length, win_length, window, # type: ignore[attr-defined] 2025-01-12 14:21:36,766 INFO MainProcess Ended the infer --- 2025-01-12 14:21:36,770 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:21:36,777 INFO MainProcess Polling for infer queue 2025-01-12 14:21:59,076 INFO MainProcess Ended the infer --- 2025-01-12 14:21:59,096 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:21:59,099 INFO MainProcess Polling for infer queue 2025-01-12 14:22:20,761 INFO MainProcess Ended the infer --- 2025-01-12 14:22:20,765 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:22:20,767 INFO MainProcess Polling for infer queue 2025-01-12 14:22:42,330 INFO MainProcess Ended the infer --- 2025-01-12 14:22:42,340 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:22:42,343 INFO MainProcess Polling for infer queue Epoch 16/99 ━━━━━━ 2/6 0:00:02 • 0:00:04 1.09it/s v_num: 0.000 loss/g/total: 36.571 loss/g/fm: 9.479 loss/g/mel: 22.632 loss/g/kl: 1.684 loss/g/lf0: 0.001 loss/d/total: 2.311 2025-01-12 14:23:03,938 INFO MainProcess Ended the infer --- 2025-01-12 14:23:03,945 INFO MainProcess Started the infer with dlq: False --- 2025-01-12 14:23:03,947 INFO MainProcess Polling for infer queue 2025-01-12 14:23:05,488 ERROR Train-20250112T142021Z 'FigureCanvasAgg' object has no attribute 'tostring_rgb' Traceback (most recent call last): File "/app/main.py", line 49, in __internal_process_train training_service.process() File "/app/app/services/sovit_training_service.py", line 60, in process trainer.train(speaker_list=speaker_list, n_epochs=100) File "/app/app/services/voice_processor/models/sovits_trainer.py", line 85, in train train( File "/opt/conda/lib/python3.10/site-packages/so_vits_svc_fork/train.py", line 149, in train trainer.fit(model, datamodule=datamodule) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 539, in fit call._call_and_handle_interrupt( File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/trainer/call.py", line 47, in _call_and_handle_interrupt return trainer_fn(*args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 575, in _fit_impl self._run(model, ckpt_path=ckpt_path) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 982, in _run results = self._run_stage() File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/trainer/trainer.py", line 1026, in _run_stage self.fit_loop.run() File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/loops/fit_loop.py", line 216, in run self.advance() File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/loops/fit_loop.py", line 455, in advance self.epoch_loop.run(self._data_fetcher) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/loops/training_epoch_loop.py", line 150, in run self.advance(data_fetcher) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/loops/training_epoch_loop.py", line 322, in advance batch_output = self.manual_optimization.run(kwargs) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/loops/optimization/manual.py", line 94, in run self.advance(kwargs) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/loops/optimization/manual.py", line 114, in advance training_step_output = call._call_strategy_hook(trainer, "training_step", *kwargs.values()) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/trainer/call.py", line 323, in _call_strategy_hook output = fn(*args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/lightning/pytorch/strategies/strategy.py", line 391, in training_step return self.lightning_module.training_step(*args, **kwargs) File "/opt/conda/lib/python3.10/site-packages/so_vits_svc_fork/train.py", line 483, in training_step "slice/mel_org": utils.plot_spectrogram_to_numpy( File "/opt/conda/lib/python3.10/site-packages/so_vits_svc_fork/utils.py", line 403, in plot_spectrogram_to_numpy data = np.fromstring(fig.canvas.tostring_rgb(), dtype=np.uint8, sep="") AttributeError: 'FigureCanvasAgg' object has no attribute 'tostring_rgb'. Did you mean: 'tostring_argb'? 2025-01-12 14:23:05,493 ERROR Train-20250112T142021Z Shutdown by error --- 2025-01-12 14:23:05,494 INFO Train-20250112T142021Z Received signal 2 and send to shutdown signal 2025-01-12 14:23:05,500 INFO Train-20250112T142021Z Ended the training ---
Actions