Skip to content

FATAL error showing sm80 and sm370 during inference #2766

@deadend521

Description

@deadend521

Hi,

I seem to have an issue trying to start the inference.
I already tried to use the steps from #2026
However I seem to have issues. I also use the Windows Standalone version.

When I add this part:

  1. 项目代码修改
    需要修改的文件:GPT_SoVITS/AR/modules/patched_mha_with_cache.py和GPT_SoVITS/inference_webui.py

3.1 修复Tuple类型注解问题

在patched_mha_with_cache.py文件开头添加

from typing import Tuple
3.2 解决PyTorch模型加载兼容性问题
在inference_webui.py文件中:

添加必要的导入

from torch.serialization import add_safe_globals
from utils import HParams
add_safe_globals([HParams])

修改模型加载部分

dict_s2 = torch.load(sovits_path, map_location="cuda", weights_only=False)

I get his output when trying to launch the inference webui:

Traceback (most recent call last):
File "D:\GPT-SoVITS\GPT_SoVITS\inference_webui.py", line 29, in
from torch.serialization import add_safe_globals
ImportError: cannot import name 'add_safe_globals' from 'torch.serialization' (D:\GPT-SoVITS\runtime\lib\site-packages\torch\serialization.py)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions