python -m unittest discover ./tests

(pure_hanlp) C:\Users\chentao\clone_dir\HanLP>python -m unittest discover ./tests
Failed to load https://file.hankcs.com/hanlp/mtl/close_tok_pos_ner_srl_dep_sdp_con_electra_small_20210111_124159.zip. See traceback below:
================================ERROR LOG BEGINS================================
Traceback (most recent call last):
File “C:\Users\chentao\clone_dir\HanLP\hanlp\utils\component_util.py”, line 81, in load_from_meta_file
obj.load(save_dir, verbose=verbose, **kwargs)
File “C:\Users\chentao\clone_dir\HanLP\hanlp\common\torch_component.py”, line 173, in load
self.load_config(save_dir, **kwargs)
File “C:\Users\chentao\clone_dir\HanLP\hanlp\common\torch_component.py”, line 125, in load_config
self.config[k] = Configurable.from_config(v)
File “c:\users\chentao\clone_dir\hanlp\plugins\hanlp_common\hanlp_common\configurable.py”, line 30, in from_config
return cls(**deserialized_config)
File “C:\Users\chentao\clone_dir\HanLP\hanlp\layers\embeddings\contextual_word_embedding.py”, line 141, in init
self.transformer_tokenizer = AutoTokenizer.from_pretrained(self.transformer,
File “C:\Users\chentao\clone_dir\HanLP\hanlp\layers\transformers\pt_imports.py”, line 65, in from_pretrained
tokenizer = cls.from_pretrained(get_mirror(transformer), use_fast=use_fast, do_basic_tokenize=do_basic_tokenize,
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\site-packages\transformers\models\auto\tokenization_auto.py”, line 523, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\site-packages\transformers\models\auto\tokenization_auto.py”, line 416, in get_tokenizer_config
resolved_config_file = cached_path(
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\site-packages\transformers\file_utils.py”, line 1347, in cached_path raise ValueError(f"unable to parse {url_or_filename} as a URL or as a local path")
ValueError: unable to parse C:\Users\chentao\AppData\Roaming\hanlp\hanlp\transformers\electra_zh_small_20210520_124451\tokenizer_config.json as a URL or as a local path
=================================ERROR LOG ENDS=================================
If the problem still persists, please submit an issue to https://github.com/hankcs/HanLP/issues
When reporting an issue, make sure to paste the FULL ERROR LOG above and the system info below.
OS: Windows-10-10.0.19042-SP0
Python: 3.8.10
PyTorch: 1.9.0+cpu
HanLP: 2.1.0-alpha.52
.E

ERROR: test_mtl (unittest.loader._FailedTest)

ImportError: Failed to import test module: test_mtl
Traceback (most recent call last):
File “C:\Users\chentao\clone_dir\HanLP\hanlp\utils\component_util.py”, line 81, in load_from_meta_file
obj.load(save_dir, verbose=verbose, **kwargs)
File “C:\Users\chentao\clone_dir\HanLP\hanlp\common\torch_component.py”, line 173, in load
self.load_config(save_dir, **kwargs)
File “C:\Users\chentao\clone_dir\HanLP\hanlp\common\torch_component.py”, line 125, in load_config
self.config[k] = Configurable.from_config(v)
File “c:\users\chentao\clone_dir\hanlp\plugins\hanlp_common\hanlp_common\configurable.py”, line 30, in from_config
return cls(**deserialized_config)
File “C:\Users\chentao\clone_dir\HanLP\hanlp\layers\embeddings\contextual_word_embedding.py”, line 141, in init
self.transformer_tokenizer = AutoTokenizer.from_pretrained(self.transformer,
File “C:\Users\chentao\clone_dir\HanLP\hanlp\layers\transformers\pt_imports.py”, line 65, in from_pretrained
tokenizer = cls.from_pretrained(get_mirror(transformer), use_fast=use_fast, do_basic_tokenize=do_basic_tokenize,
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\site-packages\transformers\models\auto\tokenization_auto.py”, line 523, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\site-packages\transformers\models\auto\tokenization_auto.py”, line 416, in get_tokenizer_config
resolved_config_file = cached_path(
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\site-packages\transformers\file_utils.py”, line 1347, in cached_path raise ValueError(f"unable to parse {url_or_filename} as a URL or as a local path")
ValueError: unable to parse C:\Users\chentao\AppData\Roaming\hanlp\hanlp\transformers\electra_zh_small_20210520_124451\tokenizer_config.json as a URL or as a local path

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\unittest\loader.py”, line 436, in _find_test_path
module = self._get_module_from_name(name)
File “C:\Users\chentao.conda\envs\pure_hanlp\lib\unittest\loader.py”, line 377, in get_module_from_name
import(name)
File “C:\Users\chentao\clone_dir\HanLP\tests\test_mtl.py”, line 6, in
mtl = hanlp.load(hanlp.pretrained.mtl.CLOSE_TOK_POS_NER_SRL_DEP_SDP_CON_ELECTRA_SMALL_ZH, devices=-1)
File "C:\Users\chentao\clone_dir\HanLP\hanlp_init
.py", line 43, in load
return load_from_meta_file(save_dir, ‘meta.json’, verbose=verbose, **kwargs)
File “C:\Users\chentao\clone_dir\HanLP\hanlp\utils\component_util.py”, line 121, in load_from_meta_file
exit(1)
SystemExit: 1


Ran 2 tests in 0.000s

FAILED (errors=1)

3 个帖子已被合并到了现有主题:Failed to load https://file.hankcs.com/hanlp/mtl/ud_ontonotes_tok_pos_lem_fea_ner_srl_dep_sdp_con_xlm_base_20210602_211620.zip