HanLP load MSRA_NER_ALBERT_BASE_ZH error

recognizer = hanlp.load(hanlp.pretrained.ner.MSRA_NER_ALBERT_BASE_ZH)
Traceback (most recent call last):
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/ultratb.py”, line 1148, in get_records
return _fixed_getinnerframes(etb, number_of_lines_of_context, tb_offset)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/ultratb.py”, line 316, in wrapped
return f(*args, **kwargs)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/ultratb.py”, line 350, in _fixed_getinnerframes
records = fix_frame_records_filenames(inspect.getinnerframes(etb, context))
File “/home/xxx/.conda/envs/xxx/lib/python3.7/inspect.py”, line 1502, in getinnerframes
frameinfo = (tb.tb_frame,) + getframeinfo(tb, context)
AttributeError: ‘tuple’ object has no attribute ‘tb_frame’

load NER 的时候出现了以上报错,ubuntu16.04. python3.7.6, hanlp 2.0.0a46, 请问是什么原因?

你的Traceback看不出跟hanlp的关系,确定是完整的吗?

Failed to load https://file.hankcs.com/hanlp/ner/ner_bert_base_msra_20200104_185735.zip. See stack trace below
Traceback (most recent call last):
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/utils/component_util.py”, line 48, in load_from_meta_file
obj.load(save_dir, **load_kwargs)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/common/component.py”, line 244, in load
self.build(**merge_dict(self.config, training=False, logger=logger, **kwargs, overwrite=True, inplace=True))
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/common/component.py”, line 255, in build
loss=kwargs.get(‘loss’, None)))
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/components/taggers/transformers/transformer_tagger.py”, line 36, in build_model
model, tokenizer = build_transformer(transformer, max_seq_length, len(self.transform.tag_vocab), tagging=True)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/layers/transformers/loader.py”, line 101, in build_transformer
assert len(vocab) == 1, ‘No vocab found or unambiguous vocabs found’
AssertionError: No vocab found or unambiguous vocabs found
ERROR:root:Internal Python error in the inspect module.
Below is the traceback from this internal error.

Traceback (most recent call last):
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/utils/component_util.py”, line 48, in load_from_meta_file
obj.load(save_dir, **load_kwargs)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/common/component.py”, line 244, in load
self.build(**merge_dict(self.config, training=False, logger=logger, **kwargs, overwrite=True, inplace=True))
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/common/component.py”, line 255, in build
loss=kwargs.get(‘loss’, None)))
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/components/taggers/transformers/transformer_tagger.py”, line 36, in build_model
model, tokenizer = build_transformer(transformer, max_seq_length, len(self.transform.tag_vocab), tagging=True)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/layers/transformers/loader.py”, line 101, in build_transformer
assert len(vocab) == 1, ‘No vocab found or unambiguous vocabs found’
AssertionError: No vocab found or unambiguous vocabs found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/interactiveshell.py”, line 3331, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File “”, line 4, in
recognizer = hanlp.load(hanlp.pretrained.ner.MSRA_NER_BERT_BASE_ZH)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/init.py”, line 51, in load
return load_from_meta_file(save_dir, meta_filename, transform_only=transform_only, load_kwargs=load_kwargs, **kwargs)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/hanlp/utils/component_util.py”, line 64, in load_from_meta_file
exit(1)
SystemExit: 1

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/ultratb.py”, line 1148, in get_records
return _fixed_getinnerframes(etb, number_of_lines_of_context, tb_offset)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/ultratb.py”, line 316, in wrapped
return f(*args, **kwargs)
File “/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/ultratb.py”, line 350, in _fixed_getinnerframes
records = fix_frame_records_filenames(inspect.getinnerframes(etb, context))
File “/home/xxx/.conda/envs/xxx/lib/python3.7/inspect.py”, line 1502, in getinnerframes
frameinfo = (tb.tb_frame,) + getframeinfo(tb, context)
AttributeError: ‘tuple’ object has no attribute ‘tb_frame’
An exception has occurred, use %tb to see the full traceback.


During handling of the above exception, another exception occurred:

SystemExit: 1
/home/xxx/.conda/envs/xxx/lib/python3.7/site-packages/IPython/core/interactiveshell.py:3339: UserWarning: To exit: use ‘exit’, ‘quit’, or Ctrl-D.
warn(“To exit: use ‘exit’, ‘quit’, or Ctrl-D.”, stacklevel=1)

可能是下载的模型没下载正确。

1 Like

@Hang14 请问问题解决了吗?我也遇到了相同的问题,能否分享一下。
Python 3.8.0 (default, May 7 2020, 02:49:39)
[GCC 8.3.1 20191121 (Red Hat 8.3.1-5)] on linux
Type “help”, “copyright”, “credits” or “license” for more information.

import hanlp
recognizer = hanlp.load(hanlp.pretrained.ner.MSRA_NER_BERT_BASE_ZH)
Failed to load https://file.hankcs.com/hanlp/ner/ner_bert_base_msra_20200104_185735.zip. See traceback below:
================================ERROR LOG BEGINS================================
Traceback (most recent call last):
File “/usr/local/lib/python3.8/site-packages/hanlp/utils/component_util.py”, line 48, in load_from_meta_file
obj.load(save_dir, **load_kwargs)
File “/usr/local/lib/python3.8/site-packages/hanlp/common/component.py”, line 244, in load
self.build(**merge_dict(self.config, training=False, logger=logger, **kwargs, overwrite=True, inplace=True))
File “/usr/local/lib/python3.8/site-packages/hanlp/common/component.py”, line 254, in build
self.model = self.build_model(**merge_dict(self.config, training=kwargs.get(‘training’, None),
File “/usr/local/lib/python3.8/site-packages/hanlp/components/taggers/transformers/transformer_tagger.py”, line 36, in build_model
model, tokenizer = build_transformer(transformer, max_seq_length, len(self.transform.tag_vocab), tagging=True)
File “/usr/local/lib/python3.8/site-packages/hanlp/layers/transformers/loader.py”, line 44, in build_transformer
assert len(vocab) == 1, ‘No vocab found or unambiguous vocabs found’
AssertionError: No vocab found or unambiguous vocabs found
=================================ERROR LOG ENDS=================================
Please upgrade hanlp with:
pip install --upgrade hanlp

If the problem still persists, please submit an issue to https://github.com/hankcs/HanLP/issues
When reporting an issue, make sure to paste the FULL ERROR LOG above.

Please ignore this issue. it works after reloading chinese_L-12_H-768_A-12.zip.