代码运行出错, hanlp2.1, hanlp.pretrained.mtl.UD_ONTONOTES_TOK_POS_LEM_FEA_NER_SRL_DEP_SDP_CON_XLMR_BASE

代码示例

import hanlp
print(hanlp.__version__)
text = 'No Apagues La Luz - Enrique Iglesias, 歌曲大合集,新整理欧美,No Apagues La Luz - Enrique Iglesias'

Sentence = hanlp.load(hanlp.pretrained.eos.UD_CTB_EOS_MUL)
sentences = Sentence(text)
print(sentences)

Tok = hanlp.load(hanlp.pretrained.mtl.UD_ONTONOTES_TOK_POS_LEM_FEA_NER_SRL_DEP_SDP_CON_XLMR_BASE) 
toks = Tok(sentences)

print(toks)
2.1.0-alpha.10
['No Apagues La Luz - Enrique Iglesias, 歌曲大合集,新整理欧美,No Apagues La Luz - Enrique Iglesias']
Traceback (most recent call last):
  File "test.py", line 29, in <module>
    toks = Tok(sentences)
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/components/mtl/multi_task_learning.py", line 766, in __call__
    return super().__call__(data, batch_size, **kwargs)
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/torch/autograd/grad_mode.py", line 26, in decorate_context
    return func(*args, **kwargs)
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/common/torch_component.py", line 631, in __call__
    **kwargs))
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/common/component.py", line 36, in __call__
    return self.predict(data, **kwargs)
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/components/mtl/multi_task_learning.py", line 511, in predict
    run_transform=True, cls_is_bos=cls_is_bos, sep_is_eos=sep_is_eos)
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/components/mtl/multi_task_learning.py", line 591, in predict_task
    results[output_key].extend(task.prediction_to_result(output_dict[output_key]['prediction'], batch))
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/components/mtl/tasks/ud.py", line 134, in prediction_to_result
    yield from UniversalDependenciesParser.prediction_to_human(self, prediction, batch)
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/components/parsers/ud/ud_parser.py", line 339, in prediction_to_human
    lemma = [apply_lemma_rule(t, lem_vocab[r]) for t, r in zip(form, lemma)]
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/components/parsers/ud/ud_parser.py", line 339, in <listcomp>
    lemma = [apply_lemma_rule(t, lem_vocab[r]) for t, r in zip(form, lemma)]
  File "/Users/pnchen/workspace/projects/cnooc/awat/hanlp2.1/env/lib/python3.6/site-packages/hanlp/components/parsers/ud/lemma_edit.py", line 88, in apply_lemma_rule
    casing, rule = lemma_rule.split(";", 1)
ValueError: not enough values to unpack (expected 2, got 1)

hanlp 2.1.0-alpha.10
Python 3.6.8 (v3.6.8:3c6b436a57, Dec 24 2018, 02:04:31)
macos

1 Like

Hi,已经修复:

我在后台日志看到你在GitHub上提的issue被垃圾过滤脚本自动过滤了,请按模板填写,谢谢合作。