load模型显示这些是正常的吗?

image

输入: hanlp.load(‘LARGE_ALBERT_BASE’)
返回:
loader: No value for:[albert/embeddings/word_embeddings_projector/projector:0], i.e.:[bert/embeddings/word_embeddings_2] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/embeddings/word_embeddings_projector/bias:0], i.e.:[bert/embeddings/word_embeddings_2/bias] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/self/query/kernel:0], i.e.:[bert/encoder/layer_shared/attention/self/query/kernel] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/self/query/bias:0], i.e.:[bert/encoder/layer_shared/attention/self/query/bias] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/self/key/kernel:0], i.e.:[bert/encoder/layer_shared/attention/self/key/kernel] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/self/key/bias:0], i.e.:[bert/encoder/layer_shared/attention/self/key/bias] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/self/value/kernel:0], i.e.:[bert/encoder/layer_shared/attention/self/value/kernel] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/self/value/bias:0], i.e.:[bert/encoder/layer_shared/attention/self/value/bias] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/output/dense/kernel:0], i.e.:[bert/encoder/layer_shared/attention/output/dense/kernel] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/output/dense/bias:0], i.e.:[bert/encoder/layer_shared/attention/output/dense/bias] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/output/LayerNorm/gamma:0], i.e.:[bert/encoder/layer_shared/attention/output/LayerNorm/gamma] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/attention/output/LayerNorm/beta:0], i.e.:[bert/encoder/layer_shared/attention/output/LayerNorm/beta] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/intermediate/kernel:0], i.e.:[bert/encoder/layer_shared/intermediate/dense/kernel] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/intermediate/bias:0], i.e.:[bert/encoder/layer_shared/intermediate/dense/bias] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/output/dense/kernel:0], i.e.:[bert/encoder/layer_shared/output/dense/kernel] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/output/dense/bias:0], i.e.:[bert/encoder/layer_shared/output/dense/bias] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/output/LayerNorm/gamma:0], i.e.:[bert/encoder/layer_shared/output/LayerNorm/gamma] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
loader: No value for:[albert/encoder/layer_shared/output/LayerNorm/beta:0], i.e.:[bert/encoder/layer_shared/output/LayerNorm/beta] in:[/Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best]
Done loading 5 BERT weights from: /Users/yan/.hanlp/embeddings/albert_base_zh/model.ckpt-best into <bert.model.BertModelLayer object at 0x7fd6c181cd10> (prefix:albert). Count of weights not found in the checkpoint was: [18]. Count of weights with mismatched shape: [0]
Unused weights from checkpoint:
bert/encoder/embedding_hidden_mapping_in/bias
bert/encoder/embedding_hidden_mapping_in/kernel
bert/encoder/transformer/group_0/inner_group_0/LayerNorm/beta
bert/encoder/transformer/group_0/inner_group_0/LayerNorm/gamma
bert/encoder/transformer/group_0/inner_group_0/LayerNorm_1/beta
bert/encoder/transformer/group_0/inner_group_0/LayerNorm_1/gamma
bert/encoder/transformer/group_0/inner_group_0/attention_1/output/dense/bias
bert/encoder/transformer/group_0/inner_group_0/attention_1/output/dense/kernel
bert/encoder/transformer/group_0/inner_group_0/attention_1/self/key/bias
bert/encoder/transformer/group_0/inner_group_0/attention_1/self/key/kernel
bert/encoder/transformer/group_0/inner_group_0/attention_1/self/query/bias
bert/encoder/transformer/group_0/inner_group_0/attention_1/self/query/kernel
bert/encoder/transformer/group_0/inner_group_0/attention_1/self/value/bias
bert/encoder/transformer/group_0/inner_group_0/attention_1/self/value/kernel
bert/encoder/transformer/group_0/inner_group_0/ffn_1/intermediate/dense/bias
bert/encoder/transformer/group_0/inner_group_0/ffn_1/intermediate/dense/kernel
bert/encoder/transformer/group_0/inner_group_0/ffn_1/intermediate/output/dense/bias
bert/encoder/transformer/group_0/inner_group_0/ffn_1/intermediate/output/dense/kernel
bert/pooler/dense/bias
bert/pooler/dense/kernel
cls/predictions/output_bias
cls/predictions/transform/LayerNorm/beta
cls/predictions/transform/LayerNorm/gamma
cls/predictions/transform/dense/bias
cls/predictions/transform/dense/kernel
cls/seq_relationship/output_bias
cls/seq_relationship/output_weights
global_step

求回复啊呜呜呜呜,一直没法往下做了。
也没搜到有人提过这个问题啊。。。。

pip install hanlp -U