来自 CLI 调试数据的 Spacy 模型

问题描述

我正在尝试使用 spaCy 转换器将带注释的 JSON 文件转换为 spacy 格式,以便我可以从 CLI 进行训练。

转换器成功执行并生成输出 JSON 文件生成的 JSON 文件似乎有空白标签。我进行了各种搜索,但找不到任何此类空白标签

这是我用于调试数据的内容

python3 -m spacy debug-data en For_Spacy_convert.json For_Spacy_convert.json -p ner



=========================== Data format validation ===========================
✔ Corpus is loadable

=============================== Training stats ===============================
Training pipeline: ner
Starting with blank model 'en'
60 training docs
60 evaluation docs
⚠ 60 training examples also in evaluation data
✘ Low number of examples to train from a blank model (60)

============================== Vocab & Vectors ==============================
ℹ 165939 total words in the data (13224 unique)
ℹ No word vectors present in the model

========================== Named Entity Recognition ==========================
ℹ 5 new labels,0 existing labels
0 missing values (tokens with '-' label)
✘ Empty label found in new labels
⚠ 4 entity span(s) with punctuation
⚠ Low number of examples for new label '' (11)
✔ Examples without occurrences available for all labels
✔ No entities consisting of or starting/ending with whitespace
Entity spans consisting of or starting/ending with punctuation can not be
trained with a noise level > 0.

================================== Summary ==================================
✔ 3 checks passed
⚠ 3 warnings
✘ 2 errors

以下是 spacy convert 中使用的示例。

一个JSONL文件,每行一个字典,如下图

  {'id': 194,'text': 'lots and lots of text'
  'Meta': {'file_name': 'file_name.docx'},'entities': [{'start': 1520,'end': 1544,'label': 'Client'},{'start': 1553,'end': 1557,'label': 'ent1'},{'start': 1560,'end': 1567,'label': 'ent2'},{'start': 1730,'end': 1763,'label': 'ent3'},{'start': 3570,'end': 3621,{'start': 1764,'end': 1768,...]}

如果我忽略 spacy data-debug 并继续训练,因为我确定我没有空白标签,我得到 TypeError: unsupported operand type(s) for /: 'nonetype' and 'str'

我已经坚持了 4 天了,但仍然坚持不懈!

感谢任何帮助,在此先感谢您

解决方法

暂无找到可以解决该程序问题的有效方法,小编努力寻找整理中!

如果你已经找到好的解决方法,欢迎将解决方案带上本链接一起发送给小编。

小编邮箱:dio#foxmail.com (将#修改为@)