GithubHelp home page GithubHelp logo

allenai / kb Goto Github PK

View Code? Open in Web Editor NEW
365.0 13.0 50.0 362 KB

KnowBert -- Knowledge Enhanced Contextual Word Representations

License: Apache License 2.0

Python 84.62% Shell 0.22% Perl 3.26% Jsonnet 11.91%

kb's Introduction

KnowBert

KnowBert is a general method to embed multiple knowledge bases into BERT. This repository contains pretrained models, evaluation and training scripts for KnowBert with Wikipedia and WordNet.

Citation:

@inproceedings{Peters2019KnowledgeEC,
  author={Matthew E. Peters and Mark Neumann and Robert L Logan and Roy Schwartz and Vidur Joshi and Sameer Singh and Noah A. Smith},
  title={Knowledge Enhanced Contextual Word Representations},
  booktitle={EMNLP},
  year={2019}
}

Getting started

git clone [email protected]:allenai/kb.git
cd kb
conda create -n knowbert python=3.6.7
source activate knowbert
pip install torch==1.2.0
pip install -r requirements.txt
python -c "import nltk; nltk.download('wordnet')"
python -m spacy download en_core_web_sm
pip install --editable .

Then make sure the tests pass:

pytest -v tests

Pretrained Models

How to embed sentences or sentence pairs programmatically

from kb.include_all import ModelArchiveFromParams
from kb.knowbert_utils import KnowBertBatchifier
from allennlp.common import Params

import torch

# a pretrained model, e.g. for Wordnet+Wikipedia
archive_file = 'https://allennlp.s3-us-west-2.amazonaws.com/knowbert/models/knowbert_wiki_wordnet_model.tar.gz'

# load model and batcher
params = Params({"archive_file": archive_file})
model = ModelArchiveFromParams.from_params(params=params)
batcher = KnowBertBatchifier(archive_file)

sentences = ["Paris is located in France.", "KnowBert is a knowledge enhanced BERT"]

# batcher takes raw untokenized sentences
# and yields batches of tensors needed to run KnowBert
for batch in batcher.iter_batches(sentences, verbose=True):
    # model_output['contextual_embeddings'] is (batch_size, seq_len, embed_dim) tensor of top layer activations
    model_output = model(**batch)

How to run intrinisic evaluation

First download one of the pretrained models from the previous section.

Heldout perplexity (Table 1)

Download the heldout data. Then run:

MODEL_ARCHIVE=..location of model
HELDOUT_FILE=wikipedia_bookscorpus_knowbert_heldout.txt
python bin/evaluate_perplexity.py -m $MODEL_ARCHIVE -e $HELDOUT_FILE

The heldout perplexity is key exp(lm_loss_wgt).

Wikidata KG probe (Table 1)

Run:

MODEL_ARCHIVE=..location of model

mkdir -p kg_probe
cd kg_probe
curl https://allennlp.s3-us-west-2.amazonaws.com/knowbert/data/kg_probe.zip > kg_probe.zip
unzip kg_probe.zip

cd ..
python bin/evaluate_mrr.py \
    --model_archive $MODEL_ARCHIVE \
    --datadir kg_probe \
    --cuda_device 0

The results are in key 'mrr'.

Word-sense disambiguation

To evaluate the internal WordNet linker on the ALL task evaluation from Raganato et al. (2017) follow these steps (Table 2). First download the Java scorer and evaluation file.

Then run this command to generate predictions from KnowBert:

EVALUATION_FILE=semeval2007_semeval2013_semeval2015_senseval2_senseval3_all.json
KNOWBERT_PREDICTIONS=knowbert_wordnet_predicted.txt
MODEL_ARCHIVE=..location of model

python bin/evaluate_wsd_official.py \
    --evaluation_file $EVALUATION_FILE \
    --output_file $KNOWBERT_PREDICTIONS \
    --model_archive $MODEL_ARCHIVE \
    --cuda_device 0

To evaluate predictions, decompress the Java scorer, navigate to the directory WSD_Evaluation_Framework/Evaluation_Datasets and run

java Scorer ALL/ALL.gold.key.txt $KNOWBERT_PREDICTIONS

AIDA Entity linking

To reproduce the results in Table 3 for KnowBert-W+W:

# or aida_test.txt
EVALUATION_FILE=aida_dev.txt
MODEL_ARCHIVE=..location of model

curl https://allennlp.s3-us-west-2.amazonaws.com/knowbert/wiki_entity_linking/$EVALUATION_FILE > $EVALUATION_FILE

python bin/evaluate_wiki_linking.py \
    --model_archive $MODEL_ARCHIVE \
    --evaluation_file $EVALUATION_FILE \
    --wiki_and_wordnet

Results are in key wiki_el_f1.

Fine tuning KnowBert for downstream tasks

Fine tuning KnowBert is similar to fine tuning BERT for a downstream task. We provide configuration and model files for the following tasks:

  • Relation extraction: TACRED and SemEval 2010 Task 8
  • Entity typing (Choi et al 2018)
  • Binary sentence classification: Words-in-Context

To reproduce our results for the following tasks, find the appropriate config file in training_config/downstream/, edit the location of the training and dev data files, then run (example provided for TACRED):

allennlp train --file-friendly-logging --include-package kb.include_all \
        training_config/downstream/tacred.jsonnet -s OUTPUT_DIRECTORY

Similar to BERT, for some tasks performance can vary significantly with hyperparameter choices and the random seed. We used the script bin/run_hyperparameter_seeds.sh to perform a small grid search over learning rate, number of epochs and the random seed, choosing the best model based on the validation set.

Evaluating fine tuned models

Fine-tuned KnowBert-Wiki+Wordnet models are available.

To evaluate a model first download the model archive and run:

allennlp evaluate --include-package kb.include_all \
    --cuda-device 0 \
    model_archive_here \
    dev_or_test_filename_here

TACRED

To evaluate a model with the official scorer, run:

python bin/write_tacred_for_official_scorer.py \
    --model_archive model_archive_here \
    --evaluation_file tacred_dev_or_test.json \
    --output_file knowbert_predictions_tacred_dev_or_test.txt

python bin/tacred_scorer.py tacred_dev_or_test.gold knowbert_predictions_tacred_dev_or_test.txt

SemEval 2010 Task 8

To evaluate a model with the official scorer, first download the testing gold keys and run:

curl https://allennlp.s3-us-west-2.amazonaws.com/knowbert/data/semeval2010_task8/test.json > semeval2010_task8_test.json

python bin/write_semeval2010_task8_for_official_eval.py \
    --model_archive model_archive_here \
    --evaluation_file semeval2010_task8_test.json \
    --output_file knowbert_predictions_semeval2010_task8_test.txt

perl -w bin/semeval2010_task8_scorer-v1.2.pl knowbert_predictions_semeval2010_task8_test.txt semeval2010_task8_testing_keys.txt

WiC

Use bin/write_wic_for_codalab.py to write a file for submission to the CodaLab evaluation server.

How to pretrain KnowBert

Roughly speaking, the process to fine tune BERT into KnowBert is:

  1. Prepare your corpus.
  2. Prepare the knowledge bases (not necessary if you are using Wikipedia or WordNet as we have already prepared these).
  3. For each knowledge base:
    1. Pretrain the entity linker while freezing everything else.
    2. Fine tune all parameters (except entity embeddings).

Prepare your corpus.

  1. Sentence tokenize your training corpus using spacy, and prepare input files for next-sentence-prediction sampling. Each file contains one sentence per line with consecutive sentences on subsequent lines and blank lines separating documents.
  2. Run bin/create_pretraining_data_for_bert.py to group the sentences by length, do the NSP sampling, and write out files for training.
  3. Reserve one or more of the training files for heldout evaluation.

Prepare the input knowledge bases.

  1. We have already prepared the knowledge bases for Wikipedia and WordNet. The necessary files will be automatically downloaded as needed when running evaluations or fine tuning KnowBert.

  2. If you would like to add an additional knowledge source to KnowBert, these are roughly the steps to follow:

    1. Compute entity embeddings for each entity in your knowledge base.
    2. Write a candidate generator for the entity linkers. Use the existing WordNet or Wikipedia generators as templates.
  3. Our Wikipedia candidate dictionary list and embeddings were extracted from End-to-End Neural Entity Linking, Kolitsas et al 2018 via a manual process.

  4. Our WordNet candidate generator is rule based (see code). The embeddings were computed via a multistep process that combines TuckER and GenSen embeddings. The prepared files contain everything needed to run KnowBert and include:

    1. entities.jsonl - metadata about WordNet synsets.
    2. wordnet_synsets_mask_null_vocab.txt and wordnet_synsets_mask_null_vocab_embeddings_tucker_gensen.hdf5 - vocabulary file and embedding file for WordNet synsets.
    3. semcor_and_wordnet_examples.json annotated training data combining SemCor and WordNet examples for supervising the WordNet linker.
  5. If you would like to generate these files yourself from scratch, follow these steps.

    1. Extract the WordNet metadata and relationship graph.
      python bin/extract_wordnet.py --extract_graph --entity_file $WORKDIR/entities.jsonl --relationship_file $WORKDIR/relations.txt
      
    2. Download the Words-in-Context dataset to exclude from the extracted WordNet example usages.
      WORKDIR=.
      cd $WORKDIR
      wget https://pilehvar.github.io/wic/package/WiC_dataset.zip
      unzip WiC_dataset.zip
      
    3. Download the word sense diambiguation data:
      cd $WORKDIR
      wget http://lcl.uniroma1.it/wsdeval/data/WSD_Evaluation_Framework.zip
      unzip WSD_Evaluation_Framework.zip
      
    4. Convert the WSD data from XML to jsonl, and concatenate all evaluation files for easy evaluation:
      mkdir $WORKDIR/wsd_jsonl
      python bin/preprocess_wsd.py --wsd_framework_root $WORKDIR/WSD_Evaluation_Framework  --outdir $WORKDIR/wsd_jsonl
      cat $WORKDIR/wsd_jsonl/semeval* $WORKDIR/wsd_jsonl/senseval* > $WORKDIR/semeval2007_semeval2013_semeval2015_senseval2_senseval3.json
      
    5. Extract all the synset example usages from WordNet (after removing sentences from WiC heldout sets):
      python bin/extract_wordnet.py --extract_examples_wordnet --entity_file $WORKDIR/entities.jsonl --wic_root_dir $WORKDIR --wordnet_example_file $WORKDIR/wordnet_examples_remove_wic_devtest.json
      
    6. Combine WordNet examples and definitions with SemCor for training KnowBert:
      cat $WORKDIR/wordnet_examples_remove_wic_devtest.json $WORKDIR/wsd_jsonl/semcor.json > $WORKDIR/semcor_and_wordnet_examples.json
      
    7. Create training and test splits of the relationship graph.
      python bin/extract_wordnet.py --split_wordnet --relationship_file $WORKDIR/relations.txt --relationship_train_file $WORKDIR/relations_train99.txt --relationship_dev_file $WORKDIR/relations_dev01.txt
      
    8. Train TuckER embeddings on the extracted graph. The configuration files uses relationship graph files on S3, although you can substitute them for the files generated in the previous step by modifying the configuration file.
      allennlp train -s $WORKDIR/wordnet_tucker --include-package kb.kg_embedding --file-friendly-logging training_config/wordnet_tucker.json
      
    9. Generate a vocabulary file useful for WordNet synsets with special tokens
      python bin/combine_wordnet_embeddings.py --generate_wordnet_synset_vocab --entity_file $WORKDIR/entities.jsonl --vocab_file $WORKDIR/wordnet_synsets_mask_null_vocab.txt
      
    10. Get the GenSen embeddings from each synset definition. First install the code from this link. Then run
      python bin/combine_wordnet_embeddings.py --generate_gensen_embeddings --entity_file $WORKDIR/entities.jsonl --vocab_file $WORKDIR/wordnet_synsets_mask_null_vocab.txt --gensen_file $WORKDIR/gensen_synsets.hdf5
      
    11. Extract the TuckER embeddings for the synsets from the trained model
      python bin/combine_wordnet_embeddings.py --extract_tucker --tucker_archive_file $WORKDIR/wordnet_tucker/model.tar.gz --vocab_file $WORKDIR/wordnet_synsets_mask_null_vocab.txt --tucker_hdf5_file $WORKDIR/tucker_embeddings.hdf5
      
    12. Finally combine the TuckER and GenSen embeddings into one file
      python bin/combine_wordnet_embeddings.py --combine_tucker_gensen --tucker_hdf5_file $WORKDIR/tucker_embeddings.hdf5 --gensen_file $WORKDIR/gensen_synsets.hdf5 --all_embeddings_file $WORKDIR/wordnet_synsets_mask_null_vocab_embeddings_tucker_gensen.hdf5
      

Pretraining the entity linkers

This step pretrains the entity linker while freezing the rest of the network using only supervised data.

Config files are in training_config/pretraining/knowbert_wiki_linker.jsonnet and training_config/pretraining/knowbert_wordnet_linker.jsonnet.

To train the Wikipedia linker for KnowBert-Wiki run:

allennlp train -s OUTPUT_DIRECTORY --file-friendly-logging --include-package kb.include_all training_config/pretraining/knowbert_wiki_linker.jsonnet

The command is similar for WordNet.

Fine tuning BERT

After pre-training the entity linkers from the step above, fine tune BERT. The pretrained models in our paper were trained on a single GPU with 24GB of RAM. For multiple GPU training, change cuda_device to a list of device IDs.

Config files are in training_config/pretraining/knowbert_wiki.jsonnet and training_config/pretraining/knowbert_wordnet.jsonnet.

Before training, modify the following keys in the config file (or use --overrides flag to allennlp train):

  • "language_modeling"
  • "model_archive" to point to the model.tar.gz from the previous linker pretraining step.

KnowBert Wordnet + Wiki

First train KnowBert-Wiki. Then pretrain the WordNet linker and finally fine tune the entire network.

Config file to pretrain the WordNet linker from KnowBert-Wiki is in training_config/pretraining/knowbert_wordnet_wiki_linker.jsonnet and config to train KnowBert-W+W is in training_config/pretraining/knowbert_wordnet_wiki.jsonnet.

kb's People

Contributors

matt-peters avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kb's Issues

Error: unable to locate credentials

Hi,

I'm trying to embed sentences programmatically following the guide in the readme (thanks for it)

The command: model = ModelArchiveFromParams.from_params(params=params) returns me the error Error: unable to locate credentials after downloading the model.

How can I fix this error?

P.S. Happy new year

self-supervised entity linker

Thanks for making the code available online.
I want to insert a new knowledge graph like (ConceptNet) into KnowBert. To do that I need to train an entity linker for it. However, I don't have labelled data for the entity linker, and as I understood from the paper, it is possible to train the linker in a self-supervised way.
How to adapt the configuration files in order to do this task ?

Thanks in advance

Training configs missing?

Hi Dr. Petters, thank you for making the code available.

However, I noticed that the training_config seems to be missing from the repo and I did not find it in the custom AllenNLP repo as well. I would greatly appreciate it if you could make those configs available!

Having trouble running the "Getting started" section (even the tests do not run)

Hi,

I am trying to run the KnowBert code but keep getting stuck with the following error when trying to run the tests or the "How to embed sentences or sentence pairs programmatically" snippet:

tests/evaluation/test_wic_reader.py:3: in <module>
    from kb.include_all import  WicDatasetReader
kb/include_all.py:2: in <module>
    from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:8: in <module>
    from allennlp.data import DatasetReader, Token, Vocabulary
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
/opt/conda/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
/opt/conda/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:88: in overrides
    return _overrides(method, check_signature, check_at_runtime)
/opt/conda/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:114: in _overrides
    _validate_method(method, super_class, check_signature)
/opt/conda/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:135: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
/opt/conda/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:93: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
/opt/conda/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:288: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.

Steps to reproduce

  1. Enter a clean anaconda docker container docker run -it continuumio/anaconda3
  2. apt-get update && apt install -y g++ make
  3. git clone https://github.com/allenai/kb.git
  4. conda create -n knowbert python=3.6.7
  5. source activate knowbert
  6. pip install torch==1.2.0
  7. pip install -r requirements.txt
  8. python -c "import nltk; nltk.download('wordnet')"
  9. python -m spacy download en_core_web_sm
  10. pip install --editable .
  11. pytest -v tests

Evaluate Perplexity: Issue when constructing DatasetReader from parameters (AWS api)

Dear @matt-peters et al.,
First of all: Impressive work with KnowBert!

I'd like to replicate the results based on the pretrained model and as a start: evaluate the perplexity of KnowBert-wiki.

I have followed the setup Getting Started section, downloaded the held out wiki book corpus and
set up everything in a submit script almost identical to

MODEL_ARCHIVE=..location of model
HELDOUT_FILE=wikipedia_bookscorpus_knowbert_heldout.txt
python bin/evaluate_perplexity.py -m $MODEL_ARCHIVE -e $HELDOUT_FILE

Having done that, I run the file, but experience a bug when I try to construct the DatasetReader from parameters after having loaded the model from the pretrained model archive knowbert_wiki_model.tar.gz. See error message below:

File "bin/evaluate_perplexity.py", line 72, in <module>
    random_candidates=False)
  File "bin/evaluate_perplexity.py", line 38, in run_evaluation
    reader = DatasetReader.from_params(Params(reader_params))
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 289, in from_params
    return subclass.from_params(params=params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 300, in from_params
    kwargs = create_kwargs(cls, params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 159, in create_kwargs
    kwargs[name] = annotation.from_params(params=subparams, **subextras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 289, in from_params
    return subclass.from_params(params=params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 300, in from_params
    kwargs = create_kwargs(cls, params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 159, in create_kwargs
    kwargs[name] = annotation.from_params(params=subparams, **subextras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 289, in from_params
    return subclass.from_params(params=params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 300, in from_params
    kwargs = create_kwargs(cls, params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 194, in create_kwargs
    value_dict[key] = value_cls.from_params(params=value_params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 289, in from_params
    return subclass.from_params(params=params, **extras)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 302, in from_params
    return cls(**kwargs)  # type: ignore
  File "/zhome/9e/7/97809/thesis/final-project-02456/kb-master/kb/wiki_linking_util.py", line 152, in __init__
    entity_world_path = cached_path(entity_world_path or self.defaults["entity_world_path"])
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/file_utils.py", line 98, in cached_path
    return get_from_cache(url_or_filename, cache_dir)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/file_utils.py", line 194, in get_from_cache
    etag = s3_etag(url)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/file_utils.py", line 142, in wrapper
    return func(url, *args, **kwargs)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/file_utils.py", line 158, in s3_etag
    return s3_object.e_tag
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/boto3/resources/factory.py", line 339, in property_loader
    self.load()
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/boto3/resources/factory.py", line 505, in do_action
    response = action(self, *args, **kwargs)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/boto3/resources/action.py", line 83, in __call__
    response = getattr(parent.meta.client, operation_name)(**params)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/zhome/9e/7/97809/miniconda3/envs/knowbert/lib/python3.6/site-packages/botocore/client.py", line 661, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

Before I experienced this error, I got a botocore.exceptions.NoCredentialsError: Unable to locate credentials error similar to the issue presented here: getmoto/moto#1941.
But after I created an aws configuration, I got to

botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

Have you experienced this before? is the issue that I do not have access to the data requested in the queried s3 aws bucket?

I thank you for your time!

Best regards,
Victor

Unable to pass the pytests

Hi,

I tried to set up the environment following the guidance in the readme. But after running all commands, I still fail the tests. This is the error I got. I wonder how to solve this problem. Thanks!

`
========================================== ERRORS ==========================================
__________________ ERROR collecting tests/test_bert_pretraining_reader.py __________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_bert_pretraining_reader.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_bert_pretraining_reader.py:2: in
from kb.bert_pretraining_reader import BertPreTrainingReader,
kb/bert_pretraining_reader.py:15: in
from kb.bert_tokenizer_and_candidate_generator import TokenizerAndCandidateGenerator, start_token, sep_token
kb/bert_tokenizer_and_candidate_generator.py:14: in
from kb.common import MentionGenerator, get_empty_candidates
kb/common.py:11: in
from allennlp.training.metrics.metric import Metric
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/training/init.py:1: in
from allennlp.training.trainer import Trainer
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/training/trainer.py:22: in
from allennlp.models.model import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
__________ ERROR collecting tests/test_bert_tokenizer_and_candidate_generator.py ___________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_bert_tokenizer_and_candidate_generator.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return bootstrap.gcd_import(name[level:], package, level)
tests/test_bert_tokenizer_and_candidate_generator.py:4: in
from kb.bert_tokenizer_and_candidate_generator import BertTokenizerAndCandidateGenerator
kb/bert_tokenizer_and_candidate_generator.py:14: in
from kb.common import MentionGenerator, get_empty_candidates
kb/common.py:11: in
from allennlp.training.metrics.metric import Metric
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/training/metrics/init.py:12: in
from allennlp.training.metrics.conll_coref_scores import ConllCorefScores
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/training/metrics/conll_coref_scores.py:5: in
from sklearn.utils.linear_assignment
import linear_assignment
E ModuleNotFoundError: No module named 'sklearn.utils.linear_assignment
'
________________________ ERROR collecting tests/test_dict_field.py _________________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_dict_field.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_dict_field.py:12: in
from kb.entity_linking import TokenCharactersIndexerTokenizer
kb/entity_linking.py:77: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
______________________ ERROR collecting tests/test_entity_linking.py _______________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_entity_linking.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_entity_linking.py:9: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
_______________________ ERROR collecting tests/test_kg_embedding.py ________________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_kg_embedding.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_kg_embedding.py:5: in
from kb.kg_embedding import KGTupleReader, get_labels_tensor_from_indices,
kb/kg_embedding.py:13: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
______________________ ERROR collecting tests/test_kg_probe_reader.py ______________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_kg_probe_reader.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_kg_probe_reader.py:10: in
from kb.wordnet import WordNetCandidateMentionGenerator
kb/wordnet.py:25: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
_________________________ ERROR collecting tests/test_knowbert.py __________________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_knowbert.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_knowbert.py:10: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
________________________ ERROR collecting tests/test_wiki_reader.py ________________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_wiki_reader.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_wiki_reader.py:4: in
from allennlp.common.testing.test_case import AllenNlpTestCase
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/common/testing/init.py:5: in
from allennlp.common.testing.model_test_case import ModelTestCase
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/common/testing/model_test_case.py:7: in
from allennlp.commands.train import train_model_from_file
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/commands/init.py:8: in
from allennlp.commands.configure import Configure
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/commands/configure.py:27: in
from allennlp.service.config_explorer import make_app
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/service/config_explorer.py:24: in
from allennlp.common.configuration import configure, choices
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/common/configuration.py:21: in
from allennlp.modules.seq2seq_encoders import _Seq2SeqWrapper
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
__________________________ ERROR collecting tests/test_wordnet.py __________________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/test_wordnet.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_wordnet.py:7: in
from kb.wordnet import WordNetFineGrainedSenseDisambiguationReader
kb/wordnet.py:25: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
_______________ ERROR collecting tests/evaluation/test_semeval2010_task8.py ________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/evaluation/test_semeval2010_task8.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/evaluation/test_semeval2010_task8.py:3: in
from kb.include_all import SemEval2010Task8Reader, SemEval2010Task8Metric
kb/include_all.py:2: in
from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:13: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
_______________ ERROR collecting tests/evaluation/test_simple_classifier.py ________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/evaluation/test_simple_classifier.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/evaluation/test_simple_classifier.py:3: in
from kb.include_all import SimpleClassifier, F1Metric
kb/include_all.py:2: in
from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:13: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
_________________ ERROR collecting tests/evaluation/test_tacred_reader.py __________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/evaluation/test_tacred_reader.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/evaluation/test_tacred_reader.py:10: in
from kb.wordnet import WordNetCandidateMentionGenerator
kb/wordnet.py:25: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
_______________ ERROR collecting tests/evaluation/test_ultra_fine_reader.py ________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/evaluation/test_ultra_fine_reader.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/evaluation/test_ultra_fine_reader.py:7: in
from kb.include_all import UltraFineReader
kb/include_all.py:2: in
from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:13: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
___________________ ERROR collecting tests/evaluation/test_wic_reader.py ___________________
ImportError while importing test module '/work/08582/yqing/maverick2/kb/tests/evaluation/test_wic_reader.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../miniconda3/envs/knowbert2/lib/python3.6/importlib/init.py:126: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/evaluation/test_wic_reader.py:3: in
from kb.include_all import WicDatasetReader
kb/include_all.py:2: in
from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:13: in
from allennlp.models import Model
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/init.py:8: in
from allennlp.models.biattentive_classification_network import BiattentiveClassificationNetwork
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/models/biattentive_classification_network.py:12: in
from allennlp.modules import Elmo, FeedForward, Maxout, Seq2SeqEncoder, TextFieldEmbedder
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/init.py:9: in
from allennlp.modules.elmo import Elmo
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/allennlp/modules/elmo.py:12: in
import h5py
/opt/apps/intel18/impi18_0/python2/2.7.16/lib/python2.7/site-packages/h5py/init.py:26: in
from . import _errors
E ImportError: libhdf5.so.103: cannot open shared object file: No such file or directory
===================================== warnings summary =====================================
../miniconda3/envs/knowbert2/lib/python3.6/site-packages/plac_ext.py:6
/work/08582/yqing/maverick2/miniconda3/envs/knowbert2/lib/python3.6/site-packages/plac_ext.py:6: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
import imp

../miniconda3/envs/knowbert2/lib/python3.6/site-packages/google/auth/crypt/_cryptography_rsa.py:22
/work/08582/yqing/maverick2/miniconda3/envs/knowbert2/lib/python3.6/site-packages/google/auth/crypt/_cryptography_rsa.py:22: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography and will be removed in a future release.
import cryptography.exceptions

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
================================= short test summary info ==================================
ERROR tests/test_bert_pretraining_reader.py
ERROR tests/test_bert_tokenizer_and_candidate_generator.py
ERROR tests/test_dict_field.py
ERROR tests/test_entity_linking.py
ERROR tests/test_kg_embedding.py
ERROR tests/test_kg_probe_reader.py
ERROR tests/test_knowbert.py
ERROR tests/test_wiki_reader.py
ERROR tests/test_wordnet.py
ERROR tests/evaluation/test_semeval2010_task8.py
ERROR tests/evaluation/test_simple_classifier.py
ERROR tests/evaluation/test_tacred_reader.py
ERROR tests/evaluation/test_ultra_fine_reader.py
ERROR tests/evaluation/test_wic_reader.py
!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 14 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!
============================= 2 warnings, 14 errors in 24.48s ==============================
`

Cannot download s3://allennlp/knowbert/models/vocabulary_wordnet.tar.gz file

Hi, I am encountering errors when trying to download s3://allennlp/knowbert/models/vocabulary_wordnet.tar.gz as specified in training_config/pretraining/knowbert_wordnet_linker.jsonnet.

I've had no issues when downloading the other s3 files in the config file (and even in the models folder), but vocabulary_wordnet.tar.gz gives me the error botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

Full traceback:

Traceback (most recent call last):
  File "/home/livia/.local/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 3319, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-33-161fe1979b1e>", line 1, in <module>
    s3.download_file('allennlp', 'knowbert/models/vocabulary_wordnet.tar.gz', 'vocabulary.tar.gz')
  File "/home/livia/.local/lib/python3.6/site-packages/boto3/s3/inject.py", line 172, in download_file
    extra_args=ExtraArgs, callback=Callback)
  File "/home/livia/.local/lib/python3.6/site-packages/boto3/s3/transfer.py", line 307, in download_file
    future.result()
  File "/home/livia/.local/lib/python3.6/site-packages/s3transfer/futures.py", line 106, in result
    return self._coordinator.result()
  File "/home/livia/.local/lib/python3.6/site-packages/s3transfer/futures.py", line 265, in result
    raise self._exception
  File "/home/livia/.local/lib/python3.6/site-packages/s3transfer/tasks.py", line 255, in _main
    self._submit(transfer_future=transfer_future, **kwargs)
  File "/home/livia/.local/lib/python3.6/site-packages/s3transfer/download.py", line 343, in _submit
    **transfer_future.meta.call_args.extra_args
  File "/home/livia/.local/lib/python3.6/site-packages/botocore/client.py", line 276, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/home/livia/.local/lib/python3.6/site-packages/botocore/client.py", line 586, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

I have upgraded allennlp using pip install git+git://github.com/matt-peters/allennlp.git@2d7ba1cb108428aaffe2dce875648253b44cb5ba as in #14 but to no avail.
If I try a simple curl https://allennlp.s3.amazonaws.com/knowbert/models/vocabulary_wordnet.tar.gz from the command line I get

<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>EE39315B861F9F02<HostId>H/WFn0ut8JZiSGgaaxeR1iffK/mGhZwD8vEk9A5Wgvw9Ac9/LKPz+2TECiEpr0KswVbFDtu907M=</HostId></Error>

Do you know how I could fix this?

Use KnowBert to predict missing words

Hi Matthew,

Thanks a bunch for the documentation on embedding sentences programmatically. It saves me a lot of time! I did a little bit of modification so that I can use KnowBert to predict the missing word (i.e., [MASK]) in a sentence, but found the results are unexpected. I am not sure if my implementation is correct, here is code snippet:

from kb.include_all import ModelArchiveFromParams
from kb.knowbert_utils import KnowBertBatchifier
from allennlp.common import Params
import torch
import torch.nn.functional as F

archive_file = 'https://allennlp.s3-us-west-2.amazonaws.com/knowbert/models/knowbert_wiki_wordnet_model.tar.gz'

# load model and batcher
params = Params({'archive_file': archive_file})
model = ModelArchiveFromParams.from_params(params=params)
model.eval()
batcher = KnowBertBatchifier(archive_file)

# get bert vocab
vocab = list(batcher.tokenizer_and_candidate_generator.bert_tokenizer.ids_to_tokens.values())

sentences = ['Paris is located in [MASK].']
mask_ind = 5

for batch in batcher.iter_batches(sentences, verbose=False):
    model_output = model(**batch)
    # the tokenized sentence, where the 6-th token is [MASK]
    print([vocab[w] for w in batch['tokens']['tokens'][0].numpy()])
    logits, _ = model.pretraining_heads(model_output['contextual_embeddings'], model_output['pooled_output'])
    log_probs = F.log_softmax(logits, dim=-1)
    topk = torch.topk(log_probs[0, mask_ind], 10, 0)[1]
    # print the top 10 predictions
    print([vocab[t.item()] for t in topk])

The top 10 predictions are [UNK], the, itself, its, and, marne, to, them, first, lissa, while the top 10 predictions of BERT-uncased-base is france, paris, europe, italy, belgium, algeria, germany, russia, haiti, canada, which seems a little bit wired. Is my implementation correct or any suggestions on this? Thanks in advance!

Pre-trained model

Hi! I'd like to know how to replace bert-base-uncased with the pre-trained Knowbert downloaded from the link given. It seems that knowbert_wordnet_model did not contain relevant tokenizer file.

Expecting property name enclosed in double quotes

when I run the example of How to embed sentences or sentence pairs programmatically, I got an error: json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 242 column 5 (char 8926) when loading config.json. I also load config.json solely, I get the same error.

Fine tuning KnowBert for downstream tasks

Hi, I am trying to run the below code for fine-tuning KnowBert for Tcred dataset:

allennlp train --file-friendly-logging --include-package kb.include_all
training_config/downstream/tacred.jsonnet -s OUTPUT_DIRECTORY

I met several errors, I was able to resolve some of them. But not sure how to deal with the below error:

Traceback (most recent call last):
File "/home/xu_lu/.local/bin/allennlp", line 10, in
sys.exit(run())
File "/home/xu_lu/.local/lib/python3.7/site-packages/allennlp/run.py", line 18, in run
main(prog="allennlp")
File "/home/xu_lu/.local/lib/python3.7/site-packages/allennlp/commands/init.py", line 102, in main
args.func(args)
File "/home/xu_lu/.local/lib/python3.7/site-packages/allennlp/commands/train.py", line 116, in train_model_from_args
args.cache_prefix)
File "/home/xu_lu/.local/lib/python3.7/site-packages/allennlp/commands/train.py", line 160, in train_model_from_file
cache_directory, cache_prefix)
File "/home/xu_lu/.local/lib/python3.7/site-packages/allennlp/commands/train.py", line 226, in train_model
validation_iterator=pieces.validation_iterator)
File "/home/xu_lu/.local/lib/python3.7/site-packages/allennlp/training/trainer.py", line 725, in from_params
params.assert_empty(cls.name)
File "/home/xu_lu/.local/lib/python3.7/site-packages/allennlp/common/params.py", line 405, in assert_empty
raise ConfigurationError("Extra parameters passed to {}: {}".format(class_name, self.params))
allennlp.common.checks.ConfigurationError: "Extra parameters passed to Trainer: {'gradient_accumulation_batch_size': 32}"
2020-08-20 15:59:20,126 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmprz5qhl43

My current environment has passed the tests and I am able to run the example to embed the sentences.

Getting hundreds of Warning messages

Hi, when I run the following commands,
allennlp train --file-friendly-logging --include-package kb.include_all
training_config/downstream/wic.jsonnet -s OUTPUT_DIRECTORY

I get hundreds of Warning messages like:

/opt/conda/conda-bld/pytorch_1565287148058/work/aten/src/ATen/native/cuda/LegacyDefinitions.cpp:14: UserWarning: masked_fill_ received a mask with dtype torch.uint8, this behavior is now deprecated,please use a mask with dtype torch.bool instead.

Which is annoying, any insights on how to avoid this problem?

RuntimeError: received 0 items of ancdata

Hello,

Could you plz tell me the reason of this error, and how to solve it ?
It appears shortly after starting to pre-train a language model, then it stops.

Number of workers: 8
Number of corpus files: 8
torch version: 1.2.0

thanks for help

allennlp fine-tune doesn't work on tacred dataset

OUTPUT_DIRECTORY=output_tacred
DATA=training_config/downstream/tacred.jsonnet
MODEL=knowbert_wiki_model

allennlp fine-tune --file-friendly-logging --include-package kb.include_all
-m $MODEL
-c $DATA
-s $OUTPUT_DIRECTORY

=======================================================
I have a question about fine-tuning knowBert-wiki on tacred (I tried above). I didn't modify any datas in OUTPUT_DIRECTORY, DATA, and MODEL(already been unzip). However, it always returns an error below: "/knowbert/lib/python3.6/site-packages/pytorch_pretrained_bert/modeling.py", line 592, in from_pretrained archive.extractall(tempdir)" .
Should I modify something when I fine-tune knowBert-wiki?

Wrong citation

Navigli et al. (2017) in both, paper and repository, should be Raganato et al. (2017)!

Error when trying to run the 'train code'

I, am trying to run the code of a paper termed KnowledgeMiningWithSceneText made by Leojc that uses the KnowBert, faced an error while executing the training command.

The error I got as follows:

(vit_kb) tih_isi_1@tih:~/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText$ python main.py -c configs/train_knowbert_attention_activity.toml
[2024-01-08 16:35:30,109][RANK=00][I]: unknown_args=[] 	[main.py:114]
Traceback (most recent call last):
  File "main.py", line 121, in <module>
    main()
  File "main.py", line 116, in main
    import train_knowbert
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/train_knowbert.py", line 17, in <module>
    from model.vit_knowbert_interaction_timm import Net as NetWithAttention
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/model/vit_knowbert_interaction_timm.py", line 14, in <module>
    from kb.include_all import ModelArchiveFromParams
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/kb/kb/include_all.py", line 3, in <module>
    from kb.entity_linking import TokenCharactersIndexerTokenizer
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/kb/kb/entity_linking.py", line 72, in <module>
    from allennlp.data.dataset import Batch
ModuleNotFoundError: No module named 'allennlp.data.dataset'

I solved this by relaced from allennlp.data.dataset import Batch by from allennlp.data import Batch.

Then I faced:

(vit_kb) tih_isi_1@tih:~/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText$ python main.py -c configs/train_knowbert_attention_activity.toml
[2024-01-09 12:14:17,782][RANK=00][I]: unknown_args=[] 	[main.py:114]
Traceback (most recent call last):
  File "main.py", line 121, in <module>
    main()
  File "main.py", line 116, in main
    import train_knowbert
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/train_knowbert.py", line 17, in <module>
    from model.vit_knowbert_interaction_timm import Net as NetWithAttention
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/model/vit_knowbert_interaction_timm.py", line 14, in <module>
    from kb.include_all import ModelArchiveFromParams
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/kb/kb/include_all.py", line 3, in <module>
    from kb.entity_linking import TokenCharactersIndexerTokenizer
  File "/home/tih_isi_1/Archisman/Fine-Grained Recognition/KnowledgeMiningWithSceneText/kb/kb/entity_linking.py", line 82, in <module>
    from allennlp.data.iterators import DataIterator
ModuleNotFoundError: No module named 'allennlp.data.iterators'

But the current issue is that 'Iterators' has been discontinued since the last version. Do you know how I can solve this error?

I appreciate any help you can provide.

json.decoder.JSONDecodeError

Hi, trying to use your code and receiving the following error:

/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/site-packages/sklearn/utils/linear_assignment_.py:22: FutureWarning: The linear_assignment_ module is deprecated in 0.21 and will be removed from 0.23. Use scipy.optimize.linear_sum_assignment instead.
FutureWarning)
/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/token_indexers/token_characters_indexer.py:51: UserWarning: You are using the default value (0) of min_padding_length, which can cause some subtle bugs (more info see allenai/allennlp#1954). Strongly recommend to set a value, usually the maximum size of the convolutional layer size when using CnnEncoder.
UserWarning)
Traceback (most recent call last):
File "KnowBERT_run.py", line 13, in
batcher = KnowBertBatchifier(archive_file)
File "/home/cheshani/kb-master/kb/knowbert_utils.py", line 61, in init
from_params(Params(candidate_generator_params))
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 289, in from_params
return subclass.from_params(params=params, **extras)
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 300, in from_params
kwargs = create_kwargs(cls, params, **extras)
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 194, in create_kwargs
value_dict[key] = value_cls.from_params(params=value_params, **extras)
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 289, in from_params
return subclass.from_params(params=params, **extras)
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/from_params.py", line 302, in from_params
return cls(**kwargs) # type: ignore
File "/home/cheshani/kb-master/kb/wiki_linking_util.py", line 153, in init
self.entity_world = json.load(open(entity_world_path))
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/json/init.py", line 299, in load
parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/json/init.py", line 354, in loads
return _default_decoder.decode(s)
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/json/decoder.py", line 339, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/home/cheshani/anaconda3/envs/knowbert/lib/python3.6/json/decoder.py", line 355, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Unterminated string starting at: line 13438 column 5 (char 522455)

Simple Classification Predictor

I trained a model for simple text classification, but I can't predict from it because there is no text classification predictor and the built-in predictor can't be used because of conflicts in allennlp package versions.

Any ideas?

What is entity_world_path used for ?

Hello,

I would like to know what entity_world_path in the class WikiCandidateMentionGenerator is used for ?

I don't have labelled entity linking data, so I'm training the entity linker jointly with the language model by providing the candidate entity data through the file 'candidates_file' in the same class.
Is that correct ?

thanks in advance,

unable to install the environment

hello, could you give the verison of these Python packages?
especiallly the "git+git://github.com/matt-peters/allennlp.git@2d7ba1cb108428aaffe2dce875648253b44cb5ba", i can't install it directly

Loading the AIDA dataset

First, thank you for making your code available!

When running bin/evaluate_wiki_linking.py, I got the error

allennlp.common.checks.ConfigurationError: "aida_wiki_linking not in acceptable choices for type: ['ccgbank', 'conll2003', 'conll2000', 'ontonotes_ner', 'coref', 'winobias', 'event2mind', 'interleaving', 'language_modeling', 'multiprocess', 'ptb_trees', 'drop', 'squad', 'quac', 'triviaqa', 'qangaroo', 'srl', 'semantic_dependencies', 'seq2seq', 'sequence_tagging', 'snli', 'universal_dependencies', 'sst_tokens', 'quora_paraphrase', 'atis', 'nlvr', 'wikitables', 'template_text2sql', 'grammar_based_text2sql', 'quarel', 'simple_language_modeling', 'babi', 'copynet_seq2seq', 'text_classification_json']"

I passed the test

pytest -v tests

Did I miss anything? Thank you very much!

Unable to install environment

Hello! I have issues with installing the environment for knowbert.

Running this from a Macbook Air M1, macOS Ventura 13.2.1.

I cloned the repository for allenai/kb.git, then used the following steps:

cd kb
conda create -n knowbert
conda activate knowbert
conda config --env --set subdir osx-64
conda install -c ehmoussi python
pip install torch==1.2.0
pip install -r requirements.txt

I had to create the conda environment first without specifying the python version, as version 3.6.7 is not available with the current version of conda. The two subsequent lines allowed for python 3.6.7 to be installed - the correct version of python was downloaded from anaconda.org.

At the last line above (install requirements), I encountered the following error:

ERROR: Package 'protobuf' requires a different Python: 3.6.7 not in '>=3.7'

In an attempt to resolve this, I restarted the environment using the above steps, BUT added a command prior to installation of requirements export MACOSX_DEPLOYMENT_TARGET=10.14, and added protobuf version to requirements.txt. The requirements.txt file now looks like this:

# this is the fp16_e_s3 branch
git+https://github.com/matt-peters/allennlp.git@2d7ba1cb108428aaffe2dce875648253b44cb5ba
pytest==6.2.2
nltk==3.5
ipython==7.16.1
spacy==2.0.18
torch==1.2.0
#added
protobuf==4.0.0rc1

Specified spacy version per issue #39 , as not doing so yields ERROR: allennlp 0.8.3-unreleased has requirement spacy<2.1,>=2.0, but you'll have spacy 3.6.1 which is incompatible.

However, I then yielded the following error: ERROR: spacy 2.0.18 has requirement regex==2018.01.10, but you'll have regex 2023.8.8 which is incompatible.. If I then add the requirement regex==2018.01.10 in the requirements.txt, I yield the following error: ERROR: parsimonious 0.10.0 has requirement regex>=2022.3.15, but you'll have regex 2018.1.10 which is incompatible.

In both instances, I get this message along with the error: Successfully built allennlp.

If I run the following commands, I would be able to run them successfully to proceed to testing:

python -c "import nltk; nltk.download('wordnet')"
python -m spacy download en_core_web_sm
pip install --editable .

However, the test failed with the following message:


========================================================= ERRORS ==========================================================
_________________________________ ERROR collecting tests/test_bert_pretraining_reader.py __________________________________
tests/test_bert_pretraining_reader.py:2: in <module>
    from kb.bert_pretraining_reader import BertPreTrainingReader, \
kb/bert_pretraining_reader.py:10: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:3: in <module>
    from allennlp.data.fields.field import DataArray, Field
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
__________________________ ERROR collecting tests/test_bert_tokenizer_and_candidate_generator.py __________________________
tests/test_bert_tokenizer_and_candidate_generator.py:4: in <module>
    from kb.bert_tokenizer_and_candidate_generator import BertTokenizerAndCandidateGenerator
kb/bert_tokenizer_and_candidate_generator.py:6: in <module>
    from allennlp.data.fields import Field, TextField, ListField, SpanField, ArrayField, LabelField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
__________________________________________ ERROR collecting tests/test_common.py __________________________________________
tests/test_common.py:6: in <module>
    from kb.common import F1Metric
kb/common.py:11: in <module>
    from allennlp.training.metrics.metric import Metric
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/training/__init__.py:1: in <module>
    from allennlp.training.trainer import Trainer
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/training/trainer.py:19: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
________________________________________ ERROR collecting tests/test_dict_field.py ________________________________________
tests/test_dict_field.py:4: in <module>
    from allennlp.data import Token, Vocabulary
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
______________________________________ ERROR collecting tests/test_entity_linking.py ______________________________________
tests/test_entity_linking.py:7: in <module>
    from allennlp.data import TokenIndexer, Vocabulary, Token
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_______________________________________ ERROR collecting tests/test_kg_embedding.py _______________________________________
tests/test_kg_embedding.py:5: in <module>
    from kb.kg_embedding import KGTupleReader, get_labels_tensor_from_indices, \
kb/kg_embedding.py:8: in <module>
    from allennlp.data import DatasetReader, Token, Vocabulary
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_____________________________________ ERROR collecting tests/test_kg_probe_reader.py ______________________________________
tests/test_kg_probe_reader.py:5: in <module>
    from allennlp.data import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_________________________________________ ERROR collecting tests/test_knowbert.py _________________________________________
tests/test_knowbert.py:8: in <module>
    from allennlp.data import Vocabulary, DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_________________________________________ ERROR collecting tests/test_metrics.py __________________________________________
tests/test_metrics.py:5: in <module>
    from kb.metrics import MeanReciprocalRank, MicroF1
kb/metrics.py:4: in <module>
    from allennlp.training.metrics.metric import Metric
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/training/__init__.py:1: in <module>
    from allennlp.training.trainer import Trainer
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/training/trainer.py:19: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
________________________________________ ERROR collecting tests/test_multitask.py _________________________________________
tests/test_multitask.py:5: in <module>
    from allennlp.data.dataset_readers import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
____________________________________ ERROR collecting tests/test_self_attn_iterator.py ____________________________________
tests/test_self_attn_iterator.py:6: in <module>
    from kb.self_attn_bucket_iterator import SelfAttnBucketIterator
kb/self_attn_bucket_iterator.py:11: in <module>
    from allennlp.data.dataset import Batch
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
___________________________________ ERROR collecting tests/test_span_attention_layer.py ___________________________________
tests/test_span_attention_layer.py:5: in <module>
    from kb.span_attention_layer import SpanAttentionLayer, SpanWordAttention
kb/span_attention_layer.py:7: in <module>
    from kb.common import get_dtype_for_module, extend_attention_mask_for_bert, get_linear_layer_init_identity, init_bert_weights
kb/common.py:11: in <module>
    from allennlp.training.metrics.metric import Metric
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/training/__init__.py:1: in <module>
    from allennlp.training.trainer import Trainer
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/training/trainer.py:19: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_______________________________________ ERROR collecting tests/test_wiki_reader.py ________________________________________
tests/test_wiki_reader.py:4: in <module>
    from allennlp.common.testing.test_case import AllenNlpTestCase
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/testing/__init__.py:5: in <module>
    from allennlp.common.testing.model_test_case import ModelTestCase
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/testing/model_test_case.py:7: in <module>
    from allennlp.commands.train import train_model_from_file
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/commands/__init__.py:8: in <module>
    from allennlp.commands.configure import Configure
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/commands/configure.py:27: in <module>
    from allennlp.service.config_explorer import make_app
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/service/config_explorer.py:24: in <module>
    from allennlp.common.configuration import configure, choices
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/common/configuration.py:17: in <module>
    from allennlp.data.dataset_readers import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_________________________________________ ERROR collecting tests/test_wordnet.py __________________________________________
tests/test_wordnet.py:7: in <module>
    from kb.wordnet import WordNetFineGrainedSenseDisambiguationReader
kb/wordnet.py:21: in <module>
    from allennlp.data import DatasetReader, Token, Vocabulary, Tokenizer
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_______________________________ ERROR collecting tests/evaluation/test_semeval2010_task8.py _______________________________
tests/evaluation/test_semeval2010_task8.py:3: in <module>
    from kb.include_all import SemEval2010Task8Reader, SemEval2010Task8Metric
kb/include_all.py:2: in <module>
    from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:8: in <module>
    from allennlp.data import DatasetReader, Token, Vocabulary
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_______________________________ ERROR collecting tests/evaluation/test_simple_classifier.py _______________________________
tests/evaluation/test_simple_classifier.py:3: in <module>
    from kb.include_all import SimpleClassifier, F1Metric
kb/include_all.py:2: in <module>
    from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:8: in <module>
    from allennlp.data import DatasetReader, Token, Vocabulary
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_________________________________ ERROR collecting tests/evaluation/test_tacred_reader.py _________________________________
tests/evaluation/test_tacred_reader.py:5: in <module>
    from allennlp.data import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
_______________________________ ERROR collecting tests/evaluation/test_ultra_fine_reader.py _______________________________
tests/evaluation/test_ultra_fine_reader.py:5: in <module>
    from allennlp.data import DatasetReader, DataIterator, Vocabulary
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
__________________________________ ERROR collecting tests/evaluation/test_wic_reader.py ___________________________________
tests/evaluation/test_wic_reader.py:3: in <module>
    from kb.include_all import  WicDatasetReader
kb/include_all.py:2: in <module>
    from kb.kg_embedding import KGTupleReader, KGTupleModel
kb/kg_embedding.py:8: in <module>
    from allennlp.data import DatasetReader, Token, Vocabulary
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/__init__.py:1: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/__init__.py:10: in <module>
    from allennlp.data.dataset_readers.ccgbank import CcgBankDatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/ccgbank.py:9: in <module>
    from allennlp.data.dataset_readers.dataset_reader import DatasetReader
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/dataset_readers/dataset_reader.py:4: in <module>
    from allennlp.data.instance import Instance
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/instance.py:4: in <module>
    from allennlp.data.fields import TextField, ListField, SpanField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/__init__.py:7: in <module>
    from allennlp.data.fields.array_field import ArrayField
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:10: in <module>
    class ArrayField(Field[numpy.ndarray]):
../miniconda3/envs/knowbert/lib/python3.6/site-packages/allennlp/data/fields/array_field.py:45: in ArrayField
    @overrides
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:83: in overrides
    return _overrides(method, check_signature, check_at_runtime)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:170: in _overrides
    _validate_method(method, super_class, check_signature)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/overrides.py:189: in _validate_method
    ensure_signature_is_compatible(super_method, method, is_static)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:102: in ensure_signature_is_compatible
    ensure_return_type_compatibility(super_type_hints, sub_type_hints, method_name)
../miniconda3/envs/knowbert/lib/python3.6/site-packages/overrides/signature.py:303: in ensure_return_type_compatibility
    f"{method_name}: return type `{sub_return}` is not a `{super_return}`."
E   TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field.Field`.
==================================================== warnings summary =====================================================
../miniconda3/envs/knowbert/lib/python3.6/site-packages/plac_ext.py:6
  /Users/ytchan/miniconda3/envs/knowbert/lib/python3.6/site-packages/plac_ext.py:6: DeprecationWarning: the imp module is deprecated in favour of importlib; see the module's documentation for alternative uses
    import imp

-- Docs: https://docs.pytest.org/en/stable/warnings.html
================================================= short test summary info =================================================
ERROR tests/test_bert_pretraining_reader.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.d...
ERROR tests/test_bert_tokenizer_and_candidate_generator.py - TypeError: ArrayField.empty_field: return type `None` is no...
ERROR tests/test_common.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field....
ERROR tests/test_dict_field.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.fi...
ERROR tests/test_entity_linking.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.field...
ERROR tests/test_kg_embedding.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields....
ERROR tests/test_kg_probe_reader.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fiel...
ERROR tests/test_knowbert.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.fiel...
ERROR tests/test_metrics.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field...
ERROR tests/test_multitask.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.fie...
ERROR tests/test_self_attn_iterator.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.f...
ERROR tests/test_span_attention_layer.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data...
ERROR tests/test_wiki_reader.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.f...
ERROR tests/test_wordnet.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.data.fields.field...
ERROR tests/evaluation/test_semeval2010_task8.py - TypeError: ArrayField.empty_field: return type `None` is not a `allen...
ERROR tests/evaluation/test_simple_classifier.py - TypeError: ArrayField.empty_field: return type `None` is not a `allen...
ERROR tests/evaluation/test_tacred_reader.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp....
ERROR tests/evaluation/test_ultra_fine_reader.py - TypeError: ArrayField.empty_field: return type `None` is not a `allen...
ERROR tests/evaluation/test_wic_reader.py - TypeError: ArrayField.empty_field: return type `None` is not a `allennlp.dat...
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 19 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================= 1 warning, 19 errors in 16.54s ==============================================

I am wondering if I did anything wrong? Would love your thoughts on how I can pass the tests. Thanks!

Additional note: Not sure if this is relevant, but it seems I am installing AllenNLP 0.8.3-unreleased instead of 0.8.2 as instructed in issue #39

Problem with evaluating tacred

Hi, I tried your code in evaluating tacred, but came back with a bizarre error saying that in kb/multitask.py line 66, the file_path need to be dict(str, str), but has str, and I checked the file_path, it's the same as the evaluation_file you outlined (in my case tacred_test.json). It should be str right? Can you help me with this?

kg_tuple is not a registered name for Model

I am running this command to extract the TuckER embeddings for the synsets from the trained model, python bin/combine_wordnet_embeddings.py --extract_tucker --tucker_archive_file $WORKDIR/wordnet_tucker/model.tar.gz --vocab_file $WORKDIR/wordnet_synsets_mask_null_vocab.txt --tucker_hdf5_file $WORKDIR/tucker_embeddings.hdf5 from here .

However, it shows the errors: allennlp.common.checks.ConfigurationError: 'kg_tuple is not a registered name for Model'. Could you tell me what should I do?

Different F1 scores with the same hyperparameters and same random seeds

Hello,

I trained a knowbert language model on some medical data, and want to fine-tune it on a relation classification task. However, I get different Micro/Macro F1 scores every time a fine-tune it while keeping the same hyperparameters and the same random seeds.

Does anybody know what the problem could be ?

thanks in advance,

Can't seem to replicate perplexity for KnowBert-Wiki & KnowBert-W+W

Hello AllenAI team,

First of all, I'd like to say I find this approach brilliant and fascinating and I would like to thank you for your work.

I have been experimenting with KnowBert for a while now, and I seem to regularly run into perplexity issues with my custom variations of KnowBert. Without access to the original "pre-training" Wikipedia+books corpus (or as I have come to call it, the "re-training" corpus, so as not to confuse the "pre-training" phase that aims to integrate the KAR into BERT, and the actual pre-training of BERT itself), I had not gotten around to attempting to replicate the perplexity results for freshly (p)retrained KnowBert models that can be found in your paper.

I finally decided to attempt to replicate the results by gathering my own version of the Wikipedia + books corpus. I have subsequently trained the KARs and (p)retrained the KnowBert-Wordnet, KnowBert-Wiki, and KnowBert-W+W models using slightly modified versions of the JSONNET files in training_config/pretraining/ (the only modifications being changing URLs to local paths pointing to pre-downloaded files in order to run training offline).

I ran training with the following instruction:

allennlp train -s $OUTPUT_DIRECTORY --file-friendly-logging --include-package kb.include_all training_config/pretraining/knowbert_<variant[_linker]>_offline.jsonnet

I then evaluated perplexity of the retrained models on a heldout Wiki+Books shard using:

python bin/evaluate_perplexity.py -m  </path/to/file>/model.tar.gz -e </path/to/file>/shard_heldout.txt

I also fine-tuned the models on an NER task to see what the impact of high perplexity is on downstream tasks.

Here are the Perplexity and NER F1 results for each version of KnowBert (reproduced and reported in paper):

Model PPL NER F1
KnowBert-Wordnet (mine) 4.8 0.84
KnowBert-Wordnet (AllenAI) 4.1 N/A
KnowBert-Wiki (mine) 27 833.6 0.00
KnowBert-Wiki (AllenAI) 4.3 N/A
KnowBert-W+W (mine) 13 760.4 0.80
KnowBert-W+W (AllenAI) 3.5 N/A

As you can see, the perplexity and performance of my reproduction of KnowBert-Wordnet is consistent with the paper, but I cannot seem to get the Wiki and W+W editions to behave as expected. The perplexity of KnowBert-Wiki and KnowBert-W+W are actually more consistent with the perplexity of my custom KnowBert models. The NER performance is also puzzling, as the fine-tuned KnowBert-Wiki has not produced a single true positive in evaluation (it did, however, produce a few in validation).

My criterion for stopping the (p)retraining is running out of allocated computation time, i.e. 7 days of training on a cluster equipped with Nvidia V100 GPUs.

Can you discern anything I might be doing incorrectly? I haven't re-run the experiments very many times given how computationally expensive they are. I would think my wikipedia KAR is not just stuck in a really bad local optimum given the end-of-training report of the KAR (see below).

I would greatly appreciate any help or pointers to figure out what the source of my inability to replicate your results might be.
Thank you for your time, and I wish you continued success in your ongoing endeavours.
-- Guy

Annex: Wikipedia KAR end-of-training report excerpt:

  "training_wiki_el_recall": 0.9242674921278753,
  "training_wiki_el_f1": 0.9535818512195965,
  "training_wiki_span_precision": 0.9903611032129656,
  "training_wiki_span_recall": 0.9294710999626408,
  "training_wiki_span_f1": 0.9589504983205271,
  "training_loss": 0.00021481883407398835,```

【freshman】how to apply this model into a binary classification work.

I am a newer student in the NLP domain. I have read your paper and want to apply your model into a binary classification work. There are some questions that i am facing:
1.if I want to run a demo that you've provided, I don't know which file to run first to start.
2.If I want to apply your model into a sentences binary classification, I don't know how to use your model do a embedding work

sorry to disturb you for such easy question, I am looking forward your reply. Thank you in advance.

Binary Text Classification

How to finetune KnowBERT for binary text classification tasks?

For example, given a sequence of words predict its polarity.

Thanks!

Download link for downstream tasks

For the downstream task dataset OpenEntity, TACRED, SemEval2010 Task 8, and WiC, I can find the download link of TACRED,
WiC,
SemEval2010 Task 8,
but the download link of OpenEntity is inaccessible, could you provide us the OpenEntity dataset?

update!
The TACRED dataset (e.g., train.txt)I download from the link above is not suitable for the processing file (e.g., needs train.json).

Embeds sentences programmatically

Hi,

I want to produce the contextual embeddings of sentences programmatically with KnowBert but the README talks about how to reproduce results and how to fine tune on downstream tasks.

How can I use this repository to obtain only contextual embeddings?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.