GithubHelp home page GithubHelp logo

Comments (12)

vinid avatar vinid commented on June 29, 2024 1

Hi, I solved it following this: allenai/allennlp#2929, there is a related pull request.

when no credentials are found it falls back to unsigned requests

from kb.

matt-peters avatar matt-peters commented on June 29, 2024 1

Thanks for the fix -- I updated the upstream branch of allennlp to include this commit, see #14 . You can fix it by upgrading allennlp with "pip install git+git://github.com/matt-peters/allennlp.git@2d7ba1cb108428aaffe2dce875648253b44cb5ba"

from kb.

matt-peters avatar matt-peters commented on June 29, 2024 1

I'd suggest building a new environment to install the dependencies. It looks like you are trying to install the code in an existing environment with tensorflow and other packages that require incompatible versions of some packages from the KnowBert code. The README includes instructions how to build a new conda environment and install the code.

from kb.

matt-peters avatar matt-peters commented on June 29, 2024 1

It is the s3 references. You can replace any occurrence of s3://allennlp with https://allennlp.s3-us-west-2.amazonaws.com and it should work.

from kb.

matt-peters avatar matt-peters commented on June 29, 2024 1

FYI, the configuration files in training_config/ on master now use the https URL instead of s3

from kb.

NooneBug avatar NooneBug commented on June 29, 2024

I attach even the traceback:

<ipython-input-4-0df14ba8d78f> in <module>
      1 archive_file = 'https://allennlp.s3-us-west-2.amazonaws.com/knowbert/models/knowbert_wiki_wordnet_model.tar.gz'
      2 params = Params({"archive_file": archive_file})
----> 3 model = ModelArchiveFromParams.from_params(params=params)
      4 batcher = KnowBertBatchifier(archive_file)

~/knowBert/kb/kb/include_all.py in from_params(cls, vocab, params)
     48             archive = load_archive(archive_file, overrides=json.dumps({'model': overrides.as_dict()}))
     49         else:
---> 50             archive = load_archive(archive_file)
     51         return archive.model
     52 

~/knowBert/venv/lib/python3.7/site-packages/allennlp/models/archival.py in load_archive(archive_file, cuda_device, overrides, weights_file)
    228                        weights_file=weights_path,
    229                        serialization_dir=serialization_dir,
--> 230                        cuda_device=cuda_device)
    231 
    232     return Archive(model=model, config=config)

~/knowBert/venv/lib/python3.7/site-packages/allennlp/models/model.py in load(cls, config, serialization_dir, weights_file, cuda_device)
    327         # This allows subclasses of Model to override _load.
    328         # pylint: disable=protected-access
--> 329         return cls.by_name(model_type)._load(config, serialization_dir, weights_file, cuda_device)
    330 
    331     def extend_embedder_vocab(self, embedding_sources_mapping: Dict[str, str] = None) -> None:

~/knowBert/venv/lib/python3.7/site-packages/allennlp/models/model.py in _load(cls, config, serialization_dir, weights_file, cuda_device)
    265         # want the code to look for it, so we remove it from the parameters here.
    266         remove_pretrained_embedding_params(model_params)
--> 267         model = Model.from_params(vocab=vocab, params=model_params)
    268 
    269         # If vocab+embedding extension was done, the model initialized from from_params

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    287                 extras = {k: v for k, v in extras.items() if takes_arg(subclass.from_params, k)}
    288 
--> 289             return subclass.from_params(params=params, **extras)
    290         else:
    291             # This is not a base class, so convert our params and extras into a dict of kwargs.

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    298             else:
    299                 # This class has a constructor, so create kwargs for it.
--> 300                 kwargs = create_kwargs(cls, params, **extras)
    301 
    302             return cls(**kwargs)  # type: ignore

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in create_kwargs(cls, params, **extras)
    192 
    193             for key, value_params in params.pop(name, Params({})).items():
--> 194                 value_dict[key] = value_cls.from_params(params=value_params, **extras)
    195 
    196             kwargs[name] = value_dict

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    287                 extras = {k: v for k, v in extras.items() if takes_arg(subclass.from_params, k)}
    288 
--> 289             return subclass.from_params(params=params, **extras)
    290         else:
    291             # This is not a base class, so convert our params and extras into a dict of kwargs.

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    298             else:
    299                 # This class has a constructor, so create kwargs for it.
--> 300                 kwargs = create_kwargs(cls, params, **extras)
    301 
    302             return cls(**kwargs)  # type: ignore

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in create_kwargs(cls, params, **extras)
    157                     kwargs[name] = annotation.by_name(subparams)()
    158                 else:
--> 159                     kwargs[name] = annotation.from_params(params=subparams, **subextras)
    160             elif not optional:
    161                 # Not optional and not supplied, that's an error!

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    287                 extras = {k: v for k, v in extras.items() if takes_arg(subclass.from_params, k)}
    288 
--> 289             return subclass.from_params(params=params, **extras)
    290         else:
    291             # This is not a base class, so convert our params and extras into a dict of kwargs.

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    298             else:
    299                 # This class has a constructor, so create kwargs for it.
--> 300                 kwargs = create_kwargs(cls, params, **extras)
    301 
    302             return cls(**kwargs)  # type: ignore

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in create_kwargs(cls, params, **extras)
    157                     kwargs[name] = annotation.by_name(subparams)()
    158                 else:
--> 159                     kwargs[name] = annotation.from_params(params=subparams, **subextras)
    160             elif not optional:
    161                 # Not optional and not supplied, that's an error!

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    287                 extras = {k: v for k, v in extras.items() if takes_arg(subclass.from_params, k)}
    288 
--> 289             return subclass.from_params(params=params, **extras)
    290         else:
    291             # This is not a base class, so convert our params and extras into a dict of kwargs.

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/from_params.py in from_params(cls, params, **extras)
    300                 kwargs = create_kwargs(cls, params, **extras)
    301 
--> 302             return cls(**kwargs)  # type: ignore

~/knowBert/kb/kb/wordnet.py in __init__(self, embedding_file, entity_dim, entity_file, vocab_file, entity_h5_key, dropout, pos_embedding_dim, include_null_embedding)
    710             # includes special, e.g. '@@PADDING@@' -> '@@PADDING@@'
    711             entity_to_pos = {}
--> 712             with JsonFile(cached_path(entity_file), 'r') as fin:
    713                 for node in fin:
    714                     if node['type'] == 'synset':

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/file_utils.py in cached_path(url_or_filename, cache_dir)
     96     if parsed.scheme in ('http', 'https', 's3'):
     97         # URL, so get it from the cache (downloading if necessary)
---> 98         return get_from_cache(url_or_filename, cache_dir)
     99     elif os.path.exists(url_or_filename):
    100         # File, and it exists.

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/file_utils.py in get_from_cache(url, cache_dir)
    192     # Get eTag to add to filename, if it exists.
    193     if url.startswith("s3://"):
--> 194         etag = s3_etag(url)
    195     else:
    196         response = requests.head(url, allow_redirects=True)

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/file_utils.py in wrapper(url, *args, **kwargs)
    140     def wrapper(url: str, *args, **kwargs):
    141         try:
--> 142             return func(url, *args, **kwargs)
    143         except ClientError as exc:
    144             if int(exc.response["Error"]["Code"]) == 404:

~/knowBert/venv/lib/python3.7/site-packages/allennlp/common/file_utils.py in s3_etag(url)
    156     bucket_name, s3_path = split_s3_path(url)
    157     s3_object = s3_resource.Object(bucket_name, s3_path)
--> 158     return s3_object.e_tag
    159 
    160 

~/knowBert/venv/lib/python3.7/site-packages/boto3/resources/factory.py in property_loader(self)
    337             if self.meta.data is None:
    338                 if hasattr(self, 'load'):
--> 339                     self.load()
    340                 else:
    341                     raise ResourceLoadException(

~/knowBert/venv/lib/python3.7/site-packages/boto3/resources/factory.py in do_action(self, *args, **kwargs)
    503             # instance via ``self``.
    504             def do_action(self, *args, **kwargs):
--> 505                 response = action(self, *args, **kwargs)
    506                 self.meta.data = response
    507             # Create the docstring for the load/reload mehtods.

~/knowBert/venv/lib/python3.7/site-packages/boto3/resources/action.py in __call__(self, parent, *args, **kwargs)
     81                     operation_name, params)
     82 
---> 83         response = getattr(parent.meta.client, operation_name)(**params)
     84 
     85         logger.debug('Response: %r', response)

~/knowBert/venv/lib/python3.7/site-packages/botocore/client.py in _api_call(self, *args, **kwargs)
    274                     "%s() only accepts keyword arguments." % py_operation_name)
    275             # The "self" in this scope is referring to the BaseClient.
--> 276             return self._make_api_call(operation_name, kwargs)
    277 
    278         _api_call.__name__ = str(py_operation_name)

~/knowBert/venv/lib/python3.7/site-packages/botocore/client.py in _make_api_call(self, operation_name, api_params)
    571         else:
    572             http, parsed_response = self._make_request(
--> 573                 operation_model, request_dict, request_context)
    574 
    575         self.meta.events.emit(

~/knowBert/venv/lib/python3.7/site-packages/botocore/client.py in _make_request(self, operation_model, request_dict, request_context)
    590     def _make_request(self, operation_model, request_dict, request_context):
    591         try:
--> 592             return self._endpoint.make_request(operation_model, request_dict)
    593         except Exception as e:
    594             self.meta.events.emit(

~/knowBert/venv/lib/python3.7/site-packages/botocore/endpoint.py in make_request(self, operation_model, request_dict)
    100         logger.debug("Making request for %s with params: %s",
    101                      operation_model, request_dict)
--> 102         return self._send_request(request_dict, operation_model)
    103 
    104     def create_request(self, params, operation_model=None):

~/knowBert/venv/lib/python3.7/site-packages/botocore/endpoint.py in _send_request(self, request_dict, operation_model)
    130     def _send_request(self, request_dict, operation_model):
    131         attempts = 1
--> 132         request = self.create_request(request_dict, operation_model)
    133         context = request_dict['context']
    134         success_response, exception = self._get_response(

~/knowBert/venv/lib/python3.7/site-packages/botocore/endpoint.py in create_request(self, params, operation_model)
    114                 op_name=operation_model.name)
    115             self._event_emitter.emit(event_name, request=request,
--> 116                                      operation_name=operation_model.name)
    117         prepared_request = self.prepare_request(request)
    118         return prepared_request

~/knowBert/venv/lib/python3.7/site-packages/botocore/hooks.py in emit(self, event_name, **kwargs)
    354     def emit(self, event_name, **kwargs):
    355         aliased_event_name = self._alias_event_name(event_name)
--> 356         return self._emitter.emit(aliased_event_name, **kwargs)
    357 
    358     def emit_until_response(self, event_name, **kwargs):

~/knowBert/venv/lib/python3.7/site-packages/botocore/hooks.py in emit(self, event_name, **kwargs)
    226                  handlers.
    227         """
--> 228         return self._emit(event_name, kwargs)
    229 
    230     def emit_until_response(self, event_name, **kwargs):

~/knowBert/venv/lib/python3.7/site-packages/botocore/hooks.py in _emit(self, event_name, kwargs, stop_on_response)
    209         for handler in handlers_to_call:
    210             logger.debug('Event %s: calling handler %s', event_name, handler)
--> 211             response = handler(**kwargs)
    212             responses.append((handler, response))
    213             if stop_on_response and response is not None:

~/knowBert/venv/lib/python3.7/site-packages/botocore/signers.py in handler(self, operation_name, request, **kwargs)
     88         # this method is invoked to sign the request.
     89         # Don't call this method directly.
---> 90         return self.sign(operation_name, request)
     91 
     92     def sign(self, operation_name, request, region_name=None,

~/knowBert/venv/lib/python3.7/site-packages/botocore/signers.py in sign(self, operation_name, request, region_name, signing_type, expires_in, signing_name)
    158                     raise e
    159 
--> 160             auth.add_auth(request)
    161 
    162     def _choose_signer(self, operation_name, signing_type, context):

~/knowBert/venv/lib/python3.7/site-packages/botocore/auth.py in add_auth(self, request)
    355     def add_auth(self, request):
    356         if self.credentials is None:
--> 357             raise NoCredentialsError
    358         datetime_now = datetime.datetime.utcnow()
    359         request.context['timestamp'] = datetime_now.strftime(SIGV4_TIMESTAMP)

NoCredentialsError: Unable to locate credentials```

from kb.

Lee4396 avatar Lee4396 commented on June 29, 2024

Hi, I am getting the same error here, Could you tell me how you managed to solve it?

from kb.

Hadjerkhd avatar Hadjerkhd commented on June 29, 2024

Hello, I'm using one of your models "KnowBert", when trying to use the fix,

Thanks for the fix -- I updated the upstream branch of allennlp to include this commit, see #14 . You can fix it by upgrading allennlp with "pip install git+git://github.com/matt-peters/allennlp.git@2d7ba1cb108428aaffe2dce875648253b44cb5ba"

I got the following error messages of incompatibility :

ERROR: tensorflow 2.1.0 has requirement wrapt>=1.11.1, but you'll have wrapt 1.10.11 which is incompatible. ERROR: tensorflow-cpu 2.1.0 has requirement wrapt>=1.11.1, but you'll have wrapt 1.10.11 which is incompatible. ERROR: fr-core-news-sm 2.2.5 has requirement spacy>=2.2.2, but you'll have spacy 2.0.18 which is incompatible. ERROR: fr-core-news-md 2.2.5 has requirement spacy>=2.2.2, but you'll have spacy 2.0.18 which is incompatible. ERROR: en-core-web-sm 2.2.5 has requirement spacy>=2.2.2, but you'll have spacy 2.0.18 which is incompatible. ERROR: astroid 2.3.1 has requirement wrapt==1.11.*, but you'll have wrapt 1.10.11 which is incompatible.

from kb.

Hadjerkhd avatar Hadjerkhd commented on June 29, 2024

Hello,
even after using the fix (that was well installed, no dependency/version conflicts errors) I still have the same error, I did try using both the virtual env and default env :

`Traceback (most recent call last):
  File "/home/hkhaldi/anaconda3/bin/allennlp", line 8, in <module>
    sys.exit(run())
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/run.py", line 18, in run
    main(prog="allennlp")
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/commands/__init__.py", line 101, in main
    args.func(args)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/commands/train.py", line 103, in train_model_from_args
    args.force)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/commands/train.py", line 136, in train_model_from_file
    return train_model(params, serialization_dir, file_friendly_logging, recover, force)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/commands/train.py", line 184, in train_model
    pieces = TrainerPieces.from_params(params, serialization_dir, recover)  # pylint: disable=no-member
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/training/trainer.py", line 850, in from_params
    all_datasets = training_util.datasets_from_params(params)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/training/util.py", line 129, in datasets_from_params
    dataset_reader = DatasetReader.from_params(params.pop('dataset_reader'))
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/common/from_params.py", line 289, in from_params
    return subclass.from_params(params=params, **extras)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/common/from_params.py", line 300, in from_params
    kwargs = create_kwargs(cls, params, **extras)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/common/from_params.py", line 159, in create_kwargs
    kwargs[name] = annotation.from_params(params=subparams, **subextras)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/common/from_params.py", line 289, in from_params
    return subclass.from_params(params=params, **extras)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/allennlp/common/from_params.py", line 302, in from_params
    return cls(**kwargs)  # type: ignore
  File "./kb/bert_tokenizer_and_candidate_generator.py", line 54, in __init__
    bert_model_type, do_lower_case=do_lower_case
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/pytorch_pretrained_bert/tokenization.py", line 176, in from_pretrained
    resolved_vocab_file = cached_path(vocab_file, cache_dir=cache_dir)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/pytorch_pretrained_bert/file_utils.py", line 106, in cached_path
    return get_from_cache(url_or_filename, cache_dir)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/pytorch_pretrained_bert/file_utils.py", line 194, in get_from_cache
    etag = s3_etag(url)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/pytorch_pretrained_bert/file_utils.py", line 140, in wrapper
    return func(url, *args, **kwargs)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/pytorch_pretrained_bert/file_utils.py", line 156, in s3_etag
    return s3_object.e_tag
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/boto3/resources/factory.py", line 339, in property_loader
    self.load()
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/boto3/resources/factory.py", line 505, in do_action
    response = action(self, *args, **kwargs)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/boto3/resources/action.py", line 83, in __call__
    response = getattr(parent.meta.client, operation_name)(**params)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/client.py", line 316, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/client.py", line 613, in _make_api_call
    operation_model, request_dict, request_context)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/client.py", line 632, in _make_request
    return self._endpoint.make_request(operation_model, request_dict)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/endpoint.py", line 102, in make_request

    return self._send_request(request_dict, operation_model)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/endpoint.py", line 132, in _send_request
    request = self.create_request(request_dict, operation_model)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/endpoint.py", line 116, in create_request
    operation_name=operation_model.name)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/hooks.py", line 356, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/signers.py", line 90, in handler
    return self.sign(operation_name, request)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/signers.py", line 160, in sign
    auth.add_auth(request)
  File "/home/hkhaldi/anaconda3/lib/python3.7/site-packages/botocore/auth.py", line 357, in add_auth
    raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
`

Any thoughts about that ? Thanks in advance

from kb.

matt-peters avatar matt-peters commented on June 29, 2024

What command are you running? I've updated all the code/configuration files to replace references to s3 with the equivalent https request to avoid this issue. If you are still seeing errors I'd suggest:

  1. make sure you are using the latest master branch
  2. remove any stale previously cached/downloaded files from your machine. allennlp will cache downloaded files locally, possibly in ~/.allennlp/cache This step will force allennlp to re-download the model archive files.

from kb.

Hadjerkhd avatar Hadjerkhd commented on June 29, 2024

1- Yes I'm on the master branch
2- I tried removing files from cache but still not working

I'm using the following command to fine-tune Knowbert on relation extraction task (using SemEval2010Task8 dataset)

!allennlp train --file-friendly-logging --include-package kb.include_all config.jsonnet -s trained_model

using the config file:

{ "dataset_reader": { "type": "semeval2010_task8", "entity_masking": "entity_markers", "tokenizer_and_candidate_generator": { "type": "bert_tokenizer_and_candidate_generator", "bert_model_type": "s3://allennlp/knowbert/models/bert-base-uncased-tacred-entity-markers-vocab.txt", "do_lower_case": true, "entity_candidate_generators": { "wiki": { "type": "wiki", }, "wordnet": { "type": "wordnet_mention_generator", "entity_file": "s3://allennlp/knowbert/wordnet/entities.jsonl" } }, "entity_indexers": { "wiki": { "type": "characters_tokenizer", "namespace": "entity_wiki", "tokenizer": { "type": "word", "word_splitter": { "type": "just_spaces" } } }, "wordnet": { "type": "characters_tokenizer", "namespace": "entity_wordnet", "tokenizer": { "type": "word", "word_splitter": { "type": "just_spaces" } } } } } }, "iterator": { "iterator": { "type": "basic", "batch_size": 32 }, "type": "self_attn_bucket", "batch_size_schedule": "base-12gb-fp32" }, "model": { "model": { "type": "from_archive", "archive_file": "s3://allennlp/knowbert/models/knowbert_wiki_wordnet_model.tar.gz", }, "type": "simple-classifier", "bert_dim": 768, "concat_word_a_b": true, "include_cls": false, "metric_a": { "type": "semeval2010_task8_metric" }, "num_labels": 19, "task": "classification" }, "train_data_path": "train.json", #train data "validation_data_path": "dev.json", #dev data "trainer": { "cuda_device": 0, "gradient_accumulation_batch_size": 32, "learning_rate_scheduler": { "type": "slanted_triangular", "num_epochs": 3, "num_steps_per_epoch": 234.375 }, "num_epochs": 3, "num_serialized_models_to_keep": 1, "optimizer": { "type": "bert_adam", "b2": 0.98, "lr": 5e-05, "max_grad_norm": 1, "parameter_groups": [ [ [ "bias", "LayerNorm.bias", "LayerNorm.weight", "layer_norm.weight" ], { "weight_decay": 0 } ] ], "t_total": -1, "weight_decay": 0.01 }, "should_log_learning_rate": true, "validation_metric": "+f1" }, "vocabulary": { "directory_path": "s3://allennlp/knowbert/models/vocabulary_wordnet_wiki.tar.gz" } }

Could be the s3:// references in the config file be the cause of this error?

from kb.

Hadjerkhd avatar Hadjerkhd commented on June 29, 2024

This

It is the s3 references. You can replace any occurrence of s3://allennlp with https://allennlp.s3-us-west-2.amazonaws.com and it should work.

This is working for me, Thank you !!

from kb.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.