GithubHelp home page GithubHelp logo

pangeo-data / pangeo-cmip6-examples Goto Github PK

View Code? Open in Web Editor NEW
55.0 12.0 22.0 951 KB

Examples of analysis of CMIP6 data using xarray and dask

License: BSD 3-Clause "New" or "Revised" License

Dockerfile 0.01% Jupyter Notebook 100.00%

pangeo-cmip6-examples's Introduction

Pangeo CMIP6 Examples

Binder

Examples of analysis of CMIP6 data using xarray and dask.

Try these notebooks on pangeo.binder.io : Binder

See http://pangeo.io to learn more about the Pangeo project.

pangeo-cmip6-examples's People

Contributors

rabernat avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pangeo-cmip6-examples's Issues

not using local dask config

I don't think this repo is using its .dask/config.yaml, instead its defaulting to the dask configs from pangeo-stacks. It would be great if someone could update this binder setup to use the local dask config so we can force dask pods onto the worker-pool. We should also add:

      nodeSelector:
        dask-worker: True

to the dask-kubernetes template, like we do here: https://github.com/pangeo-data/pangeo-example-notebooks/blob/febe8ff2609d0284708115b0a589935d63d919c3/.dask/config.yaml#L21-L22

Xarray/Dask Exception in cmip6_precip_analysis

Xarray/dask are throwing a new error in this the cmip6_precip_analysis notebook.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
<ipython-input-13-89aa342a55bc> in <module>
      3     da = da.chunk({'lat': 1, 'lon': None, 'time': None})
      4     return xr_histogram(da, bins, ['lon', 'time'], density=False)
----> 5 pr_3hr_hist = ds.pr.groupby('time.year').apply(func)
      6 pr_3hr_hist

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/groupby.py in apply(self, func, shortcut, args, **kwargs)
    572         applied = (maybe_wrap_array(arr, func(arr, *args, **kwargs))
    573                    for arr in grouped)
--> 574         return self._combine(applied, shortcut=shortcut)
    575 
    576     def _combine(self, applied, restore_coord_dims=False, shortcut=False):

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/groupby.py in _combine(self, applied, restore_coord_dims, shortcut)
    576     def _combine(self, applied, restore_coord_dims=False, shortcut=False):
    577         """Recombine the applied objects like the original."""
--> 578         applied_example, applied = peek_at(applied)
    579         coord, dim, positions = self._infer_concat_args(applied_example)
    580         if shortcut:

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/utils.py in peek_at(iterable)
    152     """
    153     gen = iter(iterable)
--> 154     peek = next(gen)
    155     return peek, itertools.chain([peek], gen)
    156 

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/groupby.py in <genexpr>(.0)
    571             grouped = self._iter_grouped()
    572         applied = (maybe_wrap_array(arr, func(arr, *args, **kwargs))
--> 573                    for arr in grouped)
    574         return self._combine(applied, shortcut=shortcut)
    575 

<ipython-input-13-89aa342a55bc> in func(da)
      2 def func(da):
      3     da = da.chunk({'lat': 1, 'lon': None, 'time': None})
----> 4     return xr_histogram(da, bins, ['lon', 'time'], density=False)
      5 pr_3hr_hist = ds.pr.groupby('time.year').apply(func)
      6 pr_3hr_hist

<ipython-input-12-9c2fe48cd1a0> in xr_histogram(data, bins, dims, **kwargs)
     10                          output_dtypes=['f8'],
     11                          output_sizes={output_dim_name: len(bins_c)},
---> 12                          vectorize=True, dask='parallelized')
     13     res[output_dim_name] = output_dim_name, bins_c
     14     res[output_dim_name].attrs.update(data.attrs)

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/computation.py in apply_ufunc(func, input_core_dims, output_core_dims, exclude_dims, vectorize, join, dataset_join, dataset_fill_value, keep_attrs, kwargs, dask, output_dtypes, output_sizes, *args)
    967                                      join=join,
    968                                      exclude_dims=exclude_dims,
--> 969                                      keep_attrs=keep_attrs)
    970     elif any(isinstance(a, Variable) for a in args):
    971         return variables_vfunc(*args)

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/computation.py in apply_dataarray_vfunc(func, signature, join, exclude_dims, keep_attrs, *args)
    215 
    216     data_vars = [getattr(a, 'variable', a) for a in args]
--> 217     result_var = func(*data_vars)
    218 
    219     if signature.num_outputs > 1:

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/computation.py in apply_variable_ufunc(func, signature, exclude_dims, dask, output_dtypes, output_sizes, keep_attrs, *args)
    562             raise ValueError('unknown setting for dask array handling in '
    563                              'apply_ufunc: {}'.format(dask))
--> 564     result_data = func(*input_data)
    565 
    566     if signature.num_outputs == 1:

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/computation.py in func(*arrays)
    556                 return _apply_blockwise(
    557                     numpy_func, arrays, input_dims, output_dims,
--> 558                     signature, output_dtypes, output_sizes)
    559         elif dask == 'allowed':
    560             pass

/srv/conda/envs/notebook/lib/python3.7/site-packages/xarray/core/computation.py in _apply_blockwise(func, args, input_dims, output_dims, signature, output_dtypes, output_sizes)
    658 
    659     return blockwise(func, out_ind, *blockwise_args, dtype=dtype,
--> 660                      concatenate=True, new_axes=output_sizes)
    661 
    662 

/srv/conda/envs/notebook/lib/python3.7/site-packages/dask/array/blockwise.py in blockwise(func, out_ind, name, token, dtype, adjust_chunks, new_axes, align_arrays, concatenate, meta, *args, **kwargs)
    231         from .utils import compute_meta
    232 
--> 233         meta = compute_meta(func, dtype, *args[::2], **kwargs)
    234     if meta is not None:
    235         return Array(graph, out, chunks, meta=meta)

/srv/conda/envs/notebook/lib/python3.7/site-packages/dask/array/utils.py in compute_meta(func, _dtype, *args, **kwargs)
    118         # with np.vectorize, such as dask.array.routines._isnonzero_vec().
    119         if isinstance(func, np.vectorize):
--> 120             meta = func(*args_meta)
    121         else:
    122             try:

/srv/conda/envs/notebook/lib/python3.7/site-packages/numpy/lib/function_base.py in __call__(self, *args, **kwargs)
   2089             vargs.extend([kwargs[_n] for _n in names])
   2090 
-> 2091         return self._vectorize_call(func=func, args=vargs)
   2092 
   2093     def _get_ufunc_and_otypes(self, func, args):

/srv/conda/envs/notebook/lib/python3.7/site-packages/numpy/lib/function_base.py in _vectorize_call(self, func, args)
   2155         """Vectorized call to `func` over positional `args`."""
   2156         if self.signature is not None:
-> 2157             res = self._vectorize_call_with_signature(func, args)
   2158         elif not args:
   2159             res = func()

/srv/conda/envs/notebook/lib/python3.7/site-packages/numpy/lib/function_base.py in _vectorize_call_with_signature(self, func, args)
   2229                             for dims in output_core_dims
   2230                             for dim in dims):
-> 2231                 raise ValueError('cannot call `vectorize` with a signature '
   2232                                  'including new output dimensions on size 0 '
   2233                                  'inputs')

ValueError: cannot call `vectorize` with a signature including new output dimensions on size 0 inputs

Pangeo binder notebook throws exception when starting cluster

Hello! I work with @balaji-gfdl and @aradhakrishnanGFDL and I'm trying to learn more about Pangeo. I was able to launch the Pangeo binder (https://binder.pangeo.io/v2/gh/pangeo-data/pangeo_cmip6_examples/master) and start a notebook (cmip6_precip_analysis.ipynb or cmip6_PT_analysis.ipynb).

The first cell, which loads xarray, numpy, and matplotlib, runs OK. The second cell, which starts the dask cluster, fails with a python exception. Could someone take a look? I don't think it's only my environment as I asked another user to try and it also failed for them.

Thank you very much,
Chris Blanton

---------------------------------------------------------------------------
ApiException                              Traceback (most recent call last)
<ipython-input-2-9ac3597415a5> in <module>
      1 from dask.distributed import Client, progress
      2 from dask_kubernetes import KubeCluster
----> 3 cluster = KubeCluster(n_workers=20)
      4 client = Client(cluster)
      5 cluster

/srv/conda/envs/notebook/lib/python3.6/site-packages/dask_kubernetes/core.py in __init__(self, pod_template, name, namespace, n_workers, host, port, env, auth, **kwargs)
    239         if n_workers:
    240             try:
--> 241                 self.scale(n_workers)
    242             except Exception:
    243                 self.cluster.close()

/srv/conda/envs/notebook/lib/python3.6/site-packages/dask_kubernetes/core.py in scale(self, n)
    394         pods = self._cleanup_terminated_pods(self.pods())
    395         if n >= len(pods):
--> 396             self.scale_up(n, pods=pods)
    397             return
    398         else:

/srv/conda/envs/notebook/lib/python3.6/site-packages/dask_kubernetes/core.py in scale_up(self, n, pods, **kwargs)
    473                     new_pods.append(
    474                         self.core_api.create_namespaced_pod(
--> 475                             self.namespace, self.pod_template
    476                         )
    477                     )

/srv/conda/envs/notebook/lib/python3.6/site-packages/kubernetes/client/apis/core_v1_api.py in create_namespaced_pod(self, namespace, body, **kwargs)
   6113             return self.create_namespaced_pod_with_http_info(namespace, body, **kwargs)
   6114         else:
-> 6115             (data) = self.create_namespaced_pod_with_http_info(namespace, body, **kwargs)
   6116             return data
   6117 

/srv/conda/envs/notebook/lib/python3.6/site-packages/kubernetes/client/apis/core_v1_api.py in create_namespaced_pod_with_http_info(self, namespace, body, **kwargs)
   6204                                         _preload_content=params.get('_preload_content', True),
   6205                                         _request_timeout=params.get('_request_timeout'),
-> 6206                                         collection_formats=collection_formats)
   6207 
   6208     def create_namespaced_pod_binding(self, name, namespace, body, **kwargs):

/srv/conda/envs/notebook/lib/python3.6/site-packages/kubernetes/client/api_client.py in call_api(self, resource_path, method, path_params, query_params, header_params, body, post_params, files, response_type, auth_settings, async_req, _return_http_data_only, collection_formats, _preload_content, _request_timeout)
    332                                    body, post_params, files,
    333                                    response_type, auth_settings,
--> 334                                    _return_http_data_only, collection_formats, _preload_content, _request_timeout)
    335         else:
    336             thread = self.pool.apply_async(self.__call_api, (resource_path, method,

/srv/conda/envs/notebook/lib/python3.6/site-packages/kubernetes/client/api_client.py in __call_api(self, resource_path, method, path_params, query_params, header_params, body, post_params, files, response_type, auth_settings, _return_http_data_only, collection_formats, _preload_content, _request_timeout)
    166                                      post_params=post_params, body=body,
    167                                      _preload_content=_preload_content,
--> 168                                      _request_timeout=_request_timeout)
    169 
    170         self.last_response = response_data

/srv/conda/envs/notebook/lib/python3.6/site-packages/kubernetes/client/api_client.py in request(self, method, url, query_params, headers, post_params, body, _preload_content, _request_timeout)
    375                                          _preload_content=_preload_content,
    376                                          _request_timeout=_request_timeout,
--> 377                                          body=body)
    378         elif method == "PUT":
    379             return self.rest_client.PUT(url,

/srv/conda/envs/notebook/lib/python3.6/site-packages/kubernetes/client/rest.py in POST(self, url, headers, query_params, post_params, body, _preload_content, _request_timeout)
    264                             _preload_content=_preload_content,
    265                             _request_timeout=_request_timeout,
--> 266                             body=body)
    267 
    268     def PUT(self, url, headers=None, query_params=None, post_params=None, body=None, _preload_content=True,

/srv/conda/envs/notebook/lib/python3.6/site-packages/kubernetes/client/rest.py in request(self, method, url, query_params, headers, body, post_params, _preload_content, _request_timeout)
    220 
    221         if not 200 <= r.status <= 299:
--> 222             raise ApiException(http_resp=r)
    223 
    224         return r

ApiException: (422)
Reason: Unprocessable Entity
HTTP response headers: HTTPHeaderDict({'Audit-Id': 'f81ccb55-ffa2-4413-9b4d-cda5ca6736b4', 'Content-Type': 'application/json', 'Date': 'Wed, 31 Jul 2019 20:29:29 GMT', 'Content-Length': '1829'})
HTTP response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"Pod \"dask-pangeo-data-pan-_cmip6_examples-it30r1l7-c81e8998-1jh7fr\" is invalid: [metadata.generateName: Invalid value: \"dask-pangeo-data-pan-_cmip6_examples-it30r1l7-c81e8998-1\": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*'), metadata.name: Invalid value: \"dask-pangeo-data-pan-_cmip6_examples-it30r1l7-c81e8998-1jh7fr\": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*')]","reason":"Invalid","details":{"name":"dask-pangeo-data-pan-_cmip6_examples-it30r1l7-c81e8998-1jh7fr","kind":"Pod","causes":[{"reason":"FieldValueInvalid","message":"Invalid value: \"dask-pangeo-data-pan-_cmip6_examples-it30r1l7-c81e8998-1\": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*')","field":"metadata.generateName"},{"reason":"FieldValueInvalid","message":"Invalid value: \"dask-pangeo-data-pan-_cmip6_examples-it30r1l7-c81e8998-1jh7fr\": a DNS-1123 subdomain must consist of lower case alphanumeric characters, '-' or '.', and must start and end with an alphanumeric character (e.g. 'example.com', regex used for validation is '[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*')","field":"metadata.name"}]},"code":422}

SSL Cert Verification Error in xr.open_zarr

Hi! I've been running through the basic_search_and_load.ipynb notebook but I'm having issues with using xr.open_zarr. I can ping www.googleapis.com just fine, and so I'm not sure what the error is. I'm running on a Mac Mini Server 10.15.7.

ds = xr.open_zarr(mapper, consolidated=True)

SSLCertVerificationError                  Traceback (most recent call last)
/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/connector.py in _wrap_create_connection(self, req, timeout, client_error, *args, **kwargs)
    945             with CeilTimeout(timeout.sock_connect):
--> 946                 return await self._loop.create_connection(*args, **kwargs)  # type: ignore  # noqa
    947         except cert_errors as exc:

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/asyncio/base_events.py in create_connection(self, protocol_factory, host, port, ssl, family, proto, flags, sock, local_addr, server_hostname, ssl_handshake_timeout)
    988             sock, protocol_factory, ssl, server_hostname,
--> 989             ssl_handshake_timeout=ssl_handshake_timeout)
    990         if self._debug:

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/asyncio/base_events.py in _create_connection_transport(self, sock, protocol_factory, ssl, server_hostname, server_side, ssl_handshake_timeout)
   1016         try:
-> 1017             await waiter
   1018         except:

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/asyncio/sslproto.py in data_received(self, data)
    529         try:
--> 530             ssldata, appdata = self._sslpipe.feed_ssldata(data)
    531         except Exception as e:

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/asyncio/sslproto.py in feed_ssldata(self, data, only_handshake)
    188                 # Call do_handshake() until it doesn't raise anymore.
--> 189                 self._sslobj.do_handshake()
    190                 self._state = _WRAPPED

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/ssl.py in do_handshake(self)
    773         """Start the SSL/TLS handshake."""
--> 774         self._sslobj.do_handshake()
    775 

SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate (_ssl.c:1091)

The above exception was the direct cause of the following exception:

ClientConnectorCertificateError           Traceback (most recent call last)
<ipython-input-21-fadab0c7272e> in <module>()
      9 
     10 # open it using xarray and zarr
---> 11 ds = xr.open_zarr(mapper, consolidated=True)
     12 #ds

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/xarray/backends/zarr.py in open_zarr(store, group, synchronizer, chunks, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, consolidated, overwrite_encoded_chunks, chunk_store, decode_timedelta, use_cftime, **kwargs)
    665         group=group,
    666         consolidated=consolidated,
--> 667         chunk_store=chunk_store,
    668     )
    669     ds = maybe_decode_store(zarr_store)

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/xarray/backends/zarr.py in open_group(cls, store, mode, synchronizer, group, consolidated, consolidate_on_close, chunk_store)
    288         if consolidated:
    289             # TODO: an option to pass the metadata_key keyword
--> 290             zarr_group = zarr.open_consolidated(store, **open_kwargs)
    291         else:
    292             zarr_group = zarr.open_group(store, **open_kwargs)

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/zarr/convenience.py in open_consolidated(store, metadata_key, mode, **kwargs)
   1172 
   1173     # setup metadata sotre
-> 1174     meta_store = ConsolidatedMetadataStore(store, metadata_key=metadata_key)
   1175 
   1176     # pass through

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/zarr/storage.py in __init__(self, store, metadata_key)
   2672 
   2673         # retrieve consolidated metadata
-> 2674         meta = json_loads(store[metadata_key])
   2675 
   2676         # check format of consolidated metadata

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/fsspec/mapping.py in __getitem__(self, key, default)
    130         k = self._key_to_str(key)
    131         try:
--> 132             result = self.fs.cat(k)
    133         except self.missing_exceptions:
    134             if default is not None:

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/fsspec/asyn.py in cat(self, path, recursive, on_error, **kwargs)
    228             ex = next(filter(is_exception, out), False)
    229             if ex:
--> 230                 raise ex
    231         if (
    232             len(paths) > 1

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/gcsfs/core.py in _cat_file(self, path)
    824         """ Simple one-shot get of file data """
    825         u2 = self.url(path)
--> 826         headers, out = await self._call("GET", u2)
    827         return out
    828 

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/gcsfs/core.py in _call(self, method, path, json_out, info_out, *args, **kwargs)
    492                     headers=headers,
    493                     data=datain,
--> 494                     timeout=self.requests_timeout,
    495                 ) as r:
    496                     import json

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/client.py in __aenter__(self)
   1081 
   1082     async def __aenter__(self) -> _RetType:
-> 1083         self._resp = await self._coro
   1084         return self._resp
   1085 

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/client.py in _request(self, method, str_or_url, params, data, json, cookies, headers, skip_auto_headers, auth, allow_redirects, max_redirects, compress, chunked, expect100, raise_for_status, read_until_eof, proxy, proxy_auth, timeout, verify_ssl, fingerprint, ssl_context, ssl, proxy_headers, trace_request_ctx, read_bufsize)
    491                                 req,
    492                                 traces=traces,
--> 493                                 timeout=real_timeout
    494                             )
    495                     except asyncio.TimeoutError as exc:

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/connector.py in connect(self, req, traces, timeout)
    526 
    527             try:
--> 528                 proto = await self._create_connection(req, traces, timeout)
    529                 if self._closed:
    530                     proto.close()

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/connector.py in _create_connection(self, req, traces, timeout)
    867         else:
    868             _, proto = await self._create_direct_connection(
--> 869                 req, traces, timeout)
    870 
    871         return proto

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/connector.py in _create_direct_connection(self, req, traces, timeout, client_error)
   1021         else:
   1022             assert last_exc is not None
-> 1023             raise last_exc
   1024 
   1025     async def _create_proxy_connection(

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/connector.py in _create_direct_connection(self, req, traces, timeout, client_error)
   1003                     server_hostname=hinfo['hostname'] if sslcontext else None,
   1004                     local_addr=self._local_addr,
-> 1005                     req=req, client_error=client_error)
   1006             except ClientConnectorError as exc:
   1007                 last_exc = exc

/Users/odyssey/anaconda3/envs/TEST_ENV/lib/python3.7/site-packages/aiohttp/connector.py in _wrap_create_connection(self, req, timeout, client_error, *args, **kwargs)
    947         except cert_errors as exc:
    948             raise ClientConnectorCertificateError(
--> 949                 req.connection_key, exc) from exc
    950         except ssl_errors as exc:
    951             raise ClientConnectorSSLError(req.connection_key, exc) from exc

ClientConnectorCertificateError: Cannot connect to host www.googleapis.com:443 ssl:True [SSLCertVerificationError: (1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate (_ssl.c:1091)')]

link for Tier 4 experiments points to Tier 1

Looking at CMIP6 on Google for the first time, and trying to figure out what all is here.

On this page: https://docs.google.com/document/d/1yUx6jr9EdedCOLd--CPdTfGDwEwzPpCF6p1jRmqx-0Q/edit the link to the Tier 4 experiments doc points to the Tier 1 experiments doc.

What is the correct link to the Tier 4 experiments doc?

(Also, is this still the best repo to use for starting to explore CMIP6 on Google or has it been superseded?)

@naomi-henderson I see your name in the google doc, perhaps you can help?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.