GithubHelp home page GithubHelp logo

planetary-computer-sdk-for-python's Introduction

Planetary Computer SDK for Python

Python library for interacting with the Microsoft Planetary Computer.

For general questions or discussions about the Planetary Computer, use the microsoft/PlanetaryComputer repository.

Installation

pip install planetary-computer

If you have an API subscription key, you may provide it to the library by using the included configuration CLI:

planetarycomputer configure

Alternatively, a subscription key may be provided by specifying it in the PC_SDK_SUBSCRIPTION_KEY environment variable. A subcription key is not required for interacting with the service, however having one in place allows for less restricted rate limiting.

Usage

This library assists with signing Azure Blob Storage URLs. The sign function operates directly on an HREF string, as well as several PySTAC objects: Asset, Item, and ItemCollection. In addition, the sign function accepts a STAC API Client ItemSearch, which performs a search and returns the resulting ItemCollection with all assets signed.

Automatic signing

If you're using pystac-client we recommend you use its feature to automatically sign results with planetary_computer.sign_inplace:

import planetary_computer
import pystac_client

from pystac_client import Client
import planetary_computer, requests
api = Client.open(
   'https://planetarycomputer.microsoft.com/api/stac/v1',
   modifier=planetary_computer.sign_inplace,
)

Now all the results you get from that client will be signed.

Manual signing

Alternatively, you can manually call planetary_computer.sign on your results.

from pystac import Asset, Item, ItemCollection
from pystac_client import ItemSearch
import planetary_computer as pc


# The sign function may be called directly on the Item
raw_item: Item = ...
item: Item = pc.sign(raw_item)
# Now use the item however you want. All appropriate assets are signed for read access.

# The sign function also works with an Asset
raw_asset: Asset = raw_item.assets['SR_B4']
asset = pc.sign(raw_asset)

# The sign function also works with an HREF
raw_href: str = raw_asset.href
href = pc.sign(raw_href)

# The sign function also works with an ItemCollection
raw_item_collection = ItemCollection([raw_item])
item_collection = pc.sign(raw_item_collection)

# The sign function also accepts an ItemSearch, and signs the resulting ItemCollection
search = ItemSearch(
    url=...,
    bbox=...,
    collections=...,
    limit=...,
    max_items=...,
)
signed_item_collection = pc.sign(search)

Convenience methods

You'll occasionally need to interact with the Blob Storage container directly, rather than using STAC items. We include two convenience methods for this:

Development

The following steps may be followed in order to develop locally:

## Create and activate venv
python3 -m venv env
source env/bin/activate

## Install requirements
python3 -m pip install -r requirements-dev.txt

## Install locally
pip install -e .

## Format code
./scripts/format

## Run tests
./scripts/test

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

planetary-computer-sdk-for-python's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

planetary-computer-sdk-for-python's Issues

Missing link from Asset to owner Item after signing STAC Item

Signed STAC Item returned from pc.sign is missing links from Assets back to Item.

assert signed_item.assets[name].owner is item

In the example below second assert fails.

from pystac import Item
import planetary_computer as pc

item = Item.from_file("S2B_MSIL2A_20190629T212529_R043_T06VVN_20201006T080531.json")
assert item.assets["B01"].owner is item
signed_item = pc.sign(item)
assert signed_item.assets["B01"].owner is signed_item

Sign VRTs

GDAL 3.4 added new STACIT and STACA drivers: https://gdal.org/drivers/raster/stacit.html, https://gdal.org/drivers/raster/stacta.html

My understanding is that these hit a STAC endpoint and build up a VRT with URLs to the STAC items or assets.

gdalinfo "STACIT:\"https://planetarycomputer.microsoft.com/api/stac/v1/search?collections=naip&bbox=-100,40,-99,41&datetime=2019-01-01T00:00:00Z%2F..\":asset=image" > image.vrt

c/Users/taugspurger via 🐍 v3.10.0 via 🅒 gdal=3.4 took 2scat image.vrt
Driver: VRT/Virtual Raster
Files: /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909907_ne_14_060_20190709.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009963_se_14_060_20190709.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009964_se_14_060_20190709.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009964_sw_14_060_20190709.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909907_nw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39100/m_3910008_ne_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909906_nw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909906_ne_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909905_nw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909905_ne_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909904_nw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909904_ne_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909903_nw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909903_ne_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909902_nw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909902_ne_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909901_nw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909901_ne_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009957_se_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40100/m_4010064_se_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009963_sw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009962_sw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009962_se_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009961_sw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009961_se_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009960_sw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009960_se_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009959_sw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009959_se_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009958_sw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009958_se_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40099/m_4009957_sw_14_060_20190711.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39098/m_3909801_nw_14_060_20190713.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/40098/m_4009857_sw_14_060_20190713.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909908_nw_14_060_20190828.tif
       /vsicurl/https://naipeuwest.blob.core.windows.net/naip/v002/ks/2019/ks_60cm_2019/39099/m_3909908_ne_14_060_20190828.tif
Size is 161196, 25023
Coordinate System is:
PROJCRS["NAD83 / UTM zone 14N",
    BASEGEOGCRS["NAD83",
        DATUM["North American Datum 1983",
            ELLIPSOID["GRS 1980",6378137,298.257222101,
                LENGTHUNIT["metre",1]]],
        PRIMEM["Greenwich",0,
            ANGLEUNIT["degree",0.0174532925199433]],
        ID["EPSG",4269]],
    CONVERSION["UTM zone 14N",
        METHOD["Transverse Mercator",
            ID["EPSG",9807]],
        PARAMETER["Latitude of natural origin",0,
            ANGLEUNIT["degree",0.0174532925199433],
            ID["EPSG",8801]],
        PARAMETER["Longitude of natural origin",-99,
            ANGLEUNIT["degree",0.0174532925199433],
            ID["EPSG",8802]],
        PARAMETER["Scale factor at natural origin",0.9996,
            SCALEUNIT["unity",1],
            ID["EPSG",8805]],
        PARAMETER["False easting",500000,
            LENGTHUNIT["metre",1],
            ID["EPSG",8806]],
        PARAMETER["False northing",0,
            LENGTHUNIT["metre",1],
            ID["EPSG",8807]]],
    CS[Cartesian,2],
        AXIS["(E)",east,
            ORDER[1],
            LENGTHUNIT["metre",1]],
        AXIS["(N)",north,
            ORDER[2],
            LENGTHUNIT["metre",1]],
    USAGE[
        SCOPE["Engineering survey, topographic mapping."],
        AREA["North America - between 102°W and 96°W - onshore and offshore. Canada - Manitoba; Nunavut; Saskatchewan. United States (USA) - Iowa; Kansas; Minnesota; Nebraska; North Dakota; Oklahoma; South Dakota; Texas."],
        BBOX[25.83,-102,84,-96]],
    ID["EPSG",26914]]
Data axis to CRS axis mapping: 1,2
Origin = (408965.400000000023283,4435589.400000000372529)
Pixel Size = (0.600000000000000,-0.600000000000003)
Corner Coordinates:
Upper Left  (  408965.400, 4435589.400) (100d 4' 2.98"W, 40d 3'56.33"N)
Lower Left  (  408965.400, 4420575.600) (100d 3'55.41"W, 39d55'49.44"N)
Upper Right (  505683.000, 4435589.400) ( 98d56' 0.08"W, 40d 4'13.97"N)
Lower Right (  505683.000, 4420575.600) ( 98d56' 0.55"W, 39d56' 6.99"N)
Center      (  457324.200, 4428082.500) ( 99d29'59.87"W, 40d 0' 6.67"N)
Band 1 Block=128x128 Type=Byte, ColorInterp=Red
  Description = Red
Band 2 Block=128x128 Type=Byte, ColorInterp=Green
  Description = Green
Band 3 Block=128x128 Type=Byte, ColorInterp=Blue
  Description = Blue
Band 4 Block=128x128 Type=Byte, ColorInterp=Undefined
  Description = NIR
  Metadata:
    description=near-infrared

It'd be great if sign could handle this case. I imagine sign now taking

  • A string that looks like a VRT, find the URLs, and sign them. Returns a new string.
  • os.PathLike, which would infer if the contents look like a VRT (is that hard?). The output would be... a new file? written in place? a string?
  • Open file objects

Some of these are tricky, since sign already takes a string which it assumes is a URL. But I think it'd be convenient to stretch things a bit and shouldn't introduce any ambiguities.

Support pydantic>=2.0.0

Currently, a pip install planetary-computer followed by planetarycomputer configure will fail, because pydantic split settings into a separate package:

» planetarycomputer configure
Traceback (most recent call last):
  File "/home/taugspurger/src/Microsoft/planetary-computer-sdk-for-python/.direnv/python-3.10.10/bin/planetarycomputer", line 5, in <module>
    from planetary_computer.scripts.cli import app
  File "/home/taugspurger/src/Microsoft/planetary-computer-sdk-for-python/planetary_computer/__init__.py", line 4, in <module>
    from planetary_computer.sas import (
  File "/home/taugspurger/src/Microsoft/planetary-computer-sdk-for-python/planetary_computer/sas.py", line 20, in <module>
    from planetary_computer.settings import Settings
  File "/home/taugspurger/src/Microsoft/planetary-computer-sdk-for-python/planetary_computer/settings.py", line 11, in <module>
    class Settings(pydantic.BaseSettings):
  File "/home/taugspurger/src/Microsoft/planetary-computer-sdk-for-python/.direnv/python-3.10.10/lib/python3.10/site-packages/pydantic/__init__.py", line 206, in __getattr__
    return _getattr_migration(attr_name)
  File "/home/taugspurger/src/Microsoft/planetary-computer-sdk-for-python/.direnv/python-3.10.10/lib/python3.10/site-packages/pydantic/_migration.py", line 279, in wrapper
    raise PydanticImportError(
pydantic.errors.PydanticImportError: `BaseSettings` has been moved to the `pydantic-settings` package. See https://docs.pydantic.dev/2.0/migration/#basesettings-has-moved-to-pydantic-settings for more details.

A temporary workaround for users is to pip install planetary-computer pydantic-settings.


I'm not sure how to fix this. I'd rather not put a hard requirement on pydantic>=2.0.0, in case there are downstream dependencies unable to migrate. I don't think there's a way to use environment markers to specify an optional dependency, which is only needed depending on the version of some other package.

Ideally pydantic would provide some kind of pydantic[settings] extra that worked with both 1.x and 2.x.

It's probably easiest to just remove the usage of pydantic and drop the dependency.

Add retry / backoff logic

Azure SDKs have built-in support for retrying and backoff based on the response status codes. This package should do the same.

Ideally, we would

  1. Retry on 429 errors (too many requests) after the number of seconds in the message
  2. Retry on 50x errors and other connection issues

https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/core/azure-core/CLIENT_LIBRARY_DEVELOPER.md and https://devblogs.microsoft.com/azure-sdk/custom-transport-in-python-sdk-an-httpx-experiment/ have some background on the design of how Azure SDKs handle network I/O. I don't think we need a full-blown pluggable transport system yet, but we should gracefully handle retries.

pip install without hyphen

Shouldn't the pip install command be
pip install planetary-computer
rather than
pip install planetarycomputer
without the hyphen?

`sign_items` convenience method

It's a one-liner, but I find myself rewriting

items = search.items_as_collection()
items = pystac_client.ItemCollection([pc.sign_assets(item) for item in items.features])

a lot. Might be nice to have built-in.

Could be even nicer to be able to pass in a Search and have it load and sign the items without making an intermediate ItemCollection?

Sign DataFrames from stac-geoparquet

xref microsoft/PlanetaryComputer#200

It might be handy to take a semi-well defined DataFrame schema (maybe just an assets field with dict / object dtype?), like those returned by stac-geoparquet, and sign the URLs under it.

catalog = pystac_client.Client.open(
    "https://planetarycomputer.microsoft.com/api/stac/v1/",
    modifier=planetary_computer.sign_inplace,
)


asset = catalog.get_collection("io-lulc-9-class").assets["geoparquet-items"]

df = geopandas.read_parquet(
    asset.href, storage_options=asset.extra_fields["table:storage_options"]
)

df = planetary_computer.sign(df)

And now the URLs in df are signed.

Update to new version of pystac and pystac-client

Once pystac 1.0 is released we'll need to update this library. pystac-client will also move to pystac 1.0, which includes removing ItemCollection from pystac-client and moving to pystac. We should determine if we want to still keep pystac-client as a dependency, or potentially remove it to only work with pystac objects.

This issue is blocked until the new releases of the upstream libraries.

sign reference filesystems

For kerchunk / reference file systems, we end up with an object like

{
  "version": 1,
  "templates": {
    "a": "https://nasagddp.blob.core.windows.net/nex-gddp-cmip6/NEX/GDDP-CMIP6/ACCESS-CM2/historical/r1i1p1f1/hurs/hurs_day_ACCESS-CM2_historical_r1i1p1f1_gn_1950.nc"
  ...
  }
...
}

For users to access the data, they must sign the URLs in the values of templates.

for k, v in references["templates"].items():
    references["templates"][k] = planetary_computer.sign(v)

Unfortunately for us, it's just a plain dictionary, so we have to actually dig into the mapping object to check its structure (version, templates, refs as top-level keys might be enough). It might be worth asking kerchunk to expose a class with the same mapping interface.

Reconstruct STAC Items from geoparquet rows

Apologies if this isn't the right repo to ask this question, but I was looking to reconstruct PySTAC Items from geoparquet rows (https://planetarycomputer.microsoft.com/docs/quickstarts/stac-geoparquet/#Bulk-STAC-item-queries-with-GeoParquet). Is there a convenience method of the flavor item = planetary_computer.item_from_geoparquet_row(dataframe, index) or something similar that I'm not finding? If not, would this be the place to open a PR to add such functionality, or is there a better repo?

I realize iterating over rows of a geopandas dataframe is an antipattern, but I think it's required for my use case (bulk load STAC items, modify them, then write them back out). If there's a better way, any guidance would be appreciated.

`sign` raises for certain asset HREFs

Some recent changes to our STAC items exposed an issue with trying to sign all the assets in a STAC item. We're now including links to a tilejson. Calling sign(item) will sign all the assets, which raises when trying to sign the tilejson asset.

This example uses the STAC item at https://planetarycomputer-staging.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items/S2B_MSIL2A_20201228T190809_R013_T10TET.

import pystac
import planetary_computer

item = pystac.Item.from_file("https://planetarycomputer-staging.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items/S2B_MSIL2A_20201228T190809_R013_T10TET")  # note the `-staging`
planetary_computer.sign(item)

That raises with

---------------------------------------------------------------------------
HTTPError                                 Traceback (most recent call last)
/tmp/ipykernel_1243/775917750.py in <module>
      3 
      4 item = pystac.Item.from_file("https://planetarycomputer-staging.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items/S2B_MSIL2A_20201228T190809_R013_T10TET")
----> 5 planetary_computer.sign(item)

/srv/conda/envs/notebook/lib/python3.8/functools.py in wrapper(*args, **kw)
    873                             '1 positional argument')
    874 
--> 875         return dispatch(args[0].__class__)(*args, **kw)
    876 
    877     funcname = getattr(func, '__name__', 'singledispatch function')

/srv/conda/envs/notebook/lib/python3.8/site-packages/planetary_computer/sas.py in _sign_item(item)
    117     signed_item = item.clone()
    118     for key in signed_item.assets:
--> 119         signed_item.assets[key] = sign(signed_item.assets[key])
    120     return signed_item
    121 

/srv/conda/envs/notebook/lib/python3.8/functools.py in wrapper(*args, **kw)
    873                             '1 positional argument')
    874 
--> 875         return dispatch(args[0].__class__)(*args, **kw)
    876 
    877     funcname = getattr(func, '__name__', 'singledispatch function')

/srv/conda/envs/notebook/lib/python3.8/site-packages/planetary_computer/sas.py in _sign_asset(asset)
    133     """
    134     signed_asset = asset.clone()
--> 135     signed_asset.href = sign(signed_asset.href)
    136     return signed_asset
    137 

/srv/conda/envs/notebook/lib/python3.8/functools.py in wrapper(*args, **kw)
    873                             '1 positional argument')
    874 
--> 875         return dispatch(args[0].__class__)(*args, **kw)
    876 
    877     funcname = getattr(func, '__name__', 'singledispatch function')

/srv/conda/envs/notebook/lib/python3.8/site-packages/planetary_computer/sas.py in _sign_url(url)
     94         )
     95         response = requests.get(token_request_url, headers=headers)
---> 96         response.raise_for_status()
     97         token = SASToken(**response.json())
     98         if not token:

/srv/conda/envs/notebook/lib/python3.8/site-packages/requests/models.py in raise_for_status(self)
    941 
    942         if http_error_msg:
--> 943             raise HTTPError(http_error_msg, response=self)
    944 
    945     def close(self):

HTTPError: 404 Client Error: Not Found for url: https://planetarycomputer.microsoft.com/api/sas/v1/token/planetarycomputer-staging/api

The tilejson asset has an href of 'https://planetarycomputer-staging.microsoft.com/api/data/v1/item/tilejson.json?collection=sentinel-2-l2a&items=S2B_MSIL2A_20201228T190809_R013_T10TET&assets=visual-10m&bidx=1,2,3&nodata=0'. Looking at the values there:

     82     settings = Settings.get()
     83     account, container = parse_blob_url(url)
     84     token_request_url = f"{settings.sas_url}/{account}/{container}"
     85     token = TOKEN_CACHE.get(token_request_url)
     86 
     87     # Refresh the token if there's less than a minute remaining,
     88     # in order to give a small amount of buffer
     89     if not token or token.ttl() < 60:
     90         headers = (
     91             {"Ocp-Apim-Subscription-Key": settings.subscription_key}
     92             if settings.subscription_key
     93             else None
     94         )
     95         response = requests.get(token_request_url, headers=headers)
---> 96         response.raise_for_status()
     97         token = SASToken(**response.json())
     98         if not token:
     99             raise ValueError(f"No token found in response: {response.json()}")
    100         TOKEN_CACHE[token_request_url] = token
    101     return token.sign(url).href
    102 

ipdb>  pp url
'https://planetarycomputer-staging.microsoft.com/api/data/v1/item/tilejson.json?collection=sentinel-2-l2a&items=S2B_MSIL2A_20201228T190809_R013_T10TET&assets=visual-10m&bidx=1,2,3&nodata=0'
ipdb>  pp account, container
('planetarycomputer-staging', 'api')
ipdb>  pp token_request_url
'https://planetarycomputer.microsoft.com/api/sas/v1/token/planetarycomputer-staging/api'

I guess the issue is that we're assuming that the URL is pointing to blob storage, when in fact it's pointing to our data API. Does anyone have thoughts on how to handle this? Skip signing when it doesn't look like a blob storage URL?

Detect double-signing URLs

If you accidentally sign a URL twice, you'll get a URL that can't be opened by rasterio. I wonder if we can detect this case and either raise / warn / silently not double sign?

>>> import pystac
>>> import rasterio
>>> import planetary_computer as pc

>>> item = pystac.Item.from_file("https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items/S2A_MSIL2A_20210803T062221_R048_T40GCP_20210803T154836")
>>> signed_item = pc.sign(item)

>>> rasterio.open(pc.sign(signed_item).assets["B02"].href)

RasterioIOError: HTTP response code: 403

Error when Importing `planetary_computer` Package

I recently installed the planetary-computer package (v 1.0.0) from conda forge. It seems like this version of planetary-computer doesn't work with the version of pydantic that loads with the package. I am attempting to import it in a python 3.9 environment, but the import fails and I get the following error:

import planetary_computer

ImportError                               Traceback (most recent call last)
Cell In[2], line 9
      7 import numpy as np
      8 import pandas as pd
----> 9 import planetary_computer
     10 import rasterio
     11 import rasterio.features

File ~/envs/aug/lib/python3.9/site-packages/planetary_computer/__init__.py:4
      1 """Planetary Computer Python SDK"""
      2 # flake8:noqa
----> 4 from planetary_computer.sas import (
      5     sign,
      6     sign_inplace,
      7     sign_url,
      8     sign_item,
      9     sign_assets,
     10     sign_asset,
     11     sign_item_collection,
     12 )
     13 from planetary_computer.settings import set_subscription_key
     14 from planetary_computer._adlfs import get_adlfs_filesystem, get_container_client

File ~/envs/aug/lib/python3.9/site-packages/planetary_computer/sas.py:13
     11 import requests.adapters
     12 import packaging.version
---> 13 import pydantic
     14 from pydantic import BaseModel, Field
     15 from pystac import Asset, Item, ItemCollection, STACObjectType, Collection

File ~/envs/aug/lib/python3.9/site-packages/pydantic/__init__.py:13
      3 import pydantic_core
      4 from pydantic_core.core_schema import (
      5     FieldSerializationInfo,
      6     FieldValidationInfo,
   (...)
     10     ValidatorFunctionWrapHandler,
     11 )
---> 13 from . import dataclasses
     14 from ._internal._annotated_handlers import (
     15     GetCoreSchemaHandler as GetCoreSchemaHandler,
     16 )
     17 from ._internal._annotated_handlers import (
     18     GetJsonSchemaHandler as GetJsonSchemaHandler,
     19 )

File ~/envs/aug/lib/python3.9/site-packages/pydantic/dataclasses.py:11
      7 from typing import TYPE_CHECKING, Any, Callable, Generic, NoReturn, TypeVar, overload
      9 from typing_extensions import Literal, dataclass_transform
---> 11 from ._internal import _config, _decorators, _typing_extra
     12 from ._internal import _dataclasses as _pydantic_dataclasses
     13 from ._migration import getattr_migration

File ~/envs/aug/lib/python3.9/site-packages/pydantic/_internal/_decorators.py:15
     12 from typing_extensions import Literal, TypeAlias, is_typeddict
     14 from ..errors import PydanticUserError
---> 15 from ..fields import ComputedFieldInfo
     16 from ._core_utils import get_type_ref
     17 from ._internal_dataclass import slots_true

File ~/envs/aug/lib/python3.9/site-packages/pydantic/fields.py:18
     15 from pydantic_core import PydanticUndefined
     16 from typing_extensions import Unpack
---> 18 from . import types
     19 from ._internal import _decorators, _fields, _generics, _internal_dataclass, _repr, _typing_extra, _utils
     20 from .errors import PydanticUserError

File ~/envs/aug/lib/python3.9/site-packages/pydantic/types.py:32
     29 from pydantic_core import CoreSchema, PydanticCustomError, PydanticKnownError, core_schema
     30 from typing_extensions import Annotated, Literal, Protocol, deprecated
---> 32 from ._internal import (
     33     _annotated_handlers,
     34     _fields,
     35     _internal_dataclass,
     36     _known_annotated_metadata,
     37     _utils,
     38     _validators,
     39 )
     40 from ._migration import getattr_migration
     41 from .config import ConfigDict

File ~/envs/aug/lib/python3.9/site-packages/pydantic/_internal/_fields.py:12
      8 from typing import TYPE_CHECKING, Any
     10 from pydantic_core import PydanticUndefined
---> 12 from . import _typing_extra
     13 from ._config import ConfigWrapper
     14 from ._repr import Representation

File ~/envs/aug/lib/python3.9/site-packages/pydantic/_internal/_typing_extra.py:13
     10 from types import GetSetDescriptorType
     11 from typing import TYPE_CHECKING, Any, ForwardRef
---> 13 from typing_extensions import Annotated, Final, Literal, TypeAliasType, TypeGuard, get_args, get_origin
     15 if TYPE_CHECKING:
     16     from ._dataclasses import StandardDataclass

ImportError: cannot import name 'TypeAliasType' from 'typing_extensions' (/home/jovyan/.local/lib/python3.9/site-packages/typing_extensions.py)

The package versions that downloaded when I installed planetary computer were:

    package                    |            build
    ---------------------------|-----------------
    annotated-types-0.6.0      |     pyhd8ed1ab_0          17 KB  conda-forge
    planetary-computer-1.0.0   |     pyhd8ed1ab_0          18 KB  conda-forge
    pydantic-2.0.3             |     pyhd8ed1ab_1         245 KB  conda-forge
    pydantic-core-2.3.0        |   py39h9fdd4d6_0         1.3 MB  conda-forge
    python-dotenv-1.0.0        |     pyhd8ed1ab_1          23 KB  conda-forge
    ------------------------------------------------------------

I've tried installing earlier versions of pydantic instead, but I don't know which version is necessary for the missing component of the package. Thank you for your help!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.