GithubHelp home page GithubHelp logo

olivierhagolle / peps_download Goto Github PK

View Code? Open in Web Editor NEW
95.0 95.0 47.0 55 KB

Tool to download Sentinel images from PEPS sentinel mirror site : https://peps.cnes.fr

Python 100.00%
data-downloader earth-observation remote-sensing sentinel-1 sentinel-2 sentinel-3

peps_download's Introduction

I am a researcher at CESBIO laboratory in France and an engineer at CNES, the French Space Agency. My research is focused on time series of optical remote sensing images with a high resolution. Being the "observation systems" team leader at CESBIO, I am also trying to understand passive and active microwaves, and in situ measurements, but there is a long way on that road. I am the main author of MAJA multi-temporal cloud detection and atmospheric correction software. If you look at my codes, you will quickly find that regarding coding, I am an amateur, with good will...

To show some of our research, we are editing a very active blog: https://labo.obs-mip.fr/multitemp

peps_download's People

Contributors

ahuarte47 avatar bergsmae avatar floriandeboissieu avatar olivierhagolle avatar sebastienpeillet avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

peps_download's Issues

dates not working as expected

Asking from 10th january to 10th january:

python peps_download.py -c S2 -a /home/atos/cron/peps.txt -w /home/atos/cron/downloads -d 2016-01-10 -f 2016-01-10 --latmin=48.6197 --lonmin=13.9898 --latmax=50.3805 --lonmax=17.6958

This request show no results

Now asking from 11th january to 11th january:
python peps_download.py -c S2 -a /home/atos/cron/peps.txt -w /home/atos/cron/downloads -d 2016-01-11 -f 2016-01-11 --latmin=48.6197 --lonmin=13.9898 --latmax=50.3805 --lonmax=17.6958

No results.

But if I ask from 10th january to 11th january:

python peps_download.py -c S2 -a /home/atos/cron/peps.txt -w /home/atos/cron/downloads -d 2016-01-10 -f 2016-01-11 --latmin=48.6197 --lonmin=13.9898 --latmax=50.3805 --lonmax=17.6958

I get results, how is this possible? how may I know downloaded data date?

problem with peps_download tmp.tmp file not found

Hello,
thank you for this nice tool, i have been using it for quite some time but today something weird happened.

I do:
python peps_download.py -c S2 -a /home/atos/cron/peps.txt -w /home/atos/cron/downloads -d 2016-01-13T00:00:00 -f 2016-01-13T23:59:59 --latmin=38.7060 --lonmin=30.8088 --latmax=40.7339 --lonmax=33.9015

But I got:
File "peps_download.py", line 168, in
with open(tmpfile) as f_tmp:
IOError: [Errno 2] No such file or directory: '/home/atos/cron/downloads/tmp.tmp'

I do not know hwat the problem is, my first thought was permissions but I gave full permisions to the folder and nothing changed.

any idea?

error with password file

Hello,

I am running the script and have a peps.txt file that has a single line with: my email address, a space, my password. The email address and password work to get me into the peps website, however I keep getting "error with password file". I even tried deleting the special character as suggested in another issue, but I am still getting the same error message . Can someone help me?

Nothing happens in the command prompt

Hello,
I am executing a command which is given as an example for downloading S2 pictures over Toulouse. But nothing happens in my console. No error message but no pictures uploaded either.
I have python 2.7 and curl (which is present on windows 10).
Do you have any idea why it doesn't work?

Trouble with S2ST mode

In case I use S2ST mode the search breaks:

syntax:
python peps_download.py -c S2ST -l 'Toulouse' -a peps.txt -d 2017-01-01 -f 2017-02-01

return:
'startDate' is not recognized as an internal or external command,
operable program or batch file.
'completionDate' is not recognized as an internal or external command,
operable program or batch file.
'maxRecords' is not recognized as an internal or external command,
operable program or batch file.

The code finds just the disk-available data and discards the imposed date-limits

In case of a tile number, it gets worse:
syntax:
python peps_download.py -c S2ST -t 31TCJ -a peps.txt -d 2017-01-01 -f 2017-02-01

return:
'startDate' is not recognized as an internal or external command,
operable program or batch file.
'completionDate' is not recognized as an internal or external command,
operable program or batch file.
'maxRecords' is not recognized as an internal or external command,
operable program or batch file.
Value for "tileid" must follow the pattern ^[0-6][0-9]A-Za-z{0,2}%?$

What I can think of, either the parser is not working properly or the curl link rules to the database have changed.
Any suggestions?

problem with parsing data to download

I download new version of this project. After setting my user name and password in peps.txt. I did example:

python ./peps_download.py -c S2ST -l 'Toulouse' -a peps.txt -d 2017-01-01 -f 2017-02-0

as response to console I recived:

curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2ST/search.json?q=Toulouse\&startDate=2017-01-01\&completionDate=2017-02-01\&maxRecords=500 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 16671 0 16671 0 0 25079 0 --:--:-- --:--:-- --:--:-- 25069 (u'S2A_MSIL1C_20170126T105321_N0204_R051_T31TCJ_20170126T105612', u'unknown') (u'S2A_MSIL1C_20170116T105401_N0204_R051_T31TCJ_20170116T105355', u'unknown') ########################## 2 products to download ########################## % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 16671 0 16671 0 0 27746 0 --:--:-- --:--:-- --:--:-- 27785 (u'S2A_MSIL1C_20170126T105321_N0204_R051_T31TCJ_20170126T105612', u'unknown') (u'S2A_MSIL1C_20170116T105401_N0204_R051_T31TCJ_20170116T105355', u'unknown')

Nothing was download. I look into a code and I found that there is a problem in parse_catalog function. In line 63
print(prod, data["features"][i]["properties"]["storage"]["mode"])
data["features"][i]["properties"]["storage"]["mode"] # this is unknown
Some error occurs but is handled by simple pass in line 96

peps - limited data download

Hello everyone,
I'm using the peps tool to download Sentinel 2 data from 2016 to 2020 from a specific area. But I'm having the trouble that the tool downloads only 500 files, and if I try to download them again, the terminal show me the message that the 500 files are already on the folder, and doesn't download the other days that I need. Those 500 files are from 2020 to November 2019. Again, I need until 2016.

Is it a limit of the tool that can only download 500 files?

If anybody can help me, I'd be kindly grateful...Thanks for the help!!!

That's the my input line to download the files:
python ./peps_download.py -c S2ST --lonmin -41.1 --lonmax -38.5 --latmin -22.1 --latmax -17.5 -a peps.txt -d 2016-01-01 -f 2020-01-01

Failed to connect to proxy.truc.fr port 8080: Connection timed out

Bonjour Olivier,

When trying to download a tile, I encountered the following error:
Failed to connect to proxy.truc.fr port 8080: Connection timed out

python ./peps_download.py  -c S2 -l 'Toulouse' -a peps.txt -d 2015-12-15 -f 2015-12-01
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2/search.json?q=Toulouse\&startDate=2015-12-15\&completionDate=2015-12-01\&maxRecords=500
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:02:10 --:--:--     0curl: (7) Failed to connect to proxy.truc.fr port 8080: Connection timed out
Traceback (most recent call last):
  File "./peps_download.py", line 307, in <module>
    prod, download_dict, storage_dict, size_dict = parse_catalog(options.search_json_file)
  File "./peps_download.py", line 47, in parse_catalog
    with open(search_json_file) as data_file:
FileNotFoundError: [Errno 2] No such file or directory: 'search.json'

Do you know how to avoid this problem? Do you have a tutorial to connect through a proxy?

Best regards,

Virginie

Cannot use the peps_download.py on Windows

Hi Olivier,
Mi colleagues and I are willing to use the code peps_download.py on windows but we get errors with the parameters. On linux it works fine, but we need to use it on windows as well.
The parameters in which we get errors are "lat" "lon" and dates.

Sentinel-1 download provide 0KO zip file (probably files on tape)

Hello,

after running the tool over French guyana to download data with these settings (end of issue), among the 203 files found in the script, only 137 was well downloaded. Other one are well in .zip but with a size equal to 0 KO. I don't control all of these files but some of that are on tape. May be it's the problem.

However, after the download, I delete the wrong files and run again the script without changing anything in order to complete the download and it's seems working well (not yet finished)

I use a xubuntu over OVH server.

Best regards

Example of one file that was not well downloaded:
S1B_IW_GRDH_1SDV_20170823T214212_20170823T214241_007073_00C75F_E440.zip
S1B_IW_GRDH_1SDV_20170730T214239_20170730T214308_006723_00BD39_5883.zip
S1A_IW_GRDH_1SDV_20170829T214323_20170829T214348_018144_01E7AE_17C8.zip
S1B_IW_GRDH_1SDV_20170811T214211_20170811T214240_006898_00C24E_21DA.zip
S1A_IW_GRDH_1SDV_20171004T214324_20171004T214349_018669_01F7D5_C9F9.zip
S1A_IW_GRDH_1SDV_20170805T214347_20170805T214412_017794_01DD18_B2BC.zip
S1A_IW_GRDH_1SDV_20171109T214255_20171109T214324_019194_0207E6_9D21.zip
S1A_IW_GRDH_1SDV_20170724T214346_20170724T214411_017619_01D7BE_DA1D.zip
S1A_IW_GRDH_1SDV_20170829T214348_20170829T214413_018144_01E7AE_B8CE.zip
S1A_IW_GRDH_1SDV_20171028T214255_20171028T214324_019019_02027F_A9F0.zip
S1A_IW_GRDH_1SDV_20170817T214347_20170817T214412_017969_01E264_4B99.zip
S1B_IW_GRDH_1SDV_20170811T214240_20170811T214308_006898_00C24E_B2AD.zip
S1A_IW_GRDH_1SDV_20170817T214253_20170817T214322_017969_01E264_30D8.zip
S1A_IW_GRDH_1SDV_20170817T214322_20170817T214347_017969_01E264_E56A.zip
S1A_IW_GRDH_1SDV_20171004T214255_20171004T214324_018669_01F7D5_F51A.zip
S1A_IW_GRDH_1SDV_20170822T215151_20170822T215216_018042_01E4A7_8E71.zip
S1B_IW_GRDH_1SDV_20170804T215040_20170804T215105_006796_00BF5A_107E.zip
S1B_IW_GRDH_1SDV_20170909T215013_20170909T215042_007321_00CE9F_4E20.zip
S1B_IW_GRDH_1SDV_20170804T215105_20170804T215131_006796_00BF5A_05A8.zip
S1A_IW_GRDH_1SDV_20170903T215126_20170903T215151_018217_01E9EE_8425.zip
S1A_IW_GRDH_1SDV_20170903T215057_20170903T215126_018217_01E9EE_DEBB.zip
S1A_IW_GRDH_1SDV_20170810T215150_20170810T215216_017867_01DF58_5144.zip
S1B_IW_GRDH_1SDV_20171108T215042_20171108T215107_008196_00E7C9_A8C6.zip
S1A_IW_GRDH_1SDV_20170810T215056_20170810T215125_017867_01DF58_2EDC.zip
S1B_IW_GRDH_1SDV_20171027T215043_20171027T215108_008021_00E2BF_9D8F.zip
S1B_IW_GRDH_1SDV_20170804T215011_20170804T215040_006796_00BF5A_D205.zip
S1B_IW_GRDH_1SDV_20171108T215013_20171108T215042_008196_00E7C9_48D6.zip
S1B_IW_GRDH_1SDV_20171027T215014_20171027T215043_008021_00E2BF_A9CA.zip
S1B_IW_GRDH_1SDV_20170816T215012_20170816T215041_006971_00C476_FA2C.zip
S1B_IW_GRDH_1SDV_20171108T215107_20171108T215133_008196_00E7C9_60B7.zip
S1B_IW_GRDH_1SDV_20171027T215108_20171027T215133_008021_00E2BF_46FE.zip
S1B_IW_GRDH_1SDV_20170816T215041_20170816T215106_006971_00C476_F284.zip
S1A_IW_GRDH_1SDH_20171023T091157_20171023T091226_018938_02000F_C7AD.zip
S1A_IW_GRDH_1SDV_20170729T215150_20170729T215215_017692_01DA02_0688.zip
S1A_IW_GRDH_1SDV_20170822T215126_20170822T215151_018042_01E4A7_D86A.zip
S1A_IW_GRDH_1SDV_20180206T215124_20180206T215149_020492_0230C6_9DB3.zip
S1A_IW_GRDH_1SDV_20170810T215125_20170810T215150_017867_01DF58_2669.zip
S1A_IW_GRDH_1SDV_20170822T215057_20170822T215126_018042_01E4A7_0F58.zip
S1B_IW_GRDH_1SDV_20170816T215106_20170816T215131_006971_00C476_A039.zip
S1B_IW_GRDH_1SDV_20170909T215042_20170909T215107_007321_00CE9F_D0F5.zip

At the contrary files well downloaded
S1A_IW_GRDH_1SDV_20180108T214347_20180108T214412_020069_022343_B906.zip
S1A_IW_GRDH_1SDV_20180201T214346_20180201T214411_020419_022E65_EC8E.zip
S1A_IW_GRDH_1SDV_20170910T214323_20170910T214348_018319_01ED1D_E048.zip
S1A_IW_GRDH_1SDV_20171203T214254_20171203T214323_019544_0212E3_C69F.zip
S1A_IW_GRDH_1SDV_20171215T214348_20171215T214413_019719_021857_D1B7.zip
S1A_IW_GRDH_1SDV_20180120T214321_20180120T214346_020244_0228D1_BCDC.zip
S1A_IW_GRDH_1SDV_20170922T214324_20170922T214349_018494_01F27C_3A45.zip
S1A_IW_GRDH_1SDV_20171227T214347_20171227T214412_019894_021DC4_94C7.zip
S1B_IW_GRDH_1SDV_20170904T214212_20170904T214241_007248_00CC74_FFFA.zip
S1A_IW_GRDH_1SDV_20171004T214349_20171004T214414_018669_01F7D5_2D41.zip
S1B_IW_GRDH_1SDV_20171221T214241_20171221T214309_008823_00FB62_4D29.zip
S1B_IW_GRDH_1SDV_20171010T214213_20171010T214242_007773_00DBA7_FBA3.zip
S1A_IW_GRDH_1SDV_20171203T214348_20171203T214413_019544_0212E3_6EE1.zip
S1B_IW_GRDH_1SDV_20171103T214213_20171103T214242_008123_00E5A6_23E9.zip
S1B_IW_GRDH_1SDV_20180114T214240_20180114T214308_009173_0106C1_7F29.zip
S1B_IW_GRDH_1SDV_20180126T214239_20180126T214308_009348_010C7A_74EA.zip
S1A_IW_GRDH_1SDV_20180201T214252_20180201T214321_020419_022E65_B79B.zip
S1A_IW_GRDH_1SDV_20170910T214254_20170910T214323_018319_01ED1D_B866.zip
S1B_IW_GRDH_1SDV_20180102T214240_20180102T214309_008998_01010F_3507.zip
S1B_IW_GRDH_1SDV_20180126T214210_20180126T214239_009348_010C7A_176D.zip
S1A_IW_GRDH_1SDV_20171215T214323_20171215T214348_019719_021857_B538.zip
S1A_IW_GRDH_1SDV_20180108T214322_20180108T214347_020069_022343_BC97.zip
S1A_IW_GRDH_1SDV_20180201T214321_20180201T214346_020419_022E65_720B.zip
S1A_IW_GRDH_1SDV_20180120T214252_20180120T214321_020244_0228D1_5778.zip
S1B_IW_GRDH_1SDV_20171115T214242_20171115T214310_008298_00EAE7_4584.zip
S1B_IW_GRDH_1SDV_20170904T214241_20170904T214309_007248_00CC74_E835.zip
S1B_IW_GRDH_1SDV_20171010T214242_20171010T214310_007773_00DBA7_CE61.zip
S1A_IW_GRDH_1SDV_20171227T214322_20171227T214347_019894_021DC4_C914.zip
S1A_IW_GRDH_1SDV_20171016T214255_20171016T214324_018844_01FD34_46C5.zip
S1A_IW_GRDH_1SDV_20170829T214254_20170829T214323_018144_01E7AE_CAF1.zip
S1B_IW_GRDH_1SDV_20180102T214211_20180102T214240_008998_01010F_98EC.zip
S1B_IW_GRDH_1SDV_20180114T214211_20180114T214240_009173_0106C1_DAE5.zip
S1A_IW_GRDH_1SDV_20170724T214321_20170724T214346_017619_01D7BE_8558.zip
S1B_IW_GRDH_1SDV_20170916T214242_20170916T214310_007423_00D19D_B083.zip
S1B_IW_GRDH_1SDV_20170916T214213_20170916T214242_007423_00D19D_947D.zip
S1B_IW_GRDH_1SDV_20171103T214242_20171103T214311_008123_00E5A6_231C.zip
S1B_IW_GRDH_1SDV_20171115T214213_20171115T214242_008298_00EAE7_14F0.zip
S1A_IW_GRDH_1SDV_20170922T214255_20170922T214324_018494_01F27C_FE37.zip
S1A_IW_GRDH_1SDV_20171215T214254_20171215T214323_019719_021857_045C.zip
S1A_IW_GRDH_1SDV_20180120T214346_20180120T214411_020244_0228D1_EA2E.zip
S1A_IW_GRDH_1SDV_20171121T214255_20171121T214324_019369_020D6C_055A.zip
S1A_IW_GRDH_1SDV_20171203T214323_20171203T214348_019544_0212E3_CBBC.zip
S1B_IW_GRDH_1SDV_20171221T214212_20171221T214241_008823_00FB62_4A82.zip
S1A_IW_GRDH_1SDV_20180108T214253_20180108T214322_020069_022343_4EB5.zip
S1A_IW_GRDH_1SDV_20171227T214253_20171227T214322_019894_021DC4_998F.zip
S1B_IW_GRDH_1SDV_20171209T214212_20171209T214241_008648_00F5D1_0893.zip
S1B_IW_GRDH_1SDV_20171209T214241_20171209T214310_008648_00F5D1_0132.zip
S1A_IW_GRDH_1SDV_20170922T214349_20170922T214414_018494_01F27C_899C.zip
S1A_IW_GRDH_1SDV_20171121T214349_20171121T214414_019369_020D6C_B2D9.zip
S1A_IW_GRDH_1SDV_20171016T214324_20171016T214349_018844_01FD34_97C1.zip
S1A_IW_GRDH_1SDV_20171121T214324_20171121T214349_019369_020D6C_5D62.zip
S1A_IW_GRDH_1SDV_20171016T214349_20171016T214414_018844_01FD34_E101.zip
S1B_IW_GRDH_1SDV_20170928T214213_20170928T214242_007598_00D6A3_EF3C.zip
S1B_IW_GRDH_1SDV_20170928T214242_20170928T214310_007598_00D6A3_32BF.zip
S1B_IW_GRDH_1SDV_20171127T214213_20171127T214242_008473_00F044_4E1B.zip
S1B_IW_GRDH_1SDV_20171127T214242_20171127T214310_008473_00F044_2E53.zip
S1A_IW_GRDH_1SDV_20171102T215059_20171102T215128_019092_0204C2_0611.zip
S1B_IW_GRDH_1SDV_20180131T215040_20180131T215105_009421_010EE9_770D.zip
S1A_IW_GRDH_1SDV_20171208T215151_20171208T215217_019617_021535_0FDE.zip
S1B_IW_GRDH_1SDV_20170921T215013_20170921T215042_007496_00D3BE_CEA6.zip
S1B_IW_GRDH_1SDV_20180107T215041_20180107T215106_009071_010379_DF62.zip
S1B_IW_GRDH_1SDV_20171202T215107_20171202T215133_008546_00F29F_8589.zip
S1A_IW_GRDH_1SDV_20171102T215128_20171102T215153_019092_0204C2_63A9.zip
S1A_IW_GRDH_1SDV_20171021T215153_20171021T215218_018917_01FF74_4E44.zip
S1A_IW_GRDH_1SDH_20171104T091226_20171104T091254_019113_020564_3FD7.zip
S1A_IW_GRDH_1SDV_20170927T215152_20170927T215218_018567_01F4BD_2159.zip
S1B_IW_GRDH_1SDV_20171226T215106_20171226T215132_008896_00FDC6_851A.zip
S1A_IW_GRDH_1SDV_20180125T215056_20180125T215125_020317_022B30_B31E.zip
S1A_IW_GRDH_1SDH_20171104T091157_20171104T091226_019113_020564_1559.zip
S1A_IW_GRDH_1SDV_20171114T215058_20171114T215127_019267_020A38_9E74.zip
S1A_IW_GRDH_1SDV_20170915T215127_20170915T215152_018392_01EF62_520A.zip
S1A_IW_GRDH_1SDV_20171208T215057_20171208T215126_019617_021535_34E7.zip
S1B_IW_GRDH_1SDV_20171003T215014_20171003T215043_007671_00D8C2_152B.zip
S1A_IW_GRDH_1SDH_20180103T091155_20180103T091224_019988_0220C0_F543.zip
S1A_IW_GRDH_1SDH_20171128T091156_20171128T091225_019463_021062_C4FC.zip
S1A_IW_GRDH_1SDV_20171220T215057_20171220T215126_019792_021AA8_8666.zip
S1A_IW_GRDH_1SDV_20171114T215152_20171114T215218_019267_020A38_2704.zip
S1B_IW_GRDH_1SDV_20171214T215013_20171214T215042_008721_00F82C_81A4.zip
S1A_IW_GRDH_1SDV_20171126T215152_20171126T215217_019442_020FC2_BFED.zip
S1A_IW_GRDH_1SDV_20171009T215058_20171009T215127_018742_01FA0D_3F88.zip
S1B_IW_GRDH_1SDV_20171003T215043_20171003T215108_007671_00D8C2_A0D4.zip
S1A_IW_GRDH_1SDV_20171126T215058_20171126T215127_019442_020FC2_322E.zip
S1A_IW_GRDH_1SDV_20170903T215151_20170903T215217_018217_01E9EE_EF29.zip
S1A_IW_GRDH_1SDV_20171126T215127_20171126T215152_019442_020FC2_1A9F.zip
S1B_IW_GRDH_1SDV_20171120T215042_20171120T215107_008371_00ED21_2B4A.zip
S1A_IW_GRDH_1SDV_20171021T215059_20171021T215128_018917_01FF74_C7BF.zip
S1A_IW_GRDH_1SDV_20171114T215127_20171114T215152_019267_020A38_8EA3.zip
S1A_IW_GRDH_1SDH_20180103T091224_20180103T091252_019988_0220C0_129A.zip
S1B_IW_GRDH_1SDV_20170921T215107_20170921T215133_007496_00D3BE_C4E9.zip
S1A_IW_GRDH_1SDV_20171009T215152_20171009T215218_018742_01FA0D_2455.zip
S1B_IW_GRDH_1SDV_20180107T215106_20180107T215131_009071_010379_9B07.zip
S1B_IW_GRDH_1SDV_20180119T215040_20180119T215105_009246_01092B_CF04.zip
S1B_IW_GRDH_1SDV_20171120T215013_20171120T215042_008371_00ED21_F9ED.zip
S1A_IW_GRDH_1SDV_20170915T215152_20170915T215217_018392_01EF62_1A7A.zip
S1A_IW_GRDH_1SDV_20170927T215127_20170927T215152_018567_01F4BD_0444.zip
S1B_IW_GRDH_1SDV_20171202T215042_20171202T215107_008546_00F29F_C125.zip
S1B_IW_GRDH_1SDV_20171202T215013_20171202T215042_008546_00F29F_150B.zip
S1A_IW_GRDH_1SDV_20171220T215126_20171220T215151_019792_021AA8_DC55.zip
S1A_IW_GRDH_1SDH_20180115T091154_20180115T091223_020163_02264B_D99B.zip
S1A_IW_GRDH_1SDH_20180127T091154_20180127T091223_020338_022BD6_A02C.zip
S1A_IW_GRDH_1SDV_20171102T215153_20171102T215218_019092_0204C2_7A33.zip
S1B_IW_GRDH_1SDV_20170909T215107_20170909T215132_007321_00CE9F_B8B9.zip
S1A_IW_GRDH_1SDV_20180101T215150_20180101T215216_019967_02201B_3497.zip
S1B_IW_GRDH_1SDV_20171015T215043_20171015T215108_007846_00DDAE_3945.zip
S1B_IW_GRDH_1SDV_20180131T215105_20180131T215130_009421_010EE9_203D.zip
S1A_IW_GRDH_1SDV_20180113T215056_20180113T215125_020142_0225A4_7611.zip
S1B_IW_GRDH_1SDV_20180131T215011_20180131T215040_009421_010EE9_F345.zip
S1A_IW_GRDH_1SDV_20180101T215056_20180101T215125_019967_02201B_2307.zip
S1B_IW_GRDH_1SDV_20180119T215105_20180119T215131_009246_01092B_253B.zip
S1B_IW_GRDH_1SDV_20171226T215041_20171226T215106_008896_00FDC6_EB33.zip
S1B_IW_GRDH_1SDV_20170921T215042_20170921T215107_007496_00D3BE_5FC9.zip
S1B_IW_GRDH_1SDV_20171015T215108_20171015T215133_007846_00DDAE_C338.zip
S1A_IW_GRDH_1SDV_20170729T215056_20170729T215125_017692_01DA02_94C9.zip
S1A_IW_GRDH_1SDV_20171208T215126_20171208T215151_019617_021535_491B.zip
S1A_IW_GRDH_1SDV_20170729T215125_20170729T215150_017692_01DA02_286C.zip
S1B_IW_GRDH_1SDV_20171003T215108_20171003T215133_007671_00D8C2_5B55.zip
S1A_IW_GRDH_1SDV_20180125T215125_20180125T215150_020317_022B30_D4FF.zip
S1B_IW_GRDH_1SDV_20171120T215107_20171120T215133_008371_00ED21_F73E.zip
S1B_IW_GRDH_1SDV_20171015T215014_20171015T215043_007846_00DDAE_AD40.zip
S1B_IW_GRDH_1SDV_20180119T215011_20180119T215040_009246_01092B_BCCC.zip
S1A_IW_GRDH_1SDV_20180113T215150_20180113T215216_020142_0225A4_3499.zip
S1A_IW_GRDH_1SDV_20170915T215058_20170915T215127_018392_01EF62_D7B9.zip
S1A_IW_GRDH_1SDV_20171009T215127_20171009T215152_018742_01FA0D_E60E.zip
S1B_IW_GRDH_1SDV_20171226T215012_20171226T215041_008896_00FDC6_4CDC.zip
S1A_IW_GRDH_1SDV_20180113T215125_20180113T215150_020142_0225A4_99A9.zip
S1A_IW_GRDH_1SDV_20171220T215151_20171220T215216_019792_021AA8_37D3.zip
S1B_IW_GRDH_1SDV_20171214T215107_20171214T215132_008721_00F82C_1B64.zip
S1A_IW_GRDH_1SDV_20170927T215058_20170927T215127_018567_01F4BD_076C.zip
S1B_IW_GRDH_1SDV_20180107T215012_20180107T215041_009071_010379_485B.zip
S1A_IW_GRDH_1SDH_20180115T091223_20180115T091252_020163_02264B_E8A9.zip
S1A_IW_GRDH_1SDH_20180127T091223_20180127T091251_020338_022BD6_1352.zip
S1A_IW_GRDH_1SDV_20180101T215125_20180101T215150_019967_02201B_CFB3.zip
S1A_IW_GRDH_1SDH_20171023T091226_20171023T091254_018938_02000F_9EB1.zip
S1A_IW_GRDH_1SDH_20171128T091225_20171128T091254_019463_021062_178E.zip
S1A_IW_GRDH_1SDV_20180125T215150_20180125T215215_020317_022B30_5C9E.zip
S1A_IW_GRDH_1SDV_20171021T215128_20171021T215153_018917_01FF74_5BF6.zip
S1B_IW_GRDH_1SDV_20171214T215042_20171214T215107_008721_00F82C_E2EA.zip

Sentinel_Sensor = 'S1'
Product_Type = 'GRD'
Sensor_Mode = 'IW'
Polarisation = 'VV VH'
Relative_Orbit_To_Process 120 and 47
Start_Date = '2016-11-01'
End_Date = '2018-02-10'

Error downloading test files

Hello, I am trying to download sentinel-2 data. I got a PEPS password which works on the PEPS website. But I get the below error trying one of the suggested download calls. Any suggestions (or other alternative methods for bulk sentinel-2 data) are appreciated.

 python ./peps_download.py -c S2 --lon 1 --lat 43.5 -a peps.txt -d 2015-11-01 -f 2015-12-01
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2/search.json?lat=43.500000\&lon=1.000000\&startDate=2015-11-01\&completionDate=2015-12-01\&maxRecords=500
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 50304    0 50304    0     0  38479      0 --:--:--  0:00:01 --:--:-- 38517
S2A_OPER_PRD_MSIL1C_PDMC_20151202T224525_R008_V20151130T105641_20151130T105641 2015-11-30T10:56:41Z
S2A_OPER_PRD_MSIL1C_PDMC_20160309T091505_R051_V20151113T105818_20151113T105818 2015-11-13T10:58:18Z
S2A_OPER_PRD_MSIL1C_PDMC_20160715T112823_R051_V20151113T105818_20151113T105818 2015-11-13T10:58:18Z
./tmp.tmp
curl -o ./tmp.tmp -k -u EMAIL:PASSWORD https://peps.cnes.fr/resto/collections/S2/65001467-4472-5d22-b552-5d611ddd0346/download/?issuerId=peps
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100    44  100    44    0     0     54      0 --:--:-- --:--:-- --:--:--    54
Result is a text file
{u'ErrorCode': 404, u'ErrorMessage': u'Not Found'}

Add tile ID to readme

Hi, this is a small request to add an example with the tile ID option to the Readme?

I had initially defined the tileid '-t' parameter wrong and retrieved the error:
Value for "tileid" must follow the pattern ^[0-6][0-9][A-Za-z]([A-Za-z]){0,2}%?$

It is working fine now that I am using it properly but might be helpful for new users to have it obvious in the examples.

Thanks so much for sharing this! I am using python 3 and it's working great, really speeding up S2 downloads. Cheers.

Update to continue using peps_download

Hi,

Some change in the code is necessary, line 390, to use peps_download.py as of today (30/01/2024):

if storage_dict[prod] == "disk":

Should be replaced by:

if storage_dict[prod] in ["disk", "tier2"]:

Error when trying to download Sentinel-2 images

I use the peps_download.py script to download Sentinel-2 images with the following command:
python peps_download.py -c S2 --lonmin -9.27 --lonmax 44.81 --latmin 34.49 --latmax 71.2 -a peps.txt -d 2016-08-02 -f 2016-08-11 -w /aphrodite02/data/ottt1/SentinelMilano2/

Today I received the following error from the script:
Result is a text file (might come from a wrong password file)
{u'ErrorCode': 420, u'ErrorMessage': u'week|10000000'}

Can someone help me with this?

Error: Value for "box" must follow pattern

Hi Olivier,

I am currently having issues with peps on a Ubuntu 18.04 machine running the following command:

$ python peps_download.py -d 2019-03-31 -f 2019-04-02 -a peps.txt --lonmax 1.8886830142 -c S1 --latmax 44.2478306835 -m IW --lonmin 0.4959285929 -n False -w . -p GRD --windows False --latmin 43.2382625528

The password file set to my credentials. I get the following output and error:

curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S1/search.json?box=0.4959285929,43.2382625528,1.8886830142,44.2478306835^&startDate=2019-03-31^&completionDate=2019-04-02^&maxRecords=500^&productType=GRD^&sensorMode=IW
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100    94  100    94    0     0     24      0  0:00:03  0:00:03 --:--:--    24
Value for "box" must follow the pattern ^[0-9\.\,\-]*$

The interesting thing is, though: When I run the same command on HAL, it works. Both times my interpreter is python 2.7.12.

Did you ever encounter this error?

Kind regards,
Peter

404 Not found

Any help for:
`sudo python ./peps_download.py -c S2 -l 'Denmark' -a peps.txt -d 2015-06-01 -f 2015-07-31
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2/search.json?q=Denmark\&startDate=2015-06-01\&completionDate=2015-07-31\&maxRecords=500
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 39500 0 39500 0 0 38979 0 --:--:-- 0:00:01 --:--:-- 38993
S2A_OPER_PRD_MSIL1C_PDMC_20160525T004131_R108_V20150730T103016_20150730T103016 disk
S2A_OPER_PRD_MSIL1C_PDMC_20160823T203138_R108_V20150730T103016_20150730T103016 disk
S2A_OPER_PRD_MSIL1C_PDMC_20160529T082535_R051_V20150726T105024_20150726T105024 tape

Stage tape product: S2A_OPER_PRD_MSIL1C_PDMC_20160529T082535_R051_V20150726T105024_20150726T105024
##########################
3 products to download
##########################
% Total % Received % Xferd Average Speed Time Time Time Current
% T o t a l D l o a%d R eUcpeliovaedd % TXoftearld ASvpeernatg e S pLeeefdt STpiemeed
T i m0e 0T i m e 0 Cu r r e n0t
0 0 0 0 - - : -D-l:o-a-d - -U:p-l-o:a-d- - -T:o-t-a:-- l 0Spent Left Speed
100 44 100 44 0 0 85 0 --:--:-- --:--:-- --:--:-- 85
100 39500 0 39500 0 0 57455 0 --:--:-- --:--:-- --:--:-- 57412
S2A_OPER_PRD_MSIL1C_PDMC_20160525T004131_R108_V20150730T103016_20150730T103016 disk
S2A_OPER_PRD_MSIL1C_PDMC_20160823T203138_R108_V20150730T103016_20150730T103016 disk
S2A_OPER_PRD_MSIL1C_PDMC_20160529T082535_R051_V20150726T105024_20150726T105024 tape

Download of product : S2A_OPER_PRD_MSIL1C_PDMC_20160823T203138_R108_V20150730T103016_20150730T103016
curl -o ./tmp_1523469183.24.tmp -k -u [email protected]:secret https://peps.cnes.fr/resto/collections/S2/b06fcf53-d547-5c69-9b87-5e31851e1499/download/?issuerId=peps
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 44 100 44 0 0 68 0 --:--:-- --:--:-- --:--:-- 68
Result is a text file (might come from a wrong password file)
{u'ErrorCode': 404, u'ErrorMessage': u'Not Found'}
`

Negatives latitudes

Hello,
I don't know how to download negative latitudes.
Is there a spécial syntax?

Here is my log:
oo@VM-XUBUNTU:/mnt/g_verbatim/peps_download-master$ python peps_download.py -a peps.txt -lon 167.2 --lat -22.2 -d 2019-06-17 -f 2019-09-07 -c S2ST
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2ST/search.json?q=on\&startDate=2019-06-17\&completionDate=2019-09-07\&maxRecords=500
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 942 100 942 0 0 596 0 0:00:01 0:00:01 --:--:-- 596

no product corresponds to selection criteria

Best regard,
Aurélien

IOError: [Errno 2] No such file or directory: 'search.json'

Hi!
Im from Brazil.

Using the following line:
$ python peps_download.py -c S2ST -a peps.txt -t 23kks -d 2018-01-01 -f 2018-12-27 -w /home/x12239181/Documentos/sentinel/teste_python/
I got this:
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2ST/search.json?tileid=23kks\&startDate=2018-01-01\&completionDate=2018-12-27\&maxRecords=500 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- 0:02:10 --:--:-- 0curl: (7) Failed to connect to peps.cnes.fr port 443: Tempo esgotado para conexão Traceback (most recent call last): File "peps_download.py", line 276, in <module> prod, download_dict, storage_dict, size_dict = parse_catalog(options.search_json_file) File "peps_download.py", line 47, in parse_catalog with open(search_json_file) as data_file: **IOError: [Errno 2] No such file or directory: 'search.json'**

My S.O.: Debian 9
Python: 2.7.15

Problem with searching data

I suppose that resto service for searching data in peps repo has changed. I can't get any data with use of peps_download. I tried to get the search service directly as bellow:

https://peps.cnes.fr/resto/api/collections/S2ST/search.json?box=14.0,49.0,24.0,55.0\&startDate=2018-05-01\&completionDate=2018-06-01\&maxRecords=5000

And I'm getting error:
{"ErrorMessage":"Value for "box" must follow the pattern ^[0-9\.\,\-]*$","ErrorCode":400}

Have you any documentation of peps API? Or can you confirm that something was changed, and what?

Parameter not recognized... and json.decoder.JSONDecodeError:

I got the error below when trying to run the program on python 3.9.0 on a windows 10 machine.
'''
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2ST/search.json?q='Toulouse'\&startDate=2017-01-01\&completionDate=2017-02-01\&maxRecords=500
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2353 0 2353 0 0 2353 0 --:--:-- 0:00:01 --:--:-- 1750
'startDate' is not recognized as an internal or external command,
operable program or batch file.
'completionDate' is not recognized as an internal or external command,
operable program or batch file.
'maxRecords' is not recognized as an internal or external command,
operable program or batch file.
Traceback (most recent call last):
File "C:\Users\nguyet78\Google Drive\DB\peps_download-python3\peps_download-python3\peps_download.py", line 245, in
prod, download_dict, storage_dict = parse_catalog(options.search_json_file)
File "C:\Users\nguyet78\Google Drive\DB\peps_download-python3\peps_download-python3\peps_download.py", line 45, in parse_catalog
data = json.load(data_file)
File "C:\python39\lib\json_init_.py", line 293, in load
return loads(fp.read(),
File "C:\python39\lib\json_init_.py", line 346, in loads
return _default_decoder.decode(s)
File "C:\python39\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\python39\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 2 column 2 (char 2)
'''

wrong URL ?

Hi Olivier,
when I use your script I get this:

python ./peps_download.py -c S1 -p GRD -l 'Toulouse' -a peps.txt -d 2015-11-01 -f 2015-12-01 -w /tmp
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S1/search.json?q=Toulouse\&startDate=2015-11-01\&completionDate=2015-12-01\&maxRecords=500\&productType=GRD\&sensorMode=
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 68932    0 68932    0     0   100k      0 --:--:-- --:--:-- --:--:--  101k
S1A_IW_GRDH_1SDV_20151126T174653_20151126T174718_008779_00C83F_ADCB 5fcbc1da-8e97-50a7-9228-17f10b24df55 2015-11-26T17:46:53.036Z disk
S1A_IW_GRDH_1SDV_20151125T060026_20151125T060051_008757_00C7A0_AF51 d2ba8d1f-da94-59e6-a6ce-5d29c6af0843 2015-11-25T06:00:26.343Z disk
S1A_IW_GRDH_1SDV_20151114T174712_20151114T174737_008604_00C35D_F501 3d9b5fca-0938-5614-a794-381eafd173b3 2015-11-14T17:47:12.758Z disk
S1A_IW_GRDH_1SDV_20151114T174647_20151114T174712_008604_00C35D_F136 5c97d076-0fd2-55f8-ad68-b63c8347886c 2015-11-14T17:46:47.757Z disk
S1A_IW_GRDH_1SDV_20151113T060026_20151113T060051_008582_00C2BD_DC5F 8ac17d36-5dab-5e54-afde-fc935ae6fb37 2015-11-13T06:00:26.469Z disk
S1A_IW_GRDH_1SDV_20151101T060024_20151101T060049_008407_00BE08_7776 9ae8aae0-16a5-53af-8e85-a51df810db96 2015-11-01T06:00:24.831Z disk

Download of product : S1A_IW_GRDH_1SDV_20151113T060026_20151113T060051_008582_00C2BD_DC5F
curl -o /tmp/tmp_1507277104.61.tmp -k -u [email protected]:mypasseword https://peps.cnes.fr/resto/collections/S1/8ac17d36-5dab-5e54-afde-fc935ae6fb37/download/?issuerId=peps
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100    44  100    44    0     0    132      0 --:--:-- --:--:-- --:--:--   132
Result is a text file (might come from a wrong password file)
{u'ErrorCode': 404, u'ErrorMessage': u'Not Found'}

And it comes that https://peps.cnes.fr/resto/collections/S1/8ac17d36-5dab-5e54-afde-fc935ae6fb37/download/?issuerId=peps do not exist.
Do you have any tips?
Cheers

Error running example code

Hello,
I try to run the example code on a Windows 10 and I get the following error:

(py27) C:\Users\jhammar\Anaconda3\envs\py27>python ./peps_download.py -c S2ST -l 'Toulouse' -a peps.txt -d 2017-01-01 -f 2017-02-01
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2ST/search.json?q='Toulouse'\&startDate=2017-01-01\&completionDate=2017-02-01\&maxRecords=500
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 93589 0 93589 0 0 93402 0 --:--:-- 0:00:01 --:--:-- 93402
'startDate' is not recognized as an internal or external command,
operable program or batch file.
'completionDate' is not recognized as an internal or external command,
operable program or batch file.
'maxRecords' is not recognized as an internal or external command,
operable program or batch file.
(u'S2A_MSIL1C_20201017T105041_N0209_R051_T31TCJ_20201017T130423', u'disk')
(u'S2B_MSIL1C_20201101T105209_N0209_R051_T31TCJ_20201101T120122', u'disk')
(u'S2B_MSIL1C_20201211T105349_N0209_R051_T31TCJ_20201211T130154', u'disk')
...
##########################
20 products to download
##########################
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 93590 0 93590 0 0 124k 0 --:--:-- --:--:-- --:--:-- 124k
'startDate' is not recognized as an internal or external command,
operable program or batch file.
'completionDate' is not recognized as an internal or external command,
operable program or batch file.
'maxRecords' is not recognized as an internal or external command,
operable program or batch file.
(u'S2A_MSIL1C_20201017T105041_N0209_R051_T31TCJ_20201017T130423', u'disk')
(u'S2B_MSIL1C_20201101T105209_N0209_R051_T31TCJ_20201101T120122', u'disk')
(u'S2B_MSIL1C_20201211T105349_N0209_R051_T31TCJ_20201211T130154', u'disk')
...
Traceback (most recent call last):
File "./peps_download.py", line 393, in
print("\nDownload of product : %s" % prod)
LookupError: unknown encoding: cp65001

does anyone have an idea what the problem might be?
Very thankful for any tips :)
Best,

Sentinel 1 A or B

Hi, how can I download only from sentinel 1 A or B data? In the script I just can download S1 Data.
Sorry for my english and thanks.

Cannot download data, error or date

Hi, I'm trying to download data using

$ ./peps_download.py -l 'Chypre' -a peps.txt -d 2016-01-01 -f 2016-04-10
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S2/search.json?q=Chypre\&startDate=2016-01-01\&completionDate=2016-04-10\&maxRecords=500
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1345 100 1345 0 0 2971 0 --:--:-- --:--:-- --:--:-- 3320
Not product matches the criteria

Executing the HTTP GET, I receive

{
ErrorMessage: "Value for "startDate" must follow the pattern ^[0-9]{4}-[0-9]{2}-[0-9]{2}(T[0-9]{2}:[0-9]{2}:[0-9]{2}(.[0-9]+)?(|Z|[+-][0-9]{2}:[0-9]{2}))?$",
ErrorCode: 400
}

Has something changed in the rocket service ?

Thanks

Frédéric

Downloader got stuck in the infinite loop

Dear Mr. Olivier,
Thank you for developing such a very effective tool to download the Sentinel products.

Recently, I have been trying to download Sentinel-2 images covering the French Guiana area in a period from 2015 until now.
However, whenever I execute the program, it gets stuck and does not download the remaining images.
It keeps saying: "5 remaining products are on tape, lets's wait 1 minute before trying again"
Could you please help me solve this problem?

Here is my command line:
python ./peps_download.py -c S2ST -t 21NZG -a peps.txt -d 2019-01-01 -f 2020-08-21 -p S2MSI1C

All the best,
Manh
error

Bound box of lat/lon-min not working as expected

I am calling the download function as:

python ./peps_download.py -c S2 --latmin 50.366051 --latmax 51.801397 --lonmin -116.934815 --lonmax -114.331055 -a peps.txt -d 2015-10-01 -f 2016-12-22 -w /home/.....

But the resulting tiles that are downloaded do not match the extent of lat/long. See image below (green dots are the corners of the extent I requested). Any idea what may be going wrong? Is my syntax correct?

Alternatively, if I know the tiles (i.e. UPS) that I want, is it possible to just download those tiles?

Thank you.

image

weird problem on peps_download

Hi,
I have been using for long time this nice tool, but not for some weeks and now I am back with it and now every time I run it:
python peps_download.py -c S2 -a /home/atos/cron/peps.txt -w /home/atos/cron/downloads -d 02/01/2016T00:00:00 -f 02/01/2016T23:59:59 --latmin=56.3814 --lonmin=22.1911 --latmax=57.0089 --lonmax=24.1131

I got:
Traceback (most recent call last):
File "peps_download.py", line 136, in
for i in range(len(data["features"])):
KeyError: 'features'

I a do not know why...
Any idea?
thank you in advance.

Search Sentinel-1 by relative orbit

Hi Olivier,

I wondered if there is an option to search Sentinel 1 images by relative orbit number or orbit direction in the code.

Kind regards,
Jesko

Values for "lat" must be numeric

Hi, I'm trying to use this simple script but I really don't understand the error, any ideas on how to fix this ?

python.exe .\peps\peps_download.py -a peps.txt -c S1 -p GRD --lon 0 --lat 0 -o 30 -w /Images/Radar -d 2019-06-01 -f 2019-06-24

% Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 68 100 68 0 0 68 0 0:00:01 --:--:-- 0:00:01 167

'lon' n’est pas reconnu en tant que commande interne ou externe, un programme exécutable ou un fichier de commandes.
'startDate' n’est pas reconnu en tant que commande interne ou externe, un programme exécutable ou un fichier de commandes.
'completionDate' n’est pas reconnu en tant que commande interne ou externe, un programme exécutable ou un fichier de commandes.
'maxRecords' n’est pas reconnu en tant que commande interne ou externe, un programme exécutable ou un fichier de commandes.
'productType' n’est pas reconnu en tant que commande interne ou externe, un programme exécutable ou un fichier de commandes.
'sensorMode' n’est pas reconnu en tant que commande interne ou externe, un programme exécutable ou un fichier de commandes.
curl -k -o search.json https://peps.cnes.fr/resto/api/collections/S1/search.json?lat=0.000000\&lon=0.000000\&startDate=2019-06-01\&completionDate=2019-06-24\&maxRecords=500\&productType=GRD\&sensorMode=
Value for "lat" must be numeric

top_secret special characters don't pass

I had some trouble with my peps password because I usually use special characters such as * $.
I change my password and it's work. Maybe you can fix it because someone can have same issue.
Bastien @ La TeleScop

No corresponding products

Hi,

I have used the peps_downloader successfully before but currently I cannot find any of the products I'm interested in (currently S1 products for Ireland). The script returns "no product corresponds to selection critera". I have tried to limit criteria as much as possible to see if I can find any product and I have also tried with the example scripts from GitHub (including S2 example scripts) but all attempts returned the same message. I suspect an issue with peps itself but I wanted to double-check.

Kind regards,
Jesko

ErrorCode:404

When I run your examples or my tests the answer is always the same:

Result is a text file
{u'ErrorCode': 404, u'ErrorMessage': u'Not Found'}

Trouble in downloading (JSONDecodeError)

Last 3 days there is no problem about downloading using this command. For today, i have the issue as the error below. How can i solve this problem?.

Error:

File "./peps_download.py", line 308, in
prod, download_dict, storage_dict, size_dict = parse_catalog(options.search_json_file)
File "./peps_download.py", line 48, in parse_catalog
data = json.load(data_file)
File "C:\Users\palm\AppData\Local\Programs\Python\Python37\lib\json_init_.py", line 296, in load
parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
File "C:\Users\palm\AppData\Local\Programs\Python\Python37\lib\json_init_.py", line 348, in loads
return _default_decoder.decode(s)
File "C:\Users\palm\AppData\Local\Programs\Python\Python37\lib\json\decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "C:\Users\palm\AppData\Local\Programs\Python\Python37\lib\json\decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

Value for "lat" must be numeric

Hello,
I try to run the example code: python ./peps_download.py -c S2 --lon 1 --lat 43.5 -a peps.txt -d 2015-11-01 -f 2015-12-01 -o 51
but get this error: Value for "lat" must be numeric
does anyone have an idea what might be wrong?
Best

Products on tape don't download

When downloading Sentinel 3 scenes, all the files are downloaded. However, the products that are on tape are skipped and the resulting zip file are 0 bytes. The products on disk are retrieved correctly.
No error messages to signal.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.