GithubHelp home page GithubHelp logo

kleok / floodpy Goto Github PK

View Code? Open in Web Editor NEW
164.0 8.0 27.0 274.22 MB

Flood Python Toolbox

Home Page: https://floodpy.readthedocs.io/en/latest/

License: Other

Python 0.30% Shell 0.01% Jupyter Notebook 99.70%
flood flood-monitoring floods python remote-sensing sentinel-1 surface-water synthetic-aperture-radar

floodpy's Introduction

FLOODPY - FLOOD PYthon toolbox

GitHub license Release contributions welcome Documentation

Introduction

The FLOod Mapping PYthon toolbox is a free and open-source python toolbox for mapping of floodwater. It exploits the dense Sentinel-1 GRD intensity time series and is based on four processing steps. In the first step, a selection of Sentinel-1 images related to pre-flood (baseline) state and flood state is performed. In the second step, the preprocessing of the selected images is performed in order to create a co-registered stack with all the pre-flood and flood images. In the third step, a statistical temporal analysis is performed and a t-score map that represents the changes due to flood event is calculated. Finally, in the fourth step, a multi-scale iterative thresholding algorithm based on t-score map is performed to extract the final flood map. We believe that the end-user community can benefit by exploiting the FLOODPY's floodwater maps.

This is research code provided to you "as is" with NO WARRANTIES OF CORRECTNESS. Use at your own risk.

1. Installation

The installation notes below are tested only on Linux. Recommended setup: Python 3.9+, SNAP 9.0+

1.1 Install snap gpt including Sentinel-1 toolbox

You can download SNAP manually from here and install it using the following commands:

chmod +x install_snap.sh
./install_snap.sh

1.2 Account setup for downloading Sentinel-1 acquisitions

Even though we offer credentials (for demonstration reasons), we encourage you to create your own account in order to not encounter any problems due to traffic.

1.3 Account setup for downloading global atmospheric model data

Currently, FloodPy is based on ERA-5 data. ERA-5 data set is redistributed over the Copernicus Climate Data Store (CDS). You have to create a new account here if you don't own a user account yet. After the creation of your profile, you will find your user id (UID) and your personal API Key on your User profile page.

  • Option 1: create manually a .cdsapirc file under your HOME directory with the following information:
url: https://cds.climate.copernicus.eu/api/v2
key: UID:personal API Key
chmod +x install_CDS_key.sh
./install_CDS_key.sh

1.4 Download FLOODPY

You can download FLOODPY toolbox using the following command: git clone https://github.com/kleok/FLOODPY.git

1.5 Create python environment for FLOODPY

FLOODPY is written in Python3 and relies on several Python modules. We suggest to install them by using conda.

  • Using conda Create a new conda environement with required packages using the the file FLOODPY_env.yml.
conda env create -f path_to_FLOODPY/FLOODPY_env.yml

1.6 Set environmental variables (Optional)

Append to .bashrc file

export FLOODPY_HOME= path_of_the_FLOODPY_folder
export PYTHONPATH=${PYTHONPATH}:${FLOODPY_HOME}
export PATH=${PATH}:${FLOODPY_HOME}/floodpy

2. Running FLOODPY

FLOODPY generates a map with flooded regions based on Sentinel-1 GRD products and meteorological data. Sentinel-1 orbits are downloaded using the sentineleof You can run FLOODPY using the following jupyter notebooks as templates.

3. Documentation and citation

Algorithms implemented in the software are described in detail at our publications. If FLOODPY was useful for you, we encourage you to cite the following work:

  • Karamvasis K, Karathanassi V. FLOMPY: An Open-Source Toolbox for Floodwater Mapping Using Sentinel-1 Intensity Time Series. Water. 2021; 13(21):2943. https://doi.org/10.3390/w13212943

You can also have a look at other works that are using FLOODPY:

4. Contact us

Feel free to open an issue, comment or pull request. We would like to listen to your thoughts and your recommendations. Any help is very welcome! ❤️

floodpy's People

Contributors

alekfal avatar kleok avatar mylonasma avatar olyna avatar palexantonakis avatar scottstanie avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

floodpy's Issues

Release of FLOMPY LPS2022 version and remove LPS2022 funcs

  • Releasing a legacy LPS2022 version. ✔️
  • Remove all the LPS2022 functionalities including:
    1. Crop_delineation sub-package and methods. ✔️
    2. Crop-delineation_unet sub-package and methods. ✔️
  • Refactoring Sentinel 2 preprocessing procedures for future flood mapping applications using Sentinel 2. ✔️ (#44)

Floodpy beginposition error

Hello, I am a third year bachelor's student in Geographic Information System and I am using FLOODPY to do a flood mapping study in my area. I have encountered the following error several times. Please help me on how I can resolve this. Also I don't understand the days_back part. Would you be so kind as to give me a short explanation because in the documentation its not so clear. Below is the error message. Thank you.

error (1)

Support GFS meteorological data

GFS data can be used in case of non-availability of ERA5 data. This is relevant for the analysis of ongoing flood events.

requirements update

The requirements.txt doesn't contain pytorch, torchvision and other new dependencies.

ValueError: autodetected range of [-inf, inf] is not finite

I've modified the configuration file to select my desired date and lat-long:

# B1. The datetime of flood event (Format is YYYYMMDDTHHMMSS)
Flood_datetime = 20170910T043000

# C2. AOI BBOX (WGS84)
LONMIN=81.29
LATMIN=24.61
LONMAX=81.46
LATMAX=24.74

After running the command FLOODPYapp.py FLOODPYapp_Pine.cfg --end Floodwater_classification, I get the following error. The previous steps seem to work fine. Any advice would be greatly appreciated! Note: I am using WSL (Ubuntu 22.04.1 LTS) and the default cfg file works without any problems.

******************** step - Floodwater_classification ********************
Calculating Bi/multi modality mask...
Calculating global flood mask using global threshold...
Traceback (most recent call last):
  File "/mnt/c/Users/acardaras/Documents/FLOODPY-main/floodpy/FLOODPYapp.py", line 441, in <module>
    main()
  File "/mnt/c/Users/acardaras/Documents/FLOODPY-main/floodpy/FLOODPYapp.py", line 432, in main
    app.run(steps=inps.runSteps, plot=inps.plot)
  File "/mnt/c/Users/acardaras/Documents/FLOODPY-main/floodpy/FLOODPYapp.py", line 400, in run
    self.run_get_flood_map(sname)
  File "/mnt/c/Users/acardaras/Documents/FLOODPY-main/floodpy/FLOODPYapp.py", line 351, in run_get_flood_map
    self.Flood_local_map_RG_morph,] = Calc_flood_map(Preprocessing_dir = self.Preprocessing_dir,
  File "/home/acardaras/.local/lib/python3.10/site-packages/floodpy/Floodwater_classification/Classification.py", line 97, in Calc_flood_map
    glob_thresh = threshold_Otsu(t_scores_flatten)
  File "/home/acardaras/.local/lib/python3.10/site-packages/floodpy/Floodwater_classification/Thresholding_methods.py", line 51, in threshold_Otsu
    return threshold_otsu(data.flatten())
  File "/home/acardaras/.local/lib/python3.10/site-packages/skimage/filters/thresholding.py", line 364, in threshold_otsu
    counts, bin_centers = _validate_image_histogram(image, hist, nbins)
  File "/home/acardaras/.local/lib/python3.10/site-packages/skimage/filters/thresholding.py", line 306, in _validate_image_histogram
    counts, bin_centers = histogram(
  File "/home/acardaras/.local/lib/python3.10/site-packages/skimage/_shared/utils.py", line 394, in fixed_func
    return func(*args, **kwargs)
  File "/home/acardaras/.local/lib/python3.10/site-packages/skimage/exposure/exposure.py", line 266, in histogram
    hist, bin_centers = _histogram(image, nbins, source_range, normalize)
  File "/home/acardaras/.local/lib/python3.10/site-packages/skimage/exposure/exposure.py", line 300, in _histogram
    hist, bin_edges = np.histogram(image, bins=bins, range=hist_range)
  File "<__array_function__ internals>", line 5, in histogram
  File "/usr/lib/python3/dist-packages/numpy/lib/histograms.py", line 793, in histogram
    bin_edges, uniform_bins = _get_bin_edges(a, bins, range, weights)
  File "/usr/lib/python3/dist-packages/numpy/lib/histograms.py", line 426, in _get_bin_edges
    first_edge, last_edge = _get_outer_edges(a, range)
  File "/usr/lib/python3/dist-packages/numpy/lib/histograms.py", line 323, in _get_outer_edges
    raise ValueError(
ValueError: autodetected range of [-inf, inf] is not finite

AssertionError: Houston we've got a problem. We encountered a problem in the coregistration procedure due to a small number of images before flood.

Hello again! I'm running Floodpy for a new AOI however this error is showing:

image

I have increased the number of before_flood_days to 100 and still i get this error. I didn't get this in an earlier Floodpy version

I'm running FLOODPYapp.py FLOODPYapp_template.cfg all at once because doing with --dostep i would get an error when pre processing Sentinel 1 data.

Any way to solve this? Thanks!

Floodpy_app.query_S1_data(): TypeError: Cannot compare tz-naive and tz-aware datetime-like objects

@kleok ,

I've opened a new ticked cause the discussion that goes in the geometry issue is continuing elsewhere and may be confusing :)

To recapitulate:
When running Floodpy_app.query_S1_data() there comes this error:
TypeError: Cannot compare tz-naive and tz-aware datetime-like objects

So, to continue the discussion from here:

The plot is Ok and the .cdsapirc is Ok. I've added this to the code of Query_Sentinel_1_products.py, so you can see the formats that are parsed:

print(query_df.head())
print(query_df.index)

Output:

<class 'datetime.datetime'>
<class 'datetime.datetime'>
                                   @odata.mediaContentType  \
beginningDateTime                                            
2023-07-04 05:11:11.651000+00:00  application/octet-stream   
2023-07-05 16:59:05.125000+00:00  application/octet-stream   
2023-07-09 05:19:23.871000+00:00  application/octet-stream   
2023-07-10 17:06:57.845000+00:00  application/octet-stream   
2023-07-10 17:07:22.845000+00:00  application/octet-stream   

                                                                    Id  \
beginningDateTime                                                        
2023-07-04 05:11:11.651000+00:00  a35fb411-5789-4ce5-b8d9-f468d7dd276f   
2023-07-05 16:59:05.125000+00:00  3ba49f42-05aa-4cf9-b5ad-2546104e87ab   
2023-07-09 05:19:23.871000+00:00  46dadcb1-446f-47da-ab82-a39e162294b7   
2023-07-10 17:06:57.845000+00:00  5edb8f38-9cc2-49d9-ab4b-21952e42cf1d   
2023-07-10 17:07:22.845000+00:00  22454991-db23-4216-8195-be7a9e07fbf6   

                                                                               Name  \
beginningDateTime                                                                     
2023-07-04 05:11:11.651000+00:00  S1A_IW_GRDH_1SDV_20230704T051111_20230704T0511...   
2023-07-05 16:59:05.125000+00:00  S1A_IW_GRDH_1SDV_20230705T165905_20230705T1659...   
2023-07-09 05:19:23.871000+00:00  S1A_IW_GRDH_1SDV_20230709T051923_20230709T0519...   
2023-07-10 17:06:57.845000+00:00  S1A_IW_GRDH_1SDV_20230710T170657_20230710T1707...   
2023-07-10 17:07:22.845000+00:00  S1A_IW_GRDH_1SDV_20230710T170722_20230710T1707...   

                                               ContentType  ContentLength  \
beginningDateTime                                                           
2023-07-04 05:11:11.651000+00:00  application/octet-stream     1776442517   
2023-07-05 16:59:05.125000+00:00  application/octet-stream     1772251838   
2023-07-09 05:19:23.871000+00:00  application/octet-stream     1777636660   
2023-07-10 17:06:57.845000+00:00  application/octet-stream     1765677503   
2023-07-10 17:07:22.845000+00:00  application/octet-stream     1766577336   

                                                OriginDate  \
beginningDateTime                                            
2023-07-04 05:11:11.651000+00:00  2023-07-04T06:03:21.575Z   
2023-07-05 16:59:05.125000+00:00  2023-07-05T17:49:26.467Z   
2023-07-09 05:19:23.871000+00:00  2023-07-09T06:12:38.712Z   
2023-07-10 17:06:57.845000+00:00  2023-07-10T17:58:58.919Z   
2023-07-10 17:07:22.845000+00:00  2023-07-10T16:07:56.405Z   

                                           PublicationDate  \
beginningDateTime                                            
2023-07-04 05:11:11.651000+00:00  2023-07-04T06:10:48.876Z   
2023-07-05 16:59:05.125000+00:00  2023-07-05T17:59:36.979Z   
2023-07-09 05:19:23.871000+00:00  2023-07-09T06:24:02.959Z   
2023-07-10 17:06:57.845000+00:00  2023-07-10T18:08:33.852Z   
2023-07-10 17:07:22.845000+00:00  2023-07-10T18:12:57.475Z   

                                          ModificationDate  Online  \
beginningDateTime                                                    
2023-07-04 05:11:11.651000+00:00  2023-07-04T06:11:03.504Z    True   
2023-07-05 16:59:05.125000+00:00  2023-07-05T18:04:00.551Z    True   
2023-07-09 05:19:23.871000+00:00  2023-07-09T06:30:45.535Z    True   
2023-07-10 17:06:57.845000+00:00  2023-07-10T18:08:50.294Z    True   
2023-07-10 17:07:22.845000+00:00  2023-07-10T18:13:13.899Z    True   

                                 EvictionDate  ...  \
beginningDateTime                              ...   
2023-07-04 05:11:11.651000+00:00               ...   
2023-07-05 16:59:05.125000+00:00               ...   
2023-07-09 05:19:23.871000+00:00               ...   
2023-07-10 17:06:57.845000+00:00               ...   
2023-07-10 17:07:22.845000+00:00               ...   

                                                                        ContentDate  \
beginningDateTime                                                                     
2023-07-04 05:11:11.651000+00:00  {'Start': '2023-07-04T05:11:11.651Z', 'End': '...   
2023-07-05 16:59:05.125000+00:00  {'Start': '2023-07-05T16:59:05.125Z', 'End': '...   
2023-07-09 05:19:23.871000+00:00  {'Start': '2023-07-09T05:19:23.871Z', 'End': '...   
2023-07-10 17:06:57.845000+00:00  {'Start': '2023-07-10T17:06:57.845Z', 'End': '...   
2023-07-10 17:07:22.845000+00:00  {'Start': '2023-07-10T17:07:22.845Z', 'End': '...   

                                                                          Footprint  \
beginningDateTime                                                                     
2023-07-04 05:11:11.651000+00:00  geography'SRID=4326;POLYGON ((15.885898 44.570...   
2023-07-05 16:59:05.125000+00:00  geography'SRID=4326;POLYGON ((12.004252 45.222...   
2023-07-09 05:19:23.871000+00:00  geography'SRID=4326;POLYGON ((13.87602 44.6898...   
2023-07-10 17:06:57.845000+00:00  geography'SRID=4326;POLYGON ((10.31477 43.9351...   
2023-07-10 17:07:22.845000+00:00  geography'SRID=4326;POLYGON ((9.496206 46.9349...   

                                                                       GeoFootprint  \
beginningDateTime                                                                     
2023-07-04 05:11:11.651000+00:00  {'type': 'Polygon', 'coordinates': [[[15.88589...   
2023-07-05 16:59:05.125000+00:00  {'type': 'Polygon', 'coordinates': [[[12.00425...   
2023-07-09 05:19:23.871000+00:00  {'type': 'Polygon', 'coordinates': [[[13.87602...   
2023-07-10 17:06:57.845000+00:00  {'type': 'Polygon', 'coordinates': [[[10.31477...   
2023-07-10 17:07:22.845000+00:00  {'type': 'Polygon', 'coordinates': [[[9.496206...   

                                                                         Attributes  \
beginningDateTime                                                                     
2023-07-04 05:11:11.651000+00:00  [{'@odata.type': '#OData.CSC.StringAttribute',...   
2023-07-05 16:59:05.125000+00:00  [{'@odata.type': '#OData.CSC.StringAttribute',...   
2023-07-09 05:19:23.871000+00:00  [{'@odata.type': '#OData.CSC.StringAttribute',...   
2023-07-10 17:06:57.845000+00:00  [{'@odata.type': '#OData.CSC.StringAttribute',...   
2023-07-10 17:07:22.845000+00:00  [{'@odata.type': '#OData.CSC.StringAttribute',...   

                                 sliceNumber orbitDirection  processorVersion  \
beginningDateTime                                                               
2023-07-04 05:11:11.651000+00:00          20     DESCENDING            003.61   
2023-07-05 16:59:05.125000+00:00          11      ASCENDING            003.61   
2023-07-09 05:19:23.871000+00:00          20     DESCENDING            003.61   
2023-07-10 17:06:57.845000+00:00          10      ASCENDING            003.61   
2023-07-10 17:07:22.845000+00:00          11      ASCENDING            003.61   

                                 relativeOrbitNumber platformSerialIdentifier  \
beginningDateTime                                                               
2023-07-04 05:11:11.651000+00:00                  22                        A   
2023-07-05 16:59:05.125000+00:00                  44                        A   
2023-07-09 05:19:23.871000+00:00                  95                        A   
2023-07-10 17:06:57.845000+00:00                 117                        A   
2023-07-10 17:07:22.845000+00:00                 117                        A   

                                         beginningDateTime  
beginningDateTime                                           
2023-07-04 05:11:11.651000+00:00  2023-07-04T05:11:11.651Z  
2023-07-05 16:59:05.125000+00:00  2023-07-05T16:59:05.125Z  
2023-07-09 05:19:23.871000+00:00  2023-07-09T05:19:23.871Z  
2023-07-10 17:06:57.845000+00:00  2023-07-10T17:06:57.845Z  
2023-07-10 17:07:22.845000+00:00  2023-07-10T17:07:22.845Z  

[5 rows x 22 columns]
DatetimeIndex(['2023-07-04 05:11:11.651000+00:00',
               '2023-07-05 16:59:05.125000+00:00',
               '2023-07-09 05:19:23.871000+00:00',
               '2023-07-10 17:06:57.845000+00:00',
               '2023-07-10 17:07:22.845000+00:00',
               '2023-07-16 05:11:12.614000+00:00',
               '2023-07-17 16:59:06.007000+00:00',
               '2023-07-21 05:19:24.419000+00:00',
               '2023-07-22 17:06:58.512000+00:00',
               '2023-07-22 17:07:23.511000+00:00',
               '2023-07-28 05:11:13.147000+00:00',
               '2023-07-29 16:59:06.631000+00:00',
               '2023-08-02 05:19:24.848000+00:00',
               '2023-08-03 17:06:59.681000+00:00',
               '2023-08-03 17:07:24.681000+00:00',
               '2023-08-09 05:11:13.195000+00:00',
               '2023-08-10 16:59:06.548000+00:00',
               '2023-08-14 05:19:25.607000+00:00',
               '2023-08-15 17:06:59.681000+00:00',
               '2023-08-15 17:07:24.680000+00:00',
               '2023-08-21 05:11:14.215000+00:00',
               '2023-08-22 16:59:07.780000+00:00',
               '2023-08-26 05:19:26.539000+00:00',
               '2023-08-27 17:07:00.546000+00:00',
               '2023-08-27 17:07:25.546000+00:00',
               '2023-09-02 05:11:15.257000+00:00',
               '2023-09-03 16:59:08.728000+00:00',
               '2023-09-07 05:19:26.988000+00:00',
               '2023-09-08 17:07:01.075000+00:00',
               '2023-09-08 17:07:26.075000+00:00'],
              dtype='datetime64[ns, UTC]', name='beginningDateTime', freq=None)

and it finishes with the same error:
TypeError: Cannot compare tz-naive and tz-aware datetime-like objects

In this part of the code you can see that there are 2 date types: one with the timezone indication (right) and one without (left). Could this be the cause?

                                         beginningDateTime  
beginningDateTime                                           
2023-07-04 05:11:11.651000+00:00  2023-07-04T05:11:11.651Z 
2023-07-05 16:59:05.125000+00:00  2023-07-05T16:59:05.125Z  
2023-07-09 05:19:23.871000+00:00  2023-07-09T05:19:23.871Z  
2023-07-10 17:06:57.845000+00:00  2023-07-10T17:06:57.845Z  
2023-07-10 17:07:22.845000+00:00  2023-07-10T17:07:22.845Z  

BTW, I use SNAP 10 (I'ts written SNAP 9.0+).

Creating orbit geometries from the era of 2 Sentinel-1 (A anf B) creates Multypoligons

In FlooDPYapppy:

self.query_S1_df['geometry'] = self.query_S1_df.GeoFootprint.apply(create_polygon)

calls create_polygon() from geo_utils.py:

from shapely.geometry import Polygon

def create_polygon(coordinates):
    return Polygon(coordinates['coordinates'][0])

This one produces a Multipolygon for Sentinel-1 before 2022 (cca.).

Such geometry raises this error
ValueError: A LinearRing must have at least 3 coordinate tuples
after calling:
Floodpy_app.sel_S1_data(sel_flood_date)

You can try to reproduce the error with this parameters:

pre_flood_start = '20191019T000000'
pre_flood_end = '20191027T000000'
flood_start = '20191115T000000'
flood_end = '20191118T00000'

# AOI BBOX (WGS84)
LONMIN = 12.42
LATMIN = 45.43
LONMAX = 13.11
LATMAX = 45.86

Stuck in Floodpy_app.create_S1_stack(overwrite=False) and missing LAI error

I'm trying to run the example notebooks, but I can't get any further than the pre-processing. When running Floodpy_app.create_S1_stack(overwrite=False), I get this message under the cell in Jupyter:

Coregistrating the primary image (2 GRDs): 
 S1A_IW_GRDH_1SDV_20230906T043947_20230906T044012_050202_060AD8_440E.SAFE.zip 
 S1A_IW_GRDH_1SDV_20230906T044012_20230906T044037_050202_060AD8_49FE.SAFE.zip
with secondary image (2 GRDs): 
 S1A_IW_GRDH_1SDV_20230708T043943_20230708T044008_049327_05EE7E_A3D4.SAFE.zip 
 S1A_IW_GRDH_1SDV_20230708T044008_20230708T044033_049327_05EE7E_C11B.SAFE.zip

It seems to be stuck there, and letting it run for ~2 hours doesn't results in any progress. Looking at htop, it's running gpt from SNAP with lots of cores active. I've only changed paths and credentials at the top, without any further changes to the notebook. I've tried SNAP 9.0 and 10.0, running Ubuntu 22.04 on an AMD 5950x with 128GB RAM.

Then I tried a different region, by changing the location to:

LONMIN = 9.16
LATMIN = 56.08
LONMAX = 9.96
LATMAX = 56.50

I kept the flood and pre-flood start/end dates the sames, and used sel_flood_date = '2023-09-08T17:07:51.075000000'. This gave me the following error instead in Floodpy_app.create_S1_stack(overwrite=False):

---------------------------------------------------------------------------
NoDataInBounds                            Traceback (most recent call last)
Cell In[13], line 1
----> 1 Floodpy_app.create_S1_stack(overwrite=False)

File [~/Downloads/FLOODPY/floodpy/FLOODPYapp.py:168](http://localhost:8888/lab/tree/floodpy/FLOODPYapp.py#line=167), in FloodwaterEstimation.create_S1_stack(self, overwrite)
    165 self.DEM_filename = os.path.join(self.Preprocessing_dir,'DEM.nc')
    166 self.LIA_filename = os.path.join(self.Preprocessing_dir,'LIA.nc')
--> 168 Run_Preprocessing(self, overwrite = overwrite)

File [~/Downloads/FLOODPY/floodpy/Preprocessing_S1_data/Preprocessing_S1_data.py:94](http://localhost:8888/lab/tree/floodpy/Preprocessing_S1_data/Preprocessing_S1_data.py#line=93), in Run_Preprocessing(Floodpy_app, overwrite)
     89     print("Please use a smaller AOI.")
     91 if not os.path.exists(flood_xarray_outfile):
     92 
     93     # create xarray for flood (primary) image
---> 94     create_coreg_xarray(netcdf4_out_filename = flood_xarray_outfile,
     95                         snap_hdf5_in_filename = flood_hdf5_outfile+ '.h5',
     96                         geojson_bbox = Floodpy_app.geojson_bbox,
     97                         ref_tif_file = Floodpy_app.lc_mosaic_filename,
     98                         primary_image = True,
     99                         delete_hdf5 = True)
    101 for Pre_flood_date in Pre_flood_dates:
    102     S1_pre_flood_rows = Floodpy_app.query_S1_sel_df.loc[pd.to_datetime(Pre_flood_date): pd.to_datetime(Pre_flood_date) + pd.Timedelta(hours=24)]

File [~/Downloads/FLOODPY/floodpy/Preprocessing_S1_data/xarray_funcs.py:162](http://localhost:8888/lab/tree/floodpy/Preprocessing_S1_data/xarray_funcs.py#line=161), in create_coreg_xarray(netcdf4_out_filename, snap_hdf5_in_filename, geojson_bbox, ref_tif_file, primary_image, delete_hdf5)
    159 LIA = S1_file['localIncidenceAngle'][:]
    160 LIA[LIA==0] = np.nan
--> 162 save_LIA_xarray(LIA = LIA,
    163                 S1_lon_vector = S1_lon_vector,
    164                 S1_lat_vector = S1_lat_vector,
    165                 geojson_bbox = geojson_bbox,
    166                 ref_xarray = ref_xarray,
    167                 LIA_xarray_outfile = os.path.join(out_dir,'LIA.nc'))
    169 save_DEM_xarray(DEM = Elevation,
    170                 S1_lon_vector = S1_lon_vector,
    171                 S1_lat_vector = S1_lat_vector,
    172                 geojson_bbox = geojson_bbox,
    173                 ref_xarray = ref_xarray,
    174                 DEM_xarray_outfile = os.path.join(out_dir,'DEM.nc'))
    176 np.save(os.path.join(out_dir,'S1_lat_vector.npy'), S1_lat_vector)

File [~/Downloads/FLOODPY/floodpy/Preprocessing_S1_data/xarray_funcs.py:74](http://localhost:8888/lab/tree/floodpy/Preprocessing_S1_data/xarray_funcs.py#line=73), in save_LIA_xarray(LIA, S1_lon_vector, S1_lat_vector, geojson_bbox, ref_xarray, LIA_xarray_outfile)
     70 df.y.attrs['axis'] = 'Y'
     72 df.rio.write_crs("epsg:4326", inplace=True)
---> 74 df_coreg = df.rio.clip(gpd.read_file(geojson_bbox)['geometry'])
     76 S1_coreg = df_coreg.rio.reproject_match(ref_xarray)
     77 S1_coreg.to_netcdf(LIA_xarray_outfile, format='NETCDF4')

File [~/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_dataset.py:380](http://localhost:8888/home/jhj/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_dataset.py#line=379), in RasterDataset.clip(self, geometries, crs, all_touched, drop, invert, from_disk)
    377 try:
    378     x_dim, y_dim = _get_spatial_dims(self._obj, var)
    379     clipped_dataset[var] = (
--> 380         self._obj[var]
    381         .rio.set_spatial_dims(x_dim=x_dim, y_dim=y_dim, inplace=True)
    382         .rio.clip(
    383             geometries,
    384             crs=crs,
    385             all_touched=all_touched,
    386             drop=drop,
    387             invert=invert,
    388             from_disk=from_disk,
    389         )
    390     )
    391 except MissingSpatialDimensionError:
    392     if len(self._obj[var].dims) >= 2 and not get_option(
    393         SKIP_MISSING_SPATIAL_DIMS
    394     ):

File [~/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_array.py:943](http://localhost:8888/home/jhj/anaconda3/envs/floodpy_gpu/lib/python3.8/site-packages/rioxarray/raster_array.py#line=942), in RasterArray.clip(self, geometries, crs, all_touched, drop, invert, from_disk)
    931     cropped_ds = _clip_xarray(
    932         self._obj,
    933         geometries=geometries,
   (...)
    936         invert=invert,
    937     )
    939 if (
    940     cropped_ds.coords[self.x_dim].size < 1
    941     or cropped_ds.coords[self.y_dim].size < 1
    942 ):
--> 943     raise NoDataInBounds(
    944         f"No data found in bounds.{_get_data_var_message(self._obj)}"
    945     )
    947 # make sure correct attributes preserved & projection added
    948 _add_attrs_proj(cropped_ds, self._obj)

NoDataInBounds: No data found in bounds. Data variable: LIA

I tried a couple of different regions too, but got the exact same error.

Have anyone encountered similar issues? And have any ideas on how to fix them? 🙂

subprocess.CalledProcessError: Command [snap/bin/gpt]

I occasionally encounter an error related to the gpt subprocess on different AOI's. I've attached an image of the error message and provided an example .cfg file modification that causes this error on my machine.

# B1. The datetime of flood event (Format is YYYYMMDDTHHMMSS)
Flood_datetime = 20170910T043000

# B2. Days before flood event for baseline stack construction
before_flood_days = 60

# B3. Days after flood event
after_flood_days = 10

# C2. AOI BBOX (WGS84)
LONMIN=-81.46
LATMIN=24.61
LONMAX=-81.29
LATMAX=24.74

image

IndexError: cannot do a non-empty take from an empty axes

I have been trying to figure out the error I keep getting on the final step of Floodwater_classification which has to do with non-empty take from an empty axes. I've looked at the numpy function relating to this error, and still can't figure out where it's coming from within the Flompy dataset.

Please find attached the error I keep getting. Any suggestions on how to go around this will be helpful.
index_error

assert (os.path.exists(Flood_image)) AssertionError

Good day Kleanthis,

You have been of invaluable assistance to myself and my colleague Nicolene Botha. After your last update we were able to obtain the flood map image for 5 February 2022. We are still going to look at the relOrbit variable as you suggested.

However, with the changes you made I can no longer get the flood map image for 9 December 2019, which I was able to get prior to the latest update. This is also the case for several other dates. It gives an assertion error and it appears that the file indicated in the flood_S1_filename.csv as the main image is not downloaded, hence the error.

I attach the error message, map.geojson and configuration file. In the configuration file you will see all the other dates of interest that are all giving the same assertion error.
FLOMPYapp.zip

Thanks again for your prompt assistance with every issue we have encountered. It is great when a developer is still actively maintaining their code.

Kind regards
Natasha

Issue Installing snap.sh

Hello!

I found this toolkit on linkedin and I find it super useful,

I'm trying to follow the instructions to do all the installation steps however I'm a windows user. As such, I'm using WSL to bypass that issue. However, when installing snap.sh, the installation never stops as shown in the following image, do you have any idea of how long it takes to install it? it is been like 2 hours cycling through processes

image

If i have snap on windows do I have to install it again through Linux?

Thank you!

EDIT1: I'm trying to do all the steps and when I reach here I can't download the precipitation data for my AOI because it says command is not found without the python3, and module not found with python3. Any help?

image

Support generation of multiple flood maps

Multiple flood map functionality

  • should enable the user to select a particular time range
  • Identify the time and dates that Floodpy can provide flood maps
  • Download all data that should be processed in order to generate the flood maps
  • Generate the different flood maps

ValueError: cannot convert float NaN to integer

FLOMPY runs well untill it gets to STEP 4 (i.e. 'Statistical_analysis'). I get the following error, which seems to do with converting NaN values to integers from line 109 (i.e. utm_zone = int(np.floor((representative_longitude + 180) / 6) + 1)) of the file Generate_aux.py. Please see the error below, and the attached image of the config file.

******************** step - Statistical_analysis ********************
/home/lesetl/anaconda3/envs/flompy/lib/python3.10/site-packages/numpy/core/fromnumeric.py:3474: RuntimeWarning: Mean of empty slice.
return _methods._mean(a, axis=axis, dtype=dtype,
/home/lesetl/anaconda3/envs/flompy/lib/python3.10/site-packages/numpy/core/_methods.py:189: RuntimeWarning: invalid value encountered in double_scalars
ret = ret.dtype.type(ret / rcount)
Traceback (most recent call last):
File "/home/lesetl/testbed/FLOMPY/flompy/FLOMPYapp.py", line 386, in <module>
main()
File "/home/lesetl/testbed/FLOMPY/flompy/FLOMPYapp.py", line 377, in main
app.run(steps=inps.runSteps, plot=inps.plot)
File "/home/lesetl/testbed/FLOMPY/flompy/FLOMPYapp.py", line 348, in run
self.run_multitemporal_statistics(sname)
File "/home/lesetl/testbed/FLOMPY/flompy/FLOMPYapp.py", line 311, in run_multitemporal_statistics
get_S1_aux (self.Preprocessing_dir)
File "/home/lesetl/testbed/FLOMPY/flompy/Generate_aux.py", line 146, in get_S1_aux
UTM_CRS_EPSG = WGS84_to_UTM(lon_list, lat_list)
File "/home/lesetl/testbed/FLOMPY/flompy/Generate_aux.py", line 109, in WGS84_to_UTM
utm_zone = int(np.floor((representative_longitude + 180) / 6) + 1)
ValueError: cannot convert float NaN to integer
(flompy) -bash-4.2$ python FLOMPYapp.py FLOMPYapp_template.cfg
FLOod Mapping PYthon toolbox (FLOMPY) v.1.0
Copyright (c) 2021-2022 Kleanthis Karamvasis, [email protected]
Remote Sensing Laboratory of National Technical University of Athens

conf_file

No sentenial-1 data downloaded post Flood_datetime.

I have made the following modifications to my .cfg file and have noticed that the Sentinel_1_imagery directory
only contains data prior(Aug 18 - Sept 15) to my specified Flood_datetime(Sept 16). Is this the expected behavior and is there any way to ensure imagery is downloaded on or after the specified Flood_datetime?

# B1. The datetime of flood event (Format is YYYYMMDDTHHMMSS)
Flood_datetime = 20200916T050000

# B2. Days before flood event for baseline stack construction
before_flood_days = 40

# B3. Days after flood event
after_flood_days = 3

# C2. AOI BBOX (WGS84)
LONMIN=-87.33
LATMIN=30.34
LONMAX=-87.26
LATMAX=30.40

Extract the water extent before flood event

The pre-flood water extent will be extracted by Sentinel-1 imagery and Sentinel-2 images before the flood.

  • For Sentinel-1 acquisitions (@kleok )
  • We select only recent (last month) Sentinel-1 acquisitions that are not related to "big" precipitations events.
  • We calculate the median backscatter coefficient over time.
  • We download a third-party dataset for water extent (e.g. JRC)
  • We create a buffer zone of the water extent
  • We calculate threshold (e.g. Otsu) and apply at median backscatter coefficient
  • For Sentinel-2 acquisitions (@alekfal)
  • We select only recent (last month) Sentinel-2 acquisitions with low cloud coverage (< 10%)
  • We calculated MNDWI and apply cloud masks
  • We calculate the median MNDWI values over time.
  • We calculate a suitable MNDWI threshold at the buffer zone. We can have a look here.
  • We apply thresholding at median MNDWI values.
  • Combination of Sentinel-1 and Sentinel-2 water masks (@kleok, @alekfal)
  • The final map is calculated as the union of Sentinel-1 and Sentinel-2 water masks

ValueError: '20220128T164719' is not in list

Good day,

I tried changing the relOrbit value to 50 and 116 as you suggested to my colleague in issue #1. Both resulted in a ValueError for the date, albeit different times. For relOrbit = 50 this error was "20211213T033455" and for relOrbit = 116 it was "20220128T164719".
Screenshot from 2022-06-20 19-54-10

Thanks in advance!
Natasha

Issue in Preprocessing_S1_data.py

Hi Team,
I am running the FLOODPY for event happened in Solingen, Germany but getting the below error in Preprocessing_S1_data.py.

File "C:\Users\Pmathur\Anaconda3\envs\floodpy\lib\site-packages\floodpy\Preprocessing_S1_data\Preprocessing_S1_data.py", line 528, in Run_Preprocessing
Subset_AOI,

UnboundLocalError: local variable 'Subset_AOI' referenced before assignment

It might be happening because two adjacent GRD tile are getting used in pre-event tile but flood event has only one GRD tile as 2 and 3 tile in table below:
| ERA5_tp_mm | baseline | S1_GRD | Datetime
1 | 23.0889534 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1A_IW_GRDH_1SDV_20210609T055025_20210609T055050_038259_0483CC_2C8D.zip | 6/9/2021 5:50
2 | 23.73645536 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1B_IW_GRDH_1SDV_20210615T054935_20210615T055000_027363_0344A0_DAC1.zip | 6/15/2021 5:49
3 | 23.73645536 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1B_IW_GRDH_1SDV_20210615T055000_20210615T055025_027363_0344A0_13F8.zip | 6/15/2021 5:50
4 | 22.57493026 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1A_IW_GRDH_1SDV_20210621T055025_20210621T055050_038434_048908_F67D.zip | 6/21/2021 5:50
5 | 37.8658546 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1B_IW_GRDH_1SDV_20210627T054935_20210627T055000_027538_03498A_B833.zip | 6/27/2021 5:49
6 | 37.8658546 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1B_IW_GRDH_1SDV_20210627T055000_20210627T055025_027538_03498A_86D9.zip | 6/27/2021 5:50
7 | 47.67774148 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1A_IW_GRDH_1SDV_20210703T055026_20210703T055051_038609_048E45_DD91.zip | 7/3/2021 5:50
8 | 71.55205072 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1B_IW_GRDH_1SDV_20210709T054936_20210709T055001_027713_034EA7_4B41.zip | 7/9/2021 5:49
9 | 71.60350636 | TRUE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1B_IW_GRDH_1SDV_20210709T055001_20210709T055026_027713_034EA7_EA87.zip | 7/9/2021 5:50
10 | 164.908054 | FALSE | E:\Vishal_2022\FloodEventResponse\SOLINGEN\Sentinel_1_imagery\S1A_IW_GRDH_1SDV_20210715T055027_20210715T055052_038784_049389_5004.zip | 7/15/2021 5:50

PARAMETERS USED:
#A1. The name of your project withough special characters.
Projectname = 'Solingen'

#A2. The location that everything is going to be saved. Make sure
projectfolder = r'E:\Vishal_2022\FloodEventResponse\SOLINGEN'

#A3. The location of floodpy code
src_dir = r'C:\Users\Pmathur\FLOODPY-main\floodpy'

#A4. SNAP ORBIT DIRECTORY
snap_dir = r'C:\Users\Pmathur.snap\auxdata\Orbits\Sentinel-1'

#A5. SNAP GPT full path
GPTBIN_PATH = r"C:\Program Files\snap\bin\gpt.exe"
Flood_datetime = '20210714T000000'#20200921T030000'

before_flood_days = 40

after_flood_days = 3

AOI_File = "None”
LONMIN=6.5
LATMIN=51.1
LONMAX=7.2
LATMAX=51.3

days_back = 12

accumulated_precipitation_threshold = 120

#E1. The number of Sentinel-1 relative orbit. The default
S1_type = 'GRD'
relOrbit = 'Auto'

#E3. The minimum mapping unit area in square meters
minimum_mapping_unit_area_m2=4000

#E4. Computing resources to employ
CPU=8
RAM='20G'

Please have a look and let me know how we can run it for the above parameters.
Thanks
Vishal

'NoneType' object is not subscriptable' when running Run_preprocessing_S1_data

Hello! I download the recent version of the repo and I started doing a new project to test a flood event that occured in Portugal.

However, when running the Run_preprocessing_S1_data it gives me a "NoneType" error as following:


TypeError Traceback (most recent call last)
Input In [13], in <cell line: 1>()
----> 1 app.run_preprocessing_S1_data('Preprocessing_S1_data')

File ~/FLOODPY/floodpy/FLOODPYapp.py:317, in FloodwaterEstimation.run_preprocessing_S1_data(self, step_name)
315 def run_preprocessing_S1_data(self, step_name):
--> 317 Get_images_for_baseline_stack(projectfolder = self.projectfolder,
318 S1_dir = self.S1_dir,
319 Precipitation_data = self.precipitation_df,
320 flood_datetime = self.flood_datetime,
321 days_back = self.days_back,
322 rain_thres=self.rain_thres)
324 Run_Preprocessing(projectfolder = self.projectfolder,
325 gpt_exe = self.gptcommand,
326 graph_dir = self.graph_dir,
327 S1_dir = self.S1_dir,
328 geojson_S1 = self.geojson_S1,
329 Preprocessing_dir = self.Preprocessing_dir)
331 return 0

File ~/FLOODPY/floodpy/Preprocessing_S1_data/Classify_S1_images.py:22, in Get_images_for_baseline_stack(projectfolder, S1_dir, Precipitation_data, flood_datetime, days_back, rain_thres)
10 def Get_images_for_baseline_stack(projectfolder,
11 S1_dir,
12 Precipitation_data,
13 flood_datetime,
14 days_back=5,
15 rain_thres=20):
16 '''
17 Creates a pandas DataFrame of Sentinel-1 acquisitions. Creates a column with
18 boolean values with name 'baseline'. If True the particular acquisition
19 can be used for calculation of baseline stack
20
21 '''
---> 22 Precipitation_data.index = Precipitation_data['Datetime']
23 Precipitation_data.drop(columns = ['Datetime'], inplace=True)
25 # calculate cumulative rain over the past days (given) for each date

TypeError: 'NoneType' object is not subscriptable

image:
image

previous step:

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.