mapme-initiative / mapme.forest Goto Github PK
View Code? Open in Web Editor NEWHome Page: https://mapme-initiative.github.io/mapme.forest
License: GNU General Public License v3.0
Home Page: https://mapme-initiative.github.io/mapme.forest
License: GNU General Public License v3.0
Hi @goergen95 . I think we can alerady deactivate this repo and link to the new package. Or what do you think? Only thing here that is not there yet is the fragstats but I guess that would not justify an active maintenance, or?
Hi @goergen95 . The packages download function seem to be broken.
library(raster)
library(igraph)
library(tibble)
library(sf)
library(magrittr)
library(dplyr)
library(tidyr)
library(ggplot2)
library(stringr)
library(mapme.forest)
# read in polygons of interest
aoi = st_read(system.file("extdata", "aoi_polys.gpkg", package = "mapme.forest"))
# download GFW data for the area of interest
raster_files = downloadfGFW(shape = aoi,
basename = "pkgTest",
dataset = "GFC-2018-v1.6",
outdir = "../data/",
keepTmpFiles = T,
.tmpdir = "../data/tmp")
gives this error message:
trying URL 'https://storage.googleapis.com/earthenginepartners-hansen/GFC-2018-v1.6/Hansen_GFC-2018-v1.6_treecover2000_20N_100E.tif'
Content type 'application/octet-stream' length 518834343 bytes (494.8 MB)
==================================================
downloaded 494.8 MB
trying URL 'https://storage.googleapis.com/earthenginepartners-hansen/GFC-2018-v1.6/Hansen_GFC-2018-v1.6_lossyear_20N_100E.tif'
Content type 'image/tiff' length 86248019 bytes (82.3 MB)
==================================================
downloaded 82.3 MB
trying URL 'http://gfw2-data.s3.amazonaws.com/climate/Hansen_emissions/2018_loss/per_pixel/20N_100E_tCO2_pixel_AGB_masked_by_loss.tif'
Error in download.file(urls[i], localname) :
cannot open URL 'http://gfw2-data.s3.amazonaws.com/climate/Hansen_emissions/2018_loss/per_pixel/20N_100E_tCO2_pixel_AGB_masked_by_loss.tif'
In addition: Warning message:
In download.file(urls[i], localname) :
cannot open URL 'http://gfw2-data.s3.amazonaws.com/climate/Hansen_emissions/2018_loss/per_pixel/20N_100E_tCO2_pixel_AGB_masked_by_loss.tif': HTTP status was '403 Forbidden'
Any ideas how to quickfix this?
HI @goergen95 .
Did you accidently overwrite something again with the co2 layer download functions. I am currently unable to download GFW data so I cannot tst #5 .
devtools::install_github("mapme-initiative/mapme.forest")
library(sf)
library(mapme.forest)
## TEST DOWNLOAD
# read in polygons of interest
aoi = st_read(system.file("extdata", "aoi_polys.gpkg", package = "mapme.forest"))
# download GFW data for the area of interest
raster_files = downloadfGFW(shape = aoi,
basename = "pkgTest",
dataset = "GFC-2019-v1.7",
outdir = "shared/xyz/data/",
keepTmpFiles = T)
gives me
trying URL 'https://storage.googleapis.com/earthenginepartners-hansen/GFC-2019-v1.7/Hansen_GFC-2019-v1.7_treecover2000_20N_100E.tif'
Content type 'application/octet-stream' length 518834343 bytes (494.8 MB)
==================================================
downloaded 494.8 MB
trying URL 'https://storage.googleapis.com/earthenginepartners-hansen/GFC-2019-v1.7/Hansen_GFC-2019-v1.7_lossyear_20N_100E.tif'
Content type 'image/tiff' length 91984705 bytes (87.7 MB)
==================================================
downloaded 87.7 MB
trying URL 'http://gfw2-data.s3.amazonaws.com/climate/Hansen_emissions/2018_loss/per_pixel/20N_100E_tCO2_pixel_AGB_masked_by_loss.tif'
Error in download.file(urls[i], localname) :
cannot open URL 'http://gfw2-data.s3.amazonaws.com/climate/Hansen_emissions/2018_loss/per_pixel/20N_100E_tCO2_pixel_AGB_masked_by_loss.tif'
In addition: Warning messages:
1: In if (!is.na(co2tiles)) { :
the condition has length > 1 and only the first element will be used
2: In download.file(urls[i], localname) :
cannot open URL 'http://gfw2-data.s3.amazonaws.com/climate/Hansen_emissions/2018_loss/per_pixel/20N_100E_tCO2_pixel_AGB_masked_by_loss.tif': HTTP status was '403 Forbidden'
So I guess he is correctly downloading treecover and lossyear layer from google but then again fails with the amazonws download.
Also i noticed that the function is now called downloadfGFW
instead of downloadGFW
. Not sure if that is on purpose.
Hi @goergen95 .
Me again.
Seems like currently the Grass functions are not working as expected.
devtools::install_github("mapme-initiative/mapme.forest")
library(sf)
library(mapme.forest)
library(raster)
## TEST DOWNLOAD
# read in polygons of interest
aoi = st_read(system.file("extdata", "aoi_polys.gpkg", package = "mapme.forest"))
# download GFW data for the area of interest
raster_files = downloadGFW(shape = aoi,
basename = "pkgTest",
dataset = "GFC-2020-v1.8",
outdir = "../../johannes/data/",
keepTmpFiles = T)
# in case the projection between aoi and the downloaded rasters is different,
treeCover = "../../johannes/data/pkgTest_treecover2000.tif"
lossYear = "../../johannes/data/pkgTest_lossyear.tif"
co2Layer = "../../johannes/data/pkgTest_co2_emission.tif"
test<-
statsGRASS(grass = grass,
addon_base = "../../johannes/mapme.protectedareas/data-raw/addons/",
areas = aoi,
tree_cover = treeCover,
tree_loss = lossYear,
tree_co2 = co2Layer,
idcol = "id",
thresholdClump = 6,
thresholdCover = 10,
years = 2001:2020,
saveRaster = F,
.tmpdir="~/shared/datalake/tempdir/")
results in:
Warning in statsGRASS(grass = grass, addon_base = "../../johannes/mapme.protectedareas/data-raw/addons/", :
IMPORTANT WARNING: The use of the CO2 emission layer during analysis is currently discouraged.
Several routines need to be adapted since the usage of a new data set by Harris et al (2021) (see https://www.nature.com/articles/s41558-020-00976-6)
Check out https://github.com/mapme-initiative/mapme.forest/issues/7 to recieve information if the issue has been solved.
ERROR: Variable 'LOCATION_NAME' not set
Error in if (!compatible) { : argument is of length zero
In addition: Warning message:
In system(paste("g.version", get("addEXE", envir = .GRASS_CACHE), :
running command 'g.version' had status 1
can you confirm?
While running the code below:
library(sf)
library(terra)
remotes::install_github("mapme-initiative/mapme.forest", ref = "terra")
library(mapme.forest)
# ---- test download function and prepare data -----
# load raster files
raster_files <- rast(paste0("../../datalake/mapme.protectedareas/input/global_forest_watch/",
list.files("../../datalake/mapme.protectedareas/input/global_forest_watch/")))
# load region of interest
aoi <-
read_sf("../../datalake/mapme.protectedareas/input/wdpa_kfw/wdpa_kfw_spatial_latinamerica_2021-02-01_supportedPAs_unique.gpkg")
# we need to reproject the aoi object
aoi <-
st_transform(aoi, crs = st_crs(raster_files))
# create function
area_comp <- function(aoi) {
# crop the rasters to the extent of the aoi
rasters <- terra::crop(raster_files, aoi)
# assigning proper names to the layers and simply plot the results
names(rasters) <- c("co2_emission", "lossyear", "treecover")
#----- test prepTC -----
prep_treeCover <- prepTC(
inputForestMap = rasters$treecover,
thresholdCover = 10,
thresholdClump = 6)
# ----- test Yearly forest mask -----
yearlyForestMask <- getTM(
inputForestMap = prep_treeCover,
inputLossMap = rasters$lossyear,
years = 2001:2020)
# ---- test Area Calc -----
result_latlon <- AreaCalc(yearlyForestMask,
studysite = aoi,
latlon = TRUE,
polyName = "WDPA_PID",
ncores = 1,
years = 2001:2020)
# ---- remove other columns ----
result <- result_latlon %>%
select(-c(2:50))
# ---- return results ----
return(result)
}
### run function for first polygon **WORKS FINE**
area_comp(aoi[1, ])
### run function for the fourth polygon **THROWS ERROR**
area_comp(aoi[4, ])
# Error: [%in%] no matches supplied
# 8. stop("[", f, "] ", emsg, ..., call. = FALSE)
# 7. error(f, x@ptr$getError())
# 6. messages(x, "%in%")
# 5. clummy %in% clumpTmp
# 4. clummy %in% clumpTmp
# 3. `[<-`(`*tmp*`, clummy %in% clumpTmp, value = 0)
# 2. prepTC(inputForestMap = rasters$treecover, thresholdCover = 10,
# thresholdClump = 6)
# 1. area_comp(aoi[4, ])
Seems like the error comes from the prepTC
function.
@goergen95 , could you please help me here in this part?
We use parallel processing to speed up analysis with the statsGRASS
function. Since some of the layers are stored temporarily we run into storage issues and get an error message
sh: 1: cannot create /tmp/Rtmpe3Ce80/file41787aa11ae6.err: No space left on device
If we could change the folder where the internal function store temporary data, this would help us because we could just switch to a bigger storage unit.
@Ohm-Np for your information
I have done area comparison for a polygon using grass
, raster
functions - from branch master & terra
functions - from branch terra and found some dissimilarities in the result as you can see in the table.
Area of polygon: 1094.880915 sqkm
WDPA_ID | name | value (grass) sqkm | value (raster) sqkm | value (terra) sqkm |
---|---|---|---|---|
6675 | area_2001 | 997.573036 | 126.3151016 | 126.4386520 |
6675 | area_2020 | 872.536478 | 21.0387268 | 21.0594864 |
- | - | - | - | - |
6675 | loss_2002 | 0.762019 | 0.7674429 | 1.3987885 |
6675 | loss_2019 | 8.174405 | 8.178824 | 0.5946693 |
6675 | loss_2020 | 21.054441 | 21.03873 | 8.1867638 |
The area from raster and terra are almost same but the area from GRASS seems reasonable. However, the loss value is similar for GRASS and raster. For terra, it might be error in subtraction from my side while computing loss as I didn't use lossCalc
function.
Terra
package is the successor of raster
and is supposedly better to handle large spatial datasets. I suggest that we substitute all raster
functions with terra
functions. This might enable us to also process large data without being reliant on GRASS
and the rather complicated installation of all the dependencies. I created a small test where I benchmark the performances. You can see that terra
seems to even outperform GRASS
which is quite surprising to me, however I did just compare them for simple zonal statistics.
library(raster)
library(terra)
library(rgrass7)
set.seed(10)
## COMPARE RASTER TO RASTER ZONAL
# zonal statistics using the raster package
r <- raster::raster(ncols=4000, nrows=4000)
raster::values(r) <- runif(ncell(r)) * 1:ncell(r)
z <- r
raster::values(z) <- rep(1:5, each=raster::ncell(z)/5)
# for large files, use a character value rather than a function
system.time({ raster::zonal(r, z, 'sum') })
# zonal statistics using the terra package
x <- terra::rast(nrows=4000, ncols=4000)
terra::values(x) <- runif(ncell(x)) * 1:ncell(x)
zz <- x
terra::values(zz) <- rep(1:5, each=raster::ncell(z)/5)
# calculate zonal
system.time({ terra::zonal(x, zz, 'sum') })
terra::writeRaster(x,"test_raster.tif",overwrite=T)
terra::writeRaster(zz,"test_zonal.tif",overwrite=T,datatype="INT4U")
# zonal statistics using GRASS
initGRASS(gisBase = "/usr/lib/grass78/",
home = "./",
addon_base = "./data-raw/addons",
override = T)
# Import raster to GRASS
execGRASS(
"r.in.gdal",
flags = c("overwrite", "o"),
parameters = list(input = "test_raster.tif",
output = "myraster")
)
# set grass region from raster
execGRASS("g.region",
parameters = list(raster = "myraster"))
execGRASS(
"r.in.gdal",
flags = c("overwrite", "o"),
parameters = list(input = "test_zonal.tif",
output = "myzones")
)
system.time({
execGRASS(
"r.stats.zonal",
flags = c("overwrite"),
parameters = list(base = "myzones",
cover = "myraster",
method = "sum", output = "mystats")
)
})
@goergen95 : I will try to do a first draft converting some of the scripts when I find a suitable time-spot in the upcoming days. You might want to confirm, nevertheless, the benchmarks and then eventually help and check if I get stuck.
@Ohm-Np : This is just for your information. You can use the package as is in the meantime.
Until some issues with the analysis of the CO2 Emission Layer have been resolved, its usage in analysis is currently discouraged. With integrating a new dataset some functions need to be adapted. Warnings are issued when using any functionality including the CO2 layer pointing towards this issue. This issue will contain updated information once available.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.