GithubHelp home page GithubHelp logo

sci's Introduction

SCI

Spatial PSM for R

Below is an exmaple for the use of this package - you can also find this example, with much more extensive documentation, in the /man directory of the package, along with an example.shp file.

Please note this package is in an early alpha release, and as such has many instabilities, bugs and errors, and is limited in functionality. The examples included currently focus on cross-sectional analyses, but the package can support spatio-temporal panel modeling via a two-way clustering algorithm. See Stage2PSM() for details - examples of this modeling approach are forthcoming.

If you encounter issues, do not hesitate to contact me ([email protected]).

#Package and data loading

library(devtools)

devtools::install_github("itpir/SCI@master")

library(SCI)

shpfile = file.path(getwd(),"man","data","example.shp")

dta_Shp = readShapePoly(shpfile)

#Variable construction examples dta_Shp$pre_trend_NDVI <- timeRangeTrend(dta_Shp,"MeanL_[0-9][0-9][0-9][0-9]",1982,1995,"SP_ID")

dta_Shp$NDVI_trend_01_10 <- timeRangeTrend(dta_Shp,"MeanL_[0-9][0-9][0-9][0-9]",2001,2010,"SP_ID")

dta_Shp$pre_trend_temp_mean <- timeRangeTrend(dta_Shp,"MeanT_[0-9][0-9][0-9][0-9]",1982,1995,"SP_ID")

dta_Shp$post_trend_temp_01_10 <- timeRangeTrend(dta_Shp,"MeanT_[0-9][0-9][0-9][0-9]",2001,2010,"SP_ID")

dta_Shp$pre_trend_precip_mean <- timeRangeTrend(dta_Shp,"MeanP_[0-9][0-9][0-9][0-9]",1982,1995,"SP_ID")

dta_Shp$post_trend_precip_01_10 <- timeRangeTrend(dta_Shp,"MeanP_[0-9][0-9][0-9][0-9]",2001,2010,"SP_ID")

dta_Shp@data["TrtBin"] <- 0

dta_Shp@data$TrtBin[dta_Shp@data$demend_y <= 2001] <- 1

dta_Shp@data$NA_check <- 0

dta_Shp@data$NA_check[is.na(dta_Shp@data$demend_y)] <- 1

int_Shp <- dta_Shp[dta_Shp@data$NA_check != 1,]

dta_Shp <- int_Shp

#Modeling examples psmModel <- "TrtBin ~ terrai_are + Pop_1990 + MeanT_1995 + pre_trend_temp_mean + MeanP_1995 + pre_trend_NDVI + Slope + Elevation + MeanL_1995 + Riv_Dist + Road_dist + pre_trend_precip_mean"

psmRes <- SpatialCausalPSM(dta_Shp,mtd="logit",psmModel,drop="support",visual=TRUE)

PSMdistDecay(psmRes$data,"PSM_trtProb",start=10,end=600,h=20)

drop_set<- c(drop_unmatched=TRUE,drop_method="SD",drop_thresh=0.25)

psm_Pairs <- SAT(dta = psmRes$data, mtd = "fastNN",constraints=c(distance=246),psm_eq = psmModel, ids = "id", drop_opts = drop_set, visual="TRUE", TrtBinColName="TrtBin")

analyticModel <- "NDVI_trend_01_10 ~ TrtBin + terrai_are + Pop_1990 + MeanT_1995 + pre_trend_temp_mean + MeanP_1995 + pre_trend_NDVI + Slope + Elevation + MeanL_1995 + Riv_Dist + Road_dist + pre_trend_precip_mean"

summary(lm(analyticModel, psm_Pairs))

Stage2PSM(analyticModel,psm_Pairs,type="lm",table_out=TRUE)

sci's People

Contributors

danrunfola avatar sgoodm avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sci's Issues

Spatial Corellogram Improvements

Biggest issue is that it needs to be able to handle different projections and distance specifications more gracefully. In general, it needs a lot of options (visualizations, for example), and an automated way to pass findings to the PSM first stage.

Improve stargazer table outputs

The stargazer table outputs are extremely poorly constructed right now.

Further, they output straight HTML - we want users to be able to copy/paste a graphic (png, jpg) optimally (or, maybe even a PDF / Word doc!).

Stargazer Code Isolation

Make the code more compatible with Stargazer - isolated functions to output relevant data for the functions.

Stage2PSM - Factor Interactions & Standardized Models

The current approach to model standardization - taking the std. deviation of all input data - does not necesarilly work in the case of factor interactions. For example, if a factor is being interacted with a numeric value (i.e., Years), you can get a different result as a product of the interaction (:) operator in the pre- and post-standardization cases. The root cause of this is currently not clear.

Binary Treatments vs. Continious

Only binary treatment values are currently enabled, and this is not clear in the function code. Either enable continious treatment values or better define the limitations of the code.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.