GithubHelp home page GithubHelp logo

linkagescape / linkage-mapper Goto Github PK

View Code? Open in Web Editor NEW
38.0 10.0 12.0 76.21 MB

ArcGIS tools to automate mapping and prioritization of wildlife habitat corridors

Home Page: https://circuitscape.org/linkagemapper/

License: GNU General Public License v3.0

Batchfile 0.38% Python 80.20% XSLT 19.41%
conservation wildlife-corridors habitat-connectivity

linkage-mapper's Introduction

linkage-mapper

Linkage Mapper connectivity analysis toolbox

Please visit the Linkage Mapper Website at https://linkagemapper.org/

linkage-mapper's People

Contributors

aprisbrey avatar bmcrae avatar dkav avatar johngallo avatar natmills avatar rgreenefl avatar venkanna37 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

linkage-mapper's Issues

Update handling of model runs where no corridors are found

  • Tools should exit without error. The tool is working correctly.
  • Code that identifies the absence of corridors should be consolidated into one module rather than being repeated in six different modules.
  • There is no check in Step 4, instead the tool crashes.

ActiveX Control Warning on Windows 10 in Tools

On Windows 10 Creators Update, when you select an empty field in a toolbox an ActiveX control warning appears. See this GeoNet discussion for more details.

A fix will require updating all tool stylesheets. Locally, I have updated LmDlgContent.xsl and the changes seem to have resolved the issue. The fix is based off MdDlgContent.xsl from ArcGIS 10.6. I do not know if there are implications for other setups.

Make maximum linkage length relative to mean permeability

Animals living in highly fragmented areas move less, compared to animals of the same species in wilder areas: Animals worldwide stick close to home when humans move in (Nature News, 25 January 2018). Hence our single threshold for maximum linkage length is too simplistic. Or do we have the option of making maximum linkage length based on cost rather than euclidian distance? If we do, put this into the documentation as best practice and cite this research.

Identify and fix any inefficiencies in raster object and workspace use

I'm trying to run LP with expert corridor importance values for a very large project file. It fails because it appears Spatial Analyst calculations during the calculation of the blended priority are saving to my OS drive by default and filling it up before the run can finish. I've looked around in the lp_settings and lp_main scripts and can't seem to find a way to change this. I've also searched ESRI forums, to no avail.

Currently, it goes to: C:/Users/my_drive/AppData/Local/ESRI/Desktop10.2/SpatialAnalyst

Would setting KEEPINTERMEDIATE to True affect this by saving intermediate files elsewhere?

Any suggestions would be much appreciated.

Input parameters are not validated when run from command line

Input parameters passed via the command line are not validated. Tools exit without clear explanation. Validation code from script tools (.tbx) are not available from the command line, although it may be possible for validation code to be shared with an ArcGIS Python toolbox (.pyt).

Beta Testing

Thank you for being a beta tester!

Right now we are primarily looking for bugs and errors, and will focus on optimization in a later release. That said, possible enhancements and ideas are always welcome.

There is a spectrum of testing that could be done. Find a place on the spectrum that works for you!

On the one end of the specturm, you can simply:

  1. Run the demo data on your machine, and
  2. Report out how it did, what tools you tested, and what operating system and ArcGIS versions you are using.
  3. You can report out in the comments below (preferred) or an e-mail for me to paste below (unless you want to stay anonymous).

On the other end of the spectrum you could do as many of the following steps as you can:

  1. Do the above with your project specific data,
  2. You can also attach your log file, and document other key computer specs, such as how much RAM you have, processing speed, and space remaining on the drive you are writing to.
  3. You can log bugs, enhancements, or ideas by clicking on "Create New Issue" (preferred), or simply in your comments.
  4. You can describe your input data in detail or upload your resistance surface and/or cores and/or outputs for illustration, in the Group, or in the Gallery. (Contact me if you do not yet have upload privileges. It is quicker to post to the Gallery, which is for more for final results but could work for this.)
  5. Anything big I am forgetting?

Release date? We were hoping for end of June, but that is too soon. E-mail me for the exact target date? Also, anyone can do any of the above after the release too...Would still be very helpful. Finally, I'd like to publicly thank all of you on the community list, please let me know if you would rather I don't mention your help.

Thanks!

New fields are added to the inputed core table and not to a duplicate

Reviewing how new core related data are added to the core table, I discovered that the new fields are added to the inputed core table and not to a duplicate. While this seems to be the intent, based on the documentation, I assumed that the fields would have been added to a duplicate.

Pinchpoint Mapper Failing

I am doing the LM_Lab and each time I try to run the Pinchpoint Mapper in Step 3, it fails. I have restarted ArcMap numerous times and made sure that I am following the tutorial exactly, but it doesn't resolve this issue.

I'm using Windows 10.0 64 bit, ArcMap 10.6, and Linkage Mapper 2.0.0.

I've included an image of the failed message below:

pinchpoint_error2

Move Truncate Options into Step 5

I think the now visible truncate options should be moved up into Step 5 within the toolbox. The Additional Option section is to my mind only for optional options. CLM (and other files) would have to be amended to reflect the change in parameter order if the update was made.

if v10plus:
config.OUTPUTFORMODELBUILDER = nullstring(arg[19])
config.WRITETRUNCRASTER = str2bool(arg[20])
config.CWDTHRESH = int(arg[21])
config.LMCUSTSETTINGS = nullstring(arg[22])
else:
config.OUTPUTFORMODELBUILDER = None
# these two settings are hardcoded based on the values used in lm_settings,
# before they were daylighted in the toolbox in LM 2.0.0+ for ArcGIS10.x
config.WRITETRUNCRASTER = True
config.CWDTHRESH = 200000
config.LMCUSTSETTINGS = None

On new feature, Core_start and Core_end not calculating properly

Hello Darren,

I think I found a bug/error in the new code implementation.

Climate Analog Ratio is not calculating correctly for the linkages in which the hotter/drier core aree is the higher core_id compared to the other core area. I think that is the issue. I suspect that the code for determining Core_Start and Core_End needs to be revised.

Here is what I see, using the modoc demo data:

The _LCPs output file:
image

and the output values in the cores file:

image

By my calculations, core 2 is hotter/drier than core 1 at the current time, so core 1 should be the Destination Core (Core_End). Therefore the C Analog ratio is the climate of core 1 in the future, divided by the climate of core 2 in the present = 608/660 = .9212 not 1.25 . I think that 1.25 occurs because core 1 is marked as the start core.

I checked the linkage of cores 1 and 4, and in this case 1 is hotter, so core start and core end are correct, and the CAnalog ratio is 445/5553 = .804 which is what we get.

I note that Core_Start and Core_end exactly replicate From_Core and To_Core so that might be a hint in the troubleshooting. Please let me know if I should clarify more or anything. Thanks!

Fix the Multi-tool Combination

It may be best to move the multi-tool combination into a new toolbox called LM Utilities Beta, or something like that. You can add the other combo tools that are under development.

In the meantime, remove red X's in Multi-tool combination by opening and closing in arc map, at the least,
also fix a weight which was 0.02 but should have been 0.33 to be equal as the others.

Update Build to have a single set of demo data

I encountered a few issues with the demo data while testing.

Issue 1:
image
This error was thrown when I tried to open LP Demo Arc10.mxd using ArcMap 10.3.1. The .mxd needs to be made backwards compatible so everyone using Arc10.x can open.

Issue 2:
The demo data for climate linkage mapper and the demo data for LM/LPM are different (different cores, resistance, etc.); the LPM demo is designed to work only with the Linkage Mapper outputs. If we want to make climate linkage mapper and LPM fully compatible, there should be matching demo data OR an added section in the LPM tutorial instructing on how to use it with the results of climate linkage mapper (directing to the appropriate inputs, etc.)

Error in LP in calculating Core Area Value when some centrality values are 0 (CF_Centrality)

This bug can occur when there are no errors, but it is very rare/extreme. It occurs when the centrality value of a core is 0. The below was caused in part by an island, seperted from the mainland by "infinite" resistance (user error), but also by some small slivers on the mainland, but on the periphery. I believe. Whoever tackles this bug shold first look at the 0 valued CF_Cental cores on the vm10 data to be sure hey are on the periphery.

At the very beginning of a linkage Priority run (v190403_vm10), I get the following error:

Retreiving outputs from Linkage Pathways model run
Calculating permeability for each LCP line
Calculating relative closeness for each LCP line
Calculating Core Area Value (CAV) and its components for each core
unsupported operand type(s) for +: 'float' and 'NoneType'
Traceback (most recent call last):
File "E:\Gallo\GIS\Projects\NPLCD_2018_john_gallo\NPLCD\Tasks\Connectivity_LM\Tools\Models\Linkage_Mapper\toolbox\scripts\lp_main.py", line 845, in main
run_analysis()
File "E:\Gallo\GIS\Projects\NPLCD_2018_john_gallo\NPLCD\Tasks\Connectivity_LM\Tools\Models\Linkage_Mapper\toolbox\scripts\lp_main.py", line 780, in run_analysis
calc_cav(core_lyr)
File "E:\Gallo\GIS\Projects\NPLCD_2018_john_gallo\NPLCD\Tasks\Connectivity_LM\Tools\Models\Linkage_Mapper\toolbox\scripts\lp_main.py", line 659, in calc_cav
True)
File "E:\Gallo\GIS\Projects\NPLCD_2018_john_gallo\NPLCD\Tasks\Connectivity_LM\Tools\Models\Linkage_Mapper\toolbox\scripts\lp_main.py", line 260, in normalize_field
in_field + "!) / " + str(max_val), "PYTHON_9.3")

Completed script LinkagePriority2...
Failed to execute (Linkage Priority (2)).
Failed at Sat Apr 6 17:29:43 2019 (Elapsed Time: 35.34 seconds)
Failed to execute (a30 Linkage PriorityNPLCD).
Failed at Sat Apr 6 17:29:43 2019 (Elapsed Time: 36.62 seconds)

Meanwhile a run on another machine (v190405_vm6), it passed through this part of the code no problem. One of the differnces is that on this local machine (vm10, the core fields are more, and some of the CF_Centrality values are 0. On the other one, ther are no )s. (Whether or not that is an error is my next line of inquiry.) Regardless, the LM code shuodl be fixed to allow for CF_Central = 0, right?

Note, the LP settings are such that all the CAV fields except for centrality are = 0. THose parameter values are available upon request.

Full error message: Document0403vm10_firsterrormessage.zip

core_centrality_v190405_vm6.gdb.zip
core_centrality_v190403_vm10.gdb.zip

Blended priority only in new beta releases

I've just tried the LP tool in several of the beta releases, and when the run is complete, the only output added to the corridor geodatabase in my project folder is the blended priority. None of the other outputs are being written to that gdb.

I started with beta release 5, and then also tried 2 and 4, and no luck.
I verified in lp_settings.py that the I had the following:
CALCCSPBP = 2 # No_Calc=0, CSP=1, CSP_BP=2

It looks as though selecting option 2 should trigger production of the blended priority as well as the other outputs to the corridor.gdb, but I can't seem to get those outputs.

Ideas? Please let me know if you'd like me to send a log file or other output.
Thanks!

Make directory structure of repository match that of the build

Or vice versa. As is, it is annoying to make any new release (wasting time). It also leads to errors in beta testing, like relative path to files in the demo folder in the repository, which gets renamed as LP_Demo folder in the build, or something like that.

Issue with zonal statistics dropping core polygons

# calc aerial mean ocav for each core
ocav_table = ZonalStatisticsAsTable(lp_env.COREFC, lp_env.COREFN, ocav_raster,
os.path.join(lm_env.SCRATCHDIR, "scratch.gdb", "core_ocav_stats"))
arcpy.AddJoin_management("core_lyr", lp_env.COREFN, ocav_table, lp_env.COREFN)

Encountering an error linked back to calculating zonal statistics for the OCAV. Zonal statistics output is dropping several of the polygons, so when the zonal stats table is joined back to the core layer, these stats values are null for the dropped polygons. Then when this part of the code is reached (calculating CAV normalization):
else:
arcpy.CalculateField_management(in_table, out_field,
"(!" + in_field + "! - " + str(min_val) + ") / " + str(max_val - min_val),
"PYTHON_9.3")

the script stops and an error is thrown: "unsupported operand type(s) for -: 'float' and 'NoneType'" since it's trying to do calculations with null values.

This also occurred (after I removed the OCAV option from the GUI) in doing the same calculation for climate envelope - two cores received null values for "clim_env" and the calculation could not be completed.

Results from climate linkage mapper incompatible with LPM

Issue - results from climate linkage mapper cannot be used with linkage priority mapper.

I used the demo datasets to try to reproduce the issue and encountered issues specific to the demo that should be fixed in addition to two issues with climate linkage mapper itself and then CLM in conjunction with LPM. I'll put the demo problems in another issue.

CLM/LPM Issues

Issue 1: Climate linkage mapper tuple error
CLM starts using cc_main.py and jumps over to lm_config.py:

# Optional parameters that only apply to 2.0.0+ toolbox, for ArcGIS 10.x only
v10plus = True
installD = config.gp.GetInstallInfo("desktop")
try:
arcgis_major_version = installD['Version'].split(".")[0]
# config.gp.AddMessage("VER:" + arcgis_major_version)
if arcgis_major_version == "9":
v10plus = False
except:
pass
if v10plus:
config.OUTPUTFORMODELBUILDER = nullstring(arg[19])
config.WRITETRUNCRASTER = str2bool(arg[20])
config.CWDTHRESH = int(arg[21])
config.LMCUSTSETTINGS = nullstring(arg[22])
else:
config.OUTPUTFORMODELBUILDER = None
# these two settings are hardcoded based on the values used in lm_settings,
# before they were daylighted in the toolbox in LM 2.0.0+ for ArcGIS10.x
config.WRITETRUNCRASTER = True
config.CWDTHRESH = 200000
config.LMCUSTSETTINGS = None

Lines 248-252 are trying to access arguments 19-22 when the user is using ArcGIS 10.x; however, the climate linkage mapper GUI does not have these variables as inputs, so there is nothing to feed in here, and the user gets a tuple index error:
tuple_error

I fixed this for test by changing the variables to the hardcoded values in lines 254-259. To avoid this issue the GUI should either be updated to accept these variables or they should be hardcoded like they are for earlier versions of ArcGIS.

Issue 2: Parameters error

My guess is that this is the main issue that is causing CLM and LPM to be incompatible.
params_error_2

# read parameters section from file and turn into tuple for passing
parms = ""
with open(last_lm_log) as log_file:
for line in log_file:
if line[0:13] == "Parameters:\t[":
parms = line[13:len(line) - 3]
break
lm_arg = tuple(parms.replace("'", "").split(", "))
if parms == "":
raise Exception("ERROR: Log file for last Linkage Mapper run does not contain a Parameters line")

These lines (147 specifically) are searching the log file generated by linkage mapper for parameters that begin with a left bracket. When linkage mapper runs, it creates a list of the parameters in the log file and so they are enclosed in brackets. However, when climate linkage mapper generates its log file, it generates a tuple and therefore encloses the parameters with parentheses.

Compare cc_main.py here (specifically 111-114 - setting lm_arg as a tuple)

def config_lm():
"""Configure Linkage Mapper"""
lm_arg = (_SCRIPT_NAME, cc_env.proj_dir, cc_env.prj_core_fc,
cc_env.core_fld, cc_env.prj_resist_rast, "false", "false", "#",
"#", "true", "false", cc_env.prune_network, cc_env.max_nn,
cc_env.nn_unit, cc_env.keep_constelations, "true", "#", "#", "#")
lm_env.configure(lm_env.TOOL_CC, lm_arg)
lm_util.create_dir(lm_env.DATAPASSDIR)
lm_util.gprint('\nClimate Linkage Mapper Version ' + lm_env.releaseNum)
lm_util.gprint('NOTE: This tool runs best with BACKGROUND '
'PROCESSING (see user guide).')
def log_setup():
"""Set up Linkage Mapper logging"""
lm_util.create_dir(lm_env.LOGDIR)
lm_util.create_dir(lm_env.MESSAGEDIR)
lm_env.logFilePath = lm_util.create_log_file(lm_env.MESSAGEDIR,
lm_env.TOOL,
lm_env.PARAMS)

to lm_util.py here (specifically 1243 - writing parameters as a list):

def create_log_file(messageDir, toolName, inParameters):
ft = tuple(time.localtime())
timeNow = time.ctime()
fileName = ('%s_%s_%s_%s%s_%s.txt' % (ft[0], ft[1], ft[2], ft[3], ft[4], toolName))
filePath = os.path.join(messageDir,fileName)
try:
logFile=open(filePath,'a')
except:
logFile=open(filePath,'w')
if inParameters is not None:
logFile.write('*'*70 + '\n')
logFile.write('Linkage Mapper log file: %s \n\n' % (toolName))
logFile.write('Start time:\t%s \n' % (timeNow))
logFile.write('Parameters:\t%s \n\n' % (inParameters))

So cc_main.py should either be changed to generate a list of parameters in the log_file rather than a tuple, or lp_main.py should be changed to search for both. The former would maintain more consistency.

I changed the parentheses to brackets in the log file for climate LM just to keep testing, and after that I ran into no further issues running LPM with the outputs from climate LM. I also ran into no issues running pinchpoint mapper or barrier mapper with the climate LM outputs.

Demo core data includes new LP fields

I noticed that the demo core data (both core.shp and LP_Demo_Modoc.gdb/cores) includes new LP fields, presumably added during a test run. I would suggest that demo data should not include data derived during a model run.

Extraneous Comment Section?

@rgreenefl is this commented section still required?

# config.USELMSETTINGS = str2bool(arg[19]) #In progress. Will need to change 9.3 toolbox too.
# if config.MAXEUCDIST == 0: Now done in nullfloat
# config.MAXEUCDIST = None
#Default values for lm_settings (used when box is not checked)
### USER SETTABLE VARIABLES
# CALCNONNORMLCCS = False # Mosiac non-normalized LCCs in step 5 (Boolean- set to True or False)
# WRITETRUNCRASTER = True # Truncate mosaic corridor raster at an upper limit, i.e., use width cutoff. (Boolean- set to True or False)
# CWDTHRESH = 200000 # CWD corridor width cutoff to use in truncated raster (Integer)
# MINCOSTDIST = None # Minimum cost distance- any corridor shorter than this will not be mapped (Integer)
# MINEUCDIST = None # Minimum euclidean distance- any core areas closer than this will not be connected (Integer)
# SAVENORMLCCS = False # Save inidvidual normalized LCC grids, not just mosaic (Boolean- set to True or False)
# SIMPLIFY_CORES = True # Simplify cores before calculating distances (Boolean- set to True or False)
# # This speeds up distance calculations in step 2,
# # but Euclidean distances will be less precise.
# if config.USELMSETTINGS: #In progress. Will need to change 9.3 toolbox too.

Make Permeability more normally distributed

right now, raw perm is LCP length (the smaller number) divided by Cost weighted distance of the LCP. (the larger number).

just like 1/2, 1/3, 1/4, 1/5 does not have the same relative difference between values as 2/1, 3/1, 4/1, and 5/1 the above applies.

So, you get a much more normal distribution if you do the inverse of the above. That is a field that already exists: cwd_to_Path_Length_Ratio The quick fix is to delete "Raw Perm" and to just use cwd_to_path_length_Ratio as the raw input, and then to make new option in LP_Settings to do Inverse Score Range Normalization, then set this as the default for this , yielding Rel_Perm.

For now, this can be done in a post processing model, then patched in as an expert opinion variable. And give the Rel_perm a weight of 0.

Help prioritize linkages based on threat

The linkage priority tool can have an 11th factor: development threat. The input needed would be a map of the landscape showing the degree of threat for every pixel. Degree of threat could be calculated in one of many ways. Then the linkage would get a "degree of threat" value, which would go into the multi-criteria model. I can see two ways of getting this "degree of threat" value for each linkage. One is to use the Least Cost Path itself, the other is to use the linkage specific surface that covers the entire region, unless a bounding circle is used (What is the name of that layer again?).

  1. Option 1: LCP: get the Least Cost path from the processing outputs, overlay that on the "degree of threat" surface, and find out the mean degree of threat for each cell. Use that.
  2. Option 2: Connectivity surface: use the full connectivity surface (gradient of values) and multiply that by the degree of threat surface, then take the mean value per resulting pixel. Use that.

To do this feature best, it would be ideal to test both of these, evaluate, and choose one to implement. If the time available for this is tight, just use option 1 for this first version of this feature.

Remove need to specify input resistance and core files for "Additional Tools"

In the future, we can change LM to make permanent copies of resistance and of cores in the "results" files, and then LP can use these as inputs. This can also apply to centrality mapper, barrier mapper, and pinchpoint mapper, right? You just specify the project directory that linkage pathways used...

Based on Closed Issue #9

Move repository to a location without a dash to facilitate dev

The multi-tool combinations are best built and tested in the repository. I made a demoOutputs folder in the repository to do this, and also added a line to .gitignore to ignore that repository. But that cannot happen as the repository is currently names:

"Setting data frame spatial reference to that of core area feature class.
ERROR: Project directory cannot contain spaces, dashes, or special characters.

A record of run settings and messages can be found in your log directory:
D:\GIS\Repositories\linkage-mapper\demo\demoOutputs\v4ClimateFocus\run_history\log
"
so, how about if we make a new repository named linkagemapper instead of linkage-mapper
@dkav do you think this will be OK?

Multi-tool Combinations tools have a red x next to them

In ArcMap 10.6 on Windows 10, when I open the Multi-tool Combinations toolset I find most of the tools have a red x next to them. When I took a look at one (a20 Build Network and Map Linkages), I found it was due to the ‘Build Network and Map Linkages’ tool not being found. This same tool had a lot of values preset too.

Fix the default values of the multi-tool

The default value for OCAV is 0.7 and should arguably be 0, and they should all arguably be the same as the default values of the regular too. Unless it is like this explicitly in the user guide? Double check that before fixing. wait until after merge and historical update, if possible.

image

Quantifying the linkage priority values for Core Area Pairs based on their Climate Signature Values and Difference (Shortname = Climate Analog)

A feature requested from the Mexico CONABIO team. Here is my paraphrase, from the e-mail dialog below: "The user can determine if they are mapping connectivity for the
near-term, and use your formula, or connectivity for the long term and
allowing for range shifts, and use the original formula."

here is "your forumla" fromthe below e-mail:

**> > We think the faster solution is to modify the line #534 in the
"lp_main.py" script:

diff_clim_env = abs(x_clim_env - y_clim_env)

     to

diff_clim_env = 1- abs(x_clim_env - y_clim_env)**

Add option to save Inputs and/or intermediate data

Provide an boolean variable in the settings file to allow user to save inputs and/or intermediate data. Documentation would need to be updated to inform user of this option.

From what I understand, only Linkage Priority currently saves intermediate data while other tools do not. Climate Linkage Mapper inconsistently tried to save clipped input data (see Pull Request #39).

Legacy Search Cursor Still in Use

Linkage Mapper still uses legacy search cursors. ESRI introduced arcpy.da cursors in ArcGIS 10.1 "to provide significantly faster performance over the previously existing set of cursor functions (arcpy.SearchCursor, arcpy.UpdateCursor, and arcpy.InsertCursor). The original cursors are provided only for continuing backward compatibility."

See this help page for more details.

Linkage Pathways step 5 won't run on its own

During the class, I ran Linkage Pathways with steps 1-5 checked, and that worked fine. Then I tried running it again with the same directory, and only step 5 checked and the truncated parameter changed from 200000 to 50000 and it did not work.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.