natcap / invest Goto Github PK
View Code? Open in Web Editor NEWInVEST®: models that map and value the goods and services from nature that sustain and fulfill human life.
License: Apache License 2.0
InVEST®: models that map and value the goods and services from nature that sustain and fulfill human life.
License: Apache License 2.0
This was reported by a user in https://community.naturalcapitalproject.org/t/trouble-with-versions-upgrading-to-3-8-on-windows-7/864. I'm currently waiting to hear back from the user about the specific exception being raised. I'll post it here when I hear back.
Apparently InVEST won't run in Windows 7. Usually this sort of thing is a DLL issue. I'm not necessarily saying that we absolutely need to support running InVEST under an old, unsupported operating system, but if there is a workaround, it would be nice to document it somewhere.
According to the python download page, this shouldn't be an issue with python itself since only Windows XP and earlier are unsupported.
As a cross-platform developer, I sometimes need to create a datastack archive or parameter set on Windows or Mac OS X and then load the datastack on the other operating system. When I create a datastack on mac and load it on Windows, it works well. When I create a datastack on Windows and load it on mac, it doesn't. Looks like the paths are being saved with \\
separators and perhaps that's a part of it.
See if there is a way to get around having to be an admin to install InVEST
not sure I'll get to this. I did do some research and it seems possible windows bumps admin privileges when finding words "setup", "install", etc..
Originally reported by @dcdenu4 at https://bitbucket.org/natcap/invest/issues/3498/invest-requires-windows-admin-privileges
We had a case in the forums (https://community.naturalcapitalproject.org/t/notimplementederror-and-missing-gdal-data-environment-variable/730/3) where the user was trying to run the InVEST binaries but already had a GDAL_DATA
environment variable defined, which then prevented the application from finding the local binaries.
Someone is likely to have a GDAL_DATA environment variable defined if they have a valid GDAL install on their system, but if it isn't of the correct version, it'll undoubtedly cause issues. We should be able to handle this case elegantly within the InVEST binaries.
This should cover the functionality existing in File > Save as
menu of the Qt UI.
This has happened fairly consistently for a while, but is not reproducible outside of the appveyor VM. Since only one InVEST model (coastal blue carbon) uses a Multi input, we’ll skip this class of tests for now, and consider refactoring the CBC UI to avoid this troublesome input.
For now we're skipping the test. Eventually we might factor out the instances of the Multi objects altogether.
Originally reported by @davemfish at https://bitbucket.org/natcap/invest/issues/3936/ui-tests-on-multi-inputs-crash-on-appveyor
The Installing InVEST page (https://invest.readthedocs.io/en/latest/installing.html) is missing pandas
, psutil
and a couple others that can be installed through pip but might be better to install through the system package manager.
Despite how common reclassifications are across InVEST, we don't handle errors very consistently. If a model's biophysical table is missing a row and that landcover code is encountered, either a KeyError
is raised, or else there's a cryptic error coming from numpy about bins.
If there's an issue around reclassifications, InVEST should consistently and helpfully handle errors like these by raising a human-readable error message that contains enough information for a GIS person to decipher the source of the issue. Stacie Wolny specifically referenced the error message at https://bitbucket.org/natcap/invest/src/13fa555b96402d9b463aa524b170de01c046a7bf/src/natcap/invest/ndr/ndr.py#lines-931 as being clear and helpful.
The InVEST release process has been quite manual (documented here: https://bitbucket.org/natcap/invest/wiki/Release Checklist) and pretty involved, requiring a nontrivial amount of engineering time to become confident in our releases before we send it out.
In short, it would be a fantastic improvement to our team productivity if we could design a process that:
Other related questions we’ll want to consider are:
As a developer, I occasionally would like to see diagnostic logging from the User Interface while it is running, without needing to rebuild the binaries and re-run the application. This would have been really useful in the InVEST 3.8.0 release process, where my mac laptop's homebrew configuration was interfering with the InVEST binaries I was attempting to run. The specific issues observed were:
brew install geos
) and GDAL (brew install osgeo-gdal-python
) were being used instead of the bundled GEOS and GDAL. It would have been nice to see, if possible, where those libraries were being loaded from..app
), I was able to interact just fine with the inputs pane. But when I wanted to run the application itself, I could press the 'run' button, but nothing would happen. If I had access to the logging of the UI, I'd be able to take a look at what was going on. This issue was not reproducible when running the binaries through a shell.InVEST uses python's logging
package to log messages throughout the various layers of the application. It would be temendously useful if logging could be directed to an appropriate place during application execution. Possibilities to consider include:
console.app
on mac, Event Viewer on Windows)When we figure this out and implement a fix, it would be nice to document in the InVEST API docs, User's Guide, or somewhere else useful to us for later reference.
InVEST tests currently fail when using GDAL 3+. This is mostly related to how GDAL 3 has changed how it handles spatial references, and particularly coordinate transformations. This issue should be blocked on the Pygeoprocessing version of this issue. The main consideration is whether we should support both GDAL 2 and GDAL 3 or move on to GDAL 3 and drop support for GDAL 2. The main thing here is that there are slight variations in spatial coordinates pertaining to bounding boxes and extents. This would mean that in certain cases we would have to carry around expected test data for both GDAL 2 runs and GDAL 3 runs. That is doable, but perhaps not ideal?
InVEST currently produces both ESRI Shapefile and GeoPackage output vectors. When the file suffix is applied to the filename (and thus the layer name) in a Shapefile, layers in a GeoPackage should also have their suffix appended so that the layers will appear in QGIS and Arc with the suffix attached.
GeoPackages are used in the following models/tools (based on a grep):
Also, it would be useful to add this to the InVEST Spec for future us.
ALSO: Arc might have a maximum layer name length for geopackages, so let's look into this to make sure we understand the constraints. What's this limit? Does it only apply to certain versions of Arc? Does QGIS have such a limit?
Occasionally (this has come up twice now to my knowledge), successive reads/writes to an LZW-compressed GeoTiff, such as within a batch script, will result in errors like the one below being produced.
ERROR 1: LZWDecode:Corrupted LZW table at scanline 1280
ERROR 1: TIFFReadEncodedTile() failed.
ERROR 1: C:\Temp\SWY_sensitivity_test2\L_sum_avail_CMCC_TFA_500.tif, band 1: IReadBlock failed at X offset 5, Y offset 5: TIFFReadEncodedTile() failed.
Traceback (most recent call last):
File "src\natcap\invest\seasonal_water_yield\seasonal_water_yield_core.pyx", line 309, in natcap.invest.seasonal_water_yield.seasonal_water_yield_core._ManagedRaster._load_block
AttributeError: 'NoneType' object has no attribute 'astype'
The only known workaround for this at the moment is to write uncompressed GeoTiffs, a change that requires passing the appropriate parameter to each pygeoprocessing function call.
A solution for this might involve touching many components of InVEST and so should probably be designed.
When creating a datastack archive, InVEST creates a randomly-generated directory that will definitely not conflict in name with other directories in the same datastack. To do this, a randomish directory name is generated. A downside to this is that you have no idea what that raster is supposed to represent unless you're also looking at the parameters.json
file.
Can't we at least include the args key names in the directory name? It would be very nice to be able to just look at the uncompressed datastack on disk and know which parameter I'm looking at.
Reported in https://community.naturalcapitalproject.org/t/hra-value-error-encountered/659/3
If we’re running on a mac but the info CSV refers to stressor vectors that have windows-style directory separators, the model crashes.
When testing out the InVEST 3.8.0 release binaries, I could launch and run the models by reaching into the application and running the compiled invest
binary. But then if I tried running a model that loaded geometries from a vector, the model would segfault. The segfault exception when n_workers=-1
looked like this:
Current thread 0x0000700010e0e000 (most recent call first):
File "env/lib/python3.7/site-packages/shapely/geometry/polygon.py", line 494 in geos_polygon_from_py
File "env/lib/python3.7/site-packages/shapely/geometry/polygon.py", line 240 in __init__
File "env/lib/python3.7/site-packages/shapely/geometry/geo.py", line 19 in box
File "env/lib/python3.7/site-packages/taskgraph/Task.py", line 1071 in _call
File "env/lib/python3.7/site-packages/taskgraph/Task.py", line 389 in _task_executor
File "env/lib/python3.7/threading.py", line 870 in run
File "env/lib/python3.7/threading.py", line 926 in _bootstrap_inner
File "env/lib/python3.7/threading.py", line 890 in _bootstrap
Thread 0x0000700010408000 (most recent call first):
File "env/lib/python3.7/threading.py", line 296 in wait
File "env/lib/python3.7/threading.py", line 552 in wait
File "env/lib/python3.7/site-packages/taskgraph/Task.py", line 1231 in join
File "env/lib/python3.7/site-packages/taskgraph/Task.py", line 713 in join
File "env/lib/python3.7/site-packages/natcap/invest/scenic_quality/scenic_quality.py", line 458 in execute
File "env/lib/python3.7/site-packages/natcap/invest/ui/model.py", line 1645 in _logged_target
File "env/lib/python3.7/site-packages/natcap/invest/ui/execution.py", line 68 in run
File "env/lib/python3.7/threading.py", line 926 in _bootstrap_inner
File "env/lib/python3.7/threading.py", line 890 in _bootstrap
Thread 0x000000010b383dc0 (most recent call first):
File "src/natcap/invest/cli.py", line 561 in main
File "src/natcap/invest/cli.py", line 575 in <module>
Segmentation fault: 11
After digging into this some more, it turned out that my homebrew install had GEOS 3.7.3 and GDAL 3.0.1 installed due to a homebrew-managed QGIS install, and that the InVEST binaries were clearly loading the GEOS binaries from the homebrew Cellar
(verified by sampling the files that the invest
process had open, visible in Activity Monitor). See the attached sample log for more information.
My questions around this are:
Cellar
and not from the bundled binaries?It would appear that pyinstaller modifies LD_LIBRARY_PATH
... maybe that has something to do with this?
When we started developing and shipping binaries for InVEST, 32-bit binaries was an obvious choice as so many computers were still running Windows XP or were running on very old hardware. In this modern age, I can't remember the last time I saw someone working with a 32-bit windows computers, and macs are all 64-bit anyways. The only computer system I can think of that runs a 32-bit OS is the Raspberry Pi, but that'll be on linux and would therefore not be using our prebuilt binaries.
The recreation raises a MemoryError on very large AOIs.
To recreate this issue, create a vector polygon that extends over the continental United States. Then run the model with that single polygon and observe a server-side memory error after a few minutes.
Previous debugging has shown the memory error occurs during the point collection phase where a long list of all covered points is created in memory.
There may be a variety of solutions including an out of core ponit list, an enhanced algorithm that sidesteps the need to maintain a point list in memory, or advanced techniques of subdividing large polygons such that sub polygon's points fit in memory, then combining the result after the fact.
We'll get to this post PyGeoprocessing 1.0 refactor.
Originally reported by @richpsharp at https://bitbucket.org/natcap/invest/issues/3614/rec-model-memory-errors
During point snapping, if we have two pixels with the same snapping distance, we should use the Flow Accumulation raster to break the tie. The stream pixel with the highest flow accumulation should be picked over those that are the same distance but with lower flow accumulation values.
This sometimes causes our tests to fail (https://ci.appveyor.com/project/davemfish/invest/builds/28437818/job/y1psai9ejk8h2htg)
It's clear what the problem is, but unclear what the correct scientific solution is.
fisheries_hst.py line 144 divides by array n_a, which is populated by a user-input matrix of 1s and 0s (see lines 109, 110 for n_a, and trace it back to the user input in fisheries_hst_io.py)
fisheries_hst.pyline 110 is especially suspicious: n_a[n_a == 0] = 0
Originally reported by @davemfish at https://bitbucket.org/natcap/invest/issues/3925/fisheries-hst-floatingpointerror-invalid
When we refactored SDR to use Multiple Flow Direction, we failed to update SDR's RKLS calculations to account for the fact that aspect (a component of the LS equation) should now account for the possibility that a pixel can flow into 8 possible neighbors. LS does not currently account for this.
After talking about this with Rich, we decided on the following, corrected approach:
FV
= "flow value", the packed 64-bit, 8-direction (4 bits per direction) value coming from MFD Flow Direction
d
= The integer direction in the form:
3 2 1
4 x 0
5 6 7
P_i(d)
= The proportion of flow in pixel i
in direction d
LS_i'
= A modified LS
equation that removes the x_d
term entirely
x_d
= The length of the pixel from x
to neighbor d
. If d
is even, this will be 1
, sqrt(2)
otherwise.
P_i(d) = FV(d) / sum(FV(j) for j in {0..7})
FV_avg = sum(1/(x_d) * P_i(d) for d in {0..7})
LS_i'
RKLS_i = R_i * K_i * LS_i' * FV_avg
HISTORY.rst
The rec model server accumulates files in a cache directory each time a user runs the model. Old workspaces need to be removed periodically because disk space is limited.
A solution is to have a shell script run on cron to find and remove workspaces. We'll track that script here in invest/scripts/recreation
When running an InVEST model, specifically an under development version of SDR that uses PyGeoprocessing 1.6.x, logging does not appear in the log for pygeoprocessing.routing, though it does seem to show up as a log to Task._call. James and I did some debugging and we find that outside of InVEST the the routing module does log and so we suspect something in the way InVEST is handling logging. Here's a short snippet that uses routing and taskgraph that shows log messages from the routing module:
import logging
import taskgraph
import pygeoprocessing.routing
logging.basicConfig(level=logging.DEBUG)
def main():
task_graph = taskgraph.TaskGraph('taskgraph_cache', 8)
target_filled_dem_raster_path = 'filled.tif'
dem_raster_path_band = (r"C:\Users\rpsharp\Documents\bitbucket_repos\invest\data\invest-data\Base_Data\Freshwater\dem", 1)
task_graph.add_task(
func=pygeoprocessing.routing.fill_pits,
args=(
dem_raster_path_band, target_filled_dem_raster_path),
target_path_list=[target_filled_dem_raster_path])
task_graph.close()
task_graph.join()
if __name__ == '__main__':
main()
Originally reported by @richpsharp at https://bitbucket.org/natcap/invest/issues/3859/pygeoprocessingrouting-not-showing-up-in
The InVEST User's Guide has recently been migrated to GitHub (https://github.com/natcap/invest.users-guide) and we'll need to make sure that this is reflected in the Makefile.
When natcap.invest
supported both python2 and python3, we used the future
package and a few other things (like conditionally importing Queue
vs queue
) for compatibility. It would be great to trim this to whatever degree is reasonable.
Known parts to this:
future
from requirements.txt
future
and future-defined imports (builtins, others?) throughout the packagebasestring
with str
(hra.py, ... )The InVEST setup.py
and python README both refer to bitbucket and/or mercurial. We should update this to use the latest repo URL.
Habitat Quality has been updated a couple of times in the past few years, mostly for pygeoprocessing updates, but it's still largely unchanged. Even though we might be doing some major work on it in the near future, it'd be good to bring it up to the latest InVEST development standards and also update it to use taskgraph
.
If the nodata value of a threat raster is undefined, but values other than 0 or 1 are present, we crash with: TypeError: ufunc 'isfinite' not supported for the input types
Here's an example from a user: https://community.naturalcapitalproject.org/t/typeerror-ufunc-isfinite-in-hq-model/874/2
It would be pretty straightforward to wrap this in a try, except and give clear direction to the user to check their nodata value:
zero_threat = numpy.isclose(block, threat_nodata)
This model has a very strict requirement for acceptable values in the soilgroup raster. (1, 2, 3, 4) are the only acceptable values.
Right now a raster with another value yields ValueError: invalid entry in choice array
. We can alert the user to the exact data problem in this case.
Right now the build pipelines share the make deploy
target, which is good. But since we don't build the sample data in the Mac pipeline, make deploy
crashes on the Mac when it tries to rsync the sampledata to our releases bucket.
We can either modify make deploy
or the ci/mac_mini_build_process.sh
script to better handle this.
Implement InVEST Pest control model.
Design doc: https://docs.google.com/document/d/1iQagCqzdaMOO8whCL9-VbZFYlrq5ky1qf1xJRrSXeks/edit?usp=sharing
My algorithms are not reproducing the same data as the original model. On hold until we can figure out what the issue might be.
Originally reported by @richpsharp at https://bitbucket.org/natcap/invest/issues/3630/invest-pest-control-model
On bitbucket, we had some appveyor integration that worked pretty well, but would take forever. We still need to test things, but I think we should revisit where these individual steps are done:
I think we should keep the binary build on appveyor for the moment, at least until we get a binary build working as expected on actions. Since the build already works there, I don't think this part is as critical to move, and execution could still take place in parallel with the other test steps.
We should move all tests to github actions, as this is a build matrix that can be done in parallel (which is something that AppVeyor doesn't provide without paying for it). Let's start out by replicating the builds we currently have on appveyor:
If possible, it'd also be nice to run model tests on both python 3.6 and python 3.7 (32- and 64-bit architectures)
We should do this on an available python install ... maybe whatever's on the PATH.
We had previously set up a script at scripts/update_installer_urls.py
to keep track of the latest application binaries, with the intent of having that be our main entry point for downloading the latest release of InVEST. Since then, we decided that the new drupal site's scheduling functionality will be a better approach for updating the latest binaries on the natcap website.
To use this new functionality, we'll need to address a couple things:
Every once in a while we get errors like the the link below from the forums where the model crashes on one exception but the underlying error (which is logged to the logfile) actually is that they ran out of disk space.
Maybe we can raise an informative exception if there's an error of sufficient severity?
Or maybe we can report the last few GDAL error codes in the log that might have affected the final exception?
https://community.naturalcapitalproject.org/t/example-data-for-annual-water-yield-model/308/10
Python 3.8 is the most recently-released version of python and we should make sure that the package supports it.
Right now this is the only component of this project depending on redux to manage state. The use of redux here is just a legacy from the prototyping of the HRA app, not a real design decision.
So unless we decide that redux is useful elsewhere, we can probably do without it here.
When building the InVEST binary build on github actions for #26 , I noticed that about 7.5 minutes of the build time was being occupied by the pyinstaller part of the build, and, specifically, the part of the build where it is Looking for dynamic libraries
and generating a pile of warnings that look like this:
278982 WARNING: lib not found: libbispeu.KX7AQLB2Z5NFTCADW76YV5UMLBQNDALD.gfortran-win32.dll dependency of C:\hostedtoolcache\windows\Python\3.7.6\x86\lib\site-packages\scipy\interpolate\_fitpack.cp37-win32.pyd
281314 WARNING: lib not found: libdfitpack.EM6D2ZEFEP35UVNTFVXVVRYRWM22ZXEY.gfortran-win32.dll dependency of C:\hostedtoolcache\windows\Python\3.7.6\x86\lib\site-packages\scipy\interpolate\dfitpack.cp37-win32.pyd
299317 WARNING: lib not found: libansari.Q4BAGRNANLWD2YZJOKYPOAUIOLXW2LXK.gfortran-win32.dll dependency of C:\hostedtoolcache\windows\Python\3.7.6\x86\lib\site-packages\scipy\stats\statlib.cp37-win32.pyd
Example build job with these warnings: https://github.com/phargogh/invest/runs/471001452?check_suite_focus=true
The build runs so this isn't exactly an error, but it sure would be nice to trim ~7.5 minutes off of our ~20 minute runtime for the build job.
Marking this with a complexity estimate of 13 to indicate the possible worst-case situation for this fix. It's quite possible that just upgrading pyinstaller would fix the issue, or maybe it's an environment thing (the conda-based mac build doesn't have this issue), but it could also be something deeper.
This actually has implications for datastack.extract_parameters_from_logfile
, which expects to find a line starting with "Arguments".
It's simply a matter of adding a newline in the log message from cli.py
.
A user experienced an issue where her pools CSV had integer lucode values but floating point column values. When calling utils.build_lookup_from_csv()
, the function returns a dictionary with the lucode values as floats. Setting all column values to integers, then lucode is also returned as an integer in the dictionary.
https://community.naturalcapitalproject.org/t/keyerror-invest-3-8-0-carbon-storage-model/850
Once #24 and issue #22 are resolved, it would be really, really nice to move our binary build process from AppVeyor to Github Actions. The problems with AppVeyor binary builds are:
HISTORY.rst
PyTest has become probably the most-used testing framework for python applications, while nosetests
(https://pypi.org/project/nose/#history) hasn't been updated since 2015.
Although nosetests works, I don't know of a reason to stick with something that isn't actively maintained and hasn't been for some time. I think we should switch to pytest
.
Mentioned on the forums: https://community.naturalcapitalproject.org/t/error-running-offshore-wind-energy-with-custom-aoi-and-custom-grid-land-points/513
Original issue: https://bitbucket.org/natcap/invest/issues/3922/wind-energy-fails-when-no-grid-points
Apparently I was already working on this in my fork on the bugfix/WINDENERGY-3922-error-when-no-grid-pts-intersect-aoi
branch. Might want to dig that out.
We just had an issue while working on natcap/taskgraph#14 where we needed to update to actions/checkout@v2
for the actions themselves to work as expected. Let's update to v2
here as well across our various workflows.
This has happened on the forums a couple of times now, but here's the latest example: https://community.naturalcapitalproject.org/t/pollination-model-error-after-updating-to-invest-3-8-0-unable-to-initiate-run/856
When a user provides an overviews file instead of the GeoTiff it's associated with, the overviews file triggers the validation error stating that the file must be projected. It would be nice if we could check if this file is an overviews file and provide a more specific and helpful error message for this common use case.
Marked as a 3 difficulty because the slightly more interesting thing will be figuring out how to build an overviews file within GDAL as part of the tests. I think maybe @richpsharp has done that before?
If I'm running a parameter study of a single model, let's say SDR, it would be really nice to be able to re-run the model in multiple workspaces (to keep the files separate), but only recompute the things that are needed. It seems reasonable to pass in the taskgraph cache dir as an optional argument, and, if it's provided, the cache location should be reused.
It looks like there's a test in the UI suite that's flakily failing. It'd be better to take a look at it than to ignore. https://github.com/natcap/invest/pull/33/checks?check_run_id=468080170
This came up while taking a look at #33.
Reported at https://forums.naturalcapitalproject.org/index.php?p=/discussion/1409/attributeerror-float-object-has-no-attribute-lower-in-running-invest-blue-carbon-model. Data is attached to the forums post.
Full exception:
Traceback (most recent call last):
File "c:\users\natcap-servers\jenkins-home\workspace\natcap.invest\label\gce-windows-1\release_env\lib\site-packages\natcap\invest\ui\execution.py", line 68, in run
File "c:\users\natcap-servers\jenkins-home\workspace\natcap.invest\label\gce-windows-1\release_env\lib\site-packages\natcap\invest\ui\model.py", line 1505, in _logged_target
File "c:\users\natcap-servers\jenkins-home\workspace\natcap.invest\label\gce-windows-1\release_env\lib\site-packages\natcap\invest\coastal_blue_carbon\coastal_blue_carbon.py", line 126, in execute
File "c:\users\natcap-servers\jenkins-home\workspace\natcap.invest\label\gce-windows-1\release_env\lib\site-packages\natcap\invest\coastal_blue_carbon\coastal_blue_carbon.py", line 684, in get_inputs
File "c:\users\natcap-servers\jenkins-home\workspace\natcap.invest\label\gce-windows-1\release_env\lib\site-packages\natcap\invest\coastal_blue_carbon\coastal_blue_carbon.py", line 684, in <genexpr>
AttributeError: 'float' object has no attribute 'lower'
06/19/2018 10:22:10 natcap.invest.ui.execution INFO Execution finished
This might have been addressed by the pandas refactor, but it'll be worth checking to make sure this is indeed resolved.
Originally reported at https://bitbucket.org/natcap/invest/issues/3749/coastal-blue-carbon-fails-when-lulc-lookup.
Here's an error report: https://community.naturalcapitalproject.org/t/urban-cooling-model-error/824/2
The model expects the args in question to either have values that cast to float, or for the key to be missing from the args dict altogether. Not handled is the case where args['cc_weight_shade'] = ''
. Ditto for the other two cc_weight
args.
It would be really nice to be able to create wheels as a part of our regular build processes. manylinux
would be great, but that's a smaller concern than windows wheels. Windows would be the highest priority, mac second.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.