codeforkjeff / dbt-sqlite Goto Github PK
View Code? Open in Web Editor NEWA SQLite adapter plugin for dbt (data build tool)
License: Apache License 2.0
A SQLite adapter plugin for dbt (data build tool)
License: Apache License 2.0
If you run pip install dbt-sqlite
and then pip uninstall dbt-sqlite
, it will ask to confirm deleting a lot of files from dbt-core. Something is messed up about the packaging.
This seems to happen with other dbt adapter packages too.
Adding the file with a variable version
set to the package version allows printing the adapter version along when dbt --version
is invoked.
Current behavior:
$ dbt --version
installed version: 1.0.0
latest version: 1.0.0
Up to date!
Plugins:
Expected behavior:
$ dbt --version
installed version: 1.0.0
latest version: 1.0.0
Up to date!
Plugins:
- sqlite: 0.2.2
Reference: https://docs.getdbt.com/docs/contributing/building-a-new-adapter#__version__py
Thanks for your work on this adapter, it's great.
The batch size to use when loading seeds is specified here as 100000.
I'm not sure how the 100k value was determined, but I ran into the following SQLite error when trying to load a seed file of 70k rows, 5 cols, 2.37MB (a dictionary of common words in 14 languages):
17:46:24 7 of 7 ERROR loading seed file main.synth_words ................................ [ERROR in 2.54s]
...
17:46:24 Database Error in seed synth_words (seeds/synth_words.csv)
17:46:24 too many SQL variables
I resolved the error by manually changing the batch size in macros/materializations/seed/seed.sql
from 100k down to 10k.
SQLite's limits include SQLITE_MAX_VARIABLE_NUMBER
which defaults to 999 for SQLite versions prior to 3.32.0 (2020-05-22) or 32766 for SQLite versions after 3.32.0.
(I'm running SQLite 3.31.1.) It seems that the number of variables used in the seed query scales with batch size.
Sidebar: dbt itself specifies a maximum file size for seeds but this is only used to determine whether or not to hash the file. dbt discourages using very large seed files, but does not impose a limit as far as I can tell.
Would it be possible to decrease the batch size from 100k to say 10k or so? and/or use a configurable seed_batch_size
parameter?
My guess is that the optimal value probably depends on both the number of rows and columns in the seed file, so it may be difficult to hard-code and justify any specific batch size. But something smaller, while resulting in more queries against the database, should still be fairly performant.
Since using SQLite is always going to be local, we can (mis)use dbt's python models to run code, probably in a subprocess.
Firstly, thank you for building and releasing dbt-sqlite. I find it really useful!
With dbt-sqlite==0.1.0 and dbt==0.19.1, I am encountering the following error when running dbt compile
:
Running with dbt=0.19.1
Encountered an error:
Field "extensions" of type typing.Union[str, NoneType] is missing in dbt.adapters.sqlite.connections.SQLiteCredentials instance
Previously, with dbt==0.19.0, everything was running smoothly for me.
I took another look at the dbt-sqlite README.md section on how to set up profiles.yml. This part in particular:
# optional: semi-colon separated list of file paths for SQLite extensions to load.
# digest.so is needed to provide for snapshots to work; see README
extensions: "/path/to/sqlite-digest/digest.so"
The error message I encountered leads me to consider that maybe extensions
is no longer optional in dbt==0.19.1.
I was able to get around the error by following the idea of this comment in the dbt Slack group: https://getdbt.slack.com/archives/CJN7XRF1B/p1618594722275700?thread_ts=1618545451.270400&cid=CJN7XRF1B
For me it looks like:
extensions: "{{ 'None' | as_native }}"
Doing that eliminated the error for me and didn't seem to break anything with dbt-sqlite 0.18.x or 0.19.0.
There might be a simpler solution. I don't know enough about extensions
to know. I just wanted to call it to your attention.
There is a sqlite__datediff_broken
macro that needs to be fixed, more fully implemented, renamed, and tested.
Dependabot updates could be set up to make sure the package is tested with the latest dbt-tests-adapter
by putting it in a pip constraints file.
Happy to submit a PR if there's interest ๐
We've just published the release cut of dbt-core 1.2.0, dbt-core 1.2.0rc1
(PyPI | GitHub release notes).
dbt-labs/dbt-core#5468 is an open discussion with more detailed information, and dbt-labs/dbt-core#5474 is for keeping track of the communities progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
BaseDocsGenerate
and BaseDocsGenReferences
dbt-labs/dbt-core#5432 might make it into the second release cut in the next week, in which case, you'll also might want to:
The Available Adapters page is one of the dbt community's most-visited docs pages. It would be of great benefit for first-time visitors to the dbt docs to see:
dbt-labs/docs.getdbt.com#1489 exists to address this with all as-of-yet undocumented adapters.
We just released Documenting a new adapter, a new guide on how to add an adapter to the Available Adapters page. I'd love to see this adapter on that page, so feel free to reach out with any questions/blockers by either replying to this issue, or posting in the #adapter-ecosystem channel of the dbt Community Slack.
Looking forward to the contribution @codeforkjeff!
The closest thing in sqlite is group_concat. Maybe there's an extension that provides this? sqlean doesn't have it.
@codeforkjeff I found another issue and I think it comes from the upgrade. What do you think ?
Running with dbt=0.20.0
Found 5 models, 19 tests, 0 snapshots, 0 analyses, 538 macros, 0 operations, 2 seed files, 0 sources, 0 exposures
03:54:44 | Concurrency: 1 threads (target='dev')
03:54:44 |
03:54:45 | Done.
03:54:45 | Building catalog
Encountered an error while generating catalog: Database Error
no such table: main_dbt_test__audit.sqlite_master
dbt encountered 1 failure while writing the catalog
03:54:45 | Catalog written to /dbt/target/catalog.json
Same as before your changes for the other commands it uses a wrong database name instead of just main
We've just published the release cut of dbt-core 1.2.0, dbt-core 1.2.0rc1
(PyPI | GitHub release notes).
dbt-labs/dbt-core#5468 is an open discussion with more detailed information, and dbt-labs/dbt-core#5474 is for keeping track of the communities progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
BaseDocsGenerate
and BaseDocsGenReferences
dbt-labs/dbt-core#5432 might make it into the second release cut in the next week, in which case, you'll also might want to:
dbt will create the model successfully on the first run, but on a subsequent run, I get this error:
Compilation Error in model CamelCaseTest (models\CamelCaseTest.sql)
When searching for a relation, dbt found an approximate match. Instead of guessing
which relation to use, dbt will move on. Please delete "main"."CamelCaseTest", or rename it to be less ambiguous.
Searched for: main.camelcasetest
Found: "main"."CamelCaseTest"
> in macro materialization_table_sqlite (macros\materializations\table\table.sql)
> called by model CamelCaseTest (models\CamelCaseTest.sql)
Done. PASS=0 WARN=0 ERROR=1 SKIP=0 TOTAL=1
Not sure why it is searching for a table name with all lower case, instead of the original CamelCaseTest passed in through dbt run -m CamelCaseTest
.
Minor version v1.7
is targeted for final release on Ocotber 26, 2023. As a maintainer of a dbt adapter, we strongly encourage you to release a corresponding minor version increment to ensure users of your adapter can make use of this new minor version.
dbt-labs/dbt-core#8307 is an open discussion with more detailed information. If you have questions, please put them there!
The latest version of dbt Core,dbt-core==1.5.0rc1
, was published on April 13, 2023 (PyPI | Github).
dbt-labs/dbt-core#7213 is an open discussion with more detailed information. If you have questions, please put them there!
The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.5.0 release of your adapter.
1.6.0
FYI, dbt-core==1.6.0
is expected to be released at the end of July, with a release cut at least two weeks prior.
Several tests in test_data_types.py
are currently failing, and the issues around types are also what's behind the failures in TestDocsGenerateSqlite and TestDocsGenReferencesSqlite.
Here's some things the adapter needs to account for:
create table
statements, including the caseThe incremental materialization in 1.1.x of this adapter is a little out of date with changes in dbt-core 1.1.x.
get_delete_insert_merge_sql
macroHere is the error I received during my test:
00:49:42 Completed with 1 error and 0 warnings:
00:49:42
00:49:42 Failure in test assert_equal_test_dateadd_actual__expected (models/test_dateadd.yml)
00:49:42 Got 4 results, configured to fail if != 0
00:49:42
00:49:42 compiled Code at target/compiled/test/models/test_dateadd.yml/assert_equal_test_dateadd_actual__expected.sql
00:49:42
00:49:42 Done. PASS=2 WARN=0 ERROR=1 SKIP=0 TOTAL=3
Here is the sample seed used for the test:
seeds__data_dateadd_csv = """from_time,interval_length,datepart,result
2018-01-01 01:00:00,1,day,2018-01-02 01:00:00
2018-01-01 01:00:00,1,month,2018-02-01 01:00:00
2018-01-01 01:00:00,1,year,2019-01-01 01:00:00
2018-01-01 01:00:00,1,hour,2018-01-01 02:00:00
,1,day,
"""
Since our existing dateadd macro only uses date
we are not able to pass this test which has datetime
in it.
{% macro sqlite__dateadd(datepart, interval, from_date_or_timestamp) %}
date(
{{ from_date_or_timestamp }},
"{{ datepart }} {{ datepart }}"
)
{% endmacro %}
The latest version of dbt Core,dbt-core==1.4.0
, was published on January 25, 2023 (PyPI | Github). In fact, a patch, dbt-core==1.4.1
(PyPI | Github), was also released on the same day.
dbt-labs/dbt-core#6624 is an open discussion with more detailed information. If you have questions, please put them there! dbt-labs/dbt-core#6849 is for keeping track of the community's progress on releasing 1.4.0
The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.4.0 release of your adapter.
FYI, dbt-core==1.5.0
is expected to be released at the end of April. Please plan on allocating a more effort to upgrade support compared to previous minor versions. Expect to hear more in the middle of April.
At a high-level expect much greater adapter test coverage (a very good thing!), and some likely heaving renaming and restructuring as the API-ification of dbt-core is now well underway. See https://github.com/dbt-labs/dbt-core/milestone/82 for more information.
The latest release cut for 1.3.0, dbt-core==1.3.0rc2
was published on October 3, 2022 (PyPI | Github). We are targeting releasing the official cut of 1.3.0 in time for the week of October 16 (in time for Coalesce conference).
We're trying to establish a following precedent w.r.t. minor versions:
Partner adapter maintainers release their adapter's minor version within four weeks of the initial RC being released. Given the delay on our side in notifying you, we'd like to set a target date of November 7 (four weeks from today) for maintainers to release their minor version
Timeframe | Date (intended) | Date (Actual) | Event |
---|---|---|---|
D - 3 weeks | Sep 21 | Oct 10 | dbt Labs informs maintainers of upcoming minor release |
D - 2 weeks | Sep 28 | Sep 28 | core 1.3 RC is released |
Day D | October 12 | Oct 12 | core 1.3 official is published |
D + 2 weeks | October 26 | Nov 7 | dbt-adapter 1.3 is published |
dbt-labs/dbt-core#6011 is an open discussion with more detailed information, and dbt-labs/dbt-core#6040 is for keeping track of the community's progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
I'm sure this is a weird issue...dbt started failing this morning on multiple independent machines.
With dbt-sqlite I have a cron job that runs dbt run
every hour and has been for months. This morning it started failing. I started debugging and it's gotten weird...when I hit a wall, I opened up the project on another machine that had a backup from yesterday. When running dbt run
I got the same error. dbt run
was run on this version of the database multiple times since yesterday without issue on the original machine. My working theory here was there was some newly ingested data causing issues, but this seems to have ruled that out.
I haven't changed anything in days...no source code, no models, no binaries, no system update, no system reboot, etc. It's been on a server running in the background. The only thing I changed, after the error started happening I updated the config to not report metrics back to dbt in hopes a network problem was causing it, but that didn't help.
Here's the output of dbt debug
:
16:37:56 Running with dbt=1.1.3
dbt version: 1.1.3
python version: 3.7.16
python path: /Users/cboden/.local/pipx/venvs/dbt-sqlite/bin/python
os info: Darwin-18.7.0-x86_64-i386-64bit
Using profiles.yml file at /Users/cboden/.dbt/profiles.yml
Using dbt_project.yml file at /Users/cboden/redacted/path/to/project/dbt/dbt_project.yml
Configuration:
profiles.yml file [OK found and valid]
dbt_project.yml file [OK found and valid]
Required dependencies:
- git [OK found]
Connection:
database: database
schema: main
schemas_and_paths: {'main': '/Users/cboden/redacted/path/to/project/project-database.db'}
schema_directory: /Users/cboden/redacted/path/to/project
Connection test: [ERROR]
1 check failed:
dbt was unable to connect to the specified database.
The database returned the following error:
>Runtime Error
Database Error
malformed database schema (my_redacted_model_name) - view "my_redacted_model_name" cannot reference objects in database main
Check your database credentials and try again. For more information, visit:
https://docs.getdbt.com/docs/configure-your-profile
Here is the output from dbt.log while running dbt debug
:
============================== 2023-03-30 16:37:56.839790 | decf6338-ceb0-41d7-a70e-b35f5c98e6d3 ==============================
16:37:56.839808 [info ] [MainThread]: Running with dbt=1.1.3
16:37:56.840766 [debug] [MainThread]: running dbt with arguments {'write_json': True, 'use_colors': True, 'printer_width': 80, 'version_check': True, 'partial_parse': True, 'static_parser': True, 'profiles_dir': '/Users/cboden/.dbt', 'send_anonymous_usage_stats': False, 'event_buffer_size': 100000, 'quiet': False, 'no_print': False, 'cache_selected_only': False, 'config_dir': False, 'which': 'debug', 'indirect_selection': 'eager'}
16:37:56.841150 [debug] [MainThread]: Tracking: do not track
16:37:56.946340 [debug] [MainThread]: Executing "git --help"
16:37:56.957930 [debug] [MainThread]: STDOUT: "b"usage: git [--version] [--help] [-C <path>] [-c <name>=<value>]\n [--exec-path[=<path>]] [--html-path] [--man-path] [--info-path]\n [-p | --paginate | -P | --no-pager] [--no-replace-objects] [--bare]\n [--git-dir=<path>] [--work-tree=<path>] [--namespace=<name>]\n <command> [<args>]\n\nThese are common Git commands used in various situations:\n\nstart a working area (see also: git help tutorial)\n clone Clone a repository into a new directory\n init Create an empty Git repository or reinitialize an existing one\n\nwork on the current change (see also: git help everyday)\n add Add file contents to the index\n mv Move or rename a file, a directory, or a symlink\n reset Reset current HEAD to the specified state\n rm Remove files from the working tree and from the index\n\nexamine the history and state (see also: git help revisions)\n bisect Use binary search to find the commit that introduced a bug\n grep Print lines matching a pattern\n log Show commit logs\n show Show various types of objects\n status Show the working tree status\n\ngrow, mark and tweak your common history\n branch List, create, or delete branches\n checkout Switch branches or restore working tree files\n commit Record changes to the repository\n diff Show changes between commits, commit and working tree, etc\n merge Join two or more development histories together\n rebase Reapply commits on top of another base tip\n tag Create, list, delete or verify a tag object signed with GPG\n\ncollaborate (see also: git help workflows)\n fetch Download objects and refs from another repository\n pull Fetch from and integrate with another repository or a local branch\n push Update remote refs along with associated objects\n\n'git help -a' and 'git help -g' list available subcommands and some\nconcept guides. See 'git help <command>' or 'git help <concept>'\nto read about a specific subcommand or concept.\n""
16:37:56.958720 [debug] [MainThread]: STDERR: "b''"
16:37:56.962706 [debug] [MainThread]: Acquiring new sqlite connection "debug"
16:37:56.964009 [debug] [MainThread]: Using sqlite connection "debug"
16:37:56.964372 [debug] [MainThread]: On debug: select 1 as id
16:37:56.964754 [debug] [MainThread]: Opening a new connection, currently in state init
16:37:56.970526 [debug] [MainThread]: On debug: No close available on handle
16:37:56.971823 [debug] [MainThread]: Connection 'debug' was properly closed.
I'm bewildered and don't know where to go from here. Any help would be greatly appreciated.
Hi,
love the idea of a dbt sqlite adapter for demos and stuff.
Am working on a demo and wanted to show incremental functionality.
If you provide a unique key, then the macro sqlite_incremental_upsert tries to commit two comments (deletion and insert) leading to the
"You can only execute one statement at a time."
error.
I'll try to share the demo as well (will have to first configure github properly) and will try to help once I got deeper into the sqlite adapter.
But maybe you already have a quick idea how to solve this (or figure out what I'm doing wrong)
Cheers Hannes
The model is basically
{{ config(unique_key='unique_key') }}
SELECT
*
FROM {{ref('STAGE_TABLE')}}
Please add support for the latest version. with the latest dbt version I get
Running with dbt=0.20.0
Found 5 models, 19 tests, 0 snapshots, 0 analyses, 537 macros, 0 operations, 1 seed file, 0 sources, 0 exposures
Encountered an error:
Runtime Error
Database Error
no such table: main_dbt_test__audit.sqlite_master
It should be main.sqlite_master
instead of main_dbt_test__audit.sqlite_master
Thanks
SQLite on Mac OS doesn't have support for loadable extensions compiled in, so enable_load_extension()
fails. This is true for Big Sur's python 2.7.16 and 3.8.2. The python docs also mention this issue.
The fix is to not call enable_load_extension() unless there are extensions to load, so that the adapter can at least work on Mac OS with the pythons that ship with it, w/o extensions. Snapshots will not work due to missing the md5() function. To get those working, people running Mac OS will need to compile sqlite/python or get some other python distribution (maybe Anaconda or Intel's python dist?).
I have a 2.1GB CSV file in the seeds
directory. When I run dbt seed
, it works for a while then crashes:
$ dbt seed
18:40:13 Running with dbt=1.1.0
18:40:14 Found 2 models, 0 tests, 0 snapshots, 0 analyses, 173 macros, 0 operations, 2 seed files, 0 sources, 0 exposures, 0 metrics
18:40:14
18:40:14 Concurrency: 1 threads (target='dev')
18:40:14
18:40:14 1 of 2 START seed file main.DATA ............................................. [RUN]
[1] 702783 killed dbt seed
$ /usr/lib/python3.9/multiprocessing/resource_tracker.py:216: UserWarning: resource_tracker: There appear to be 2 leaked semaphore objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d '
Running dmesg
shows:
[553066.736092] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=user.slice,mems_allowed=0,global_oom,task_memcg=/,task=dbt,pid=709386,uid=1000
[553066.736147] Out of memory: Killed process 709386 (dbt) total-vm:4719592kB, anon-rss:3142892kB, file-rss:8kB, shmem-rss:4kB, UID:1000 pgtables:6252kB oom_score_adj:0
[553067.100938] oom_reaper: reaped process 709386 (dbt), now anon-rss:0kB, file-rss:0kB, shmem-rss:4kB
Running on a system with limited RAM, obviously, but is there any reason that it would run out of RAM while inserting data into DB ?
Thanks for any pointers!
See https://docs.getdbt.com/docs/contributing/building-a-new-adapter#other-files
Couldn't figure out how to do the schemas_and_paths and extensions parts
fixed:
type: sqlite
threads: 1
database: "database"
schema: "main"
prompts:
schemas_and_paths:
main:
hint: '/my_project/data/etl.db'
schema_directory:
hint: '/my_project/data'
extensions:
- "/path/to/sqlean/crypto.so"
Hey
i am using dbt-sqlit 1.4.0 with dbt-core 1.5.2 and i get the following error:
File "XXX\.venv\Lib\site-packages\dbt\adapters\sqlite\connections.py", line 40, in unique_field
return self.host
^^^^^^^^^
AttributeError: 'SQLiteCredentials' object has no attribute 'host'
The problem is the telemetry hash in SQLiteCredentials subclassing Credentials :
@property
def unique_field(self):
"""
Hashed and included in anonymous telemetry to track adapter adoption.
Pick a field that can uniquely identify one team/organization building with this adapter
"""
return self.host
Credentials from dbt.adapters.base has no field host. Maybee change it to self.schema_directory ?
Best Regards!
Minor version v1.6
is targeted for final release on July 27, 2023. As a maintainer of a dbt adapter, we strongly encourage you to release a corresponding minor version increment to ensure users of your adapter can make use of this new minor version.
dbt-labs/dbt-core#7958 is an open discussion with more detailed information. If you have questions, please put them there!
The above linked guide has more information, but below is a high-level checklist of work that would enable a successful 1.6.0 release of your adapter:
### Tasks
- [ ] SUPPORT: materialized views
- [ ] SUPPORT: new `clone` command
- [ ] BEHIND THE SCENES: Drop support for Python 3.7 (if you haven't already)
- [ ] BEHIND THE SCENES: new arg for `adapter.execute()`
- [ ] BEHIND THE SCENES: ensure support for revamped `dbt debug``
- [ ] BEHIND THE SCENES: Add support for new/modified relevant tests
1.7.0
FYI, dbt-core==1.7.0
is expected to be released on October 12, 2023 in time for Coalesce, the annual analytics engineering conference!
This issue has appeared since version 1.4 of dbt-core
Minor version v1.8
is targeted for final release within dbt Core on May 9, 2024.
As a maintainer of a dbt adapter, we strongly encourage you to release a corresponding minor version increment to ensure users of your adapter can make use of this new minor version.
As of dbt-core v1.8.0
, we no longer need to encourage you to release a new minor version anytime we do. After following the linked upgrade guide, we guarantee your adapter will be forward compatible with all future minor versions of dbt-core
(at least until v2.0
which is not yet planned).
Another major win: you can now make your adapter truly SemVer compliant, as you can release new versions of your adapter without needing to wait for a new dbt-core
release. You can actually follow
dbt-labs/dbt-core#9798 is an open discussion with more detailed information. If you have questions, please put them there!
The latest release cut for 1.3.0, dbt-core==1.3.0rc2
was published on October 3, 2022 (PyPI | Github). We are targeting releasing the official cut of 1.3.0 in time for the week of October 16 (in time for Coalesce conference).
We're trying to establish a following precedent w.r.t. minor versions:
Partner adapter maintainers release their adapter's minor version within four weeks of the initial RC being released. Given the delay on our side in notifying you, we'd like to set a target date of November 7 (four weeks from today) for maintainers to release their minor version
Timeframe | Date (intended) | Date (Actual) | Event |
---|---|---|---|
D - 3 weeks | Sep 21 | Oct 10 | dbt Labs informs maintainers of upcoming minor release |
D - 2 weeks | Sep 28 | Sep 28 | core 1.3 RC is released |
Day D | October 12 | Oct 12 | core 1.3 official is published |
D + 2 weeks | October 26 | Nov 7 | dbt-adapter 1.3 is published |
dbt-labs/dbt-core#6011 is an open discussion with more detailed information, and dbt-labs/dbt-core#6040 is for keeping track of the community's progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
The docs have been updated a lot since I last looked at them. Do a review and make sure this plugin does everything it's supposed to.
https://docs.getdbt.com/docs/contributing/building-a-new-adapter
Due to some personal issues, I unfortunately don't have time to maintain this project anymore. This adapter works with dbt 1.4.0 but there have been two minor versions since then that probably require changes for full compatibility.
If anyone wants to take this over, please let me know and we can coordinate it somehow.
Not sure how feasible this is.
We've just published the release cut of dbt-core 1.2.0, dbt-core 1.2.0rc1
(PyPI | GitHub release notes).
dbt-labs/dbt-core#5468 is an open discussion with more detailed information, and dbt-labs/dbt-core#5474 is for keeping track of the communities progress on releasing 1.2.0
Below is a checklist of work that would enable a successful 1.2.0 release of your adapter.
BaseDocsGenerate
and BaseDocsGenReferences
dbt-labs/dbt-core#5432 might make it into the second release cut in the next week, in which case, you'll also might want to:
I think is doable? The slightly tricky part is determining whether the passed-in value is date or timestamp.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.