surrealdb / surrealdb Goto Github PK
View Code? Open in Web Editor NEWA scalable, distributed, collaborative, document-graph database, for the realtime web
Home Page: https://surrealdb.com
License: Other
A scalable, distributed, collaborative, document-graph database, for the realtime web
Home Page: https://surrealdb.com
License: Other
One should be able to define link constraints to multiple tables. Currently it's only possible to create a link constraint on a single type.
DEFINE FIELD link ON entity TYPE record(person);
We should be able to constrain record types to multiple types (in a polymorphic way).
DEFINE FIELD link ON entity TYPE record(person, organisation, cause);
No alternative methods exist.
surreal 1.0.0-beta.5 for macos on aarch64
No response
The original points for this issue can be seen in #32.
Currently there is no way of finding out which Namespace or Database is currently selected.
We should have a new statement type or a function for retrieving the current Namespace and current Database.
It is not yet possible to get the currently selected NS or DB.
surreal 1.0.0-beta.6 for macos on aarch64
No response
The syntax for defining an embedded JavaScript function is a little obtuse and obscure.
CREATE event SET name = fn::future -> () => {
return 'my js function';
};
Why not define embedded JavaScript functions just like in JavaScript...
CREATE event SET name = function() {
return 'my js function';
};
We'll also enable the ability to define functions using a shortened syntax (similar to Rust)...
CREATE event SET name = fn() {
return 'my js function';
};
surreal 1.0.0-beta.2 for macos on aarch64
No response
It appears the Docker image is not public?
docker run --pull --rm -p 8000:8000 surrealdb/surrealdb:latest start
docker: Error response from daemon: No such image: surrealdb/surrealdb:latest.
See 'docker run --help'.
docker run --pull --rm -p 8000:8000 surrealdb/surrealdb:latest start
To download the iamge
N/A
No response
Currently, after a recent update, no web requests make use of HTTP compression, as the warp library does not support setting the output compression based on the Accept-Encoding
request header.
Currently there is no standard solution until seanmonstar/warp#513 gets added in to the warp library.
We could implement request compression by taking a similar approach to casper-network/casper-node#2077.
surreal 1.0.0-beta.5 for macos on aarch64
No response
Currently when creating Vec<u8>
keys for use in the datastore, we pass in references, which are then cloned before being serialized. In addition when deserializing a datastore key, the value is cloned to create owned data.
This is unnecessary as the datastore key is never used or held beyond the end of a local function.
Zero-copy deserialization will ensure that we are not unnecessarily cloning &str values when serializing, and cloning data once again when deserializing.
With this improvement, writing to and reading from the key-value store should be quicker, with less memory allocation.
For this to work, we need to make the storekey deserializer accept borrowed data as can be seen in rmp-serde - https://github.com/3Hren/msgpack-rust/blob/master/rmp-serde/src/decode.rs#L909-L1003
#[derive(Clone, Debug, Eq, PartialEq, PartialOrd, Serialize, Deserialize)]
pub struct Ns<'a> {
kv: &'a str,
_a: &'a str,
ns: &'a str,
}
impl<'a> Into<Vec<u8>> for Ns<'a> {
fn into(self) -> Vec<u8> {
self.encode().unwrap()
}
}
impl<'a> From<Vec<u8>> for Ns<'a> {
fn from(val: Vec<u8>) -> Self {
deserialize(&val).unwrap()
}
}
pub fn new<'a>(ns: &'a str) -> Ns<'a> {
Ns::new(ns)
}
impl<'a> Ns<'a> {
pub fn new(ns: &'a str) -> Ns {
Ns {
kv: BASE,
_a: "!ns",
ns,
}
}
pub fn encode(&self) -> Result<Vec<u8>, Error> {
Ok(serialize(self)?)
}
pub fn decode(v: &[u8]) -> Result<Ns, Error> {
Ok(deserialize(v)?)
}
}
No alternative methods.
surreal 1.0.0-beta.5 for macos on aarch64
No response
Currently running commands like
cargo test --no-default-features --features kv-mem --package surrealdb
result in test failures like
...
running 5 tests
Error: InvalidScript { message: "Embedded functions are not enabled." }
test script_function_simple ... FAILED
test script_function_module_os ... FAILED
Error: InvalidScript { message: "Embedded functions are not enabled." }
test script_function_types ... FAILED
Error: InvalidScript { message: "Embedded functions are not enabled." }
test script_function_arguments ... FAILED
Error: InvalidScript { message: "Embedded functions are not enabled." }
test script_function_context ... FAILED
...
failures:
script_function_arguments
script_function_context
script_function_module_os
script_function_simple
script_function_types
test result: FAILED. 0 passed; 5 failed; 0 ignored; 0 measured; 0 filtered out; finished in 0.01s
This makes it harder to test only a subset of features during development, which can increase the test-debug cycle. Also not testing all the various combinations of features to make sure they compile and run properly may mean that some configurations may not work.
Incorporate a tool like cargo hack in the CI. It can test all the different combinations of features to make sure they compile and run successfully.
--feature-powerset
Perform for the feature powerset which includes --no-default-features and default features of the package.
This is useful to check that every combination of features is working properly.
It also has a Github Action.
We could add cargo hack check --feature-powerset
and cargo hack test --feature-powerset
to CONTRIBUTING.md and ask engineers to run them before pushing. The advantage of doing it this way is that the CI won't be impacted. However, there is no guarantee that engineers will comply.
surreal 1.0.0-beta.7 for linux on x86_64
No response
SSE may be preferable over WS for certain users, as it provides a few benefits over WS:
Reference:
https://www.the-guild.dev/blog/graphql-over-sse
https://wundergraph.com/blog/deprecate_graphql_subscriptions_over_websockets
Support for SSE connections in addition to WS connections when making LIVE / realtime queries
N/A
1.0.0-beta.6
No response
When no root password is set with the command-line arguments -p
or --pass
, then instead of setting a randomly-generated password for the root user, we should disable root authentication altogether.
Disable root authentication if the root password is not specified.
Not applicable.
surreal 1.0.0-beta.2 for macos on aarch64
No response
SurrealDB is built upon various ordered and, to the extent they are distributed, range-partitioned key-value stores such as TiKV. This has the potential to make range queries on keys (record ids) very performant. However, SurQL lacks a dedicated syntax or performance guarantees for such queries.
Consider the following timeseries records grouped by game (the first index of each record id is the game name, and the second index is a timestamp of days since product launch):
[
{
id: ["Chess", 1],
players: 50
},
{
id: ["Chess", 2],
players: 15
},
...296 records omitted...
{
id: ["Chess", 299],
players: 15
},
{
id: ["Chess", 300],
players: 15
},
{
id: ["Tetris", 1],
players: 10
},
{
id: ["Tetris", 2],
players: 12
},
...296 records omitted...
{
id: ["Tetris", 299],
players: 26
},
{
id: ["Tetris", 300],
players: 23
}
]
Assuming each node can handle 150 records, a likely partitioning into four nodes would result in the following ranges:
["Chess", 1]
to ["Chess", 150]
["Chess", 151]
to ["Chess", 300]
["Tetris", 1]
to ["Tetris", 150]
["Tetris", 151]
to ["Tetris", 300]
A common query pattern will be to chart the data for a particular game for the last 90 days. Using the game Tetris as an example, that means getting all records between ["Tetris", 210]
(inclusive) and ["Tetris", 300]
(inclusive). Luckily, these records all reside on Node 4, so the underlying KV-store can retrieve them in a single access (side note: if we were querying many more records, we might hit multiple nodes, but the ordering would make their disk accesses much more efficient and the number of nodes hit would be relatively minimal).
Idea 1 (..
and ..=
to signify Range<Key>
and RangeInclusive<Key>
, respectively):
SELECT id, players FROM metrics:["Tetris", 210]..["Tetris", 301]
SELECT id, players FROM metrics:["Tetris", 210]..=["Tetris", 300]
Or, if preferable, idea 1.5:
SELECT id, players FROM metrics:["Tetris", 210]..metrics:["Tetris", 301]
SELECT id, players FROM metrics:["Tetris", 210]..=metrics:["Tetris", 300]
Idea 2 (support normal-SQL's BETWEEN ... AND ...
syntax, and make sure it is optimized to use a range lookup from the underlying KV-store):
SELECT id, players FROM metrics WHERE id BETWEEN ["Tetris", 210] AND ["Tetris", 300]
Idea 3 (no new syntax, just make sure the following optimizes to use a range lookup from the underlying KV-store):
SELECT id, players FROM metrics WHERE ["Tetris", 210] <= id AND id <= ["Tetris", 300]
Non-solution: Changing the schema to use a random record id and to have an index on game name and timestamp would throw spatial-locality and, by extension, query performance out the window. Executing SELECT id, players FROM metrics WHERE name = "Tetris" AND timestamp BETWEEN 210 AND 300
, assuming the existence of an ordered index on (name, timestamp)
, would play nicely with that index but then do 90 random accesses to fetch the actual records.
See also: https://discord.com/channels/902568124350599239/902568124350599242/1012746600315105401
surreal 1.0.0-beta.6 for linux on x86_64
If you define a UNIQUE index on a field, that field will only accept a single NONE value on that field.
> DEFINE TABLE foo SCHEMAFULL;
[{"time":"27.67µs","status":"OK","result":null}]
> DEFINE FIELD name ON foo TYPE string;
[{"time":"27.043µs","status":"OK","result":null}]
> DEFINE FIELD national_id ON foo TYPE string;
[{"time":"46.241µs","status":"OK","result":null}]
> DEFINE INDEX national_id_idx ON foo FIELDS national_id UNIQUE;
[{"time":"69.913µs","status":"OK","result":null}]
# The first person with no national_id will be accepted
> CREATE foo SET name = "John Doe";
[{"time":"266.761µs","status":"OK","result":[{"id":"foo:yki2758ba7dwzastfsk1","name":"John Doe","national_id":"NONE"}]}]
# Any subsequent records without a national_id will be rejected
> CREATE foo SET name = "Jane Doe";
[{"time":"168.721µs","status":"ERR","detail":"Database index `national_id_idx` already contains `foo:4vgq4l8u4y211c3ekgn9`"}]
I would expect NONE
values to be ignored.
surreal 1.0.0-beta.7 for linux on x86_64
No response
Currently we can only perform Backup/Restore and Import/Export with CLI and There is no way to initiate these task with SurrealQL.
I suggest adding SurrealQL statements for performing backup/restore and import/export data.
BACKUP INTO '{collectionURI}';
RESTORE DATABASE bank FROM LATEST IN '{collectionURI}';
RESTORE TABLE bank.customers FROM LATEST IN '{collectionURI}';
Reference: https://www.cockroachlabs.com/docs/stable/backup.html
These statements will enable us to initiate and schedule backup/restore from code.
CLI can be used to perform backup/restore.
v1.0.0-beta.4
No response
In strict mode If we try to use a non-exixtent namespace then SDB should return an error and not status=ok.
info for kv; use ns test;
[
{
"time": "1.28µs",
"status": "OK",
"result": null
}
]
[
{
"time": "1.28µs",
"status": "ERR",
"detail": "The namespace test does not exists"
}
]
v1.0.0-beta.4
The unstable options in .rustfmt.toml
are currently not being used as the CI is testing on the stable channel. These options lead to to warning messages saying unstable features are only available in nightly channel
. This can be confusing and may prompt developers to run cargo +nightly fmt
which currently results in almost the entire codebase being reformatted.
$ cargo fmt
...
Warning: can't set `reorder_impl_items = true`, unstable features are only available in nightly channel.
...
Running cargo fmt
on a formatted codebase shouldn't print any warnings.
surreal 1.0.0-beta.7 for linux on x86_64
No response
When fetching a multi-yield path expression, if the last path part uses an AS
field name, then the overall aliased field name is ignored and not output.
Run the following SQL on a blank database:
CREATE person:1, person:2, person:3 RETURN NONE;
RELATE person:1->like->person:2;
RELATE person:3->like->person:2;
SELECT ->?->(person AS a)<-like<-(person AS b)->(like AS c)->(person AS d) AS people FROM person:1;
The result of the final SELECT
query is:
{
"a": [
"person:2"
],
"b": [
"person:3",
"person:1"
],
"c": [
"like:cgxc3q8m0470vkcunu5b",
"like:qw1zus1o0xg607vw339l"
],
"d": [
"person:2",
"person:2"
]
}
Run the following SQL on a blank database:
CREATE person:1, person:2, person:3 RETURN NONE;
RELATE person:1->like->person:2;
RELATE person:3->like->person:2;
SELECT ->?->(person AS a)<-like<-(person AS b)->(like AS c)->(person AS d) AS people FROM person:1;
The result of the final SELECT
query should be:
{
"a": [
"person:2"
],
"b": [
"person:3",
"person:1"
],
"c": [
"like:cgxc3q8m0470vkcunu5b",
"like:qw1zus1o0xg607vw339l"
],
"d": [
"person:2",
"person:2"
],
"people": [
"person:2",
"person:2"
]
}
surreal 1.0.0-beta.5 for macos on aarch64
No response
When running in production, especially in a Kubernetes environment, it is fairly common practice to included a /health
or /health_check
endpoint that is publicly available. This way the system can monitor the health status of the running instances.
Add a REST endpoint: /health
. That return an Http Status of 200
when all is ok and 500
otherwise.
/health
is the defacto.
all
No response
The original points for this issue can be seen in #32.
If SurrealDB is started for the first time, or if no Namespace or Database exists yet, then it doesn't make sense to specify a Namespace or Database using the --ns
or --db
arguments when running the command-line REPL with surreal sql
.
The --ns
and --db
arguments should be optional.
Currently the arguments are required.
surreal 1.0.0-beta.6 for macos on aarch64
No response
When a new record contains a duplicate on a field defined as unique, the error message returns the ID of the new record. While this works out fine for the id
field, on a field with a UNIQUE index this new ID doesn't always match the ID in the database and may even be a newly generated random ID.
> DEFINE TABLE foo SCHEMAFULL;
[{"time":"84.158µs","status":"OK","result":null}]
> DEFINE FIELD email_address ON foo TYPE string ASSERT is::email($value);
[{"time":"152.971µs","status":"OK","result":null}]
> DEFINE INDEX email_address_idx ON foo FIELDS email_address UNIQUE;
[{"time":"105.3µs","status":"OK","result":null}]
> CREATE foo SET email_address = "[email protected]";
[{"time":"778.488µs","status":"OK","result":[{"email_address":"[email protected]","id":"foo:ni00u6941w0xij9b6aev"}]}]
> CREATE foo SET email_address = "[email protected]";
[{"time":"214.592µs","status":"ERR","detail":"Database index `email_address_idx` already contains `foo:1z1vsan748ov7jfyhing`"}]
# Notice that the ID already in the database is `foo:ni00u6941w0xij9b6aev` not `foo:1z1vsan748ov7jfyhing`
# which is being reported in the error message.
I think
Database index `email_address_idx` contains a duplicate of a value in `foo:ni00u6941w0xij9b6aev`
or something like that would be more accurate and makes it easier to debug down the line if someone does not handle this error right away and only notices it in the logs. In a case like this, the new ID will not even be in the database by the time one discovers this. This would make it impossible to track it down.
It would be nice if it said
Database index `email_address_idx` already contains `[email protected]`
but that might lead to gigantic error messages if the field value is large.
surreal 1.0.0-beta.7 for linux on x86_64
No response
Fields defined with a type of datetime
cannot be set to NONE
. They always default to time::now()
and revert to that value even when explicitly set to NONE
.
> DEFINE TABLE foo;
[{"time":"49.646µs","status":"OK","result":null}]
> DEFINE FIELD deleted_at ON foo TYPE datetime;
[{"time":"173.153µs","status":"OK","result":null}]
> CREATE foo;
[{"time":"198.576µs","status":"OK","result":[{"deleted_at":"2022-08-29T16:25:02.478436711Z","id":"foo:hx4rbszo4xajq8ilip2f"}]}]
# Notice deleted_at is set to time::now() instead of NONE ^^^
> UPDATE foo:hx4rbszo4xajq8ilip2f SET deleted_at = NONE;
[{"time":"150.578µs","status":"OK","result":[{"deleted_at":"2022-08-29T16:25:32.007420801Z","id":"foo:hx4rbszo4xajq8ilip2f"}]}]
# Again here, instead of setting the value to NONE, it regenerates time::now() and uses that new value instead
# Even if you explicitly set the default value to NONE
> DEFINE FIELD deleted_at ON foo TYPE datetime VALUE $value OR NONE;
[{"time":"26.726µs","status":"OK","result":null}]
# It will still override it with time::now()
> CREATE foo;
[{"time":"128.641µs","status":"OK","result":[{"deleted_at":"2022-08-29T16:44:10.777821074Z","id":"foo:qged50ou8ygw2bv49me1"}]}]
NONE
when there is no explicit default value defined and there is no ASSERT $value != NONE
time::now()
.surreal 1.0.0-beta.7 for linux on x86_64
No response
When fetching nested fields, array values, and remote records, we are currently cloning all values which are fetched. This has an impact on performance.
Instead of using clone()
by default when we are fetching or comparing fields , we should return Cow<'a, Value>
values, resulting in a value only being cloned when it needs to be updated / written to.
We would also need to ensure that all Value
types are based on Cow<'a, Value>
values, as opposed to owned Value
values.
So the following:
pub async fn get(&self, ctx: &Runtime, opt: &Options, txn: &Transaction, path: &[Part]) -> Result<Self, Error>;
would become:
pub async fn get(&'a self, ctx: &Runtime, opt: &Options, txn: &Transaction, path: &[Part]) -> Result<Cow<'a, Self>, Error>;
In addition, compute()
functions would need to return Cow<'a, Value>
values.
pub(crate) async fn compute(&'a self, ctx: &Context<'_>, opt: &Options, txn: &Transaction, doc: Option<&Value>) -> Result<Cow<'a, Value>, Error>
Currently the functionality works without modification, but query performance will be significantly improved when this change is made.
surreal 1.0.0-beta.5 for macos on aarch64
No response
Currently these only support floats. We may want to support more formats like:-
40° 26′ 46″ N 79° 58′ 56″ W
N 40° 26′ 46″ W 79° 58′ 56″
40° 26.767' N 79° 58.933' W
40° 26′ 46″ 79° 58′ 56″, 40° 26′ 46″, 79° 58′ 56″, ...
N 40° 26.767' W 79° 58.933'
40° 26.767' 79° 58.933', 40° 26.767', 79° 58.933', ...
N 40.446° W 79.982°
40.446° N 79.982° W
40.446° 79.982°, 40.446,79.982, etc.
From Tobie's comment:
We would need to write it in the parser (as opposed to defer to the that crate though, as that will be more performant. Similar to how we are parsing datetimes/durations...
Using the linked crate latlon
. See Tobie's comment above on this.
surreal 1.0.0-beta.7 for linux on x86_64
No response
An error occurs when attempting to modify a field after creating an index on that field
DEFINE TABLE PERSON;
CREATE person:bob SET age = 23;
UPDATE person set age = 10;
DEFINE INDEX age_idx ON person COLUMNS age;
UPDATE person SET age = 30;
Upon executing the last statement, I get this:
[
{
"detail": "There was a problem with a datastore transaction: Value being checked was not correct",
"status": "ERR",
"time": "190.625µs"
}
]
The age field is successfully assigned the new value (30)
surreal 1.0.0-beta.6 for macos on aarch64
Currently these functions allow some invalid inputs and disallow some valid ones. We are currently using regular expressions to implement these functions. While this is OK for some applications, for a database management system it's very important that we be as correct as possible. Especially considering that these functions are used to constrain data that will be stored in the database.
# Accepts invalid input
> SELECT * FROM is::domain("example-.com");
[{"time":"370.978µs","status":"OK","result":[true]}]
# Domain labels cannot end with a "-"
# Rejects valid input
> SELECT * FROM is::domain("食狮.**");
[{"time":"263.639µs","status":"OK","result":[false]}]
# That is a valid internationalised domain name
# Accepts invalid input
> SELECT * FROM is::email("[email protected]");
[{"time":"4.776857ms","status":"OK","result":[true]}]
# Email addresses cannot contain empty labels in the local-part
# Rejects valid input
> SELECT * FROM is::email("user@[fd79:cdcb:38cc:9dd:f686:e06d:32f3:c123]");
[{"time":"315.276µs","status":"OK","result":[false]}]
# IP addresses are valid email hosts
Of all the inputs here, email addresses are probably the hardest to get right.
# Returns 400 when input is not valid UTF-8
> SELECT * FROM is::uuid("67e55044-10b1-426f-9247-bb680e5\0e0c8");
{"code":400,"details":"Request problems detected","description":"There is a problem with your request. Refer to the documentation for further information.","information":"There was a problem with the database: Parse error on line 1 at character 16 when parsing '::uuid(\"67e55044-10b1-426f-9247-bb680e5\\0e0c8\");'"}
# This is not unique to UUIDs. It happens to all the functions I tested.
# Rejects valid input
> SELECT * FROM is::uuid("67e55044-10b1-426f-9247-bb680e5fe0c8");
[{"time":"351.335µs","status":"OK","result":[false]}]
# This is a valid UUIDv4
All parser and validation functions should parse correctly and return correct results.
We could try and fix our regular expressions but it will be very hard to get them right and it will be a pain to maintain them. Instead, I would like to propose that we delegate this functionality to external crates.
uuid
crate. We can use that to validate UUIDs.semver
crate.addr
crate. I maintain the addr
crate but that is not the reason I'm nominating it. It is small, fast and well-tested. It even supports no_std
. It avoids heap allocations even when the std
feature is enabled.All these crates are lightweight, popular, fast and well tested. After these changes the only new crates that will be added to Cargo.lock
are addr
which is 92.6 kB
on crates.io and psl-types
which is just 7.96 kB
. On the positive side, having tests for the functions stay in the external crates will result in tests running faster as opposed to importing them.
@tobiemh Let me know it you would like to go ahead with this. I will be happy to put together a PR right away.
surreal 1.0.0-beta.7 for linux on x86_64
No response
Safe code (no unsafe
) making use of surrealdb::Datastore::new("rocksdb:...")
and surrealdb::Datastore::transaction
can trigger a segmentation fault, and the only warning was an internal comment in the relevant source code.
Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)
This is probably not an issue for the vast majority of users, as Datastore
s generally last for the lifetime of a process. As a result, I don't recommend sacrificing time/effort/features to fix it in the near future. I might think of a solution that sacrifices a negligible amount of performance or introduces a negligible memory leak, and submit it in a PR.
unsafe
Box::leak
to get a real, safe 'static
reference, at the expense of a potential memory leak, and optionally a way (unsafe
or otherwise) to de-allocateIf you want me to implement Box::leak
idea (the straight-forward one), as opposed to waiting and thinking of a better solution, I can submit a PR.
[package]
name = "surrealdb_rocksdb_unsound"
version = "0.1.0"
edition = "2021"
[dependencies]
tokio = {version = "1.20", features = ["full"]}
surrealdb = "1.0.0-beta.7"
use surrealdb::Transaction;
#[tokio::main]
async fn main() {
let mut transaction = get_transaction().await;
println!("{:?}", transaction.put("uh", "oh").await.unwrap());
}
async fn get_transaction() -> Transaction {
let datastore = surrealdb::Datastore::new("rocksdb:/tmp/rocks.db").await.unwrap();
datastore.transaction(true, false).await.unwrap()
}
Either:
unsafe
)At minimum, the unsoundness should be documented. Right now, the main clue is in an internal comment:
surrealdb/lib/src/kvs/rocksdb/mod.rs
Lines 38 to 49 in 738ba5d
surrealdb = "1.0.0-beta.7"
Selecting a record that contains a MultiPolygon
crashes the database.
Run the following documented query:
UPDATE university:oxford SET locations = {
type: "MultiPolygon",
coordinates: [
[
[ [10.0, 11.2], [10.5, 11.9], [10.8, 12.0], [10.0, 11.2] ]
],
[
[ [9.0, 11.2], [10.5, 11.9], [10.3, 13.0], [9.0, 11.2] ]
]
]
};
Select the record or export the database:
SELECT * FROM university:oxford;
The database crashes with the following error message:
thread 'tokio-runtime-worker' panicked at 'called `Result::unwrap()` on an `Err` value: Syntax("unknown variant `Polygons`, expected one of `Point`, `Line`, `Polygon`, `MultiPoint`, `MultiLine`, `MultiPolygon`, `Collection`")', lib/src/sql/value/value.rs:97:60
Selects should return the record successfully and exporting should work as expected. I have already identified the source of the bug and will prepare and submit a pull request shortly.
surreal 1.0.0-beta.7 for linux on x86_64
No response
OS: Windows 10 x64
PS C:\WINDOWS\system32> iwr https://windows.surrealdb.com -useb | iex
.d8888b. 888 8888888b. 888888b.
d88P Y88b 888 888 'Y88b 888 '88b
Y88b. 888 888 888 888 .88P
'Y888b. 888 888 888d888 888d888 .d88b. 8888b. 888 888 888 8888888K.
'Y88b. 888 888 888P' 888P' d8P Y8b '88b 888 888 888 888 'Y88b
'888 888 888 888 888 88888888 .d888888 888 888 888 888 888
Y88b d88P Y88b 888 888 888 Y8b. 888 888 888 888 .d88P 888 d88P
'Y8888P' 'Y88888 888 888 'Y8888 'Y888888 888 8888888P' 8888888P'
Fetching the latest database version...
Fetching the host system architecture...
Installing surreal-v1.0.0-beta.5
for windows-amd64...
Invoke-WebRequest : The remote server returned an error: (403) Forbidden.
At line:54 char:5
+ Invoke-WebRequest $DownloadUrl -OutFile $Executable -UseBasicPars ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (System.Net.HttpWebRequest:HttpWebRequest) [Invoke-WebRequest], WebExc
eption
+ FullyQualifiedErrorId : WebCmdletWebResponseException,Microsoft.PowerShell.Commands.InvokeWebRequestCommand
PS C:\WINDOWS\system32>
Run the installation script
iwr https://windows.surrealdb.com -useb | iex
It's supposed to install SurrealDB.
1.0.0-beta.5
The original points for this issue can be seen in #32.
Once the --ns
and --db
arguments are made optional in the SurrealDB command-line REPL (#34), then if the Namespace or Database has not yet been specified, each query must include a USE NS ... DB ...
statement before any other statement, can be run.
The SurrealDB command-line should remember the last specified Namespace or Database which have been set with USE NS
and USE DB
.
No alternative methods.
surreal 1.0.0-beta.6 for macos on aarch64
No response
The original points for this issue can be seen in #25.
With the addition of a key-value store implementation for FoundationDB, SurrealDB will be able to be run on a throughly tested, and scaleable key-value store, as an alternative to TiKV.
Currently the only other method for using distributed key-value backed storage is TiKV.
surreal 1.0.0-beta.6 for macos on aarch64
No response
When performing a graph traversal query with multi-yield path alias expressions, each additional output field is embedded within an array, instead of being output as a flat result.
Run the following SQL on a blank database:
CREATE person:1, person:2, person:3 RETURN NONE;
RELATE person:1->like->person:2;
RELATE person:3->like->person:2;
SELECT ->?->(person AS a)<-like<-(person AS b)->(like AS c)->person AS people FROM person:1;
The result of the final SELECT
query is:
{
"a": [
"person:2"
],
"b": [
[
"person:3",
"person:1"
]
],
"c": [
[
[
"like:1fm28zli7kd49uzzl42q"
],
[
"like:ybvh0zbpfzbbsi1mcbai"
]
]
],
"people": [
[
[
[
"person:2"
]
],
[
[
"person:2"
]
]
]
]
}
Run the following SQL on a blank database:
CREATE person:1, person:2, person:3 RETURN NONE;
RELATE person:1->like->person:2;
RELATE person:3->like->person:2;
SELECT ->?->(person AS a)<-like<-(person AS b)->(like AS c)->person AS people FROM person:1;
The result of the final SELECT
query should be:
{
"time": "4.921291ms",
"status": "OK",
"result": [
{
"a": [
"person:2"
],
"b": [
"person:1",
"person:3"
],
"c": [
"like:fy41kwb25ik92zxvej9g",
"like:vpf4yciv2067xtkidpq4"
],
"people": [
"person:2",
"person:2"
]
}
]
}
surreal 1.0.0-beta.5 for macos on aarch64
No response
When a request is made which causes an error response (400
or 500
status codes), then the server does not set any CORS headers on the HTTP response. This means that client browsers can not read the error response returned.
Make a POST
request to the /sql
HTTP endpoint, with a query that has a syntax error. The server returns a 400
error because the request failed, however the CORS headers are not present in the response.
The server should respond with the correct CORS headers regardless of whether the request succeeded or failed.
surreal 1.0.0-beta.2 for macos on aarch64
No response
type::table
doesn't extract the table from the output of type::thing
SELECT * FROM <string> type::table(type::thing(1, 2));
[{"time":"24.292µs","status":"OK","result":["`1:2`"]}]
SELECT * FROM <string> type::table(type::thing(1, 2));
[{"time":"24.292µs","status":"OK","result":["1"]}]
surreal 1.0.0-beta.6 for linux on x86_64
Building this repo from source is a bit of challenge right now, even for Rust developers. This is especially true if one wants to use TiKV as their KV store.
Nix flakes are the easiest way of distributing software from source, that I know of. If someone has a flake-enabled (flakes are currently still an experimental feature) Nix package manager, all they will need to do to install surreal
on their system is nix profile install github:surrealdb/surrealdb
. This will download, compile and install the latest commit along with all its dependencies. One can target specific versions, branches or even commits. They don't need install it in their environment either. nix run github:surrealdb/surrealdb
is equivalent to cargo run
except it will download not only this repo but also all its dependencies.
To turn the repo into a Nix flake, all we need to do is add to files to the root, flake.nix
and flake.lock
.
Keep installing dependencies manually.
surreal 1.0.0-beta.7 for linux on x86_64
No response
A few of the so-called synchronous functions are very computationally intensive. Some of them, namely the argon2
, pbkdf2
, and scrypt
hash functions, are computationally intensive by design (to prevent brute force).
Futures are not supposed to synchronously perform too much computation, as they run on threads of the async executor. See the docs on this subject:
An implementation of poll should strive to return quickly, and should not block. Returning quickly prevents unnecessarily clogging up threads or event loops. If it is known ahead of time that a call to poll may end up taking awhile, the work should be offloaded to a thread pool (or something similar) to ensure that poll can return quickly.
(https://docs.rs/futures/latest/futures/future/trait.Future.html)
And yet, the expensive functions are called synchronously:
Lines 26 to 38 in 3d83f08
Performance would severely degrade when one or more of the expensive functions are running.
// Attempts to run any function
pub async fn run(ctx: &Context<'_>, name: &str, args: Vec<Value>) -> Result<Value, Error> {
match name {
v if v.starts_with("http") => {
// HTTP functions are asynchronous
asynchronous(ctx, name, args).await
}
v if v.starts_with("crypto") && (v.ends_with("argon2") || ...others...) => {
// Computationally expensive functions are dispatched to a thread pool
tokio::task::spawn_blocking(|| synchronous(ctx, name, args)).await
}
_ => {
// Other functions are synchronous
synchronous(ctx, name, args)
}
}
}
The reason this isn't a PR is that the tokio
dependency doesn't seem to be available. It could maybe be replicated with std::thread::spawn
and an spsc channel; let me know what you think.
surreal 1.0.0-beta.7 for linux on x86_64
Currently when processing linked records, we process all records concurrently using try_join_all!
, with no limit to the number of futures processed at one time.
let futs = v.iter().map(|v| v.get(ctx, opt, txn, path));
try_join_all(futs).await.map(Into::into)
The files which are affected are:
surrealdb/lib/src/sql/value/get.rs
Lines 46 to 47 in 863830c
surrealdb/lib/src/sql/value/get.rs
Lines 76 to 77 in 863830c
surrealdb/lib/src/sql/value/set.rs
Lines 60 to 61 in 863830c
surrealdb/lib/src/sql/value/set.rs
Lines 86 to 87 in 863830c
surrealdb/lib/src/sql/value/del.rs
Lines 49 to 50 in 863830c
surrealdb/lib/src/sql/value/del.rs
Lines 114 to 115 in 863830c
We should look into replacing try_join_all
with a buffered stream. Something like the following...
futures::stream::iter(futures).buffer_unordered(10);
join_all!
and try_join_all!
now make use of FuturesOrdered for performance reasons if the number of futures is large.
Perhaps we therefore don't need to do anything here...
surreal 1.0.0-beta.5 for macos on aarch64
No response
N/A
Database provide twitter snowflake id, then application will no need to implement it.
it will very friendly to application developer.
the snowflake id can be easy declare by sql such as below
DEFINE FIELD id ON TABLE user TYPE snowflake(begin_time, node_id, sequence);
1.0.0-beta.6
This is a security feature. This prevents an attacker from directly reading the contents of a database without decrypting it first.
Please see https://discord.com/channels/902568124350599239/970336107206176768/1014654616442511523
Apparently this was supported in the now defunct GoLang version of SurrealDB.
TiKV, which SurrealDB supports as a KV backend, supports encryption at rest (see https://docs.pingcap.com/tidb/stable/encryption-at-rest#tikv-encryption-at-rest)
See: https://discord.com/channels/902568124350599239/970336107206176768/1014635769865977937
v1.0.0-beta.7
No response
When I try to use the example for the granular permissions I get a parse error. Maybe I am doing something wrong, but I just copied that SurrealQL code.
I used the docker version of SurrealDB.
Execute this SurrealQL query from the README.
DEFINE TABLE post SCHEMALESS
PERMISSIONS
FOR select
-- Published posts can be selected
WHERE published = true
-- A user can select all their own posts
OR user = $auth.id
FOR create
-- A user can create or update their own posts
WHERE user = $auth.id
FOR delete
-- A user can delete their own posts
WHERE user = $auth.id
-- Or an admin can delete any posts
OR $auth.admin = true
;
The Server responds with
{
"code": 400,
"details": "Request problems detected",
"description": "There is a problem with your request. Refer to the documentation for further information.",
"information": "There was a problem with the database: Parse error on line 8 at character 2 when parsing 'FOR create\n\t\t\t-- A user can create or update their own posts\n\t\t\tWHERE user = $auth.id\n\t\tFOR delete\n\t'"
}
I would expect that the example from the README would work, but maybe I am missing something. The DEFINE statement has almost no documentation yet. Maybe there's just an error in the README code.
1.0.0-beta.7
No response
Duplicated SCHEMAFULL FIELDS in export file
Create a simple SCHEMAFULL table with SDBQL
# tag:define
DEFINE TABLE tag SCHEMAFULL;
DEFINE FIELD name ON tag TYPE string;
DEFINE FIELD meta_data ON tag TYPE object;
DEFINE FIELD created_at ON tag TYPE datetime VALUE time::now();
DEFINE INDEX idx_name ON tag COLUMNS name UNIQUE;
# tag:create
CREATE tag:red CONTENT { name: "Red" };
CREATE tag:green CONTENT { name: "Green" };
CREATE tag:blue CONTENT { name: "Blue" };
CREATE tag:white CONTENT { name: "White" };
CREATE tag:black CONTENT { name: "Black" };
# tag:info
INFO FOR TABLE tag;
Export it, and note that FIELD's are duplicated
-- ------------------------------
-- TABLE: tag
-- ------------------------------
DEFINE TABLE tag SCHEMAFULL;
DEFINE FIELD created_at ON tag TYPE datetime VALUE time::now();
DEFINE FIELD meta_data ON tag TYPE object;
DEFINE FIELD name ON tag TYPE string;
DEFINE FIELD created_at ON tag TYPE datetime VALUE time::now();
DEFINE FIELD meta_data ON tag TYPE object;
DEFINE FIELD name ON tag TYPE string;
expected to have exported file without duplicate FIELD's
-- ------------------------------
-- TABLE: tag
-- ------------------------------
DEFINE TABLE tag SCHEMAFULL;
DEFINE FIELD created_at ON tag TYPE datetime VALUE time::now();
DEFINE FIELD meta_data ON tag TYPE object;
DEFINE FIELD name ON tag TYPE string;
surreal 1.0.0-beta.5 for linux on x86_64
When using the TiKV store, sometimes the database runs into ResolveLockError
and when that happens, it looks like it gets stuck in that state. Even restarting TiKV doesn't seem to make it go away.
The easiest way to trigger this error is by embedding the database in Rust. I was able to trigger it via the /sql
endpoint too but in that case the database seems to recover somehow. I'm able to consistently reproduce it by running the following Rust code repeatedly:
use surrealdb::{Datastore, Session};
#[tokio::main]
async fn main() -> Result<(), surrealdb::Error> {
let sql = "
BEGIN TRANSACTION;
DEFINE NAMESPACE foo;
USE NS foo;
DEFINE DATABASE bar;
COMMIT TRANSACTION;
";
let dbs = Datastore::new("tikv://127.0.0.1:2379").await?;
let ses = Session::for_kv();
let results = dbs.execute(sql, &ses, None, true).await?;
for result in results {
match result.output() {
Ok(record) => println!("{record}"),
Err(error) => eprintln!("{error}"),
}
}
Ok(())
}
Put the above code in main.rs
.
Start TiKV...
$ tiup playground --mode tikv-slim
...
Playground Bootstrapping...
Start pd instance:v6.2.0
Start tikv instance:v6.2.0
...
Run the Rust code repeatedly...
$ for i in `seq 1 3`; do cargo run; done
Aug 30 15:02:55.547 INFO connect to tikv endpoint: "127.0.0.1:20160"
NONE
NONE
NONE
Aug 30 15:02:56.638 INFO connect to tikv endpoint: "127.0.0.1:20160"
The query was not executed due to a failed transaction
The query was not executed due to a failed transaction
There was a problem with a datastore transaction: PessimisticLock error: ResolveLockError
Aug 30 15:02:57.799 INFO connect to tikv endpoint: "127.0.0.1:20160"
The query was not executed due to a failed transaction
The query was not executed due to a failed transaction
There was a problem with a datastore transaction: PessimisticLock error: ResolveLockError
Once it runs into that error, even spinning up SurrealDB
surreal start --log trace --user root --pass root tikv://127.0.0.1:2379
and running queries via the REPL doesn't work
$ surreal sql --conn http://localhost:8000 --user root --pass root --ns foo --db bar
> DEFINE TABLE foo_bar SCHEMAFULL;
[{"time":"15.826006ms","status":"ERR","detail":"There was a problem with a datastore transaction: Failed to resolve lock"}]
If I don't spawn a new process each time but instead move the loop into main.rs
it takes a bit longer to run into this. It keeps returning NONE
with no errors. That's what I expect to happen. It should never return ResolveLockError
, no matter how many times you run it and TiKV must remain in good state.
surreal 1.0.0-beta.7 for linux on x86_64
No response
Currently within a transaction, TABLE
definitions, EVENT
definitions, FIELD
definitions, INDEX
definitions, and foreign TABLE AS
definitions are fetched for every record when reading or writing records.
Ideally we should only fetch the definitions once, and then use the cached values when fetching them for subsequent record processing.
Use an in-transaction cache to store and cache configuration table records once they have been retrieved for the first time within a transaction.
This can be used when retrieving:
DEFINE TABLE
statements for the document table
DEFINE EVENT
statements for the document events
DEFINE FIELD
statements for the document fields
DEFINE INDEX
statements for the document indexes
DEFINE TABLE AS
statements for the document foreign tables
No alternative methods.
surreal 1.0.0-beta.5 for macos on aarch64
No response
When starting Docker, the container fails to run and instead the following error is returned:
docker: Error response from daemon: failed to create shim task: OCI runtime create failed: runc create failed: unable to start container process: exec: "/surreal": permission denied: unknown.
Run the following command:
docker run --rm -p 8000:8000 surrealdb/surrealdb:latest start
Should run without issue, and start the SurrealDB server
surreal 1.0.0-beta.2 for macos on aarch64
No response
The value of let statements is instantly lost once they have been executed. Running
-- Define the parameter
LET $name = "tobie";
And then
-- Use the parameter
CREATE person SET name = $name;
results in the name field not being set:
> LET $name = "tobie";
[{"time":"17.363µs","status":"OK","result":null}]
> CREATE person:3 SET name = $name;
[{"time":"116.22µs","status":"OK","result":[{"id":"person:3"}]}]
However interestingly copying both statements together into the repl works as expected.
Run
> LET $name = "tobie";
[{"time":"17.363µs","status":"OK","result":null}]
> CREATE person:3 SET name = $name;
[{"time":"116.22µs","status":"OK","result":[{"id":"person:3"}]}]
The value is stored in the variable.
surreal 1.0.0-beta.6 for linux on x86_64
I have run surrealdb started with root user and password. Server running well but, i hit via rest API that not include username password still can acess query.
Output from surreal :
[2022-09-01 09:28:19] INFO surrealdb::iam Root authentication is enabled
[2022-09-01 09:28:19] INFO surrealdb::iam Root username is 'userdb'
[2022-09-01 09:28:19] INFO surrealdb::dbs Database strict mode is enabled
[2022-09-01 09:28:19] INFO surrealdb::kvs Connecting to kvs store at tikv://192.23.192.212:2079
[2022-09-01 09:28:19] INFO surrealdb::kvs Connected to kvs store at tikv://192.23.192.212:2079
[2022-09-01 09:28:19] INFO surrealdb::net Starting web server on 0.0.0.0:8000
[2022-09-01 09:28:19] INFO surrealdb::net Started web server on 0.0.0.0:8000
Start surreal db with :
surreal start --strict --log trace --user userdb --pass userdb123 tikv://192.23.192.212:2079
select from HTTP REST API without username and password :
curl -X POST \ 04:28:11 PM
-H "NS: myapplication" \
-H "DB: myapplication" \
-H "Content-Type: application/json" \
-d "SELECT * FROM time::day('2021-11-01T08:30:17+00:00');" \
http://192.23.192.210:8000/sql
Work with result :
[{"time":"957.355µs","status":"OK","result":[1]}]
If i query to some table, the authentication works, for example :
curl -X POST \ 04:28:11 PM
-H "NS: myapplication" \
-H "DB: myapplication" \
-H "Content-Type: application/json" \
-d "SELECT * FROM person WHERE age > 18" \
http://192.23.192.210:8000/sql
The output is :
{"code":403,"details":"Authentication failed","description":"Your authentication details are invalid. Reauthenticate using valid authentication parameters.","information":"There was a problem with authentication"}%
Request Not Authenticated
1.0.0-beta.7
While trying to build: cargo build
the build fails:
chris@cb ~/s/surrealdb (main)> cargo build
Compiling proc-macro2 v1.0.43
Compiling unicode-ident v1.0.3
Compiling version_check v0.9.4
Compiling quote v1.0.21
Compiling syn v1.0.99
Compiling cfg-if v1.0.0
Compiling autocfg v1.1.0
Compiling libc v0.2.132
Compiling memchr v2.5.0
Compiling cc v1.0.73
Compiling once_cell v1.13.1
Compiling typenum v1.15.0
Compiling log v0.4.17
Compiling futures-core v0.3.23
Compiling pin-project-lite v0.2.9
Compiling pkg-config v0.3.25
Compiling bytes v1.2.1
Compiling futures-io v0.3.23
Compiling fastrand v1.8.0
Compiling bitflags v1.3.2
Compiling futures-sink v0.3.23
Compiling serde_derive v1.0.143
Compiling either v1.8.0
Compiling serde v1.0.143
Compiling anyhow v1.0.62
Compiling hashbrown v0.12.3
Compiling subtle v2.4.1
Compiling lazy_static v1.4.0
Compiling scopeguard v1.1.0
Compiling itoa v1.0.3
Compiling fnv v1.0.7
Compiling pin-utils v0.1.0
Compiling futures-channel v0.3.23
Compiling futures-task v0.3.23
Compiling futures-util v0.3.23
Compiling regex-syntax v0.6.27
Compiling futures v0.1.31
Compiling libm v0.2.5
Compiling base64 v0.13.0
Compiling ppv-lite86 v0.2.16
Compiling remove_dir_all v0.5.3
Compiling percent-encoding v2.1.0
Compiling smallvec v1.9.0
Compiling matches v0.1.9
Compiling cpufeatures v0.2.2
Compiling byteorder v1.4.3
Compiling opaque-debug v0.3.0
Compiling glob v0.3.0
Compiling tinyvec_macros v0.1.0
Compiling unicode-segmentation v1.9.0
Compiling unicode-bidi v0.3.8
Compiling parking v2.0.0
Compiling waker-fn v1.1.0
Compiling cache-padded v1.2.0
Compiling httparse v1.7.1
Compiling multimap v0.8.3
Compiling bindgen v0.57.0
Compiling fixedbitset v0.4.2
Compiling openssl-probe v0.1.5
Compiling fixedbitset v0.2.0
Compiling same-file v1.0.6
Compiling parking_lot_core v0.8.5
Compiling event-listener v2.5.3
Compiling ryu v1.0.11
Compiling shlex v0.1.1
Compiling rustc-hash v1.1.0
Compiling lazycell v1.3.0
Compiling peeking_take_while v0.1.2
Compiling getrandom v0.1.16
Compiling proc-macro-hack v0.5.19
Compiling async-task v4.3.0
Compiling semver v1.0.13
Compiling wasm-bindgen-shared v0.2.82
Compiling openssl v0.10.41
Compiling crc32fast v1.3.2
Compiling protobuf v2.27.1
Compiling foreign-types-shared v0.1.1
Compiling mime v0.3.16
Compiling try-lock v0.2.3
Compiling async-trait v0.1.57
Compiling cpuid-bool v0.2.0
Compiling httpdate v1.0.2
Compiling const_fn v0.4.9
Compiling serde_json v1.0.83
Compiling adler v1.0.2
Compiling native-tls v0.2.10
Compiling bumpalo v3.11.0
Compiling crossbeam-utils v0.8.11
Compiling tower-service v0.3.2
Compiling spin v0.5.2
Compiling strsim v0.10.0
Compiling encoding_rs v0.8.31
Compiling untrusted v0.7.1
Compiling atomic-waker v1.0.0
Compiling wasm-bindgen v0.2.82
Compiling curl v0.4.44
Compiling ident_case v1.0.1
Compiling stable_deref_trait v1.2.0
Compiling io-lifetimes v0.7.3
Compiling isahc v0.9.14
Compiling base64ct v1.5.1
Compiling relative-path v1.7.2
Compiling alloc-no-stdlib v2.0.3
Compiling http-types v2.12.0
Compiling bytes v0.5.6
Compiling hex v0.4.3
Compiling prometheus v0.12.0
Compiling ipnet v2.5.0
Compiling infer v0.2.3
Compiling rustix v0.35.9
Compiling paste v1.0.8
Compiling linux-raw-sys v0.0.46
Compiling utf-8 v0.7.6
Compiling safemem v0.3.3
Compiling time-macros v0.2.4
Compiling num_threads v0.1.6
Compiling endian-type v0.1.2
Compiling minimal-lexical v0.2.1
Compiling any_ascii v0.1.7
Compiling robust v0.2.3
Compiling quick-error v1.2.3
Compiling iana-time-zone v0.1.45
Compiling futures-timer v3.0.2
Compiling arc-swap v1.5.1
Compiling urlencoding v2.1.0
Compiling os_str_bytes v6.3.0
Compiling deunicode v1.3.1
Compiling utf8parse v0.2.0
Compiling scoped-tls v1.0.0
Compiling trice v0.1.0
Compiling unicode-width v0.1.9
Compiling half v1.8.2
Compiling termcolor v1.1.3
Compiling textwrap v0.15.0
Compiling libloading v0.7.3
Compiling instant v0.1.12
Compiling geographiclib-rs v0.2.1
Compiling tracing-core v0.1.29
Compiling thread_local v1.1.4
Compiling itertools v0.9.0
Compiling itertools v0.10.3
Compiling form_urlencoded v1.0.1
Compiling tinyvec v1.6.0
Compiling pem v1.1.0
Compiling concurrent-queue v1.2.4
Compiling nibble_vec v0.1.0
Compiling value-bag v1.0.0-alpha.9
Compiling generic-array v0.14.6
Compiling nom v5.1.2
Compiling standback v0.2.17
Compiling unicase v2.6.0
Compiling time v0.2.27
Compiling proc-macro-error-attr v1.0.4
Compiling cookie v0.14.4
Compiling proc-macro-error v1.0.4
Compiling openssl-src v111.22.0+1.1.1q
Compiling cmake v0.1.48
Compiling http v0.2.8
Compiling slab v0.4.7
Compiling lock_api v0.4.7
Compiling tokio v1.20.1
Compiling indexmap v1.9.1
Compiling num-traits v0.2.15
Compiling num-integer v0.1.45
Compiling async-io v1.8.0
Compiling num-bigint v0.4.3
Compiling hash32 v0.2.1
Compiling walkdir v2.3.2
Compiling async-lock v2.5.0
Compiling foreign-types v0.3.2
Compiling miniz_oxide v0.5.3
Compiling heck v0.3.3
Compiling clang-sys v1.3.3
Compiling alloc-stdlib v0.2.1
Compiling lexical-sort v0.3.1
Compiling clap_lex v0.2.4
Compiling fuzzy-matcher v0.3.7
Compiling radix_trie v0.2.1
Compiling libz-sys v1.1.8
Compiling libnghttp2-sys v0.1.7+1.45.0
Compiling curl-sys v0.4.56+curl-7.83.1
Compiling ring v0.16.20
Compiling rquickjs-sys v0.1.6
Compiling boringssl-src v0.2.0
Compiling openssl-sys v0.9.75
Compiling brotli-decompressor v2.3.2
Compiling async-channel v1.7.1
Compiling unicode-normalization v0.1.21
Compiling aho-corasick v0.7.18
Compiling futures-lite v1.12.0
Compiling twoway v0.1.8
Compiling buf_redux v0.8.4
Compiling nom v7.1.1
Compiling flate2 v1.0.24
Compiling http-body v0.4.5
Compiling headers-core v0.2.0
Compiling rustc_version v0.4.0
Compiling sluice v0.5.5
Compiling which v4.2.5
Compiling tempfile v3.3.0
Compiling mime_guess v2.0.4
Compiling spin v0.9.4
Compiling spinning_top v0.2.4
Compiling bitmaps v2.1.0
Compiling idna v0.2.3
Compiling flume v0.9.2
Compiling prost-build v0.9.0
Compiling prost-build v0.7.0
Compiling petgraph v0.6.2
Compiling petgraph v0.5.1
Compiling heapless v0.7.16
Compiling regex v1.6.0
Compiling async-executor v1.4.1
Compiling blocking v1.2.0
Compiling brotli v3.3.4
Compiling num_cpus v1.13.1
Compiling getrandom v0.2.7
Compiling socket2 v0.4.4
Compiling procfs v0.9.1
Compiling atty v0.2.14
Compiling time v0.3.13
Compiling time v0.1.44
Compiling dirs-sys-next v0.1.2
Compiling nix v0.24.2
Compiling sized-chunks v0.6.5
Compiling crypto-common v0.1.6
Compiling block-buffer v0.10.2
Compiling digest v0.9.0
Compiling block-buffer v0.9.0
Compiling cipher v0.2.5
Compiling universal-hash v0.4.1
Compiling crypto-mac v0.10.1
Compiling aead v0.3.2
Compiling inout v0.1.3
Compiling rand_core v0.6.3
Compiling nanorand v0.7.0
Compiling rand_core v0.5.1
Compiling colored v1.9.3
Compiling clap v3.2.17
Compiling approx v0.5.1
Compiling rmp v0.8.11
Compiling float_next_after v0.1.5
Compiling parking_lot v0.11.2
Compiling dirs-next v2.0.0
Compiling digest v0.10.3
Compiling sha2 v0.9.9
Compiling sha-1 v0.9.8
Compiling polyval v0.4.5
Compiling hmac v0.10.1
Compiling aes-soft v0.6.4
Compiling ctr v0.6.0
Compiling cipher v0.4.3
Compiling rand_chacha v0.3.1
Compiling password-hash v0.4.2
Compiling rand_xoshiro v0.6.0
Compiling rand_chacha v0.2.2
Compiling cexpr v0.4.0
Compiling dmp v0.1.1
Compiling ghash v0.3.1
Compiling hkdf v0.10.0
Compiling hmac v0.12.1
Compiling sha-1 v0.10.0
Compiling sha2 v0.10.2
Compiling blake2 v0.10.4
Compiling md-5 v0.10.1
Compiling aes v0.6.0
Compiling salsa20 v0.10.2
Compiling rand v0.7.3
Compiling rand v0.8.5
Compiling imbl v1.0.1
Compiling headers v0.3.7
Compiling rstar v0.9.3
Compiling aes-gcm v0.8.0
Compiling pbkdf2 v0.11.0
Compiling argon2 v0.4.1
Compiling scrypt v0.10.0
Compiling fd-lock v3.0.6
Compiling sct v0.6.1
Compiling webpki v0.21.4
Compiling toml v0.5.9
Compiling nanoid v0.4.0
Compiling wasm-bindgen-backend v0.2.82
Compiling darling_core v0.14.1
Compiling wasm-bindgen-macro-support v0.2.82
Compiling rquickjs-core v0.1.6
Compiling ctor v0.1.23
Compiling thiserror-impl v1.0.32
Compiling futures-macro v0.3.23
Compiling tokio-macros v1.8.0
Compiling tracing-attributes v0.1.22
Compiling prost-derive v0.7.0
Compiling prost-derive v0.9.0
Compiling pin-project-internal v1.0.12
Compiling derive-new v0.5.9
Compiling openssl-macros v0.1.0
Compiling time-macros-impl v0.1.2
Compiling async-recursion v1.0.0
Compiling surrealdb-derive v0.3.0
Compiling wasm-bindgen-macro v0.2.82
Compiling darling_macro v0.14.1
Compiling grpcio-sys v0.8.1
Compiling time-macros v0.1.1
Compiling darling v0.14.1
Compiling mio v0.8.4
Compiling want v0.3.0
Compiling polling v2.2.0
Compiling kv-log-macro v1.0.7
Compiling rustls v0.19.1
Compiling fail v0.4.0
Compiling multipart v0.18.0
Compiling fern v0.6.1
Compiling rustyline v10.0.0
Compiling pin-project v1.0.12
Compiling flume v0.10.14
Compiling js-sys v0.3.59
Compiling tracing v0.1.36
Compiling thiserror v1.0.32
Compiling prost v0.9.0
Compiling async-global-executor v2.2.0
Compiling proc-macro-crate v1.2.1
Compiling simple_asn1 v0.6.2
Compiling tracing-futures v0.2.5
Compiling async-std v1.12.0
Compiling prost v0.7.0
Compiling prost-types v0.9.0
Compiling rquickjs-macro v0.1.6
Compiling prost-types v0.7.0
Compiling grpcio-compiler v0.10.0
Compiling web-sys v0.3.59
Compiling protobuf-build v0.12.3
Compiling futures-executor v0.3.23
Compiling futures v0.3.23
Compiling tikv-client-proto v0.1.0
Compiling tokio-util v0.7.3
Compiling echodb v0.3.0
Compiling tokio-util v0.6.10
Compiling tokio-rustls v0.22.0
Compiling async-compression v0.3.14
Compiling tokio-stream v0.1.9
Compiling rquickjs v0.1.6
Compiling h2 v0.3.14
Compiling url v2.2.2
Compiling serde_urlencoded v0.7.1
Compiling serde_qs v0.8.5
Compiling geo-types v0.7.6
Compiling bigdecimal v0.3.0
Compiling storekey v0.3.0
Compiling chrono v0.4.22
Compiling uuid v1.1.2
Compiling rmp-serde v1.1.0
Compiling serde_cbor v0.11.2
Compiling tungstenite v0.14.0
Compiling geo v0.22.1
Compiling jsonwebtoken v8.1.1
Compiling tokio-tungstenite v0.15.0
Compiling tokio-native-tls v0.3.0
Compiling hyper v0.14.20
Compiling http-client v6.5.3
Compiling surf v2.3.2
Compiling hyper-tls v0.5.0
Compiling warp v0.3.2
Compiling reqwest v0.11.11
error: failed to run custom build command for `grpcio-sys v0.8.1`
Caused by:
process didn't exit successfully: `/home/chris/src/surrealdb/target/debug/build/grpcio-sys-1ce1c20ccd61db10/build-script-build` (exit status: 101)
--- stdout
cargo:rerun-if-changed=grpc_wrap.cc
cargo:rerun-if-changed=grpc
cargo:rerun-if-env-changed=UPDATE_BIND
cargo:rerun-if-env-changed=CARGO_CFG_TARGET_OS
cargo:rerun-if-env-changed=GRPCIO_SYS_USE_PKG_CONFIG
cargo:rerun-if-env-changed=CARGO_CFG_TARGET_OS
cargo:rerun-if-env-changed=CARGO_CFG_TARGET_OS
cargo:rerun-if-env-changed=CARGO_CFG_TARGET_OS
cargo:rerun-if-env-changed=CXX
OPT_LEVEL = Some("0")
TARGET = Some("x86_64-unknown-linux-gnu")
HOST = Some("x86_64-unknown-linux-gnu")
CC_x86_64-unknown-linux-gnu = None
CC_x86_64_unknown_linux_gnu = None
HOST_CC = None
CC = None
CFLAGS_x86_64-unknown-linux-gnu = None
CFLAGS_x86_64_unknown_linux_gnu = None
HOST_CFLAGS = None
CFLAGS = None
CRATE_CC_NO_DEFAULTS = None
DEBUG = Some("true")
CARGO_CFG_TARGET_FEATURE = Some("fxsr,sse,sse2")
cargo:rustc-link-search=native=/home/chris/src/surrealdb/target/debug/build/libz-sys-a9b86f3dea7681f0/out/build
cargo:rustc-link-search=native=/home/chris/src/surrealdb/target/debug/build/libz-sys-a9b86f3dea7681f0/out/lib
CMAKE_TOOLCHAIN_FILE_x86_64-unknown-linux-gnu = None
CMAKE_TOOLCHAIN_FILE_x86_64_unknown_linux_gnu = None
HOST_CMAKE_TOOLCHAIN_FILE = None
CMAKE_TOOLCHAIN_FILE = None
CMAKE_GENERATOR_x86_64-unknown-linux-gnu = None
CMAKE_GENERATOR_x86_64_unknown_linux_gnu = None
HOST_CMAKE_GENERATOR = None
CMAKE_GENERATOR = None
CMAKE_PREFIX_PATH_x86_64-unknown-linux-gnu = None
CMAKE_PREFIX_PATH_x86_64_unknown_linux_gnu = None
HOST_CMAKE_PREFIX_PATH = None
CMAKE_PREFIX_PATH = Some("/home/chris/src/surrealdb/target/debug/build/libz-sys-a9b86f3dea7681f0/out/build")
CMAKE_x86_64-unknown-linux-gnu = None
CMAKE_x86_64_unknown_linux_gnu = None
HOST_CMAKE = None
CMAKE = None
running: "cmake" "/home/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/grpcio-sys-0.8.1/grpc" "-DgRPC_INSTALL=false" "-DgRPC_BUILD_CSHARP_EXT=false" "-DgRPC_BUILD_CODEGEN=false" "-DgRPC_BENCHMARK_PROVIDER=none" "-DgRPC_SSL_PROVIDER=package" "-DgRPC_ZLIB_PROVIDER=package" "-DCMAKE_INSTALL_PREFIX=/home/chris/src/surrealdb/target/debug/build/grpcio-sys-f8d84882fc745783/out" "-DCMAKE_C_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_C_COMPILER=/usr/bin/cc" "-DCMAKE_CXX_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_CXX_COMPILER=/usr/bin/c++" "-DCMAKE_ASM_FLAGS= -ffunction-sections -fdata-sections -fPIC -m64" "-DCMAKE_ASM_COMPILER=/usr/bin/cc" "-DCMAKE_BUILD_TYPE=Debug"
-- The C compiler identification is GNU 11.2.0
-- The CXX compiler identification is GNU 11.2.0
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working C compiler: /usr/bin/cc - skipped
-- Detecting C compile features
-- Detecting C compile features - done
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++ - skipped
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Looking for pthread.h
-- Looking for pthread.h - found
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD
-- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success
-- Found Threads: TRUE
-- Looking for res_servicename in resolv
-- Looking for res_servicename in resolv - not found
-- Looking for gethostbyname in nsl
-- Looking for gethostbyname in nsl - found
-- Looking for gethostbyname in socket
-- Looking for gethostbyname in socket - not found
-- Looking for socket in socket
-- Looking for socket in socket - not found
-- Looking for clock_gettime in rt
-- Looking for clock_gettime in rt - found
-- Looking for include file sys/types.h
-- Looking for include file sys/types.h - found
-- Looking for include file sys/socket.h
-- Looking for include file sys/socket.h - found
-- Looking for include file arpa/inet.h
-- Looking for include file arpa/inet.h - found
-- Looking for include file arpa/nameser_compat.h
-- Looking for include file arpa/nameser_compat.h - found
-- Looking for include file arpa/nameser.h
-- Looking for include file arpa/nameser.h - found
-- Looking for include file assert.h
-- Looking for include file assert.h - found
-- Looking for include file errno.h
-- Looking for include file errno.h - found
-- Looking for include file fcntl.h
-- Looking for include file fcntl.h - found
-- Looking for include file inttypes.h
-- Looking for include file inttypes.h - found
-- Looking for include file limits.h
-- Looking for include file limits.h - found
-- Looking for include file malloc.h
-- Looking for include file malloc.h - found
-- Looking for include file memory.h
-- Looking for include file memory.h - found
-- Looking for include file netdb.h
-- Looking for include file netdb.h - found
-- Looking for include file netinet/in.h
-- Looking for include file netinet/in.h - found
-- Looking for include file netinet/tcp.h
-- Looking for include file netinet/tcp.h - found
-- Looking for include file net/if.h
-- Looking for include file net/if.h - found
-- Looking for include file signal.h
-- Looking for include file signal.h - found
-- Looking for include file socket.h
-- Looking for include file socket.h - not found
-- Looking for include file stdbool.h
-- Looking for include file stdbool.h - found
-- Looking for include file stdint.h
-- Looking for include file stdint.h - found
-- Looking for include file stdlib.h
-- Looking for include file stdlib.h - found
-- Looking for include file strings.h
-- Looking for include file strings.h - found
-- Looking for include file string.h
-- Looking for include file string.h - found
-- Looking for include file stropts.h
-- Looking for include file stropts.h - not found
-- Looking for include file sys/ioctl.h
-- Looking for include file sys/ioctl.h - found
-- Looking for include file sys/param.h
-- Looking for include file sys/param.h - found
-- Looking for include file sys/select.h
-- Looking for include file sys/select.h - found
-- Looking for include file sys/stat.h
-- Looking for include file sys/stat.h - found
-- Looking for include file sys/time.h
-- Looking for include file sys/time.h - found
-- Looking for include file sys/uio.h
-- Looking for include file sys/uio.h - found
-- Looking for include file time.h
-- Looking for include file time.h - found
-- Looking for include file dlfcn.h
-- Looking for include file dlfcn.h - found
-- Looking for include file unistd.h
-- Looking for include file unistd.h - found
-- Looking for include files winsock2.h, windows.h
-- Looking for include files winsock2.h, windows.h - not found
-- Looking for 3 include files winsock2.h, ..., windows.h
-- Looking for 3 include files winsock2.h, ..., windows.h - not found
-- Looking for include files winsock.h, windows.h
-- Looking for include files winsock.h, windows.h - not found
-- Looking for include file windows.h
-- Looking for include file windows.h - not found
-- Performing Test HAVE_SOCKLEN_T
-- Performing Test HAVE_SOCKLEN_T - Success
-- Performing Test HAVE_TYPE_SOCKET
-- Performing Test HAVE_TYPE_SOCKET - Failed
-- Performing Test HAVE_BOOL_T
-- Performing Test HAVE_BOOL_T - Success
-- Performing Test HAVE_SSIZE_T
-- Performing Test HAVE_SSIZE_T - Success
-- Performing Test HAVE_LONGLONG
-- Performing Test HAVE_LONGLONG - Success
-- Performing Test HAVE_SIG_ATOMIC_T
-- Performing Test HAVE_SIG_ATOMIC_T - Success
-- Performing Test HAVE_STRUCT_ADDRINFO
-- Performing Test HAVE_STRUCT_ADDRINFO - Success
-- Performing Test HAVE_STRUCT_IN6_ADDR
-- Performing Test HAVE_STRUCT_IN6_ADDR - Success
-- Performing Test HAVE_STRUCT_SOCKADDR_IN6
-- Performing Test HAVE_STRUCT_SOCKADDR_IN6 - Success
-- Performing Test HAVE_STRUCT_SOCKADDR_STORAGE
-- Performing Test HAVE_STRUCT_SOCKADDR_STORAGE - Success
-- Performing Test HAVE_STRUCT_TIMEVAL
-- Performing Test HAVE_STRUCT_TIMEVAL - Success
-- Looking for AF_INET6
-- Looking for AF_INET6 - found
-- Looking for O_NONBLOCK
-- Looking for O_NONBLOCK - found
-- Looking for FIONBIO
-- Looking for FIONBIO - found
-- Looking for SIOCGIFADDR
-- Looking for SIOCGIFADDR - found
-- Looking for MSG_NOSIGNAL
-- Looking for MSG_NOSIGNAL - found
-- Looking for PF_INET6
-- Looking for PF_INET6 - found
-- Looking for SO_NONBLOCK
-- Looking for SO_NONBLOCK - not found
-- Looking for CLOCK_MONOTONIC
-- Looking for CLOCK_MONOTONIC - found
-- Performing Test HAVE_SOCKADDR_IN6_SIN6_SCOPE_ID
-- Performing Test HAVE_SOCKADDR_IN6_SIN6_SCOPE_ID - Success
-- Performing Test HAVE_LL
-- Performing Test HAVE_LL - Success
-- Looking for bitncmp
-- Looking for bitncmp - not found
-- Looking for closesocket
-- Looking for closesocket - not found
-- Looking for CloseSocket
-- Looking for CloseSocket - not found
-- Looking for connect
-- Looking for connect - found
-- Looking for fcntl
-- Looking for fcntl - found
-- Looking for freeaddrinfo
-- Looking for freeaddrinfo - found
-- Looking for getaddrinfo
-- Looking for getaddrinfo - found
-- Looking for getenv
-- Looking for getenv - found
-- Looking for gethostbyaddr
-- Looking for gethostbyaddr - found
-- Looking for gethostbyname
-- Looking for gethostbyname - found
-- Looking for gethostname
-- Looking for gethostname - found
-- Looking for getnameinfo
-- Looking for getnameinfo - found
-- Looking for getservbyport_r
-- Looking for getservbyport_r - found
-- Looking for gettimeofday
-- Looking for gettimeofday - found
-- Looking for if_indextoname
-- Looking for if_indextoname - found
-- Looking for inet_net_pton
-- Looking for inet_net_pton - not found
-- Looking for inet_ntop
-- Looking for inet_ntop - found
-- Looking for inet_pton
-- Looking for inet_pton - found
-- Looking for ioctl
-- Looking for ioctl - found
-- Looking for ioctlsocket
-- Looking for ioctlsocket - not found
-- Looking for IoctlSocket
-- Looking for IoctlSocket - not found
-- Looking for recv
-- Looking for recv - found
-- Looking for recvfrom
-- Looking for recvfrom - found
-- Looking for send
-- Looking for send - found
-- Looking for setsockopt
-- Looking for setsockopt - found
-- Looking for socket
-- Looking for socket - found
-- Looking for strcasecmp
-- Looking for strcasecmp - found
-- Looking for strcmpi
-- Looking for strcmpi - not found
-- Looking for strdup
-- Looking for strdup - found
-- Looking for stricmp
-- Looking for stricmp - not found
-- Looking for strncasecmp
-- Looking for strncasecmp - found
-- Looking for strncmpi
-- Looking for strncmpi - not found
-- Looking for strnicmp
-- Looking for strnicmp - not found
-- Looking for writev
-- Looking for writev - found
-- Looking for __system_property_get
-- Looking for __system_property_get - not found
-- Found OpenSSL: /home/chris/src/surrealdb/target/debug/build/openssl-sys-53d5f7ede8b04507/out/openssl-build/install/lib/libcrypto.a (found version "1.1.1q")
-- Found ZLIB: /home/chris/src/surrealdb/target/debug/build/libz-sys-a9b86f3dea7681f0/out/lib/libz.a (found version "1.2.11")
-- Configuring done
-- Generating done
-- Build files have been written to: /home/chris/src/surrealdb/target/debug/build/grpcio-sys-f8d84882fc745783/out/build
running: "cmake" "--build" "." "--target" "grpc" "--config" "Debug" "--parallel" "32"
[ 0%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/bitstate.cc.o
[ 0%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_log_severity.dir/log_severity.cc.o
[ 0%] Building CXX object third_party/abseil-cpp/absl/numeric/CMakeFiles/absl_int128.dir/int128.cc.o
[ 0%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_civil_time.dir/internal/cctz/src/civil_time_detail.cc.o
[ 0%] Building C object CMakeFiles/address_sorting.dir/third_party/address_sorting/address_sorting.c.o
[ 0%] Building C object CMakeFiles/address_sorting.dir/third_party/address_sorting/address_sorting_posix.c.o
[ 0%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/compile.cc.o
[ 0%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_spinlock_wait.dir/internal/spinlock_wait.cc.o
[ 0%] Building C object CMakeFiles/address_sorting.dir/third_party/address_sorting/address_sorting_windows.c.o
[ 0%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/filtered_re2.cc.o
[ 0%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_format.cc.o
[ 0%] Building CXX object third_party/abseil-cpp/absl/hash/CMakeFiles/absl_city.dir/internal/city.cc.o
[ 0%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_if.cc.o
[ 0%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/mimics_pcre.cc.o
[ 1%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_info.cc.o
[ 1%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_fixed.cc.o
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/onepass.cc.o
[ 1%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/dfa.cc.o
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/parse.cc.o
[ 3%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_libc.cc.o
[ 3%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_impl.cc.o
[ 3%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/decode_fast.c.o
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/nfa.cc.o
[ 3%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/decode.c.o
[ 3%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_exponential_biased.dir/internal/exponential_biased.cc.o
[ 3%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_lookup.cc.o
[ 3%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/time_zone_posix.cc.o
[ 3%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/def.c.o
[ 3%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time_zone.dir/internal/cctz/src/zone_info_source.cc.o
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/perl_groups.cc.o
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/prefilter.cc.o
[ 3%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares__close_sockets.c.o
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/prefilter_tree.cc.o
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/prog.cc.o
[ 3%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/encode.c.o
[ 3%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares__get_hostent.c.o
[ 3%] Linking C static library libaddress_sorting.a
[ 3%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/re2.cc.o
[ 3%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/json_decode.c.o
[ 5%] Linking CXX static library libabsl_spinlock_wait.a
[ 5%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/json_encode.c.o
[ 6%] Linking CXX static library libabsl_city.a
[ 6%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares__read_line.c.o
[ 6%] Linking CXX static library libabsl_exponential_biased.a
[ 6%] Built target address_sorting
[ 6%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares__timeval.c.o
[ 6%] Built target absl_spinlock_wait
[ 6%] Linking CXX static library libabsl_log_severity.a
[ 6%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/msg.c.o
[ 6%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/reflection.c.o
[ 6%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/regexp.cc.o
[ 6%] Built target absl_city
[ 6%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/set.cc.o
[ 6%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/table.c.o
[ 6%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/text_encode.c.o
[ 6%] Built target absl_exponential_biased
[ 6%] Building C object CMakeFiles/upb.dir/third_party/upb/upb/upb.c.o
[ 6%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_android.c.o
[ 6%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/stringpiece.cc.o
[ 6%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_cancel.c.o
[ 6%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/simplify.cc.o
[ 6%] Built target absl_log_severity
[ 6%] Building C object CMakeFiles/upb.dir/src/core/ext/upb-generated/google/protobuf/descriptor.upb.c.o
[ 6%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_data.c.o
[ 6%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_destroy.c.o
[ 6%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_expand_name.c.o
[ 6%] Linking CXX static library libabsl_civil_time.a
[ 6%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/unicode_casefold.cc.o
[ 8%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_expand_string.c.o
[ 8%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_fds.c.o
[ 8%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_free_hostent.c.o
[ 8%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/tostring.cc.o
[ 10%] Linking C static library libupb.a
[ 10%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_raw_logging_internal.dir/internal/raw_logging.cc.o
[ 10%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_getenv.c.o
[ 10%] Linking CXX static library libabsl_int128.a
[ 12%] Building CXX object third_party/re2/CMakeFiles/re2.dir/re2/unicode_groups.cc.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_free_string.c.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_gethostbyaddr.c.o
[ 12%] Building CXX object third_party/re2/CMakeFiles/re2.dir/util/rune.cc.o
[ 12%] Built target absl_civil_time
[ 12%] Building CXX object third_party/re2/CMakeFiles/re2.dir/util/strutil.cc.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_gethostbyname.c.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_getnameinfo.c.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_getsock.c.o
[ 12%] Built target absl_int128
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_init.c.o
[ 12%] Built target upb
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_aaaa_reply.c.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_library_init.c.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_nowarn.c.o
[ 12%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_options.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_create_query.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_llist.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_mkquery.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_a_reply.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_mx_reply.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_naptr_reply.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_soa_reply.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_srv_reply.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_ns_reply.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_platform.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_ptr_reply.c.o
[ 13%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_parse_txt_reply.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_process.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_query.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_send.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_strerror.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_strdup.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_search.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_strcasecmp.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_strsplit.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_version.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_writev.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/ares_timeout.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/inet_net_pton.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/bitncmp.c.o
[ 15%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/inet_ntop.c.o
[ 17%] Building C object third_party/cares/cares/CMakeFiles/c-ares.dir/windows_port.c.o
[ 17%] Linking CXX static library libabsl_raw_logging_internal.a
[ 17%] Linking C static library lib/libcares.a
[ 17%] Built target absl_raw_logging_internal
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings_internal.dir/internal/escaping.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_base.dir/internal/spinlock.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_base.dir/internal/cycleclock.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings_internal.dir/internal/ostringstream.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_base.dir/internal/sysinfo.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings_internal.dir/internal/utf8.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/debugging/CMakeFiles/absl_debugging_internal.dir/internal/address_is_readable.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/types/CMakeFiles/absl_bad_variant_access.dir/bad_variant_access.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_base.dir/internal/thread_identity.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_throw_delegate.dir/internal/throw_delegate.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/debugging/CMakeFiles/absl_debugging_internal.dir/internal/elf_mem_image.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/types/CMakeFiles/absl_bad_optional_access.dir/bad_optional_access.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/debugging/CMakeFiles/absl_debugging_internal.dir/internal/vdso_support.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_base.dir/internal/unscaledcycleclock.cc.o
[ 17%] Built target c-ares
[ 17%] Linking CXX static library libabsl_time_zone.a
[ 17%] Built target absl_time_zone
[ 17%] Linking CXX static library libabsl_bad_optional_access.a
[ 17%] Linking CXX static library libabsl_strings_internal.a
[ 17%] Linking CXX static library libabsl_bad_variant_access.a
[ 17%] Built target absl_bad_optional_access
[ 17%] Linking CXX static library libabsl_throw_delegate.a
[ 17%] Built target absl_strings_internal
[ 17%] Linking CXX static library libabsl_debugging_internal.a
[ 17%] Built target absl_bad_variant_access
[ 17%] Built target absl_throw_delegate
[ 17%] Built target absl_debugging_internal
[ 17%] Building CXX object third_party/abseil-cpp/absl/debugging/CMakeFiles/absl_stacktrace.dir/stacktrace.cc.o
[ 17%] Linking CXX static library libabsl_base.a
[ 17%] Built target absl_base
[ 17%] Building CXX object third_party/abseil-cpp/absl/base/CMakeFiles/absl_malloc_internal.dir/internal/low_level_alloc.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/charconv.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/ascii.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/debugging/CMakeFiles/absl_demangle_internal.dir/internal/demangle.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/str_cat.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/internal/memutil.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/match.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/numbers.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/internal/charconv_bigint.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/str_split.cc.o
[ 17%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/escaping.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/str_replace.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/internal/charconv_parse.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/string_view.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_strings.dir/substitute.cc.o
[ 18%] Linking CXX static library libabsl_stacktrace.a
[ 18%] Linking CXX static library libre2.a
[ 18%] Linking CXX static library libabsl_demangle_internal.a
[ 18%] Built target absl_stacktrace
[ 18%] Built target absl_demangle_internal
[ 18%] Built target re2
[ 18%] Linking CXX static library libabsl_malloc_internal.a
[ 18%] Built target absl_malloc_internal
[ 18%] Building CXX object third_party/abseil-cpp/absl/synchronization/CMakeFiles/absl_graphcycles_internal.dir/internal/graphcycles.cc.o
[ 18%] Linking CXX static library libabsl_strings.a
[ 18%] Built target absl_strings
[ 18%] Building CXX object third_party/abseil-cpp/absl/debugging/CMakeFiles/absl_symbolize.dir/symbolize.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_cord.dir/cord.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time.dir/civil_time.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_str_format_internal.dir/internal/str_format/arg.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_str_format_internal.dir/internal/str_format/bind.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time.dir/clock.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/hash/CMakeFiles/absl_hash.dir/internal/hash.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_str_format_internal.dir/internal/str_format/extension.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_str_format_internal.dir/internal/str_format/float_conversion.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time.dir/duration.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time.dir/format.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/time/CMakeFiles/absl_time.dir/time.cc.o
[ 18%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_str_format_internal.dir/internal/str_format/output.cc.o
[ 20%] Building CXX object third_party/abseil-cpp/absl/strings/CMakeFiles/absl_str_format_internal.dir/internal/str_format/parser.cc.o
[ 20%] Linking CXX static library libabsl_hash.a
[ 20%] Linking CXX static library libabsl_symbolize.a
[ 20%] Built target absl_hash
[ 20%] Built target absl_symbolize
[ 22%] Linking CXX static library libabsl_time.a
[ 22%] Built target absl_time
[ 22%] Linking CXX static library libabsl_str_format_internal.a
[ 22%] Built target absl_str_format_internal
[ 22%] Linking CXX static library libabsl_cord.a
[ 22%] Built target absl_cord
--- stderr
CMake Warning at cmake/protobuf.cmake:51 (message):
gRPC_PROTOBUF_PROVIDER is "module" but PROTOBUF_ROOT_DIR is wrong
Call Stack (most recent call first):
CMakeLists.txt:254 (include)
CMake Warning:
Manually-specified variables were not used by the project:
CMAKE_ASM_COMPILER
CMAKE_ASM_FLAGS
gmake: warning: -j32 forced in submake: resetting jobserver mode.
/home/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/grpcio-sys-0.8.1/grpc/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc: In member function ‘void absl::lts_2020_09_23::synchronization_internal::GraphCycles::RemoveNode(void*)’:
/home/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/grpcio-sys-0.8.1/grpc/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc:451:26: error: ‘numeric_limits’ is not a member of ‘std’
451 | if (x->version == std::numeric_limits<uint32_t>::max()) {
| ^~~~~~~~~~~~~~
/home/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/grpcio-sys-0.8.1/grpc/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc:451:49: error: expected primary-expression before ‘>’ token
451 | if (x->version == std::numeric_limits<uint32_t>::max()) {
| ^
/home/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/grpcio-sys-0.8.1/grpc/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc:451:52: error: ‘::max’ has not been declared; did you mean ‘std::max’?
451 | if (x->version == std::numeric_limits<uint32_t>::max()) {
| ^~~
| std::max
In file included from /usr/include/c++/11/algorithm:62,
from /home/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/grpcio-sys-0.8.1/grpc/third_party/abseil-cpp/absl/synchronization/internal/graphcycles.cc:38:
/usr/include/c++/11/bits/stl_algo.h:3467:5: note: ‘std::max’ declared here
3467 | max(initializer_list<_Tp> __l, _Compare __comp)
| ^~~
gmake[3]: *** [third_party/abseil-cpp/absl/synchronization/CMakeFiles/absl_graphcycles_internal.dir/build.make:76: third_party/abseil-cpp/absl/synchronization/CMakeFiles/absl_graphcycles_internal.dir/internal/graphcycles.cc.o] Error 1
gmake[2]: *** [CMakeFiles/Makefile2:3153: third_party/abseil-cpp/absl/synchronization/CMakeFiles/absl_graphcycles_internal.dir/all] Error 2
gmake[2]: *** Waiting for unfinished jobs....
gmake[1]: *** [CMakeFiles/Makefile2:848: CMakeFiles/grpc.dir/rule] Error 2
gmake: *** [Makefile:247: grpc] Error 2
thread 'main' panicked at '
command did not execute successfully, got: exit status: 2
build script failed, must exit now', /home/chris/.cargo/registry/src/github.com-1ecc6299db9ec823/cmake-0.1.48/src/lib.rs:975:5
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
warning: build failed, waiting for other jobs to finish...
cargo build
on ubuntu
Build should work.
surreal for linux
No response
The commands in that document contain -vvv
which is not or is no longer supported. Furthermore, running without root authentication seems to only lead to authentication failure.
$ cargo run -- -vvv start memory
Finished dev [unoptimized + debuginfo] target(s) in 0.17s
Running `target/debug/surreal -vvv start memory`
error: Found argument '-v' which wasn't expected, or isn't valid in this context
If you tried to supply `-v` as a value rather than a flag, use `-- -v`
USAGE:
surreal [SUBCOMMAND]
For more information try --help
Commands in that document should just run.
surreal 1.0.0-beta.7 for linux on x86_64
No response
The syntax for defining an embedded JavaScript function is a little obtuse and obscure.
CREATE event SET start_time = fn::future -> {
time::now() + 1w
};
We could simplify this and make the syntax similar to the casting expressions...
CREATE event SET start_time = <future> {
time::now() + 1w
};
No other alternatives for this syntax improvement.
surreal 1.0.0-beta.2 for macos on aarch64
No response
When using the math::sum()
function on a field which is a number, and when not using a GROUP BY
clause to aggregate the record values, the function should return a number instead of NONE
.
Running the following query...
INSERT INTO player (agility, strength, scores) VALUES (10, 10, [97, 83, 79]);
INSERT INTO player (agility, strength, scores) VALUES (10, 50, [87, 90, 88]);
SELECT math::sum(strength) AS strength FROM player;
currently returns...
[
{
"time": "203.041µs",
"status": "OK",
"result": [
{
"strength": null
},
{
"strength": null
}
]
}
]
If the value is not an array of numbers, and when not using a GROUP BY
clause we would expect the function to return the field value as a number...
[
{
"time": "203.041µs",
"status": "OK",
"result": [
{
"strength": 10
},
{
"strength": 50
}
]
}
]
surreal 1.0.0-beta.6 for macos on aarch64
No response
Strings that are not valid UTF-8 result in the database throwing the following error:-
There is a problem with your request. Refer to the documentation for further information.
> SELECT * FROM is::uuid("67e55044-10b1-426f-9247-bb680e5\0e0c8");
{
"code":400,
"details":"Request problems detected",
"description":"There is a problem with your request. Refer to the documentation for further information.",
"information":"There was a problem with the database: Parse error on line 1 at character 16 when parsing '::uuid(\"67e55044-10b1-426f-9247-bb680e5\\0e0c8\");'"
}
This is not limited to UUIDs nor validation functions.
Non-UTF-8 strings should be handled properly. In this particular case it should be forwarded to is::uuid
which will then return false
.
surreal 1.0.0-beta.7 for linux on x86_64
No response
Attempting to DEFINE NAMESPACE
or DEFINE DATABASE
without first selecting the NAMESPACE and DATABASE tp use results in a failure to execute the query. The database appears to need both a NAMESPACE and a DATABASE selected, before running any queries.
DEFINE NAMESPACE test;
[
{
"time": "76.958µs",
"status": "ERR",
"detail": "Specify a namespace to use"
}
]
USE NAMESPACE test;
DEFINE NAMESPACE test;
[
{
"time": "76.958µs",
"status": "ERR",
"detail": "Specify a database to use"
}
]
USE NAMESPACE test DATABASE test;
DEFINE NAMESPACE test;
[
{
"time": "48.875µs",
"status": "OK",
"result": null
}
]
The database should allow the user to create the NAMESPACE or DATABASE if they have the correct permissions, and without needing to first select the NAMESPACE or DATABASE.
surreal 1.0.0-beta.3 for macos on aarch64
No response
When the server starts up, currently there is no obvious way of knowing whether root authentication is enabled or disabled.
We should log whether root authentication is enabled or not when the server is started.
Not applicable.
surreal 1.0.0-beta.2 for macos on aarch64
No response
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.