GithubHelp home page GithubHelp logo

arangodb / arangodb Goto Github PK

View Code? Open in Web Editor NEW
13.3K 331.0 831.0 2.72 GB

🥑 ArangoDB is a native multi-model database with flexible data models for documents, graphs, and key-values. Build high performance applications using a convenient SQL-like query language or JavaScript extensions.

Home Page: https://www.arangodb.com

License: Other

JavaScript 45.67% HTML 0.02% C++ 49.98% CMake 0.42% Shell 0.17% C 0.33% CSS 0.27% Python 0.38% Yacc 0.12% LLVM 0.02% PowerShell 0.01% TypeScript 1.04% SCSS 0.34% EJS 0.19% NASL 1.03% Dockerfile 0.01% SourcePawn 0.02% Batchfile 0.01% Makefile 0.01% Max 0.01%
multi-model graph-database document-database key-value database distributed-database arangodb nosql graphdb

arangodb's Introduction

Two stylized avocado halves and the product name.

ArangoDB

ArangoDB is a scalable graph database system to drive value from connected data, faster. Native graphs, an integrated search engine, and JSON support, via a single query language. ArangoDB runs on-prem, in the cloud – anywhere.

ArangoDB Cloud Service

The ArangoGraph Insights Platform is the simplest way to run ArangoDB. You can create deployments on all major cloud providers in many regions with ease.

Getting Started

For the impatient:

  • Test ArangoDB in the cloud with ArangoGraph for free.

  • Alternatively, download and install ArangoDB. Start the server arangod if the installer did not do it for you.

    Or start ArangoDB in a Docker container:

    docker run -e ARANGO_ROOT_PASSWORD=test123 -p 8529:8529 -d arangodb
    

    Then point your browser to http://127.0.0.1:8529/.

Key Features of ArangoDB

Native Graph - Store both data and relationships, for faster queries even with multiple levels of joins and deeper insights that simply aren't possible with traditional relational and document database systems.

Document Store - Every node in your graph is a JSON document: flexible, extensible, and easily imported from your existing document database.

ArangoSearch - Natively integrated cross-platform indexing, text-search and ranking engine for information retrieval, optimized for speed and memory.

ArangoDB is available in a free and open-source Community Edition, as well as a commercial Enterprise Edition with additional features.

Community Edition features

  • Horizontal scalability: Seamlessly shard your data across multiple machines.
  • High availability and resilience: Replicate data to multiple cluster nodes, with automatic failover.
  • Flexible data modeling: Model your data as combination of key-value pairs, documents, and graphs as you see fit for your application.
  • Work schema-free or use schema validation for data consistency. Store any type of data - date/time, geo-spatial, text, nested.
  • Powerful query language (AQL) to retrieve and modify data - from simple CRUD operations, over complex filters and aggregations, all the way to joins, graphs, and ranked full-text search.
  • Transactions: Run queries on multiple documents or collections with optional transactional consistency and isolation.
  • Data-centric microservices: Unify your data storage logic, reduce network overhead, and secure sensitive data with the ArangoDB Foxx JavaScript framework.
  • Fast access to your data: Fine-tune your queries with a variety of index types for optimal performance. ArangoDB is written in C++ and can handle even very large datasets efficiently.
  • Easy to use web interface and command-line tools for interaction with the server.

Enterprise Edition features

Focus on solving enterprise-scale problems for mission critical workloads using secure graph data. The Enterprise Edition has all the features of the Community Edition and offers additional features for performance, compliance, and security, as well as further query capabilities.

  • Smartly shard and replicate graphs and datasets with features like EnterpriseGraphs, SmartGraphs, and SmartJoins for lightning fast query execution.
  • Combine the performance of a single server with the resilience of a cluster setup using OneShard deployments.
  • Increase fault tolerance with Datacenter-to-Datacenter Replication and and create incremental Hot Backups without downtime.
  • Enable highly secure work with Encryption 360, enhanced Data Masking, and detailed Auditing.
  • Perform parallel graph traversals.
  • Use ArangoSearch search highlighting and nested search for advanced information retrieval.

Latest Release

Packages for all supported platforms can be downloaded from https://www.arangodb.com/download/.

For what's new in ArangoDB, see the Release Notes in the Documentation.

Stay in Contact

arangodb's People

Contributors

apetenchea avatar cpjulia avatar danielhlarkin avatar dhly-etc avatar dothebart avatar dronplane avatar estebanlombeyda avatar fceller avatar gnusi avatar goedderz avatar graetzer avatar gschwab avatar hkernbach avatar jamesarango avatar jsteemann avatar kvahed avatar kvs85 avatar maierlars avatar markuspf avatar mbkkt avatar mchacki avatar mpoeter avatar neunhoef avatar obiwahn avatar palashkaria avatar pluma avatar scottashton avatar simran-b avatar sleto-it avatar vasiliy-arangodb avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

arangodb's Issues

Reading CSV files

  1. Daten lesen:
    Die zwei Methoden processCsvFile und processJsonFile sind sehr nützlich. Vielleicht braucht man später auch was zum schreiben.
    Vielleicht macht es Sinn, die in ein Module zu packen? In Node heißt ein Ähnliches Module einfach fs, was sicher kein schlechter Name ist!

avocadodb --version shows incorrect version number

I just build from tag v0.2.0 (in order to verify the updated brew formula) and I noticed, that avocado --version shows 0.0.8 [1340M]. Also when avocado is started, it prints the wrong version:

$ avocado
2012-03-01T16:31:09Z [8214] INFO no user init file \'/Users/basti/.avocado/avocado.conf\' found
2012-03-01T16:31:09Z [8214] INFO no system init file \'/usr/local/Cellar/avocadodb/0.2.0/etc/avocado.conf\' found
2012-03-01T16:31:09Z [8214] INFO using built-in JavaScript startup files
2012-03-01T16:31:09Z [8214] INFO using database action files at \'/usr/local/var/avocado/_ACTIONS\'
2012-03-01T16:31:09Z [8214] INFO using system action files at \'/usr/local/Cellar/avocadodb/0.2.0/share/avocado/js/system\'
2012-03-01T16:31:09Z [8214] INFO using JavaScript modules path \'/usr/local/Cellar/avocadodb/0.2.0/share/avocado/js/modules\'
2012-03-01T16:31:09Z [8214] INFO using JavaScript front-end files stored at \'/usr/local/Cellar/avocadodb/0.2.0/share/avocado/html/admin\'
2012-03-01T16:31:09Z [8214] INFO AvocadoDB (version 0.0.8 [1340M]) is ready for business
2012-03-01T16:31:09Z [8214] INFO HTTP client port: 127.0.0.1:8529
2012-03-01T16:31:09Z [8214] INFO HTTP admin port: 127.0.0.1:8530
2012-03-01T16:31:09Z [8214] INFO Have Fun!

HTTP Interface

  1. HTTP Interface:
    Weder der erste Wiki Artikel noch das Tool geben Informationen darüber, auf welchem Port der HTTP Server läuft.
    Vielleicht könnte beim Starten das Tool irgendwas sagen wie:
    Starting AvocadoDB on Port 8529…
    Have Fun!
    (Habe den Port über den zweiten Artikel zu REST von dir rausgefunden ). Denke, dass das REST Interface gut ist.
    Würde im REST Interface auch die auf die Actions verweisen, die ja per REST aufgerufen werden können.

JSON.stringify

In ECMA Script gibt es standardmäßig eine JSON.stringify Methode zur Umwandlung in JavaScript. Da es eine solche Methode auch in V8 gibt, kann man diese vielleicht benutzen?

--server.admin-port and --server.http-port issues

The server has an option --server.admin-port.
If no value is set for this, a default value is used.
There currently is no way to turn off the admin port completely. There is also no way to use the same port for --server.http-port and --server.admin-port. If the following startup option is used, the server will try to bind to the same port twice:

$ ./avocado --server.http-port 9000 --server.admin-port 9000
2012-03-21T16:13:41Z [26997] ERROR bind failed with 98 (Address already in use)
2012-03-21T16:13:41Z [26997] INFO bind to address '::' port '9000'
2012-03-21T16:13:41Z [26997] ERROR bind failed with 98 (Address already in use)
2012-03-21T16:13:42Z [26997] INFO bind to address '0.0.0.0' port '9000'
2012-03-21T16:13:42Z [26997] ERROR bind failed with 98 (Address already in use)
2012-03-21T16:13:42Z [26997] INFO bind to address '::' port '9000'
...

Furthermore, if only a port number is specified for --server.http-port, then the server will issue an error on startup:

$ ./avocado --server.http-port 9000
2012-03-21T16:21:08Z [27173] ERROR bind failed with 98 (Address already in use)

The second bind does fail, but the server still starts.

Actions

  1. Actions:
    Was ich nicht ganz verstanden habe: Wo definiere ich die Actions? Speichere ich die in die Datenbank? Muss ich dafür dann
    die Konsole benutzen? Oder kann ich irgendwo eine JS Datei ablegen und die laden lassen?

variable potentially undefined

When compiling AvocadoDB I got compiler warnings about a variable "e" being potentially undefined.
If it is actually initialized or not depends on several things so all I did was flagging the relevant parts in the code with a comment:

V8/v8-conv.cpp:665: // TODO FIXME: my compiler complains about e being potentially undefined. what to do?
ShapedJson/shaped-json.c:866: // TODO: FIXME my compiler complains about e being potentially undefined. what to do?

fix TOC for "over the wire protocol" in wiki

The table of contents (TOC) for the "over the wire protocol" on the Reference Manual page differs from the TOC on the "over the wire protocol" page:

https://github.com/triAGENS/AvocadoDB/wiki/RefManual
https://github.com/triAGENS/AvocadoDB/wiki/OTWP

The TOC on the latter page contains more entries than are shown on the Reference manual page. The TOC on the reference page seems to be created manually and is therefore out of sync. I suggest removing the sub items from the TOC on the reference manual page so nothing gets out of sync again.

call munmap more frequently?

I cannot run the AvocadoDB unittests on my 32 bit laptop.
At some point during the test execution I am encountering mmap errors. mmap fails when creating datafiles and the database reports "cannot allocate memory".

I have marked all mmap and munmap calls to report the cumulated size of mapped memory.
Results below for running "make unittests":

|| SHELL SERVER TESTS ||

./avocado "/tmp/vocdir.28161" --unit-tests ./js/server/tests/shell-document.js --unit-tests ./js/server/tests/shell-collection.js --unit-tests ./js/server/tests/aql-simple.js --unit-tests ./js/server/tests/aql-keywords.js --unit-tests ./js/server/tests/aql-bind.js --unit-tests ./js/server/tests/aql-joins.js --unit-tests ./js/server/tests/aql-operators.js --unit-tests ./js/server/tests/aql-indexes.js
2012-04-11T15:40:10Z [28222] INFO no user init file '/home/steemann/.avocado/avocado.conf' found
2012-04-11T15:40:10Z [28222] INFO no system init file '/usr/local/etc/avocado.conf' found
2012-04-11T15:40:10Z [28222] INFO using built-in JavaScript startup files
2012-04-11T15:40:10Z [28222] INFO using database action files at '/tmp/vocdir.28161/_ACTIONS'
2012-04-11T15:40:10Z [28222] INFO using system action files at '/usr/local/share/avocado/js/actions/system'
2012-04-11T15:40:10Z [28222] INFO using JavaScript modules path '/usr/local/share/avocado/js/server/modules;/usr/local/share/avocado/js/common/modules'
AvocadoDB shell [V8 version 3.9.4, DB version 0.3.7]
2012-04-11T15:40:10Z [28222] INFO Running collectionDocumentSuiteErrorHandling
2012-04-11T15:40:10Z [28222] INFO 3 tests found

MMAP: total mmaped size is now: 2465792

MMAP: total mmaped size is now: 36020224

MMAP: total mmaped size is now: 69574656
2012-04-11T15:40:10Z [28222] INFO [PASSED] testErrorHandlingBadHandle

MMAP: total mmaped size is now: 72040448

MMAP: total mmaped size is now: 105594880

MMAP: total mmaped size is now: 139149312
2012-04-11T15:40:10Z [28222] INFO [PASSED] testErrorHandlingUnknownDocument

MMAP: total mmaped size is now: 141615104

MMAP: total mmaped size is now: 175169536

MMAP: total mmaped size is now: 208723968
2012-04-11T15:40:10Z [28222] INFO [PASSED] testErrorHandlingCrossCollection
2012-04-11T15:40:10Z [28222] INFO 3 tests passed
2012-04-11T15:40:10Z [28222] INFO 0 tests failed
2012-04-11T15:40:10Z [28222] INFO 392 milliseconds elapsed
2012-04-11T15:40:10Z [28222] INFO Running collectionDocumentSuiteReadDocument
2012-04-11T15:40:10Z [28222] INFO 5 tests found

MMAP: total mmaped size is now: 211189760

MMAP: total mmaped size is now: 244744192

MMAP: total mmaped size is now: 278298624
2012-04-11T15:40:10Z [28222] INFO [PASSED] testSaveDocument

MMAP: total mmaped size is now: 280764416

MMAP: total mmaped size is now: 314318848

MMAP: total mmaped size is now: 347873280
2012-04-11T15:40:10Z [28222] INFO [PASSED] testReadDocument

MMAP: total mmaped size is now: 350339072

MMAP: total mmaped size is now: 383893504

MMAP: total mmaped size is now: 417447936
2012-04-11T15:40:11Z [28222] INFO [PASSED] testReadDocumentConflict

MMAP: total mmaped size is now: 419913728

MMAP: total mmaped size is now: 453468160

MMAP: total mmaped size is now: 487022592
2012-04-11T15:40:11Z [28222] INFO [PASSED] testUpdateDocument

MMAP: total mmaped size is now: 489488384

MMAP: total mmaped size is now: 523042816

MMAP: total mmaped size is now: 556597248
2012-04-11T15:40:11Z [28222] INFO [PASSED] testDeleteDocument
2012-04-11T15:40:11Z [28222] INFO 5 tests passed
2012-04-11T15:40:11Z [28222] INFO 0 tests failed
2012-04-11T15:40:11Z [28222] INFO 911 milliseconds elapsed
2012-04-11T15:40:11Z [28222] INFO Running documentSuiteErrorHandling
2012-04-11T15:40:11Z [28222] INFO 2 tests found
2012-04-11T15:40:11Z [28222] INFO [PASSED] testErrorHandlingBadHandle

MMAP: total mmaped size is now: 559063040

MMAP: total mmaped size is now: 592617472

MMAP: total mmaped size is now: 626171904
2012-04-11T15:40:11Z [28222] INFO [PASSED] testErrorHandlingUnknownDocument
2012-04-11T15:40:11Z [28222] INFO 2 tests passed
2012-04-11T15:40:11Z [28222] INFO 0 tests failed
2012-04-11T15:40:11Z [28222] INFO 77 milliseconds elapsed
2012-04-11T15:40:11Z [28222] INFO Running documentSuiteReadDocument
2012-04-11T15:40:11Z [28222] INFO 4 tests found

MMAP: total mmaped size is now: 628637696

MMAP: total mmaped size is now: 662192128

MMAP: total mmaped size is now: 695746560
2012-04-11T15:40:11Z [28222] INFO [PASSED] testReadDocument

MMAP: total mmaped size is now: 698212352

MMAP: total mmaped size is now: 731766784

MMAP: total mmaped size is now: 765321216
2012-04-11T15:40:11Z [28222] INFO [PASSED] testReadDocumentConflict

MMAP: total mmaped size is now: 767787008

MMAP: total mmaped size is now: 801341440

MMAP: total mmaped size is now: 834895872
2012-04-11T15:40:12Z [28222] INFO [PASSED] testUpdateDocument

MMAP: total mmaped size is now: 837361664

MMAP: total mmaped size is now: 870916096

MMAP: total mmaped size is now: 904470528
2012-04-11T15:40:12Z [28222] INFO [PASSED] testDeleteDocument
2012-04-11T15:40:12Z [28222] INFO 4 tests passed
2012-04-11T15:40:12Z [28222] INFO 0 tests failed
2012-04-11T15:40:12Z [28222] INFO 697 milliseconds elapsed
2012-04-11T15:40:12Z [28222] INFO 14 total, 14 passed, 0 failed, 2077 ms
2012-04-11T15:40:12Z [28222] INFO Running collectionSuiteErrorHandling
2012-04-11T15:40:12Z [28222] INFO 6 tests found
2012-04-11T15:40:12Z [28222] INFO [PASSED] testErrorHandlingBadNameUnderscore
2012-04-11T15:40:12Z [28222] INFO [PASSED] testErrorHandlingBadNameEmpty
2012-04-11T15:40:12Z [28222] INFO [PASSED] testErrorHandlingBadNameNumber
2012-04-11T15:40:12Z [28222] INFO [PASSED] testErrorHandlingBadNameUnderscoreShortCut
2012-04-11T15:40:12Z [28222] INFO [PASSED] testErrorHandlingBadNameEmptyShortCut
2012-04-11T15:40:12Z [28222] INFO [PASSED] testErrorHandlingBadNameNumberShortCut
2012-04-11T15:40:12Z [28222] INFO 6 tests passed
2012-04-11T15:40:12Z [28222] INFO 0 tests failed
2012-04-11T15:40:12Z [28222] INFO 1 millisecond elapsed
2012-04-11T15:40:12Z [28222] INFO Running collectionSuite
2012-04-11T15:40:12Z [28222] INFO 18 tests found

MMAP: total mmaped size is now: 906936320

MMAP: total mmaped size is now: 940490752

MMAP: total mmaped size is now: 974045184
2012-04-11T15:40:12Z [28222] INFO [PASSED] testReadingByName

MMAP: total mmaped size is now: 976510976

MMAP: total mmaped size is now: 1010065408

MMAP: total mmaped size is now: 1043619840
2012-04-11T15:40:12Z [28222] INFO [PASSED] testReadingByIdentifier

MMAP: total mmaped size is now: 1046085632

MMAP: total mmaped size is now: 1079640064

MMAP: total mmaped size is now: 1113194496
2012-04-11T15:40:12Z [28222] INFO [PASSED] testReadingByNameShortCut

MMAP: total mmaped size is now: 1115660288

MMAP: total mmaped size is now: 1149214720

MMAP: total mmaped size is now: 1182769152
2012-04-11T15:40:12Z [28222] INFO [PASSED] testReadingAll

MMAP: total mmaped size is now: 1185234944

MMAP: total mmaped size is now: 1218789376

MMAP: total mmaped size is now: 1252343808
2012-04-11T15:40:12Z [28222] INFO [PASSED] testCreatingDefaults

MMAP: total mmaped size is now: 1254809600

MMAP: total mmaped size is now: 1255858176

MMAP: total mmaped size is now: 1256906752
2012-04-11T15:40:12Z [28222] INFO [PASSED] testCreatingProperties

MMAP: total mmaped size is now: 1259372544

MMAP: total mmaped size is now: 1292926976

MMAP: total mmaped size is now: 1326481408
2012-04-11T15:40:12Z [28222] INFO [PASSED] testDroppingNewBornDB

MMAP: total mmaped size is now: 1328947200

MMAP: total mmaped size is now: 1362501632

MMAP: total mmaped size is now: 1396056064
2012-04-11T15:40:12Z [28222] INFO [PASSED] testDroppingLoadedDB

MMAP: total mmaped size is now: 1398521856

MMAP: total mmaped size is now: 1432076288

MMAP: total mmaped size is now: 1465630720
2012-04-11T15:40:13Z [28222] INFO [PASSED] testDroppingUnloadedDB

MMAP: total mmaped size is now: 1468096512

MMAP: total mmaped size is now: 1501650944

MMAP: total mmaped size is now: 1535205376
2012-04-11T15:40:13Z [28222] INFO [PASSED] testTruncatingNewBornDB

MMAP: total mmaped size is now: 1537671168

MMAP: total mmaped size is now: 1571225600
2012-04-11T15:40:13Z [28222] ERROR cannot memory map file '/tmp/vocdir.28161/collection-32699786/journal-33748362.db': 'Cannot allocate memory'
2012-04-11T15:40:13Z [28222] ERROR cannot create new journal in '/tmp/vocdir.28161/collection-32699786/journal-33748362.db'
(same error repeats here...)

As can be seen above, the first few tests (shell-document.js and shell-collection.js) already amount for 1.5 GB of memory mapped file data. This seems to be the maximum that my computer supports.
1.5 GB of free RAM should be sufficient to run a few tests I think.

I think the problem is that the tests create new collections in the setUp routine. When creating a new collection, mmap is called for it. This is fine, however, munmap is not called when the collections are dropped when tests execute their tearDown routine.
That means the mmap'ed value will only increase, at least while the tests are running. It be helpful if munmap could be called when a collection is explicitly dropped (or at least shortly after) to free resources.

Segmentation Fault auf dem Mac

Leider segfaulted die neuste Version bei mir Habe gepulled, make clean, autoconf, ./configure..., make gemacht. Alles ohne Probleme. Aber wenn ich dann mit ./avocado startet kommt:
"Segmentation fault: 11"
Habe es dann mit valgrind gestartet. Erstaunlicherweise fährt er da den Server hoch. Aber im Log ist eine riesige Anzahl von Fehlern. Weißt du, was das zu bedeuten hat?
https://gist.github.com/a6b7b8bd5aea05541c17

properties for edges

Bei einem Edge könnte die Methode ".properties", die es bei Vertices gibt, auch sehr nützlich sein!

input / output

  1. Einfache Ein/Ausgabe:
    Du hast für Avocado "print" und "output" eingeführt. Ich würde vielleicht lieber auf den Standard in der JS Welt zurückgreifen,
    den die Browser und auch Node unterstützt benutzen: console. Mehr Infos findest dazu findest du hier:
    http://nodejs.org/docs/v0.6.4/api/stdio.html

/_admin/log?search=xyz returns wrong results

  1. http://192.168.173.15:10000/_admin/log?search=no

{"lid":[1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27],"level":[3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 2, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2],"timestamp":[1334829218, 1334829218, 1334829218, 1334829218, 1334829218, 1334829218, 1334829218, 1334829218, 1334829218, 1334829218, 1334835212, 1334907139, 1334907139, 1334910687, 1334910688, 1334910715, 1334910716, 1334910716, 1334910716, 1334910717, 1334910731, 1334910774, 1334913849, 1334918332, 1334918333, 1334918818, 1334921071],"text":["2012-04-19T09:53:38Z [7933] INFO no user init file '/home/wesselinux/hkernbach/.avocado<
/code>/avocado.conf' found", "2012-04-19T09:53:38Z [7933] INFO no system init file '/usr/local/etc/avocado.conf' found", "2012-04-19T09:53:38Z [7933] INFO using built-in JavaScript startup files", "2012-04-19T09:53:38Z [7933] INFO using database action files at '/tmp/vocbase/_ACTIONS'", "2012-04-19T09:53:38Z [7933] INFO using system action files at '/usr/local/share/avocado/js/actions/system'", "2012-04-19T09:53:38Z [7933] INFO using JavaScript modules path '/usr/local/share/avocado/js/server/modules;/usr/local/share/avocado/js/common/modules'", "2012-04-19T09:53:38Z [7933] INFO using JavaScript front-end files stored at './html/admin'", "2012-04-19T09:53:38Z [7933] INFO AvocadoDB (version 0.3.10) is ready for business", "2012-04-19T09:53:38Z [7933] INFO HTTP client/admin port: [::]:10000", "2012-04-19T09:53:38Z [7933] INFO Have Fun!", "2012-04-19T11:33:32Z [7933] WARNING file './html/admin/log' not found", "2012-04-20T07:32:19Z [7933] ERROR cannot save collection parameter '/tmp/vocbase/collection-13548657/SHAPES': 'illegal parameter file'", "2012-04-20T07:32:19Z [7933] ERROR cannot open shapes collection", "2012-04-20T08:31:27Z [7933] WARNING file './html/admin/media/icons' contains ille
gal character", "2012-04-20T08:31:28Z [7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T08:31:55Z [7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T08:31:56Z [7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T08:31:56Z [7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T08:31:56Z [7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T08:31:57Z [7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T08:32:11Z [
7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T08:32:54Z [7933] WARNING file './html/admin/media/icons' contains illegal character", "2012-04-20T09:24:09Z [7933] WARNING file './html/admin/media/icons/connected_icon16.png' not found", "2012-04-20T10:38:52Z [7933] WARNING file './html/admin/media/icons/round_save_icon16.png' not found", "2012-04-20T10:38:53Z [7933] WARNING file './html/admin/media/icons/round_save_icon16.png' not found", "2012-04-20T10:46:58Z [7933] WARNING file './html/admin/media/icons/off_icon16.png' not found", "2012-04-20T11:24:31Z [7933] WARNING file './
html/admin/media/icons' contains illegal character"],"totalAmount":27}

  1. http://192.168.173.15:10000/_admin/log?search=javascript oder JavaScript
    {"lid":[],"level":[],"timestamp":[],"text":[],"totalAmount":0}
    Kein Ergebnis obwohl Einträge vorhanden sind.

3.http://192.168.173.15:10000/_admin/log?search=cannot
{"lid":[1, 2],"level":[3, 3],"timestamp":[1334829218, 1334829218],"text":["2012-04-19T09:53:38Z [7933] INFO no user init file '/home/wesselinux/hkernbach/.avocado/avocado.conf' found", "2012-04-19T09:53:38Z [7933] INFO no system init file '/usr/local/etc/avocado.conf' found"],"totalAmount":2}
Hier total falsches Ergebnis...

http interface doesn't handle escaped quotes

POST /document?collection=location&createCollection=false HTTP/1.1
Host: localhost
Content-Type: application/json
Content-Length: 18

{"hans":"kanns""}

HTTP/1.1 400 Bad Request
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
content-length: 73

{"error":true,"code":400,"errorNum":600,"errorMessage":"expecting comma"}

Filtern

Man sollte die Suchergebnisse (wie zB. alle ausgehenden Kanten) filtern können.

Struktur der Dokumentation

  1. Struktur der Doku:
    Ich würde vielleicht als Struktur folgendes nehmen:
  2. Basics
    1.1 Starting the AvocadoDB
    1.2 AvocadoScript
  3. Advanced Topics
    2.1 The Rest Interface
    2.2 Making Queries with the Rest Interface (was jetzt "REST Interface for Documents" heist)
    2.3 Actions (das was jetzt "First Steps" heißt)
    2.3.1 Defining an Action
    2.3.2 Query Building Functions
    2.4 Graphs, Vertices, and Edges (das was jetzt "Function Index" heißt)
    2.5 Command Line Options for Logging

Property access for edges and vertices

Kann man auf die Properties auch mit der in JavaScript üblichen Klammer Syntax zugreifen?
Also statt:
vertex.getProperty("hello");
auch:
vertex["hello"];
Selbiges für die Setter der Eigenschaften? So verhalten sich ja beispielsweise JavaScript Objects:
a = {'hello': 'huhu'}; print(a['hello']);

Do not use `delete` as a method name

To delete a document from a collection in JavaScript, AvocadoDB provides the method delete. That is not a good idea. delete is a reserved word in JavaScript and should not be used an identifier. I would suggest deleteDocument.

http interface returned invalid json, unhandling quote.

http-request was fixed in Issue#35, but http-response is not fix.

$ curl --data @- -X POST --dump - "http://127.0.0.1:8529/document?collection=unit_test_issue35"

{"xx": "xx\""}
HTTP/1.1 202 Accepted
etag: "1811904640"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
location: /document/1809181757/1811904640
content-length: 63

{"error":false,"_id":"1809181757/1811904640","_rev":1811904640}



$ curl -X GET --dump - http://127.0.0.1:8529/document/1809181757/1811904640

HTTP/1.1 200 OK
etag: "1811904640"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
content-length: 60

{"xx":"xx"","_id":"1809181757/1811904640","_rev":1811904640}

Purge API call to remove all backups

It was quite weird how much Avocado eats disk space. I experienced twice "No more disk space" on my 128 Gb mac book air. That's not cool, especially because one execution of my tests suite eats ~5 GB of disk space - there's 20-30 creation/deletion of empty collections.

Can you create kind of "purge" API Call which remove all backups?

Also I think it will be very useful to make note in documentation to execute this API call every 24hrs and mark with some red marker.

Backups are important but when db fill disk so quickly it confuse.

Command Line Interface

  1. Das Commandline Interface:
    Denke, man kann das Command einfach avocado nennen, dass ist einfacher zu merken! Es gibt ja aktuell
    drei Parameter, die man dem Command übergeben kann. Ich würde die vielleicht so bauen:

avocado [--interactive] [--log LEVEL] [DATABASEFILE]

Finde interactive passt besser als Shell oder Debug, weil man ja eine interaktive Session mit einer Datenbank
aufmacht. .log.level ist schwer zu merken, das müsste man immer nachschauen.
Schlussendlich zur Databasefile: Eine default-location ist eine gute Idee, aber ich denke, meistens möchte man
eine bestimmte Datenbank starten (bzw. mit interactive anschauen) – deswegen wird man meist eine Datenbank
Datei übergeben, weswegen das vielleicht das einfachste Parameter sein sollte.

Ein weiteres nützliches Parameter wäre vielleicht --port, damit man auch zwei Avocado Instanzen starten kann?

Außerdem würde ich mit avocado --help diese Info kurz ausgeben.

binary: no symbolic link to PATH on install (OS X)

When I brew install AvocadoDB from HEAD on Mac OS X 10.7, the avocsh gets linked to /usr/local/bin so it is accessible from the commandline.

However, this is not the case for the database binary itself (avocado). Is this intended?

I ran this line to fix it to my liking:
$ ln -s /usr/local/Cellar/avocadodb/HEAD/sbin/avocado /usr/local/bin/avocado

Did you leave that out on purpose or is this an error that should be fixed? If this was intended it would be a good idea to update the README or installation instruction accordingly, so new users don't get lost trying to find the binary.

addOutEdge and addInEdge

Zwei Methoden fände ich für die Vertices noch sehr nützlich:
vertex.addOutEdge(v1, "label");
und
vertex.addInEdge(v1, "label");
Klar, kann man auch auf dem Graphobjekt machen, aber so ist es sicher an vielen Stellen klarer lesbar!

when mmap fails, a few thousand journal files may be created

For some reason mmap() temporarily fails on my computer when running the unittests.
When it fails, AvocadoDB seems to try again very often. Here's an output from the log file:

2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-36718910.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-36718910.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-36784446.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-36784446.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-36849982.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-36849982.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-36915518.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-36915518.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-36981054.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-36981054.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37046590.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37046590.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37112126.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37112126.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37177662.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37177662.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37243198.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37243198.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37308734.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37308734.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37374270.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37374270.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37439806.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37439806.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37505342.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37505342.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37570878.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37570878.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37636414.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37636414.db'
2012-04-11T13:26:37Z [6553] ERROR cannot memory map file '/tmp/vocdir.6491/collection-35539262/journal-37701950.db': 'Cannot allocate memory'
2012-04-11T13:26:37Z [6553] ERROR cannot create new journal in '/tmp/vocdir.6491/collection-35539262/journal-37701950.db'
... (goes on and on and on)

It seems that this loop creates a lot of files in the database directory without cleaning up the old ones.
After I killed the above process, I checked which files were left:

steemann@linux-stj:~$ DIR="/tmp/vocdir.6491"; for i in find "$DIR" -name "collection*" -type d; do echo "$i"; find "$i" -name "journ*" -type f | wc -l; done/tmp/vocdir.6491/collection-7424318
2
/tmp/vocdir.6491/collection-291457342
2839
/tmp/vocdir.6491/collection-121719102
1251
/tmp/vocdir.6491/collection-30820670
2
/tmp/vocdir.6491/collection-15354174
2
/tmp/vocdir.6491/collection-13781310
2
/tmp/vocdir.6491/collection-5851454
2
/tmp/vocdir.6491/collection-291522878
1925
/tmp/vocdir.6491/collection-35539262
455
/tmp/vocdir.6491/collection-121850174
2
/tmp/vocdir.6491/collection-9062718
2
/tmp/vocdir.6491/collection-115886398
2
/tmp/vocdir.6491/collection-115755326
2
/tmp/vocdir.6491/collection-26102078
2
/tmp/vocdir.6491/collection-111036734
2
/tmp/vocdir.6491/collection-20334910
2
/tmp/vocdir.6491/collection-291391806
1074
/tmp/vocdir.6491/collection-102451518
2
/tmp/vocdir.6491/collection-16992574
2

It has created some 7.500 files before I killed this. This should be avoided because otherwise the file system may get slow or run out of space.

unique constraint fails every second time

I've encountered a strange bug and have no idea, where it comes from:

AvocadoDB shell [V8 version 3.8.5, DB version 0.3.5]
avocado> db.examples.ensureUniqueConstraint("a")
{ "id" : 1194947, "unique" : true, "type" : "hash", "fields" : ["a"], "isNewlyCreated" : true }
avocado> x = db.examples.save({ a : 1, b : 1 })
80835/1653699
avocado> db.examples.document(x)
{ "_id" : "80835/1653699", "_rev" : 1653699, "a" : 1, "b" : 1 }
avocado> x = db.examples.save({ a : 1, b : 2 })
JavaScript exception in file '(avocado)' at 1,17: cannot save document: unique constraint violated
!x = db.examples.save({ a : 1, b : 2 })
! ^
avocado> db.examples.document(x)
{ "_id" : "80835/1653699", "_rev" : 1653699, "a" : 1, "b" : 1 }
avocado> db.examples.toArray()
[{ "_id" : "80835/1653699", "_rev" : 1653699, "a" : 1, "b" : 1 }]
avocado> x = db.examples.save({ a : 1, b : 2 })
80835/1981379
avocado> db.examples.toArray()
[{ "_id" : "80835/1653699", "_rev" : 1653699, "a" : 1, "b" : 1 }, { "_id" : "80835/1981379", "_rev" : 1981379, "a" : 1, "b" : 2 }]
avocado> bye...

The first save creates a document => fine
the second save gives a constraint violation => fine
the third save creates a document without violation => I've no idea why

journal size

Need a stress test for compactor & journal size.

error occurs ./configure

I failed to execute configure script

OS : CentOS release 5.8
gcc version 4.1.2 20080704 (Red Hat 4.1.2-52)

The following is a config.log

This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.

It was created by triAGENS AvocadoDB configure 0.3.7, which was
generated by GNU Autoconf 2.68. Invocation command line was

$ ./configure

---------

Platform.

---------

hostname = lunes
uname -m = x86_64
uname -r = 2.6.18-194.26.1.el5
uname -s = Linux
uname -v = #1 SMP Tue Nov 9 12:54:20 EST 2010

/usr/bin/uname -p = unknown
/bin/uname -X = unknown

/bin/arch = x86_64
/usr/bin/arch -k = unknown
/usr/convex/getsysinfo = unknown
/usr/bin/hostinfo = unknown
/bin/machine = unknown
/usr/bin/oslevel = unknown
/bin/universe = unknown

PATH: /usr/kerberos/bin
PATH: /usr/local/bin
PATH: /bin
PATH: /usr/bin
PATH: /home/siggy/bin
PATH: /usr/local/java/bin
PATH: /usr/local/ant/bin
PATH: /home/siggy/play-2.0
PATH: /usr/local/scala/bin
PATH: /usr/local/pig/bin
PATH: /usr/local/hbase/bin
PATH: /usr/local/zookeeper/bin
PATH: /usr/local/ec2-api-tools/bin
PATH: /usr/local/hadoop/src/contrib/ec2/bin
PATH: /usr/local/hive/bin
PATH: /usr/local/jaql/bin
PATH: /home/siggy/ec2/RDSCli-1.3.003/bin
PATH: /usr/local/src/jython
PATH: /usr/local/src/IAMCli-1.2.0/bin
PATH: /usr/local/apache-maven/bin
PATH: /usr/local/storm/bin
PATH: /usr/local/mahout/bin

-----------

Core tests.

-----------

configure:2691: ................................................................................
configure:2693: CHECKING BUILD SYSTEM
configure:2695: ................................................................................
configure:2702: checking build system type
configure:2716: result: x86_64-unknown-linux-gnu
configure:2736: checking host system type
configure:2749: result: x86_64-unknown-linux-gnu
configure:2769: checking target system type
configure:2782: result: x86_64-unknown-linux-gnu
configure:2843: checking for a BSD-compatible install
configure:2911: result: /usr/bin/install -c
configure:2922: checking whether build environment is sane
configure:2972: result: yes
configure:3113: checking for a thread-safe mkdir -p
configure:3152: result: /bin/mkdir -p
configure:3165: checking for gawk
configure:3181: found /bin/gawk
configure:3192: result: gawk
configure:3203: checking whether make sets $(MAKE)
configure:3225: result: yes
configure:3319: ................................................................................
configure:3321: CHECKING C/C++ COMPILER AND LINKER
configure:3323: ................................................................................
configure:3356: checking for style of include used by make
configure:3384: result: GNU
configure:3464: checking for g++
configure:3480: found /usr/bin/g++
configure:3491: result: g++
configure:3518: checking for C++ compiler version
configure:3527: g++ --version >&5
g++ (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52)
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

configure:3538: $? = 0
configure:3527: g++ -v >&5
Using built-in specs.
Target: x86_64-redhat-linux
Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --enable-checking=release --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions --enable-libgcj-multifile --enable-languages=c,c++,objc,obj-c++,java,fortran,ada --enable-java-awt=gtk --disable-dssi --disable-plugin --with-java-home=/usr/lib/jvm/java-1.4.2-gcj-1.4.2.0/jre --with-cpu=generic --host=x86_64-redhat-linux
Thread model: posix
gcc version 4.1.2 20080704 (Red Hat 4.1.2-52)
configure:3538: $? = 0
configure:3527: g++ -V >&5
g++: '-V' option must have argument
configure:3538: $? = 1
configure:3527: g++ -qversion >&5
g++: unrecognized option '-qversion'
g++: no input files
configure:3538: $? = 1
configure:3558: checking whether the C++ compiler works
configure:3580: g++ conftest.cpp >&5
configure:3584: $? = 0
configure:3632: result: yes
configure:3635: checking for C++ compiler default output file name
configure:3637: result: a.out
configure:3643: checking for suffix of executables
configure:3650: g++ -o conftest conftest.cpp >&5
configure:3654: $? = 0
configure:3676: result:
configure:3698: checking whether we are cross compiling
configure:3706: g++ -o conftest conftest.cpp >&5
configure:3710: $? = 0
configure:3717: ./conftest
configure:3721: $? = 0
configure:3736: result: no
configure:3741: checking for suffix of object files
configure:3763: g++ -c conftest.cpp >&5
configure:3767: $? = 0
configure:3788: result: o
configure:3792: checking whether we are using the GNU C++ compiler
configure:3811: g++ -c conftest.cpp >&5
configure:3811: $? = 0
configure:3820: result: yes
configure:3829: checking whether g++ accepts -g
configure:3849: g++ -c -g conftest.cpp >&5
configure:3849: $? = 0
configure:3890: result: yes
configure:3915: checking dependency style of g++
configure:4025: result: gcc3
configure:4045: checking how to run the C++ preprocessor
configure:4072: g++ -E conftest.cpp
configure:4072: $? = 0
configure:4086: g++ -E conftest.cpp
conftest.cpp:11:28: error: ac_nonexistent.h: No such file or directory
configure:4086: $? = 1
configure: failed program was:
| /* confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| /
end confdefs.h. /
| #include <ac_nonexistent.h>
configure:4111: result: g++ -E
configure:4131: g++ -E conftest.cpp
configure:4131: $? = 0
configure:4145: g++ -E conftest.cpp
conftest.cpp:11:28: error: ac_nonexistent.h: No such file or directory
configure:4145: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| /
end confdefs.h. */
| #include <ac_nonexistent.h>
configure:4227: checking for gcc
configure:4243: found /usr/bin/gcc
configure:4254: result: gcc
configure:4285: checking for C compiler version
configure:4294: gcc --version >&5
gcc (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52)
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

configure:4305: $? = 0
configure:4294: gcc -v >&5
Using built-in specs.
Target: x86_64-redhat-linux
Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --enable-checking=release --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions --enable-libgcj-multifile --enable-languages=c,c++,objc,obj-c++,java,fortran,ada --enable-java-awt=gtk --disable-dssi --disable-plugin --with-java-home=/usr/lib/jvm/java-1.4.2-gcj-1.4.2.0/jre --with-cpu=generic --host=x86_64-redhat-linux
Thread model: posix
gcc version 4.1.2 20080704 (Red Hat 4.1.2-52)
configure:4305: $? = 0
configure:4294: gcc -V >&5
gcc: '-V' option must have argument
configure:4305: $? = 1
configure:4294: gcc -qversion >&5
gcc: unrecognized option '-qversion'
gcc: no input files
configure:4305: $? = 1
configure:4309: checking whether we are using the GNU C compiler
configure:4328: gcc -c conftest.c >&5
configure:4328: $? = 0
configure:4337: result: yes
configure:4346: checking whether gcc accepts -g
configure:4366: gcc -c -g conftest.c >&5
configure:4366: $? = 0
configure:4407: result: yes
configure:4424: checking for gcc option to accept ISO C89
configure:4488: gcc -c -g -O2 conftest.c >&5
configure:4488: $? = 0
configure:4501: result: none needed
configure:4523: checking dependency style of gcc
configure:4633: result: gcc3
configure:4760: checking for C++ compiler version
configure:4769: g++ --version >&5
g++ (GCC) 4.1.2 20080704 (Red Hat 4.1.2-52)
Copyright (C) 2006 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

configure:4780: $? = 0
configure:4769: g++ -v >&5
Using built-in specs.
Target: x86_64-redhat-linux
Configured with: ../configure --prefix=/usr --mandir=/usr/share/man --infodir=/usr/share/info --enable-shared --enable-threads=posix --enable-checking=release --with-system-zlib --enable-__cxa_atexit --disable-libunwind-exceptions --enable-libgcj-multifile --enable-languages=c,c++,objc,obj-c++,java,fortran,ada --enable-java-awt=gtk --disable-dssi --disable-plugin --with-java-home=/usr/lib/jvm/java-1.4.2-gcj-1.4.2.0/jre --with-cpu=generic --host=x86_64-redhat-linux
Thread model: posix
gcc version 4.1.2 20080704 (Red Hat 4.1.2-52)
configure:4780: $? = 0
configure:4769: g++ -V >&5
g++: '-V' option must have argument
configure:4780: $? = 1
configure:4769: g++ -qversion >&5
g++: unrecognized option '-qversion'
g++: no input files
configure:4780: $? = 1
configure:4784: checking whether we are using the GNU C++ compiler
configure:4812: result: yes
configure:4821: checking whether g++ accepts -g
configure:4882: result: yes
configure:4907: checking dependency style of g++
configure:5017: result: gcc3
configure:5037: checking how to run the C preprocessor
configure:5068: gcc -E conftest.c
configure:5068: $? = 0
configure:5082: gcc -E conftest.c
conftest.c:11:28: error: ac_nonexistent.h: No such file or directory
configure:5082: $? = 1
configure: failed program was:
| /* confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| /
end confdefs.h. /
| #include <ac_nonexistent.h>
configure:5107: result: gcc -E
configure:5127: gcc -E conftest.c
configure:5127: $? = 0
configure:5141: gcc -E conftest.c
conftest.c:11:28: error: ac_nonexistent.h: No such file or directory
configure:5141: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| /
end confdefs.h. /
| #include <ac_nonexistent.h>
configure:5170: checking whether gcc and cc understand -c and -o together
configure:5201: gcc -c conftest.c -o conftest2.o >&5
configure:5205: $? = 0
configure:5211: gcc -c conftest.c -o conftest2.o >&5
configure:5215: $? = 0
configure:5226: cc -c conftest.c >&5
configure:5230: $? = 0
configure:5238: cc -c conftest.c -o conftest2.o >&5
configure:5242: $? = 0
configure:5248: cc -c conftest.c -o conftest2.o >&5
configure:5252: $? = 0
configure:5270: result: yes
configure:5296: checking whether ln -s works
configure:5300: result: yes
configure:5307: checking whether make sets $(MAKE)
configure:5329: result: yes
configure:5408: checking for library containing sincos
configure:5439: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:5439: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| /
end confdefs.h. /
|
| /
Override any GCC internal prototype to avoid an error.
| Use char because int might match the return type of a GCC
| builtin and then its argument prototype would still apply. _/
| #ifdef _cplusplus
| extern "C"
| #endif
| char sincos ();
| int
| main ()
| {
| return sincos ();
| ;
| return 0;
| }
configure:5439: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c -lm >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:5439: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| /
end confdefs.h. /
|
| /
Override any GCC internal prototype to avoid an error.
| Use char because int might match the return type of a GCC
| builtin and then its argument prototype would still apply. _/
| #ifdef __cplusplus
| extern "C"
| #endif
| char sincos ();
| int
| main ()
| {
| return sincos ();
| ;
| return 0;
| }
configure:5456: result: no
configure:5563: checking for ranlib
configure:5579: found /usr/bin/ranlib
configure:5590: result: ranlib
configure:5827: checking for special C compiler options needed for large files
configure:5872: result: no
configure:5878: checking for FILE_OFFSET_BITS value needed for large files
configure:5903: gcc -c -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:5903: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <sys/types.h>
| /
Check that off_t can represent 263 - 1 correctly.
| We can't simply define LARGE_OFF_T to be 9223372036854775807,
| since some C++ compilers masquerading as C compilers
| incorrectly reject 9223372036854775807. /
| #define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62))
| int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721
| && LARGE_OFF_T % 2147483647 == 1)
| ? 1 : -1];
| int
| main ()
| {
|
| ;
| return 0;
| }
configure:5927: gcc -c -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:5927: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. _/
| #define FILE_OFFSET_BITS 64
| #include <sys/types.h>
| /
Check that off_t can represent 2
63 - 1 correctly.
| We can't simply define LARGE_OFF_T to be 9223372036854775807,
| since some C++ compilers masquerading as C compilers
| incorrectly reject 9223372036854775807. _/
| #define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62))
| int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721
| && LARGE_OFF_T % 2147483647 == 1)
| ? 1 : -1];
| int
| main ()
| {
|
| ;
| return 0;
| }
configure:5935: result: unknown
configure:5947: checking for LARGE_FILES value needed for large files
configure:5972: gcc -c -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:5972: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <sys/types.h>
| /
Check that off_t can represent 263 - 1 correctly.
| We can't simply define LARGE_OFF_T to be 9223372036854775807,
| since some C++ compilers masquerading as C compilers
| incorrectly reject 9223372036854775807. /
| #define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62))
| int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721
| && LARGE_OFF_T % 2147483647 == 1)
| ? 1 : -1];
| int
| main ()
| {
|
| ;
| return 0;
| }
configure:5996: gcc -c -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:5996: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. _/
| #define LARGE_FILES 1
| #include <sys/types.h>
| /
Check that off_t can represent 2
63 - 1 correctly.
| We can't simply define LARGE_OFF_T to be 9223372036854775807,
| since some C++ compilers masquerading as C compilers
| incorrectly reject 9223372036854775807. /
| #define LARGE_OFF_T (((off_t) 1 << 62) - 1 + ((off_t) 1 << 62))
| int off_t_is_large[(LARGE_OFF_T % 2147483629 == 721
| && LARGE_OFF_T % 2147483647 == 1)
| ? 1 : -1];
| int
| main ()
| {
|
| ;
| return 0;
| }
configure:6004: result: unknown
configure:6028: ................................................................................
configure:6030: CHECKING FOR PTHREADS
configure:6032: ................................................................................
configure:6207: checking for the pthreads library -lpthreads
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c -lpthreads >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6152: checking whether pthreads work without any flags
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6157: checking whether pthreads work with -Kthread
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement -Kthread conftest.c >&5
gcc: unrecognized option '-Kthread'
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6157: checking whether pthreads work with -kthread
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement -kthread conftest.c >&5
gcc: unrecognized option '-kthread'
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6207: checking for the pthreads library -llthread
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c -llthread >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6157: checking whether pthreads work with -pthread
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement -pthread conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6157: checking whether pthreads work with -pthreads
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement -pthreads conftest.c >&5
gcc: unrecognized option '-pthreads'
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6157: checking whether pthreads work with -mthreads
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement -mthreads conftest.c >&5
cc1: error: unrecognized command line option "-mthreads"
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6207: checking for the pthreads library -lpthread
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement conftest.c -lpthread >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6157: checking whether pthreads work with --thread-safe
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement --thread-safe conftest.c >&5
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-fthread-safe"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. /
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6157: checking whether pthreads work with -mt
configure:6240: gcc -o conftest -g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement -mt conftest.c >&5
cc1: error: unrecognized command line option "-mt"
cc1: error: unrecognized command line option "-std=gnu90"
cc1: error: unrecognized command line option "-std=gnu90"
configure:6240: $? = 1
configure: failed program was:
| /
confdefs.h /
| #define PACKAGE_NAME "triAGENS AvocadoDB"
| #define PACKAGE_TARNAME "avocado"
| #define PACKAGE_VERSION "0.3.7"
| #define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"
| #define PACKAGE_BUGREPORT "[email protected]"
| #define PACKAGE_URL "http://www.avocadodb.org"
| #define PACKAGE "avocado"
| #define VERSION "0.3.7"
| #define TRI_ENABLE_LOGGER 1
| #define TRI_ENABLE_LOGGER_TIMING 1
| #define TRI_ENABLE_TIMING 1
| #define TRI_ENABLE_FIGURES 1
| /
end confdefs.h. */
| #include <pthread.h>
| int
| main ()
| {
| pthread_t th; pthread_join(th, 0);
| pthread_attr_init(0); pthread_cleanup_push(0, 0);
| pthread_create(0,0,0,0); pthread_cleanup_pop(0);
| ;
| return 0;
| }
configure:6249: result: no
configure:6165: checking for pthread-config
configure:6193: result: no
configure:6388: error: please install the pthread library

----------------

Cache variables.

----------------

ac_cv_build=x86_64-unknown-linux-gnu
ac_cv_c_compiler_gnu=yes
ac_cv_cxx_compiler_gnu=yes
ac_cv_env_CCC_set=
ac_cv_env_CCC_value=
ac_cv_env_CC_set=
ac_cv_env_CC_value=
ac_cv_env_CFLAGS_set=
ac_cv_env_CFLAGS_value=
ac_cv_env_CPPFLAGS_set=
ac_cv_env_CPPFLAGS_value=
ac_cv_env_CPP_set=
ac_cv_env_CPP_value=
ac_cv_env_CXXCPP_set=
ac_cv_env_CXXCPP_value=
ac_cv_env_CXXFLAGS_set=
ac_cv_env_CXXFLAGS_value=
ac_cv_env_CXX_set=
ac_cv_env_CXX_value=
ac_cv_env_LDFLAGS_set=
ac_cv_env_LDFLAGS_value=
ac_cv_env_LIBS_set=
ac_cv_env_LIBS_value=
ac_cv_env_build_alias_set=
ac_cv_env_build_alias_value=
ac_cv_env_host_alias_set=
ac_cv_env_host_alias_value=
ac_cv_env_target_alias_set=
ac_cv_env_target_alias_value=
ac_cv_host=x86_64-unknown-linux-gnu
ac_cv_objext=o
ac_cv_path_install='/usr/bin/install -c'
ac_cv_path_mkdir=/bin/mkdir
ac_cv_prog_AWK=gawk
ac_cv_prog_CPP='gcc -E'
ac_cv_prog_CXXCPP='g++ -E'
ac_cv_prog_ac_ct_CC=gcc
ac_cv_prog_ac_ct_CXX=g++
ac_cv_prog_ac_ct_RANLIB=ranlib
ac_cv_prog_acx_pthread_config=no
ac_cv_prog_cc_c89=
ac_cv_prog_cc_g=yes
ac_cv_prog_cc_gcc_c_o=yes
ac_cv_prog_cxx_g=yes
ac_cv_prog_make_make_set=yes
ac_cv_search_sincos=no
ac_cv_sys_file_offset_bits=unknown
ac_cv_sys_large_files=unknown
ac_cv_sys_largefile_CC=no
ac_cv_target=x86_64-unknown-linux-gnu
am_cv_CC_dependencies_compiler_type=gcc3
am_cv_CXX_dependencies_compiler_type=gcc3

-----------------

Output variables.

-----------------

ACLOCAL='${SHELL} /home/siggy/AvocadoDB/config/missing --run aclocal-1.11'
AMDEPBACKSLASH=''
AMDEP_FALSE='#'
AMDEP_TRUE=''
AMTAR='${SHELL} /home/siggy/AvocadoDB/config/missing --run tar'
AM_BACKSLASH=''
AM_DEFAULT_VERBOSITY='0'
AUTOCONF='${SHELL} /home/siggy/AvocadoDB/config/missing --run autoconf'
AUTOHEADER='${SHELL} /home/siggy/AvocadoDB/config/missing --run autoheader'
AUTOMAKE='${SHELL} /home/siggy/AvocadoDB/config/missing --run automake-1.11'
AWK='gawk'
BISON=''
BOOST_CPPFLAGS=''
BOOST_LDFLAGS=''
BOOST_LIBS=''
BUILD_H=''
CC='gcc'
CCDEPMODE='depmode=gcc3'
CFLAGS='-g -O2 -std=gnu90 -std=gnu90 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Wshadow -Wstrict-prototypes -Wdeclaration-after-statement'
CPP='gcc -E'
CPPFLAGS=''
CXX='g++'
CXXCPP='g++ -E'
CXXDEPMODE='depmode=gcc3'
CXXFLAGS='-g -O2 -Wall -Winit-self -Wno-long-long -Wno-variadic-macros -Woverloaded-virtual -Wstrict-null-sentinel'
CYGPATH_W='echo'
DEFS=''
DEPDIR='.deps'
DOT_PATH=''
ECHO_C=''
ECHO_N='-n'
ECHO_T=''
EGREP=''
ENABLE_32BIT_FALSE=''
ENABLE_32BIT_TRUE='#'
ENABLE_64BIT_FALSE='#'
ENABLE_64BIT_TRUE=''
ENABLE_ALL_IN_ONE_FALSE='#'
ENABLE_ALL_IN_ONE_TRUE=''
ENABLE_BISON_FALSE=''
ENABLE_BISON_TRUE='#'
ENABLE_BOOST_TEST_FALSE=''
ENABLE_BOOST_TEST_TRUE='#'
ENABLE_DARWIN_FALSE=''
ENABLE_DARWIN_TRUE='#'
ENABLE_FIGURES_FALSE='#'
ENABLE_FIGURES_TRUE=''
ENABLE_FLEX_FALSE=''
ENABLE_FLEX_TRUE='#'
ENABLE_FORCE_32BIT_FALSE=''
ENABLE_FORCE_32BIT_TRUE='#'
ENABLE_GCOV_FALSE=''
ENABLE_GCOV_TRUE=''
ENABLE_INCLUDED_BOOST_FALSE='#'
ENABLE_INCLUDED_BOOST_TRUE=''
ENABLE_INSTALL_DBDIR_FALSE='#'
ENABLE_INSTALL_DBDIR_TRUE=''
ENABLE_LIBEV_FALSE=''
ENABLE_LIBEV_TRUE=''
ENABLE_LOGGER_FALSE='#'
ENABLE_LOGGER_TIMING_FALSE='#'
ENABLE_LOGGER_TIMING_TRUE=''
ENABLE_LOGGER_TRUE=''
ENABLE_NCURSES_FALSE=''
ENABLE_NCURSES_TRUE=''
ENABLE_OPENSSL_FALSE=''
ENABLE_OPENSSL_TRUE=''
ENABLE_READLINE_FALSE=''
ENABLE_READLINE_TRUE=''
ENABLE_TIMING_FALSE='#'
ENABLE_TIMING_TRUE=''
EXEEXT=''
GCOV_CFLAGS=''
GCOV_CXXFLAGS=''
GCOV_LDFLAGS=''
GCOV_LIBS=''
GREP=''
HAVE_DOT=''
INSTALL_DATA='${INSTALL} -m 644'
INSTALL_PROGRAM='${INSTALL}'
INSTALL_SCRIPT='${INSTALL}'
INSTALL_STRIP_PROGRAM='$(install_sh) -c -s'
LDFLAGS=''
LEX=''
LEXLIB=''
LEX_OUTPUT_ROOT=''
LIBEV_CPPFLAGS=''
LIBEV_LDFLAGS=''
LIBEV_LIBS=''
LIBOBJS=''
LIBS=''
LN_S='ln -s'
LTLIBOBJS=''
MAKEINFO='${SHELL} /home/siggy/AvocadoDB/config/missing --run makeinfo'
MATH_CPPFLAGS=''
MATH_LDFLAGS=''
MATH_LIBS=''
MKDIR_P='/bin/mkdir -p'
NCURSES_CONFIG=''
NCURSES_CPPFLAGS=''
NCURSES_LDFLAGS=''
NCURSES_LIBS=''
OBJEXT='o'
OPENSSL_CPPFLAGS=''
OPENSSL_LDFLAGS=''
OPENSSL_LIBS=''
PACKAGE='avocado'
PACKAGE_BUGREPORT='[email protected]'
PACKAGE_NAME='triAGENS AvocadoDB'
PACKAGE_STRING='triAGENS AvocadoDB 0.3.7'
PACKAGE_TARNAME='avocado'
PACKAGE_URL='http://www.avocadodb.org'
PACKAGE_VERSION='0.3.7'
PATH_SEPARATOR=':'
PTHREAD_CC='gcc'
PTHREAD_CFLAGS=''
PTHREAD_LIBS=''
RANLIB='ranlib'
READLINE_CPPFLAGS=''
READLINE_LDFLAGS=''
READLINE_LIBS=''
SCONS=''
SET_MAKE=''
SHELL='/bin/sh'
STRIP=''
V8_CPPFLAGS=''
V8_LDFLAGS=''
V8_LIBS=''
VERSION='0.3.7'
ac_ct_CC='gcc'
ac_ct_CXX='g++'
acx_pthread_config='no'
am__EXEEXT_FALSE=''
am__EXEEXT_TRUE=''
am__fastdepCC_FALSE='#'
am__fastdepCC_TRUE=''
am__fastdepCXX_FALSE='#'
am__fastdepCXX_TRUE=''
am__include='include'
am__isrc=''
am__leading_dot='.'
am__quote=''
am__tar='${AMTAR} chof - "$$tardir"'
am__untar='${AMTAR} xf -'
bindir='${exec_prefix}/bin'
build='x86_64-unknown-linux-gnu'
build_alias=''
build_cpu='x86_64'
build_os='linux-gnu'
build_vendor='unknown'
datadir='${datarootdir}'
datarootdir='${prefix}/share'
docdir='${datarootdir}/doc/${PACKAGE_TARNAME}'
dvidir='${docdir}'
exec_prefix='NONE'
host='x86_64-unknown-linux-gnu'
host_alias=''
host_cpu='x86_64'
host_os='linux-gnu'
host_vendor='unknown'
htmldir='${docdir}'
includedir='${prefix}/include'
infodir='${datarootdir}/info'
install_sh='${SHELL} /home/siggy/AvocadoDB/config/install-sh'
libdir='${exec_prefix}/lib'
libexecdir='${exec_prefix}/libexec'
localedir='${datarootdir}/locale'
localstatedir='${prefix}/var'
mandir='${datarootdir}/man'
mkdir_p='/bin/mkdir -p'
oldincludedir='/usr/include'
pdfdir='${docdir}'
prefix='NONE'
program_transform_name='s,x,x,'
psdir='${docdir}'
sbindir='${exec_prefix}/sbin'
sharedstatedir='${prefix}/com'
sysconfdir='${prefix}/etc'
target='x86_64-unknown-linux-gnu'
target_alias=''
target_cpu='x86_64'
target_os='linux-gnu'
target_vendor='unknown'

-----------

confdefs.h.

-----------

/* confdefs.h */

define PACKAGE_NAME "triAGENS AvocadoDB"

define PACKAGE_TARNAME "avocado"

define PACKAGE_VERSION "0.3.7"

define PACKAGE_STRING "triAGENS AvocadoDB 0.3.7"

define PACKAGE_BUGREPORT "[email protected]"

define PACKAGE_URL "http://www.avocadodb.org"

define PACKAGE "avocado"

define VERSION "0.3.7"

define TRI_ENABLE_LOGGER 1

define TRI_ENABLE_LOGGER_TIMING 1

define TRI_ENABLE_TIMING 1

define TRI_ENABLE_FIGURES 1

configure: exit 1

addEdge without label

addEdge: Ich glaube, man kann in vielen Graphen auch gut eine Edge gebrauchen, die keinen Namen
hat. Vielleicht die zwei Methoden hinzufügen:
addEdge(out, in)
und
addEdge(out, in, data)
Weil Name ist ja eigentlich nur ein Sonderfall – oder übersehe ich da was? Beim Vertex funktioniert es
ja so.

REST API if-none-match doesn't work

As you can see, the if-none-match header value is the same as the etag/_rev value.
Expected output should be: http error 304.

curl -vX GET '-H if-none-match: "28459431"' --dump - http://localhost:8529/document/3228071/28459431

  • About to connect() to localhost port 8529 (#0)
  • Trying 127.0.0.1...
  • connected
  • Connected to localhost (127.0.0.1) port 8529 (#0)

    GET /document/3228071/28459431 HTTP/1.1
    User-Agent: curl/7.24.0 (x86_64-apple-darwin10.8.0) libcurl/7.24.0 OpenSSL/1.0.1 zlib/1.2.6 libidn/1.22
    Host: localhost:8529
    Accept: /
    if-none-match: "28459431"

    < HTTP/1.1 200 OK
    HTTP/1.1 200 OK
    < etag: "28459431"
    etag: "28459431"
    < connection: Keep-Alive
    connection: Keep-Alive
    < content-type: application/json; charset=utf-8
    content-type: application/json; charset=utf-8
    < server: triagens GmbH High-Performance HTTP Server
    server: triagens GmbH High-Performance HTTP Server
    < content-length: 99
    content-length: 99

<

  • Connection #0 to host localhost left intact
    {"work":{"b":0,"l":60},"name":"Name/0/60","home":[0,60],"_id":"3228071/28459431","_rev":28459431}* Closing connection #0

create tests for /_api/key

It seems there are no tests yet for the /_api/key API. Tests should be added to cover this interface.

/collection/collection-id/parameter is deprecated

The API call /collection/collection-id/parameter is deprecated now.
It has been replaced with /collection/collection-id/properties.

This should be noted in the release notes.
At some point, the old method should be removed.

waitForSync is unreachable

I have one issue related waitForSync collection param.
According this document https://github.com/triAGENS/AvocadoDB/wiki/HttpCollection#wiki-1HttpCollectionParameter it should be returned on describe collection API call (GET /_api/collection/collection-identifier

But actually that doesn't work. waitForSync is unreachable.

I've found temporary hack: when I make PUT /_api/collection/collection-identifier/parameter
with empty data then it return information about this param.

That's not RESTful and it should be fixed. I've noted in tests this behavior.

Also when I've tried GET /_api/collection/collection-identifier/parameter I got
{"error":true,"code":404,"errorNum":404,"errorMessage":"expecting one of the resources 'count', 'figures'"}

As far as I understand this API call may accept much more params than it described? Or that's just resource routing issue?

Multibyte of UNICODE encoding is not correct.

Multibyte of UNICODE encoding is not collect.

git-version: master, 665d18b

how to reproduce:

case1:

$ curl --data @- -X POST --dump - "http://127.0.0.1:8529/document?collection=unit_test_issue&createCollection=true"
"あ"

HTTP/1.1 202 Accepted
etag: "1824232426"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
location: /document/1822921706/1824232426
content-length: 63

{"error":false,"_id":"1822921706/1824232426","_rev":1824232426}

$ curl -X GET --dump - http://127.0.0.1:8529/document/1822921706/1824232426

HTTP/1.1 200 OK
etag: "1824232426"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
content-length: 8

"\u0042"

It should be "\u3042".

case2:

$ curl --data @- -X POST --dump - "http://127.0.0.1:8529/document?collection=unit_test_issue&createCollection=true"
"\u3042"

HTTP/1.1 202 Accepted
etag: "1824363498"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
location: /document/1822921706/1824363498
content-length: 63

{"error":false,"_id":"1822921706/1824363498","_rev":1824363498}

$ curl -X GET --dump - http://127.0.0.1:8529/document/1822921706/1824363498
HTTP/1.1 200 OK
etag: "1824363498"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
content-length: 8

"\u0042"

It should be "\u3042".

case3:

$ curl --data @- -X POST --dump - "http://127.0.0.1:8529/document?collection=unit_test_issue&createCollection=true"

"寿司"
HTTP/1.1 202 Accepted
etag: "1824297962"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
location: /document/1822921706/1824297962
content-length: 63

{"error":false,"_id":"1822921706/1824297962","_rev":1824297962}


$ curl -X GET --dump - http://127.0.0.1:8529/document/1822921706/1824297962

HTTP/1.1 200 OK
etag: "1824297962"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
content-length: 14

"\u0BFF\u03F8"

It should be "\u5BFF\u53F8"

case4:

$ curl --data @- -X POST --dump - "http://127.0.0.1:8529/document?collection=unit_test_issue&createCollection=true"
"\u2606"

HTTP/1.1 202 Accepted
etag: "1824429034"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
location: /document/1822921706/1824429034
content-length: 63

{"error":false,"_id":"1822921706/1824429034","_rev":1824429034}


$ curl -X GET --dump - http://127.0.0.1:8529/document/1822921706/1824429034

HTTP/1.1 200 OK
etag: "1824429034"
connection: Keep-Alive
content-type: application/json; charset=utf-8
server: triagens GmbH High-Performance HTTP Server
content-length: 8

"\u0606"

It should be "\u2606".

`db._drop` doesn't work correctly

Snippet from the Unit Test:

db._drop(vertex);
db._drop(edge);

graph = new Graph(vertex, edge);

First Run: Everything is ok, second run:

[AvocadoError 1108: cannot create/rename collection because directory ready exists: cannot create collection]

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.