GithubHelp home page GithubHelp logo

vitaly-t / pg-promise Goto Github PK

View Code? Open in Web Editor NEW
3.4K 47.0 218.0 7.93 MB

PostgreSQL interface for Node.js

Home Page: https://vitaly-t.github.io/pg-promise

License: MIT License

JavaScript 95.62% HTML 0.12% CSS 1.67% TypeScript 2.59% Batchfile 0.01%
postgresql nodejs promises javascript transaction promise-library typescript

pg-promise's Introduction

pg-promise

Build Status Node Version Postgres Version


PostgreSQL interface for Node.js



About

Built on top of node-postgres, this library adds the following:

  • Automatic connections
  • Automatic transactions
  • Powerful query-formatting engine + query generation
  • Declarative approach to handling query results
  • Global events reporting for central handling
  • Extensive support for external SQL files
  • Support for all promise libraries

At its inception in 2015, this library was only adding promises to the base driver, hence the name pg-promise. And while the original name was kept, the library's functionality was vastly extended, with promises now being only its tiny part.

Support & Sponsorship

I do free support here and on StackOverflow.

And if you want to help this project, I can accept Bitcoin: 1yki7MXMkuDw8qqe5icVdh1GJZSQSzKZp

Documentation

Chapter Usage below explains the basics you need to know, while the Official Documentation gets you started, and provides links to all other resources.

Contributing

Please read the Contribution Notes before opening any new issue or PR.

Usage

Once you have created a Database object, according to the steps in the Official Documentation, you get access to the methods documented below.

Methods

All query methods of the library are based off generic method query.

You should normally use only the derived, result-specific methods for executing queries, all of which are named according to how many rows of data the query is expected to return, so for each query you should pick the right method: none, one, oneOrNone, many, manyOrNone = any. Do not confuse the method name for the number of rows to be affected by the query, which is completely irrelevant.

By relying on the result-specific methods you protect your code from an unexpected number of data rows, to be automatically rejected (treated as errors).

There are also a few specific methods that you will often need:

  • result, multi, multiResult - for verbose and/or multi-query results;
  • map, each - for simpler/inline result pre-processing/re-mapping;
  • func, proc - to simplify execution of SQL functions/procedures;
  • stream - to access rows from a query via a read stream;
  • connect, task, tx + txIf - for shared connections + automatic transactions, each exposing a connected protocol that has additional methods batch, page and sequence.

The protocol is fully customizable / extendable via event extend.

IMPORTANT:

The most important methods to understand from start are task and tx/txIf (see Tasks and Transactions). As documented for method query, it acquires and releases the connection, which makes it a poor choice for executing multiple queries at once. For this reason, Chaining Queries is a must-read, to avoid writing the code that misuses connections.

Learn by Example is a beginner's tutorial based on examples.

Query Formatting

This library comes with embedded query-formatting engine that offers high-performance value escaping, flexibility and extensibility. It is used by default with all query methods, unless you opt out of it entirely via option pgFormatting within Initialization Options.

All formatting methods used internally are available from the formatting namespace, so they can also be used directly when needed. The main method there is format, used by every query method to format the query.

The formatting syntax for variables is decided from the type of values passed in:

ATTENTION: Never use ES6 template strings or manual concatenation to generate queries, as both can easily result in broken queries! Only this library's formatting engine knows how to properly escape variable values for PostgreSQL.

Index Variables

The simplest (classic) formatting uses $1, $2, ... syntax to inject values into the query string, based on their index (from $1 to $100000) from the array of values:

await db.any('SELECT * FROM product WHERE price BETWEEN $1 AND $2', [1, 10])

The formatting engine also supports single-value parametrization for queries that use only variable $1:

await db.any('SELECT * FROM users WHERE name = $1', 'John')

This however works only for types number, bigint, string, boolean, Date and null, because types like Array and Object change the way parameters are interpreted. That's why passing in index variables within an array is advised as safer, to avoid ambiguities.

Named Parameters

When a query method is parameterized with values as an object, the formatting engine expects the query to use the Named Parameter syntax $*propName*, with * being any of the following open-close pairs: {}, (), <>, [], //.

// We can use every supported variable syntax at the same time, if needed:
await db.none('INSERT INTO users(first_name, last_name, age) VALUES(${name.first}, $<name.last>, $/age/)', {
    name: {first: 'John', last: 'Dow'},
    age: 30
});

IMPORTANT: Never use the reserved ${} syntax inside ES6 template strings, as those have no knowledge of how to format values for PostgreSQL. Inside ES6 template strings you should only use one of the 4 alternatives - $(), $<>, $[] or $//. In general, you should either use the standard strings for SQL, or place SQL into external files - see Query Files.

Valid variable names are limited to the syntax of open-name JavaScript variables. And name this has special meaning - it refers to the formatting object itself (see below).

Keep in mind that while property values null and undefined are both formatted as null, an error is thrown when the property does not exist.

this reference

Property this refers to the formatting object itself, to be inserted as a JSON-formatted string.

await db.none('INSERT INTO documents(id, doc) VALUES(${id}, ${this})', {
    id: 123,
    body: 'some text'    
})
//=> INSERT INTO documents(id, doc) VALUES(123, '{"id":123,"body":"some text"}')

Nested Named Parameters

Named Parameters support property name nesting of any depth.

Example
const obj = {
    one: {
        two: {
            three: {
                value1: 123,
                value2: a => {
                    // a = obj.one.two.three
                    return 'hello';
                },
                value3: function(a) {
                    // a = this = obj.one.two.three
                    return 'world';
                },
                value4: {
                    toPostgres: a => {
                        // Custom Type Formatting
                        // a = obj.one.two.three.value4
                        return a.text;
                    },
                    text: 'custom'
                }                
            }
        }
    }
};
await db.one('SELECT ${one.two.three.value1}', obj); //=> SELECT 123
await db.one('SELECT ${one.two.three.value2}', obj); //=> SELECT 'hello'
await db.one('SELECT ${one.two.three.value3}', obj); //=> SELECT 'world'
await db.one('SELECT ${one.two.three.value4}', obj); //=> SELECT 'custom'

The last name in the resolution can be anything, including:

i.e. the resolution chain is infinitely flexible, and supports recursion without limits.

Please note, however, that nested parameters are not supported within the helpers namespace.

Formatting Filters

By default, all values are formatted according to their JavaScript type. Formatting filters (or modifiers), change that, so the value is formatted differently.

Note that formatting filters work only for normal queries, and are not available within PreparedStatement or ParameterizedQuery, because those are, by definition, formatted on the server side.

Filters use the same syntax for Index Variables and Named Parameters, following immediately the variable name:

With Index Variables
await db.any('SELECT $1:name FROM $2:name', ['price', 'products'])
//=> SELECT "price" FROM "products"
With Named Parameters
await db.any('SELECT ${column:name} FROM ${table:name}', {
    column: 'price',
    table: 'products'    
});
//=> SELECT "price" FROM "products"

The following filters are supported:

SQL Names

When a variable name ends with :name, or shorter syntax ~ (tilde), it represents an SQL name or identifier, to be escaped accordingly:

Using ~ filter
await db.query('INSERT INTO $1~($2~) VALUES(...)', ['Table Name', 'Column Name']);
//=> INSERT INTO "Table Name"("Column Name") VALUES(...)
Using :name filter
await db.query('INSERT INTO $1:name($2:name) VALUES(...)', ['Table Name', 'Column Name']);
//=> INSERT INTO "Table Name"("Column Name") VALUES(...)

Typically, an SQL name variable is a text string, which must be at least 1 character long. However, pg-promise supports a variety of ways in which SQL names can be supplied:

  • A string that contains only * (asterisks) is automatically recognized as all columns:
await db.query('SELECT $1:name FROM $2:name', ['*', 'table']);
//=> SELECT * FROM "table"
  • An array of strings to represent column names:
await db.query('SELECT ${columns:name} FROM ${table:name}', {
    columns: ['column1', 'column2'],
    table: 'table'
});
//=> SELECT "column1","column2" FROM "table"
  • Any object that's not an array gets its properties enumerated for column names:
const obj = {
    one: 1,
    two: 2
};

await db.query('SELECT $1:name FROM $2:name', [obj, 'table']);
//=> SELECT "one","two" FROM "table"

In addition, the syntax supports this to enumerate column names from the formatting object:

const obj = {
    one: 1,
    two: 2
};

await db.query('INSERT INTO table(${this:name}) VALUES(${this:csv})', obj);
//=> INSERT INTO table("one","two") VALUES(1, 2)

Relying on this type of formatting for sql names and identifiers, along with regular variable formatting protects your application from SQL injection.

Method as.name implements the formatting.

Alias Filter

An alias is a simpler, less-strict version of :name filter, which only supports a text string, i.e. it does not support *, this, array or object as inputs, like :name does. However, it supports other popular cases that are less strict, but cover at least 99% of all use cases, as shown below.

  • It will skip adding surrounding double quotes when the name is a same-case single word:
await db.any('SELECT full_name as $1:alias FROM $2:name', ['name', 'table']);
//=> SELECT full_name as name FROM "table"
  • It will automatically split the name into multiple SQL names when encountering ., and then escape each part separately, thus supporting auto-composite SQL names:
await db.any('SELECT * FROM $1:alias', ['schemaName.table']);
//=> SELECT * FROM "schemaName".table

For more details see method as.alias that implements the formatting.

Raw Text

When a variable name ends with :raw, or shorter syntax ^, the value is to be injected as raw text, without escaping.

Such variables cannot be null or undefined, because of the ambiguous meaning in this case, and those values will throw error Values null/undefined cannot be used as raw text.

const where = pgp.as.format('WHERE price BETWEEN $1 AND $2', [5, 10]); // pre-format WHERE condition
await db.any('SELECT * FROM products $1:raw', where);
//=> SELECT * FROM products WHERE price BETWEEN 5 AND 10

Special syntax this:raw / this^ is supported, to inject the formatting object as raw JSON string.

WARNING:
This filter is unsafe, and should not be used for values that come from the client side, as it may result in SQL injection.

Open Values

When a variable name ends with :value, or shorter syntax #, it is escaped as usual, except when its type is a string, the trailing quotes are not added.

Open values are primarily to be able to compose complete LIKE/ILIKE dynamic statements in external SQL files, without having to generate them in the code.

i.e. you can either generate a filter like this in your code:

const name = 'John';
const filter = '%' + name + '%';

and then pass it in as a regular string variable, or you can pass in only name, and have your query use the open-value syntax to add the extra search logic:

SELECT * FROM table WHERE name LIKE '%$1:value%')

WARNING:
This filter is unsafe, and should not be used for values that come from the client side, as it may result in SQL injection.

Method as.value implements the formatting.

JSON Filter

When a variable name ends with :json, explicit JSON formatting is applied to the value.

By default, any object that's not Date, Array, Buffer, null or Custom-Type (see Custom Type Formatting), is automatically formatted as JSON.

Method as.json implements the formatting.

CSV Filter

When a variable name ends with :csv or :list, it is formatted as a list of Comma-Separated Values, with each value formatted according to its JavaScript type.

Typically, you would use this for a value that's an array, though it works for single values also. See the examples below.

Using :csv filter
const ids = [1, 2, 3];
await db.any('SELECT * FROM table WHERE id IN ($1:csv)', [ids])
//=> SELECT * FROM table WHERE id IN (1,2,3)
Using :list filter
const ids = [1, 2, 3];
await db.any('SELECT * FROM table WHERE id IN ($1:list)', [ids])
//=> SELECT * FROM table WHERE id IN (1,2,3)

Using automatic property enumeration:

Enumeration with :csv filter
const obj = {first: 123, second: 'text'};

await db.none('INSERT INTO table($1:name) VALUES($1:csv)', [obj])
//=> INSERT INTO table("first","second") VALUES(123,'text')

await db.none('INSERT INTO table(${this:name}) VALUES(${this:csv})', obj)
//=> INSERT INTO table("first","second") VALUES(123,'text')
Enumeration with :list filter
const obj = {first: 123, second: 'text'};

await db.none('INSERT INTO table($1:name) VALUES($1:list)', [obj])
//=> INSERT INTO table("first","second") VALUES(123,'text')

await db.none('INSERT INTO table(${this:name}) VALUES(${this:list})', obj)
//=> INSERT INTO table("first","second") VALUES(123,'text')

Method as.csv implements the formatting.

Custom Type Formatting

The library supports dual syntax for CTF (Custom Type Formatting):

  • Explicit CTF - extending the object/type directly, for ease of use, while changing its signature;
  • Symbolic CTF - extending the object/type via Symbol properties, without changing its signature.

The library always first checks for the Symbolic CTF, and if no such syntax is used, only then it checks for the Explicit CTF.

Explicit CTF

Any value/object that implements function toPostgres is treated as a custom-formatting type. The function is then called to get the actual value, passing it the object via this context, and plus as a single parameter (in case toPostgres is an ES6 arrow function):

const obj = {
    toPostgres(self) {
        // self = this = obj
        
        // return a value that needs proper escaping
    }
}

Function toPostgres can return anything, including another object with its own toPostgres function, i.e. nested custom types are supported.

The value returned from toPostgres is escaped according to its JavaScript type, unless the object also contains property rawType set to a truthy value, in which case the returned value is considered pre-formatted, and thus injected directly, as Raw Text:

const obj = {
    toPostgres(self) {
        // self = this = obj
        
        // return a pre-formatted value that does not need escaping
    },
    rawType: true // use result from toPostgres directly, as Raw Text
}

Example below implements a class that auto-formats ST_MakePoint from coordinates:

class STPoint {
    constructor(x, y) {
        this.x = x;
        this.y = y;
        this.rawType = true; // no escaping, because we return pre-formatted SQL
    }
    
    toPostgres(self) {
        return pgp.as.format('ST_MakePoint($1, $2)', [this.x, this.y]);
    }
}

And a classic syntax for such a class is even simpler:

function STPoint(x, y){
    this.rawType = true; // no escaping, because we return pre-formatted SQL
    this.toPostgres = () => pgp.as.format('ST_MakePoint($1, $2)', [x, y]);
}

With this class you can use new STPoint(12, 34) as a formatting value that will be injected correctly.

You can also use CTF to override any standard type:

Date.prototype.toPostgres = a => a.getTime();

Symbolic CTF

The only difference from Explicit CTF is that we set toPostgres and rawType as ES6 Symbol properties, defined in the ctf namespace:

const {toPostgres, rawType} = pgp.as.ctf; // Global CTF symbols

const obj = {
    [toPostgres](self) {
        // self = this = obj
        
        // return a pre-formatted value that does not need escaping
    },
    [rawType]: true // use result from toPostgres directly, as Raw Text
};

As CTF symbols are global, you can also configure objects independently of this library:

const ctf = {
    toPostgres: Symbol.for('ctf.toPostgres'),
    rawType: Symbol.for('ctf.rawType')
};

Other than that, it works exactly as the Explicit CTF, but without changing the object's signature.

If you do not know what it means, read the ES6 Symbol API and its use for unique property names. But in short, Symbol properties are not enumerated via for(name in obj), i.e. they are not generally visible within JavaScript, only through specific API Object.getOwnPropertySymbols.

Query Files

Use of external SQL files (via QueryFile) offers many advantages:

  • Much cleaner JavaScript code, with all SQL kept in external files;
  • Much easier to write large and well-formatted SQL, with many comments and whole revisions;
  • Changes in external SQL can be automatically re-loaded (option debug), without restarting the app;
  • Pre-formatting SQL upon loading (option params), automating two-step SQL formatting;
  • Parsing and minifying SQL (options minify + compress), for early error detection and compact queries.
Example
const {join: joinPath} = require('path');

// Helper for linking to external query files:
function sql(file) {
    const fullPath = joinPath(__dirname, file);
    return new pgp.QueryFile(fullPath, {minify: true});
}

// Create a QueryFile globally, once per file:
const sqlFindUser = sql('./sql/findUser.sql');

db.one(sqlFindUser, {id: 123})
    .then(user => {
        console.log(user);
    })
    .catch(error => {
        if (error instanceof pgp.errors.QueryFileError) {
            // => the error is related to our QueryFile
        }
    });

File findUser.sql:

/*
    multi-line comments are supported
*/
SELECT name, dob -- single-line comments are supported
FROM Users
WHERE id = ${id}

Every query method of the library can accept type QueryFile as its query parameter. Type QueryFile never throws any error, leaving it for query methods to gracefully reject with QueryFileError.

Use of Named Parameters within external SQL files is recommended over the Index Variables, because it makes the SQL much easier to read and understand, and because it also allows Nested Named Parameters, so variables in a large and complex SQL file can be grouped in namespaces for even easier visual separation.

Tasks

A task represents a shared connection for executing multiple queries:

db.task(t => {
    // execute a chain of queries against the task context, and return the result:
    return t.one('SELECT count(*) FROM events WHERE id = $1', 123, a => +a.count)
        .then(count => {
            if(count > 0) {
                return t.any('SELECT * FROM log WHERE event_id = $1', 123)
                    .then(logs => {
                        return {count, logs};
                    })
            }
            return {count};
        });    
})
    .then(data => {
        // success, data = either {count} or {count, logs}
    })
    .catch(error => {
        // failed    
    });

Tasks provide a shared connection context for its callback function, to be released when finished, and they must be used whenever executing more than one query at a time. See also Chaining Queries to understand the importance of using tasks.

You can optionally tag tasks (see Tags), and use ES7 async syntax:

With ES7 async
db.task(async t => {
    const count = await t.one('SELECT count(*) FROM events WHERE id = $1', 123, a => +a.count);
    if(count > 0) {
        const logs = await t.any('SELECT * FROM log WHERE event_id = $1', 123);
        return {count, logs};
    }
    return {count};
})
    .then(data => {
        // success, data = either {count} or {count, logs}
    })
    .catch(error => {
        // failed    
    });
With ES7 async + tag
db.task('get-event-logs', async t => {
    const count = await t.one('SELECT count(*) FROM events WHERE id = $1', 123, a => +a.count);
    if(count > 0) {
        const logs = await t.any('SELECT * FROM log WHERE event_id = $1', 123);
        return {count, logs};
    }
    return {count};
})
    .then(data => {
        // success, data = either {count} or {count, logs}
    })
    .catch(error => {
        // failed    
    });

Conditional Tasks

Method taskIf creates a new task only when required, according to the condition.

The default condition is to start a new task only when necessary, such as on the top level.

With default condition
db.taskIf(t1 => {
    // new task has started, as the top level doesn't have one
    return t1.taskIf(t2 => {
        // Task t1 is being used, according to the default condition
        // t2 = t1
    });
})
With a custom condition - value
db.taskIf({cnd: false}, t1 => {
    // new task is created, i.e. option cnd is ignored here,
    // because the task is required on the top level
    return t1.taskIf({cnd: true}, t2 => {
        // new task created, because we specified that we want one;
        // t2 != t1
    });
})
With a custom condition - callback
const cnd = c => {
    // c.ctx - task/tx context (not available on the top level)
    // default condition: return !c.ctx;
    return someValue;
};

db.taskIf({cnd}, t1 => {
    // new task is always created, because it is required on the top level
    return t1.taskIf({cnd}, t2 => {
        // if someValue is truthy, a new task is created (t2 != t1);
        // otherwise, we continue with the containing task (t2 = t1).
    });
})

Transactions

Transaction method tx is like task, which also executes BEGIN + COMMIT/ROLLBACK:

db.tx(t => {
    // creating a sequence of transaction queries:
    const q1 = t.none('UPDATE users SET active = $1 WHERE id = $2', [true, 123]);
    const q2 = t.one('INSERT INTO audit(entity, id) VALUES($1, $2) RETURNING id', ['users', 123]);

    // returning a promise that determines a successful transaction:
    return t.batch([q1, q2]); // all of the queries are to be resolved;
})
    .then(data => {
        // success, COMMIT was executed
    })
    .catch(error => {
        // failure, ROLLBACK was executed
    });

If the callback function returns a rejected promise or throws an error, the method will automatically execute ROLLBACK at the end. In all other cases the transaction will be automatically closed by COMMIT.

The same as tasks, transactions support Tags and ES7 async:

With ES7 async
db.tx(async t => {
    await t.none('UPDATE users SET active = $1 WHERE id = $2', [true, 123]);
    await t.one('INSERT INTO audit(entity, id) VALUES($1, $2) RETURNING id', ['users', 123]);
})
    .then(data => {
        // success, COMMIT was executed
    })
    .catch(error => {
        // failure, ROLLBACK was executed
    });
With ES7 async + tag
db.tx('update-user', async t => {
    await t.none('UPDATE users SET active = $1 WHERE id = $2', [true, 123]);
    await t.one('INSERT INTO audit(entity, id) VALUES($1, $2) RETURNING id', ['users', 123]);
})
    .then(data => {
        // success, COMMIT was executed
    })
    .catch(error => {
        // failure, ROLLBACK was executed
    });

Nested Transactions

Nested transactions automatically share the connection between all levels. This library sets no limitation as to the depth (nesting levels) of transactions supported.

Example
db.tx(t => {
    const queries = [
        t.none('DROP TABLE users;'),
        t.none('CREATE TABLE users(id SERIAL NOT NULL, name TEXT NOT NULL)')
    ];
    for (let i = 1; i <= 100; i++) {
        queries.push(t.none('INSERT INTO users(name) VALUES($1)', 'name-' + i));
    }
    queries.push(
        t.tx(t1 => {
            return t1.tx(t2 => {
                return t2.one('SELECT count(*) FROM users');
            });
        }));
    return t.batch(queries);
})
    .then(data => {
        // success
    })
    .catch(error => {
        // failure
    });

If you want to avoid automatic occurrence of nested transactions, see Conditional Transactions.

Limitations

It is important to know that PostgreSQL does not support full/atomic nested transactions, it only supports savepoints inside top-level transactions, to allow partial rollbacks.

Postgres uses BEGIN with COMMIT / ROLLBACK for top-level transactions, and SAVEPOINT name with RELEASE / ROLLBACK TO name for inner save-points.

This library automatically executes all such transaction and savepoint commands, with unique savepoint names, based on the transaction level, plus index within the current level, in the form of sp_x_y.

In the name, x is the transaction level, starting with 1 (because 0 is the top-level transaction that does not use savepoints). And y represents sub-transaction order/index within the current level, starting with 1. So the first savepoint on the top level will be named sp_1_1.

Configurable Transactions

TransactionMode type can extend your BEGIN command with transaction configuration:

const {TransactionMode, isolationLevel} = pgp.txMode;
 
// Create a reusable transaction mode (serializable + read-only + deferrable):
const mode = new TransactionMode({
    tiLevel: isolationLevel.serializable,
    readOnly: true,
    deferrable: true
});

db.tx({mode}, t => {
    // do transaction queries here
})
    .then(() => {
        // success;
    })
    .catch(error => {
        // failure    
    });

Instead of the default BEGIN, such transaction will open with the following command:

BEGIN ISOLATION LEVEL SERIALIZABLE READ ONLY DEFERRABLE

Transaction Mode is set via option mode, preceding the callback function. See methods tx and txIf.

This is the most efficient and best-performing way of configuring transactions. In combination with Transaction Snapshots you can make the most out of transactions in terms of performance and concurrency.

Conditional Transactions

Method txIf executes a transaction / tx when a specified condition is met, or else it executes a task.

When no condition is specified, the default is to start a transaction, if currently not in one, or else it starts a task. It is useful when you want to avoid Nested Transactions - savepoints.

With default condition
db.txIf(t => {
    // transaction is started, as the top level doesn't have one
    return t.txIf(t2 => {
        // a task is started, because there is a parent transaction        
    });
})
With a custom condition - value
db.txIf({cnd: someValue}, t => {
    // if condition is truthy, a transaction is started
    return t.txIf(t2 => {
        // a task is started, if the parent is a transaction
        // a transaction is started, if the parent is a task
    });
})
With a custom condition - callback
const cnd = c => {
    // c.ctx - task/transaction context (not available on the top level)
    // default condition: return !c.ctx || !c.ctx.inTransaction;
    return someValue;
};

db.txIf({cnd}, t => {
    // if condition is truthy, a transaction is started
    return t.txIf(t2 => {
        // a task is started, if the parent is a transaction
        // a transaction is started, if the parent is a task
    });
})

Library de-initialization

This library manages all database connections via the connection pool, which internally caches them.

Connections in the cache expire due to inactivity after idleTimeoutMillis number of milliseconds, which you can set only when creating the Database object.

While there is a single open connection in the pool, the process cannot terminate by itself, only via process.exit(), unless allowExitOnIdle is used - see update section below. If you want the process to finish by itself, without waiting for all connections in the pool to expire, you need to force the pool to shut down all the connections it holds:

db.$pool.end(); // shuts down the connection pool associated with the Database object

For example, if you are using the Bluebird library, you can chain the last promise in the process like this:

.finally(db.$pool.end);

IMPORTANT: Note that if your app is an HTTP service, or generally an application that does not feature any exit point, then you should not do any de-initialization at all. It is only if your app is a run-through process/utility, then you might want to use it, so the process ends without delays.

In applications that either use multiple databases or execute a multi-pool strategy for balanced query loads, you would end up with multiple Database objects, each with its own connection pool. In this scenario, in order to exit the process normally, at a particular point, you can call pgp.end to shut down all connection pools at once:

pgp.end(); // shuts down all connection pools created in the process

or promise-chained to the last query block in the process:

.finally(pgp.end);

Once you have shut down the pool associated with your Database object, you can no longer use the object, and any of its query methods will be rejecting with Error = Connection pool of the database object has been destroyed.

See the relevant API: pgp.end, Database.$pool

pg-promise's People

Contributors

72636c avatar alex-sherwin avatar cmelone avatar demurgos avatar dimfeld avatar djmax avatar dplewis avatar erndob avatar ethanshry avatar ferdinandsalis avatar forbeslindesay avatar greenkeeper[bot] avatar grinnellian avatar jcristovao avatar johanneswuerbach avatar kostiantyno avatar latentflip avatar mmkal avatar nikhilag avatar nlf avatar rafaelkallis avatar raine avatar thomwright avatar tolgaio avatar twooster avatar valeriangalliat avatar vinnl avatar vitaly-t avatar wms avatar xiamx avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pg-promise's Issues

Alias obj.func as obj.exec

The func method would have a more semantic meaning if the method's name was a verb. The most obvious candidate I can remember is exec. So instead of code like this:

db.func('findAudit', [123])
    .then(function(data){
        console.log(data);
    });

we could have

db.exec('findAudit', [123])
    .then(function(data){
        console.log(data);
    });

This would actually be more familiar to people coming from sql server, as to call a stored procedure one should use

EXEC my_sp 'arg1', 'arg2'

If there's interest in this change, I can submit a PR for it.

Add promise validator

Add full validation for promise overrides, to test any promise library override for compatibility at run time, during initialization, so it is easier to diagnose issues with alternative promise implementations.

Add tests to cover the override functionality.

Formatting Error when storing ${name} values with embedded $1

Hello,

There seems to an error when substituting values that may contain embedded $1, $2, etc. values. Such is the case, for example, with bcrypt hashed passwords. Here is an example:

f.formatQuery('${x}', {x:'$2a$10$tZ2eG7mVV2LfndEpIGNJi.BKSr36jcR6fqAG3mIZTnJDRdztMxkd2'})
# output: '\'a{0$tZ2eG7mVV2LfndEpIGNJi.BKSr36jcR6fqAG3mIZTnJDRdztMxkd2\''

While the following returns the expected result:

f.formatQuery('$1', '$2a$10$tZ2eG7mVV2LfndEpIGNJi.BKSr36jcR6fqAG3mIZTnJDRdztMxkd2')
# output:  '\'$2a$10$tZ2eG7mVV2LfndEpIGNJi.BKSr36jcR6fqAG3mIZTnJDRdztMxkd2\''

Otherwise, this is really an excellent project.
Keep up the good work.

Cheers

No way to call cancel() on query

hello, so I was previously using the native node-pg module and if a query takes more than 20 seconds to complete I would then call pg.cancel(). I'm not sure if i'm missing something but there doesn't seem to be anyway of calling the cancel method from pg-promise. My intentions were to use the query cb in the options object to access the pg object but there is no cancel method off of that. Any help would be deeply appreciated!

Rollback a transaction with intermediate data

Hello @vitaly-t,

I believe this is more of a question/misunderstanding rather than an issue, similar to #33. I have a transaction that first inserts a user into the db. Then with the user id returned from that insertion, I insert a pin. Here's the problem: when inserting the pin fails, the user that was inserted remains. That is, the "insert into user ..." query is not rolled back when the "insert into pin ..." query fails.

db.tx(function(){
    console.log("inserting user:");
    return db.one('insert into user (name) values ($1) returning id', ["example"])
        .then(function (user) {
            console.log("inserting pin:");
            return db.one('insert into pin (user_id, pin) values ($1, $2) returning id',[user.id, pin])
                .then(function () {
                    return user.id;
                }, function(error){
                    console.log("error inserting pin")
                    throw error;
                });
        }, function(error){
            console.log("error inserting user");
            throw error;
        });

}).then(function(data){
    console.log("SUCCESS",data); // printing successful transaction output 
}, function(reason){
    console.log("FAIL", reason); // printing the reason why the transaction was rejected 
});

The transaction documentation reads: "4. Executes COMMIT, if the callback resolves, or ROLLBACK, if the callback rejects". So it appears that the "insert into user ..." query is committed when the corresponding promise is resolved (in order to get the user id).

To be very clear, I want to rollback the "insert into user ..." query when the "insert into pin ..." query fails. I've tried a few different approaches but so far I've had no luck. I'm sure there's a way to accomplish this but I could use some guidance.

I'm using pg-promise v.1.7.8 with PostgreSQL v9.4.

pg-promise is awesome, keep up the good work!

Wildcard in query

How can I perform a wildcard search?

db.manyOrNone("select * from items where name like $1", "%Joe's%")

I know I can use '%$1^%', to avoid escaping the wildcard character, but that also leaves the quotes unescaped. I want to sanitize my input and use wildcards.

Connecting with "postgres://" does not work as it expects an object

The README says that you can use either an Object or an String to connect, but following code is not working:

var db = pgp("postgresq://user:password@host:port/database");

getting a run-time exception:

        throw new Error("Invalid parameter 'options' specified.");

Switched to an Object representation and worked like a charm.

Checking for no data found with db.one("select ...")

How can I check for a no data found or a too many rows error when using the db.one("select ...") style of query? At the moment I am catching an error, if it's a real syntax error I get a complex object, but for no data found it's just a string saying "No data returned from the query.". It would be nicer if there were a error object with a code and message for no data found.

disconnect

Здравствуйте!
Не знаю как с вами связаться, потому пишу сюда.
Вопрос.
Почему query после выполнения запроса к базе сразу вызывает disconnect?

Вот текст от монитора:

16:28:23 connect(cardinal@cardinal)                                                                                                                                                                                                      
16:28:23                                                                                                                                                                                                                                 
                        insert into                                                                                                                                                                                                      
                        individual(first_name, surname, patronymic)                                                                                                                                                                      
                        values('fdsfds', 'fdsfds', 'fdsfds')                                                                                                                                                                             
                        returning *                                                                                                                                                                                                      

16:28:23 disconnect(cardinal@cardinal)                                                                                                                                                                                                   
16:28:23 connect(cardinal@cardinal)                                                                                                                                                                                                      
16:28:23 select * from contractor where contractor_id = 105                                                                                                                                                                              
16:28:23 disconnect(cardinal@cardinal)                                                                                                                                                                                                   
16:28:23 connect(cardinal@cardinal)                                                                                                                                                                                                      
16:28:23 select * from document where document_id = 124                                                                                                                                                                                  
16:28:23 disconnect(cardinal@cardinal)                                                                                                                                                                                                   
16:28:23 connect(cardinal@cardinal)                                                                                                                                                                                                      
16:28:23                                                                                                                                                                                                                                 
                                update document set                                                                                                                                                                                      
                                notes = 'rewrwer',                                                                                                                                                                                       
                                date_start = '2015-09-30',                                                                                                                                                                               
                                number = 'rewrwer'                                                                                                                                                                                       
                                where document_id = 124                                                                                                                                                                                  

16:28:23 disconnect(cardinal@cardinal)          

Хотелось бы чтобы он один раз соединился, и один раз разъединился.
А лучше чтобы вообще не разъединялся пока я сам не скажу.

И вот текст кода:

    database
        .oneOrNone(`
            insert into 
            individual(first_name, surname, patronymic) 
            values($/individual_first_name/, $/individual_surname/, $/individual_patronymic/) 
            returning *
        `, req.body)
        .then(function(individual) {
            return database.oneOrNone("select * from contractor where contractor_id = $/contractor_id/", individual).then(function(contractor) {
                return {
                    main: individual,
                    contractor: contractor
                };
            });
        })
        .then(function(individual) {
            return database.oneOrNone("select * from document where document_id = $/document_id/", individual.contractor).then(function(document) {
                individual.document = document;
                return individual;
            });
        })
        .then(function(individual) {

            individual.document["notes"] = req.body["document_notes"];
            individual.document["date_start"] = req.body["document_date_start"];
            individual.document["number"] = req.body["document_notes"];

            return database.none(`
                update document set 
                notes = $/notes/, 
                date_start = $/date_start/, 
                number = $/number/ 
                where document_id = $/document_id/
            `, individual.document);


        })
        .then(function() {
            res.send({
                success: true,
                data: req.body
            });
        })
        .catch(function(error) {
            console.log("Произошла ошибка при вставке нового физического лица:", error);
        });

Reject with an error instead of string when a query mask is set

When a query mask (e. g. one) is set, the promise is rejected with the string "No data returned from the query.". This makes checking for errors very cumbersome and does not play well with the typical express/restify/whatever style:

.catch(err => {
  if (typeof err === 'string') return next(new Error(err));
  next(err);
});

There are several reasons why you should not reject with plain strings. It would be nice to have custom error objects like UnexpectedResultError

.catch(err => {
  if (err instanceof pgp.UnexpectedResultError) {
    return res.redirect('/somewhere');
  }
  next(err);
});

What is your guys opinion on that?

This relates to #37.

Inserting fails with suspect errors in postgresql log

NodeJS v0.10.25 (ubuntu package: nodejs 0.10.25~dfsg2-2ubuntu1 )
( libpq5:amd64 9.4.4-0ubuntu0.14.10 but thats not used afaik? )
against postgresql-9.4.3 server
npm installed:
[email protected]

var columns = [
"AGE",
"I_am_going_to_concerts",
"I_am_working_in_field",
"I_like_books",
"I_like_movies",
"I_like_music",
"I_like_specialties_from_kitchen",
"I_like_watching_movie",
"I_most_enjoy_good_food",
"I_mostly_like_listening_to_music",
"art_culture",
"body",
"body_type",
"cars",
"children",
"companies_brands",
"completed_level_of_education",
"completion_percentage",
"computers_internet",
"education",
"eye_color",
"favourite_color",
"fun",
"gender",
"hair_color",
"hair_type",
"health",
"hobbies",
"hobbies_interests",
"last_login",
"life_style",
"love_is_for_me",
"marital_status",
"more",
"movies",
"music",
"my_active_sports",
"my_eyesight",
"my_partner_should_be",
"my_passive_sports",
"on_pokec_i_am_looking_for",
"pets",
"politics",
"profession",
"public",
"region",
"registration",
"relation_to_alcohol",
"relation_to_casual_sex",
"relation_to_children",
"relation_to_smoking",
"relationships",
"science_technologies",
"sign_in_zodiac",
"spoken_languages",
"sport",
"the_idea_of_good_evening",
"travelling"
];
The table is created with this statement:
db.query('create table test_table (id VARCHAR PRIMARY KEY,' + columns.join(" VARCHAR, ") + ' VARCHAR)')

The insert statement is created like this:
var insertColumns = ['id'];
var binds = ['$1'];

for (var i in columns) {
  insertColumns.push(columns[i]);
  var j = parseInt(i, 10) + 2;
  binds.push('$'+ j);
}
module.exports.singleWriteQuery = 'insert into Profiles_temp  ('+ insertColumns.join(', ') + ') values(' + binds.join(',')+')' ;

And later on executed with multiple datasets. This one fails:
[ 'P/25720',
15,
'zriedkavo. :|',
'studujem, ale budem pracovat v zdravotnictve. :)',
'len a len: twilight saga. <3 -',
'komedie, fantasy, rodinne, romanticke. ^^',
'vsetky. :p',
null,
'u priatela, priatelky, v kine. :)',
null,
'na posteli, v aute, kedykolvek a kdekolvek, na dobru noc. ^^',
null,
'172 cm',
null,
null,
'v buducnosti chcem mat velaaa. -',
null,
'zakladne. :d',
59,
null,
null,
'nadherne modre. -',
'biela, cervena, hneda, cierna, modra, zelena. <3',
null,
0,
'hnede, ale chcem to odfarbit na rysave. -',
'dlhe. niekedy rovne, niekedy kucerave. :p',
null,
'cestovanie, pozeranie filmov, nakupovanie, stanovanie, fotografovanie, surfovanie po webe, pocuvanie hudby, diskoteky, kino, priatelia. -',
null,
'2012-05-12 22:14:00.0',
null,
'nie je nic lepsie, ako byt zamilovana. :$',<--------------this quote vanishes!
'cakam na zazrak. :d',
null,
null,
null,
null,
'vyborny. ;)',
'moj najlepsi priatel. ;)',
null,
'dobreho priatela, priatelku. :)',
'psy. ^^',
null,
'este nic. :d ale budem zdravotna sestra. :p',
0,
'trnavsky kraj, galanta',
'2011-11-23 00:00:00.0',
'hmm.. len niekedy pijem. :d',
null,
null,
'nefajcim. ^^',
null,
null,
'blizenci. :)',
'madarsky!, slovensky, nemecky, anglicky. :d',
null,
'zhasnut svetla a meditovat, pocuvat dobru hudbu, surfovat na sieti a chatovat, ist do kina, pozerat dobry film v tv. :)',
null ]

Support for Array as column value

Hi again,

It is stated in the readme that:

pg-promise automatically converts all basic javascript types
(text, boolean, date, number and null) into their Postgres format.

Is there any possibility of also supporting arrays, declared in postgres like:

xpto int[],

Currently trying to pass an array gives a:

Cannot convert type 'object' of parameter with index ...

Thanks!

Nested transaction behavior - maybe incorrect?

I have a situation that kind of looks like this... (stole this from the pg-promise tests)

db.tx(function (t1) {
    context1 = this;
    return this.none('update users set login=$1', ['External']).then(function () {
        return context1.tx(function (t2) {
            return t2.none('update users set login=$1', ['External']) 
        });
    }).then(function () {
        return context1.one('select * from unknowntable') // emulating a bad query;
    })
})

The premise - it creates a transaction, then does something, then creates an inner transaction that does something else, then finishes the inner transaction, then does something else which causes the outer transaction to rollback. I know it's a contrived situation, but when you have situations involving other dependant methods the composition does make sense. This is just a simplified form.

The pg-monitor outlook looks like this...

12:21:30 tx/start
12:21:30 tx: begin
12:21:30 tx: update users set login='External'
12:21:30 tx/start
12:21:30 tx: begin
12:21:30 tx: select * from users
12:21:30 tx: commit
12:21:30 tx/end; duration: .004, success: true
12:21:30 tx: select * from unknowntable
12:21:30 error: relation "unknowntable" does not exist
tx: select * from unknowntable
12:21:30 tx: rollback
12:21:30 tx/end; duration: .013, success: false

You can see in the bolded line that pg-promise finishes the inner transaction and issues a commit. This is incorrect behaviour as there is an outer scope in play, and PG does not support nested transactions. This commit actually commits the initial outer scope update, as well as commiting the inner scope. Shouldn't it hold the commit until the outer transaction completes?

Any advice on this situation? Thanks!

The new locks feature introduces in 1.10.3 made unit test mocking/stubbing impossible

When the locks was introduced it also added side effect that mocking libraries like Sinon.js cannot create mocks from db object. Now mocking library throws error "TypeError: Cannot redefine property:" as locking also sets configurable property to false for functions. Here is the test case that worked with at least version 1.4.4 but now with latest 1.10.7 is broken with TypeError error

var chai = require("chai");
var assert = chai.assert;
var sinon = require("sinon");
var pgpLib = require('pg-promise');

describe("Database query", function(){

  it("health check executed correct", function(){
    var pgp = pgpLib();
    var cn = {
      host: 'localhost', // server name or IP address;
      port: 5432,
      database: 'my_db_name',
      user: 'user_name',
      password: 'user_password'
    };
    var db = pgp(cn);
    var stub = sinon.stub(db, "one");
    var sqlString = "SELECT $1::int AS number";
    var params = ["1"];
    db.one("SELECT $1::int AS number", ["1"]);
    assert.isTrue(stub.calledWith(sqlString, params), "one should have been called");
    stub.restore();
  });
});

UPDATE and INSERT convenience methods

Hey there,

I am switching from mysql to postgres, and am having to rewrite some of my inserts and updates. I was wondering if there were some convenience methods or something that I'm missing that would help simplify building these queries. For example, the mysql library allowed this kind of syntax:

var insertUserQuery = 'INSERT INTO user SET ?';

and then:

mysql.query(insertUserQuery, userObject, callback);

but doing it with pg-promise can become cumbersome if you're working with an object that has a lot of properties, especially if you're doing an update with an object and aren't updating all of the fields.

Thanks.

Error after Promise upgrade to 7.0.0

Everything was working until yesterday. Any ideas on what broke?

Users/USER/Documents/APPLICATION/api/node_modules/pg-promise/node_modules/promise/node_modules/asap/asap.js:45
throw e;
^
ReferenceError: reason is not defined
at /Users/USER/Documents/APPLICATION/api/app/routes/home.js:8:19
at /Users/USER/Documents/APPLICATIONUSER/Documents/APPLICATION/api/node_modules/pg-promise/node_modules/promise/node_modules/asap/asap.js:27:13)
at process._tickCallback (node.js:355:11)
[23:24:18] 'dev:app' errored after 19 s
[23:24:18] Error in plugin 'gulp-shell'
Message:
Command node app/app.js failed with exit code 1
Error running task sequence: { task: 'dev:app',
message: 'dev:app stream',
duration: 18.899271645,
hrDuration: [ 18, 899271645 ],
err:
{ [Error: Command node app/app.js failed with exit code 1]
message: 'Command node app/app.js failed with exit code 1',
showStack: false,
showProperties: true,
plugin: 'gulp-shell',
__safety: { toString: [Function] } } }

data with $s in quoted strings

I love this library and hope I wasn't too sarcastic with this bug last time :)
got a couple of issues with this format function
object: function (query, obj) {
80 var pattern = /$(?:({)|(()|(<)|([)|(/))\s_[a-z0-9$]+^?\s(?:(?=\2)(?=\3)(?=\4)(?=\5)}|(?=\1)(?=\3)(?=\4)(?=\5))|(?=\1)(?=\2)(?=\4)(?=\5)>|(?=\1)(?=\2)(?=\3)(?=\5)]|(?=\1)(?=\2)(?=\3)(?=\4)/)/gi;
81
etc.....

looks like it limits replacing to only 5 params which would break when you have 6 and some text in the string such as " I will pay you $6 for that used bus ticket"
a bit new to git so don't want to do a pull and a push or whatever so can I suggest a better way

object: function (query, obj) {
query.split(/($[0-9]+)/g).map(function (val) {
if (mat = val.match(/^$([0-9]+)$/)) { return obj[mat[1] - 1]; }
return val;
}).join('');
etc... (with your raw stuff)

so with a call like so
var values = ['first', 'second', 'third', 'fourth', 'fifth', 'i bought this house for $6', 'seventh', 'eighth'];
var text = 'update username="$1" where id=$2 and email="$3" and addr1="$4" and addr2="$5" and addr3="$6" and addr4="$7" and addr5="$8"';
return db.query(text, values);

it would split the query string like so
[ 'update username="',
'$1',
'" where id=',
'$2',
' and email="',
'$3',
'" and addr1="',
'$4',
'" and addr2="',
'$5',
'" and addr3="',
'$6',
'" and addr4="',
'$7',
'" and addr5="',
'$8'
]
map the array to
[ 'update username="',
'first',
'" where id=',
'second',
' and email="',
'third',
'" and addr1="',
'fourth',
'" and addr2="',
'fifth',
'" and addr3="',
'i bought this house for $6',
'" and addr4="',
'seventh',
'" and addr5="',
'eighth'
]
avoiding any clashes with quoted values with a '$' in them
then join it to
'update username="first" where id=second and email="third" and addr1="fourth" and addr2="fifth" and addr3="i bought this house for $6" and addr4="seventh" and addr5="eighth"'
which wouldn't blow up with an address line3 of "i bought this house for $6" :)

thanks Zeus

p.s. the nice thing about pg is that you can call a query with an object
pg.query({text: 'some query', values: []})
and the nice thing about squel is the toParam() method returns that object and can be passed direct to pg.
var qobj = squel....
pg.query(qobj);
would be nice to do the same with pg-promise :)
not sure how easy that would be?

Formatting handling for functions

At the moment when calling methods func and proc with invalid parameters that cannot be formatted they will just throw the formatting error, which is inconsistent with how the rest of queries behave:
they handle any formatting error, reject with that error and send a notification to the global error handler. Calls into functions should be modified to do the same, i.e. make it consistent with the other calls.

This is a very minor issue, so marked as enhancement.

Tests

Do you think about include tests?

Can I help you with this?

If you start I try to help

I'd like to use the new query syntax on prepared statements

I tried it, but it didn't work:

db.one({
  name: 'test-query',
  text: 'select id from file where id = ${id}',
  values: {
    id: 1,
  }
}).then(function(row){console.log(row)}).catch(function(err){console.log(err)})
error: syntax error at or near "$"

I see the documentation specifically says "Using the syntax supported by node-postgres". I guess that's my only choice? I guess this is a bit harder to implement, but it's just an argument mapping, so it should be possible...

WHERE col IN (values)

This is the example given in the docs:

var data = [1, 2, 3, 4];
db.query("SELECT * FROM table WHERE id IN ($1^)", pgp.as.csv(data))
    .then(function (data) {
        console.log(data); // print data;
    }, function (reason) {
        console.log(reason); // print error;
    });

Is there a way to apply this to arrays of string values? If done like above, the strings end up not being surrounded in quotes.

Thanks!

In POST request, unknown colums have values

How can I structure a .post using pg-promise when I have unknown number of columns with values? So I don't know how many of $1, $2, $3 .. should i put and which maps to which column.

I am building my api server using expressjs and postgres.
My postgres table is

$=> \d+ locations
                       Table "$.locations"
       Column        |            Type             | Modifiers                    | 
---------------------+-----------------------------+--------------------------------
 location_id         | integer                     | not null default nextval ... | 
 short_name          | character varying(100)      | not null                      
 long_name           | character varying(300)      |                                
 physical_address1   | character varying(300)      | not null                       
 physical_address2   | character varying(300)      |                                
 physical_city       | character varying(100)      | not null                       
 physical_state      | character varying(20)       |                                
 physical_postalcode | character varying(20)       |                                
 physical_country    | character varying(100)      | not null                       
 physical_latitude   | numeric(12,0)               |                                
 physical_longitude  | numeric(12,0)               |                                
 mailing_address1    | character varying(300)      |                                
 mailing_address2    | character varying(300)      |                                
 mailing_city        | character varying(100)      |                                
 mailing_state       | character varying(20)       |                                
 mailing_postalcode  | character varying(20)       |                                
 mailing_country     | character varying(100)      |                                
 created_at          | timestamp(6) with time zone | default now()                  
 modified_at         | timestamp(6) with time zone |                                
Indexes:
    "locations_pkey" PRIMARY KEY, btree (location_id)

My post block looks like

router.post('/locations', function(req, res){
     // convert req.body json object into array of values
     var locationVar = Object.keys(req.body).map(function(index) { return req.body[index]; });
     db.one("insert into findnow.locations " +
                 "values ($1, $2, $3, $4) "
                 +"returning location_id "
                 , locationVar)
         .then(function (data) {
             console.log(data.location_id);
         }, function (reason) {
             console.log('Insert into /locations failed for ', reason); // print error;
         })
         .done(function () {
             pgp.end(); // closing the connection pool, to exit immediately.
         });
  });

My issue is that the user may provide some of these values and I'd like to still accept her post. How can I structure my code to not have to test what she is sending?

Thanks so much in advance.

Raj

Ой, я не знал что вы это удалите все!

Уф, а восстановить уже нельзя??? Если бы я знал что вы удалите вопрос, я бы скопировал... там же много было информации которую я еще не усвоил(((

Export queryResult

Query Result Mask mentions that queryResult can be used to construct a bitmask to qualify expectations about a query result.

I know it's trivial to just use the integer values directly, but I think it would be nice if queryResult was exported from the library, so that it can be used when calling query().

Resolve a transaction with intermediate data

Hi Vitaly,

First of all let me say a big thank you to you for the time and effort you've spend creating this library. The documentation is well-written. I'm surprised to see so few stars but I'm sure it will gain a lot of traction in the near future.

I have one specific use case which I can't seem to resolve. I have a detached transaction starting with a query returning one result and several subsequent queries which make use of that result. It seems like a simple task but I couldn't find guidance in any of the examples. I'm not a promise expert either so that doesn't help. Here's an example of what I have:

db.tx(function (t) {
    return t.one('INSERT INTO table1(col1) VALUES($1) RETURNING id', [123])
        .then(function (data) {
            return promise.all([
                t.none('INSERT INTO table2(col1, col2) VALUES($1, $2)', [data.id, "John"]),
                t.none('INSERT INTO table2(col1, col2) VALUES($1, $2)', [data.id, "Mary"])
            ]);
        });
})
    .then(function (data) {
        // Success, do something with data...
        return res.status(200).json(data);
    }, function (reason) {
        // Error
        return res.json(reason);
    });

As you can use I use the result from the first query in subsequent queries. I also use this result after the transaction resolves successfully. What am I doing wrong here?

Access to ctx.db after disconnect()

Not quite sure about the circumstances, but I'm getting TypeError: Cannot read property 'client' of null in index.js:430.

ctx.db is null, probably because disconnect() was already called.

data with $s in quoted strings

Was getting a lot of problems with my passwords becoming invalid. I finally tracked it down to parameterised queries. I am using squel...toParam() which returns something like this
text "UPDATE users SET password = $1 WHERE (id = $2) AND (resetpass = $3) RETURNING *"
values[0] "$2a$08$4ImbUVRRASc1xlNFQ"
values[1] 3
values[2] "4ybcAQN8"
then calling pg-promise like so
return db.query(text, values);
which mangles the password to
3a$08$4ImbUVRRASc1xlNFQ
the initial $2 in the password has been replaced by the '3' in values[1]

You shouldn't be replacing $n in quoted strings.
Imagine an
UPDATE bids SET text = $1 WHERE id = $2
values[0] "I agree to pay you the unreservedly binding sum of $2.00 for this used bus ticket"
values[1] 34856

Thanks Zeus

Add support for pg-native

Hi, did you consider adding support for pg-native? Since it's meant to be a drop-in replacement for pg, a flag in the init phase should do the trick.

Nested transactions

At the moment the library supports only one-level transactions. A good few changes are needed to accommodate the logic of nested transactions.

Since nobody complained about this as missing, it is considered an enhancement, to be future-proof.

And if somebody needs it already, please vote up.

money[] not being converted into JavaScript array

Not sure if this is an issue for pg-promise or pg. Columns of the type "money[]" are not being converted into JavaScript arrays. For example, given the following table and data:

CREATE TABLE public."MoneyArrayTest"
(
  moneys money[],
  int_array integer[]
);

INSERT INTO public."MoneyArrayTest" (moneys, int_array)
VALUES ('{2.50, 1.99, 200.00}', '{1,2,3}')

Exucuting the following:

db.any('select * from "MoneyArrayTest"').then(function (results) {
    console.log('results: ', results);
});

Will return this:

[
     {
          "moneys": "{$2.50,$1.99,$200.00}",
          "int_array": [ 1, 2, 3 ]
     }
]

Ready for production?

There are many different postgres promise libraries out there, is this one ready for production use? Also why would I want to use this library over something like http://knexjs.org/. These types of questions might want to be addressed in the readme or wiki.

Support named parameters

It'd be nice to specify named parameters in the query that references properties on an object passed in place of the array of values that would normally map to $1, $2, etc.

db.one("select * from users where id=:id", {id: 123}) // find the user from id;
    .then(function(data){
        // find 'login' records for the user found:
        return db.query("select * from audit where event=:event and userId=:userId",
        {event: "login", userId: data.id});
    })
    .then(function(data){
        // display found audit records;
        console.log(data);
    }, function(reason){
        console.log(reason); // display reason why the call failed;
    })

I created a small wrapper for the plain postgres module to do this for my internal project. I really like how your library looks and think this would be an excellent addition to essentially make my project obsolete.

https://github.com/joeandaverde/tinypg

json support?

Sorry to ask this prematurely as I have not tried to use the library yet. But is there json support? Should I just treat a json column as normal and just use JSON.stringify/parse?

Named Parameters with template strings collision

since named parameters have the same es6 template string syntax for variables, there is a collision if you want to use both

this will throw error

db.query(`select * from users where name=${name} and active=${active}`, {
    name: 'John',
    active: true
});

one reason why would you want to use template strings in this case is for multiline support without concating

so you can write for example

`
select * from users
where name = ${name}
and active = ${active}
`

instead of

"select * from users" +
"where name = ${name}" +
"and active = ${active}" 

INSERT batch w/no transaction

I have a function that processes large CSV data files (>1GB). The code below errors out due to memory exhaustion. Basically, I'm creating a promise (db.none) asynchronously for each CSV record which is not good. Any recommendations? In my case, I don't need transaction support, although I do want to report on failed rows.

return promise.map(this.mappedFiles, function(file) {
  var stream = fs.createReadStream(file.file_name);
  return api.parsers.processCsvStream(stream, function(data) {
    return db.none('INSERT ...', data).then(function(result) { return result; });
  })
}
api.parsers.processCsvStream = function(passedStream, processor) {
  if(!(passedStream instanceof stream.Stream)) {
    return promise.reject('No passedStream received.');
  }

  if(!_.isFunction(processor)) {
    return promise.reject('No processor received.');
  }

  var parser = csv.parse(passedStream, {trim: true});

  parser.on('readable', function() {
    var data;
    while(data = parser.read()) {
      processor(data);
    }
  })

  passedStream.pipe(parser);

  return new Promise(function(resolve, reject) {
    parser.on('end', resolve);
    parser.on('error', reject);
  });
}

Multiple Databases

The initial library structure doesn't allow for connecting to multiple databases. It is not a frequent need, hence the omission.

This will require a small breaking change that will be delivered initially in version 0.2.0.

Add `query` event

Add query event to the library's options, as an event handler for whenever a query is about to execute. This is just for debugging+logging purposes.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.