denodrivers / postgres Goto Github PK
View Code? Open in Web Editor NEWPostgreSQL driver for Deno
Home Page: https://denodrivers.github.io/postgres
License: MIT License
PostgreSQL driver for Deno
Home Page: https://denodrivers.github.io/postgres
License: MIT License
One of really nice features of PostgreSQL is range types:
tsrange — Range of timestamp without time zone
tstzrange — Range of timestamp with time zone
daterange — Range of date
int4range — Range of integer
int8range — Range of bigint
numrange — Range of numeric
https://www.postgresql.org/docs/current/rangetypes.html
would be nice to add support for them. I have time to work on this but it prob. requires a bit of discussion on TS types we would want to create for this @bartlomieju ?
When selecting from char(n)
column.
We don't know the type of arguments that are passed to query and right now it is reflected using any[]
type.
export interface QueryConfig {
text: string;
args?: any[];
name?: string;
encoder?: (arg: any) => EncodedArg;
}
This should be replaced with Array<unknown>
This repo should have format.ts
file to format all files in repo as well as CI check that fails if files are unformatted. deno_std has such check.
I’m wondering if this library supports listen/notify or if it needs to be adapted slightly to. Any suggestions on the best approach on getting it working? I’ve used pg-notify in node and may need to port it for deno use if nothing exists yet.
Following up on pq's encode method, decode
is needed as well, that will convert Postgres encoded data to native TypeScript types. Currently it's a TODO in QueryResult._parseDataRow
, but it should be moved to Connection
.
As in #13, for future, QueryConfig
should take decoder
argument that is user provided function for custom data types.
When querying OID values, I run into a following error:
Error: Don't know how to parse column type: 2206
at decodeText (decode.ts:214:13)
at decode (decode.ts:222:12)
at QueryResult._parseDataRow (query.ts:35:24)
at QueryResult.handleDataRow (query.ts:47:28)
at Connection._simpleQuery (connection.ts:319:18)
at async Connection.query (connection.ts:529:14)
at async Client.query (client.ts:24:12)
at async mod.ts:12:18
mod.ts:
import {Client} from "https://deno.land/x/[email protected]/mod.ts";
const client = new Client({
database: "testdb",
host: "127.0.0.1",
port: "5432",
password: "postgres",
user: "postgres"
})
await client.connect();
try {
const result = await client.query(`SELECT
*,
('"' || "udt_schema" || '"."' || "udt_name" || '"')::"regtype" AS "regtype"
FROM
"information_schema"."columns"
WHERE
(
"table_schema" = 'public'
AND "table_name" = 'post'
)
`);
console.log(result.rows);
} catch (err) {
console.error(err);
}
$ deno run --allow-net ./mod.ts
Error: Don't know how to parse column type: 2206
at decodeText (decode.ts:214:13)
at decode (decode.ts:222:12)
at QueryResult._parseDataRow (query.ts:35:24)
at QueryResult.handleDataRow (query.ts:47:28)
at Connection._simpleQuery (connection.ts:319:18)
at async Connection.query (connection.ts:529:14)
at async Client.query (client.ts:24:12)
at async mod.ts:12:18
The post table is as follows:
CREATE TABLE "post"(
"id" integer NOT NULL,
"characterVarying" character varying(50) NOT NULL,
"varchar" character varying(50) NOT NULL,
"character" character(50) NOT NULL,
"char" character(50) NOT NULL,
CONSTRAINT "PK_be5fda3aac270b134ff9c21cdee" PRIMARY KEY("id")
);
I just wanted to leave a short information for anyone interested in this project.
It seems to be half dead, but that's not the case. A huge chunk of my time is now dedicated to the work in main Deno repo at the moment.
I still intend to put more work into this module but that will happen after Deno 1.0 is released.
inet - IPv4 and IPv6 hosts and networks
cidr - IPv4 and IPv6 networks
macaddr - MAC addresses
I came to conclusion that Deno.env
should be read lazily only in case connection parameter is missing. It will allow to get rid of required --allow-env
permission.
node-postgres's Result class has two properties as follows:
.rowCount
The number of rows processed by the last command.
.command
The command type last executed:
INSERT
UPDATE
CREATE
SELECT
etc.
I think these properties are useful for implementing some features, such as optimistic locking.
Whole documentation is just a README file, better documentation should be provided as documentation page.
Public API should be annotated with JSDoc, so auto generated API reference can be obtained.
rowsOfObjects()
is completely broken:
tx.query(`
SELECT 'title 1' title
UNION ALL
SELECT 'title 2' title
UNION ALL
SELECT 'title 3' title
`)
returns
[{"title":"title 1"},{},{}]
That's because:
rowsOfObjects() {
return this.rows.map((row, index) => {
const rv: { [key: string]: any } = {};
this.rowDescription.columns.forEach(column => {
rv[column.name] = row[index];
});
return rv;
});
}
index of row is being used to choose the column. Should be:
rv[column.name] = row[column.index];
and the row index is completely unnecessary.
Client
's constructor should support string argument with DSN configuration.
import { Client } from "https://deno.land/x/postgres/mod.ts";
new Client("postgres://username:password@host:port/database")
This document should contain information about strict TS mode as well as formatting.
Hi there! I wanted to propose adding the following badge to the README to indicate how many // TODO
comments are in this codebase:
The badge links to tickgit.com
which is a free service that indexes and displays TODO comments in public github repos. It can help surface latent work and be a way for contributors to find areas of code to improve, that might not be otherwise documented.
The markdown is:
[![TODOs](https://img.shields.io/endpoint?url=https://api.tickgit.com/badge?repo=github.com/buildondata/deno-postgres)](https://www.tickgit.com/browse?repo=github.com/buildondata/deno-postgres)
Thanks for considering, feel free to close this issue if it's not appropriate or you prefer not to!
One thing I love about node-postgres is it's streaming result sets. I have applications that select thousands of rows and I don't want them all in a single result array. It's important to have a stream-like option that uses Postgres cursors to work through result sets.
Waiting for tearDown
and setUp
from denoland/deno_std#128
Then we need to prepare methods to quickly setup new tables
With https://github.com/denolib/setup-deno we'll be able to easily setup CI using Github Actions.
I'd prefer to move there over Travis, that should allow to spawn multiple Postgres instances for testing.
When parsing connection parameters, if value is not supplied it should be read from environmental variable.
$ deno run -r --allow-net --allow-env ./test.ts
Compile file:///home/uki00a/work/deno/test.ts
Download https://deno.land/x/[email protected]/mod.ts
Download https://deno.land/x/[email protected]/client.ts
Download https://deno.land/x/[email protected]/error.ts
Download https://deno.land/x/[email protected]/pool.ts
Download https://deno.land/x/[email protected]/connection.ts
Download https://deno.land/x/[email protected]/query.ts
Download https://deno.land/x/[email protected]/connection_params.ts
Download https://deno.land/x/[email protected]/deps.ts
Download https://deno.land/x/[email protected]/packet_writer.ts
Download https://deno.land/x/[email protected]/utils.ts
Download https://deno.land/x/[email protected]/packet_reader.ts
Download https://deno.land/[email protected]/io/bufio.ts
Download https://deno.land/[email protected]/io/util.ts
Download https://deno.land/[email protected]/util/async.ts
Download https://deno.land/x/[email protected]/mod.ts
Download https://deno.land/[email protected]/testing/asserts.ts
Download https://deno.land/[email protected]/path/mod.ts
Download https://deno.land/[email protected]/strings/mod.ts
Download https://deno.land/[email protected]/path/win32.ts
Download https://deno.land/[email protected]/path/posix.ts
Download https://deno.land/[email protected]/path/constants.ts
Download https://deno.land/[email protected]/path/constants.ts
Download https://deno.land/[email protected]/path/interface.ts
Download https://deno.land/[email protected]/path/glob.ts
Download https://deno.land/[email protected]/path/globrex.ts
Download https://deno.land/[email protected]/path/utils.ts
Download https://deno.land/[email protected]/fmt/colors.ts
Download https://deno.land/[email protected]/testing/diff.ts
Download https://deno.land/[email protected]/testing/format.ts
Download https://deno.land/[email protected]/strings/encode.ts
Download https://deno.land/[email protected]/strings/decode.ts
Download https://deno.land/[email protected]/strings/pad.ts
Download https://deno.land/std/strings/mod.ts
Download https://deno.land/x/[email protected]/hash.ts
error: Uncaught Error: Import 'https://deno.land/std/strings/mod.ts' failed: 404 Not Found
► $deno$/ops/dispatch_json.ts:43:11
at unwrapResponse ($deno$/ops/dispatch_json.ts:43:11)
at sendAsync ($deno$/ops/dispatch_json.ts:98:10)
Let's say:
// Types for clarity of idea:
type Field = any
type IndexedRow = Field[]
type KeyedRow = { [key: string]: Field; }
result.rows: IndexedRow[]
result.row: IndexedRow | undefined // throws if >1 row returned
result.keyedRows: KeyedRow[]
result.keyedRow: KeyedRow | undefined // throws if >1 row returned
result.column: Field[] // throws if >1 column returned
result.field: Field | undefined // throws if >1 column or row returned
What do you think?
How about separating test dependecies from runtime dependencies like oak server does? That reduces the amount of the downloads for the users of this library.
While using code like this:
await postgres.query(
`INSERT TABLE (A, B, C) VALUE ($1, $2, $3)`,
fk_contact,
name,
area,
).catch((e: PostgresError) => {
if(!e.fields.constraint) throw e;
//Throw a custom message for constraint error here
});
However a typo on line 17 on PostgresError definition causes TypeScript to complain on the missing property. It's a quick fix I think.
export interface ErrorFields {
//...
contraint?: string; //Should be constraint
//...
}
Edit: Spotted another typo on this interface
export interface ErrorFields {
//...
schemaName?: string; //Should be schema
//...
}
We need to be able to handle queries like this:
const query = `
CREATE TABLE ids(id integer);
INSERT INTO ids(id) values(1);
INSERT INTO ids(id) values(2);
`;
await client.query(query);
I'm not sure how to nicely handle results, currently QueryResult
has rows
property that stores data returned from query. One idea is to subclass QueryResult
and create MultiQueryResult
that has slight different API and leave it up to user to handle that properly.
Comments welcome
I got uncaught error when execute query
error: Uncaught Error: Don't know how to parse column type: 19
► decode.ts:212:13
212 throw new Error(`Don't know how to parse column type: ${typeOid}`);
^
at decodeText (decode.ts:212:13)
at decode (decode.ts:220:12)
at _parseDataRow (query.ts:35:24)
at handleDataRow (query.ts:47:28)
at _preparedQuery (connection.ts:505:18)
and the query is
SELECT * FROM information_schema.tables
According to documents, PostgreSQL has 'name' special character types and that type is internal type for object names.
https://www.postgresql.org/docs/current/datatype-character.html#DATATYPE-CHARACTER-SPECIAL-TABLE
However in deno-postgres, 'name' column type is impleted in here
https://github.com/buildondata/deno-postgres/blob/7a27fd94c7b765ca256b3da96a9de94f380e6bbe/oid.ts#L5
but parsing method is not implemented.
https://github.com/buildondata/deno-postgres/blob/7a27fd94c7b765ca256b3da96a9de94f380e6bbe/decode.ts#L182-L213
deno-postgres v0.3.4
deno 0.26.0
postgres 12.1
const result = await client.query('SELECT * FROM ids WHERE id < $1;', 2); // notice number here
results in:
TypeError: input is not iterable
at stringToCodePoints (gen/bundle/main.js:4438:23)
at TextEncoder.encode (gen/bundle/main.js:4664:42)
at config.args.forEach.arg (/dev/deno-postgres/connection.ts:239:49)
at Array.forEach (<anonymous>)
at Connection._sendBindMessage (/dev/deno-postgres/connection.ts:230:21)
at Connection._preparedQuery (/dev/deno-postgres/connection.ts:315:20)
When doing SELECT pg_advisory_xact_lock(...)
With recent changes to deno fmt
we should update format.ts
script to leverage faster formatting.
JavaScript's Number cannot represent PostgreSQL's bigint type accurately as follows:
MIN | MAX | |
---|---|---|
JavaScript's Number | -9,007,199,254,740,991 |
9,007,199,254,740,991 |
PostgreSQL's bigint | -9,223,372,036,854,775,808 |
9,223,372,036,854,775,807 |
node-postgres treats bigint as String.
const { Client } = require('pg');
(async () => {
const client = new Client({
host: '127.0.0.1',
port: 5432,
user: 'postgres',
database: 'deno_postgres',
password: 'postgres',
});
try {
await client.connect();
const res = await client.query('SELECT 9223372036854775807');
console.log(res.rows);
} finally {
await client.end();
}
})();
The output is as follows:
[ { '?column?': '9223372036854775807' } ]
It's already implemented in Query.prepareArgs
but it's very elegantly implemented in pq, so I'd rather move prepareArg
to encode.ts
. Tests are needed.
Also, for future, QueryConfig
should take endoder: (any) => string
argument, this will allow users to provide custom encoding function.
Right now only simple defaults are applied. Config should check env variables as well for configuration.
Additionally parsing DSN would be great, so Client
's signature might look like this:
class Client {
constructor(config?: ConnectionParams | string)
}
Turns out there was an edge case in copyBytes
that was copied from deno_std, go back to importing this function from standard lib.
QueryConfig.name
is meant to prepare named queries however this parameter is now ignored. Eventually we need to support it, but it's lower priority now.
To land #45 I had to disable Pool
tests, rendering it unusable. Refactor has to be done to Pool
, some initial thoughts:
PooledClient
to PoolClient
Pool
can be constructed from DSNPool.transaction
helper that wraps Pool.query
in transactionimport pool from "../model/db.ts";
export default async function transaction(sql:string):Promise<string>{
let client,affect;
try {
client = await pool.connect();
await client.query("BEGIN");
affect = await client.query(sql);
await client.query("COMMIT");
} catch (error) {
await client.query("ROLLBACK");
console.log(error);
affect = null;
}finally{
client.release();
}
return affect;
}
transaction('select * from article order by create_time desc limit 10 offset 0;')
error message
PostgresError: syntax error at or near " offset 0
Add FrameReader
and FrameWriter
for communication with Postgres backend.
This will allow connection pools as you can do something like:
const pool = await Promise.all([...Array(5)].map(() => client.connect()))
and then funnel queries to this pool.
Both cleartext password and MD5 should be handled:
https://github.com/bartlomieju/deno-postgres/blob/master/connection.ts#L95:L112
When querying a table that has a numeric column, the following error occurs:
Error: Don't know how to parse column type: 1700
at decodeText (decode.ts:226:13)
at decode (decode.ts:234:12)
at QueryResult._parseDataRow (query.ts:35:24)
at QueryResult.handleDataRow (query.ts:47:28)
at Connection._simpleQuery (connection.ts:319:18)
at async Connection.query (connection.ts:530:14)
at async Client.query (client.ts:24:12)
JSONB columns are not supported by decode
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.