GithubHelp home page GithubHelp logo

ff00ff / mammoth Goto Github PK

View Code? Open in Web Editor NEW
492.0 492.0 25.0 2.31 MB

A type-safe Postgres query builder for TypeScript.

Home Page: https://mammoth.tools

License: MIT License

TypeScript 99.95% JavaScript 0.04% Batchfile 0.01%
deno nodejs postgresql query-builder type-safety typescript

mammoth's People

Contributors

auspexeu avatar cakoose avatar dependabot-preview[bot] avatar dependabot[bot] avatar echentw avatar galactic8291 avatar gilbert avatar github-actions[bot] avatar ivasilov avatar martijndeh avatar vitordino avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

mammoth's Issues

Tool to check that the Mammoth schema matches the Postgres schema.

In my test suite, I'd like to run something that verifies that my Postgres schema is compatible with what the structure I've defined for Mammoth. Is there an easy way to do this already?

If not, I think I could probably figure out how to write it. But it seems like a generally useful tool and maybe should be part of Mammoth itself, maybe in a sibling package?

Select * from table

Am I missing something or is it right now not possible to select * from a table?

count() returns of type string?

When I select count like so

const [{ count }] = await db.select(count()).from(db.users)

The type of count is string. Bug or feature? :)

Idea: Switching Mammoth to use native `async`/`await` might yield better stack traces.

Historically, JavaScript stack traces get truncated once you yield back to the event loop, e.g. for I/O. In the last few years, V8 added async stack traces, which fix that problem if you use native async/await instead of raw promises. This makes it way easier to track down errors.

It looks like Mammoth uses raw promises, so the stack traces I get are truncated. I think if it used native async/await, the stack traces would be better. I'm not familiar enough with Mammoth to understand the difficulties and downsides of doing that, though.

A workaround: wrap each function with code that catches exceptions and fixes the stack trace: GitHub gist. That works if I'm only using a few functions or there are only a few call sites. But Mammoth has a pretty big API and I use Mammoth everywhere in my app, so that would be too tedious.

PG < 8.0.3 incompatible with Node 14

Just took me hours to figure this out...but PG < 8.0.3 is incompatible with Node 14. It doesn't crash, no errors, no stack traces...but when calling query the runtime just stops. I saw you're already using dependabot which opened a PR and I've verified that all tests pass when upgrading.

But just thought I'd post the problem here in case someone else is baffled. Could you merge in the newest deps and then this issue can be closed.

Double prepend cwd

The cwd in the CLI is prepended in two places, leading to errors about trying to access /Users/foo/project/Users/foo/project/src/db.ts.

const object = require(path.join(process.cwd(), databasePath));

A concise type for a table?

Let's say I have a DB with a users table and the appropriate Mammoth defineTable and defineDb calls.

How can I get a concise type for that table, e.g. for use in a function signature:

function processUser(user: ???) { ... }

star() function returns results with snake_case fields

Querying a table

const dogs = defineTable({
  breedName: text().notNull(),
});

with

const rows = await db.select(star()).from(db.dogs);

Results in rows having a type of {breedName: string}[], but the actual value that gets returned is of the form {breed_name: string}[].

Insert into with array of values

Currently, only inserting a single value works. Instead, we should also support an array (so in addition to this single value) like so:

db.insertInto(foo).values([{ name: `A` }, { name: `B` }])

Which should produce the below SQL

INSERT INTO foo (name) VALUES ('A'), ('B')

Your starting point is probably somewhere at makeInsertInto which includes the actual values() function. It has some massive type smarts to only require the not null columns without a default. But here we also want to accept an array.

Please also add a simple test add insert.test.ts.

Support pg & pool connections in createDatabase

Problem

Sometimes a connection needs to be shared with other services in the application.

And mammoth allows passing only databaseUrl option and establishes own connection.

Solution

Allow BYOD (Bring Your Own Database) connection to be passed.

Returning to support expressions and queries

The .returning() function currently only supports strings (which should be the target table's column names). In reality they can accept any expression and (returning) query.

This should be supported:

insert into bar (val) values (123) returning (select count(*) from bar), id, val || 'test'

which would equate to something like:

db.insertInto(bar).values({ val: 123 }).returning(db.select(count()).from(bar), concat(bar.val, `test`).as(`test`))

The returning function is used in insert, update and delete queries. So probably best to focus on getting one of those working. And later maybe create some reusable type if that's at all possible similar to something like SelectFn.

The return type in ResultSet<Returning, Test> probably already supports getting the type right.

Helping detect when I forget to `await`?

I frequently forget to await my Mammoth .insertInto(...) and .update(...) calls. The code ends up running without an error but the DB actions never happen.

Normally, either TypeScript or WebStorm's static analysis will warn me about a missing await. This doesn't seem to happen for those Mammoth calls, though, possibly because the return value isn't a simple Promise?

I'm looking into enabling the ESLint "@typescript-eslint/no-floating-promises" rule. The problem is that rule requires running ESLint in "compute type information" mode, which seems to make ESLint 5x slower on our codebase.

I don't really have a recommendation here, but here are some ideas:

  • Is there a way to detect this dynamically? I wouldn't mind getting a runtime error at least.
  • Is there a modification to Mammoth (e.g. some kind of annotation) that would cause WebStorm's static analysis to catch this case? (Kind of a WebStorm-specific issue, so maybe I'll ask on their forums.)
  • Maybe Mammoth can include an ESLint rule to detect this particular issue without needing to run ESLint in "compute type information" mode?

Maybe someone else has figured out another way to solve this problem?

Documentation recommendation

For starters: love what you've done here. Definitely fills a need, so I want this lib to be successful.

But I think the README is structured poorly, to the point of being alienating to a potential new user.

This is the first bit of code in the readme:

const rows = await select(list.id, list.createdAt)
  .from(list)
  .where(list.createdAt.gt(now().minus(`2 days`)).or(list.value.eq(0)))
  .limit(10);

What's "list" and where does it come from? It just appears in your sample code. How do you import the module?

After reading the full readme the picture becomes clear(ish) but you should start the Readme with the "Schema" section so its clear how to define a Mammoth class (and how you provide static type information to Typescript) before talking about operations you can make using that class.

This lib deserves more attention than it's getting currently! Keep up the good work! I ended up here after you got a shoutout here: https://news.ycombinator.com/item?id=22739121 so I expect many people are discovering this at the moment.

PS Shouldn't "list" be capitalized in most of the sample code?

Support Postgres `bytea` type for binary data?

Mammoth currently doesn't have a type corresponding to Postgres' bytea type for binary data. Can it be added?

For my use case, it would be nice to be able to pass/receive Buffer or Uint8Array values.

Question: best way to define a concise type for the Mammoth DB object?

I want to create a type alias for the DB object so that I can use it in function signatures. For example, I have a wrapper around my controller functions that takes care of fetching the DB connection and managing the DB transaction, e.g.

async function wrapHandlerAsync<T>(handlerAsync: (db: DbType, ...) => Promise<T>): Promise<T> {
    const client = globalPool.getClient();
    const db = ...;
    client.query("BEGIN");
    try {
        await handlerAsync(db);
    } finally {
        ...
    }
}

What's the best way to write DbType in the function signature? What I'm doing now:

function createDb(client: pg.Client) {
    return defineDb({...}, async (query, parameters) => {...});
}

type Db = ReturnType<typeof CreateDb>;

One downside is that I'm relying on inference. For types that cross module boundaries, I prefer to not rely on inference. Is there a better way to do it?

Row-wise comparison operations

Is there a way to do write a row-wise comparison, e.g.

SELECT *
FROM cities
WHERE (state, county, city) > ($1, $2, $3)
ORDER BY state, county, city
LIMIT 10

This is useful when paging through a table with a cursor that is based on multiple columns. There's a way to do it without row-wise comparisons, but it's a little more complex to write and read.

If there were a way to insert a custom SQL fragment into a larger Mammoth query, that would be a pretty good workaround.

Support constraints / indices

Currently, constraints and indices are only supported if you manually add them to your migrations. Instead, we want to be able to define them similar to tables in TypeScript land using e.g. a defineConstraint() call.

This should fix #185.

Project status?

This project is pre-1.0 and was last updated 2018-10-29.

  1. Is the API stable and the implementation production-ready?
  2. What future fixes/additions -- if any -- are planned?

Your .dependabot/config.yml contained invalid details

Dependabot encountered the following error when parsing your .dependabot/config.yml:

Automerging is not enabled for this account. You can enable it from the [account settings](https://app.dependabot.com/accounts/Ff00ff/settings) screen in your Dependabot dashboard.

Please update the config file to conform with Dependabot's specification using our docs and online validator.

Self referencing tables

How would one write the correct table definition to reflect a composite pattern where a row can reference another row in the same table?

Idea: replace length-specific overloads with TS 4.0 variadic tuple types

TypeScript 4.0 added variadic tuple types.

Mammoth's variable-length methods like .select(...) are currently implemented with one signature overload for each length of arguments, from 1-75. Can/should that be replaced with variadic tuple types?

Potential advantages:

  1. My IDE occasionally freezes for ~3 seconds when trying to show the autocomplete for methods like .select(...). I wonder if that's because of the number of length-specific overloads.
  2. I wonder if things like this will be easier: #218. (I don't actually know...)
  3. Potentially better type error messages.

Support selecting from multiple tables

Currently, it's not possible to select from multiple tables e.g. db.select(foo.id).from(foo, bar) (you could do a cross join but ok). So we want to accept multiple from items in the from function.

Introspection

I don't really want to use Mammoth because it relies on migrations or manually maintained schemas. Rather I'd use something like Prisma 2 where I just setup my Postgres database and run some introspection query to generate the correct schema file.

However, I don't want to use Prisma 2 because they're adding too much abstraction/magic on top of SQL. I'd much rather use mammoth that's just a thin veneer of Typescript on top of SQL query builder.

So here is a proposal to pick&mix the best of both worlds...adding a "introspect" command to the cli. It would parse the specified Postgres schemas and generate as close a matching mammoth schema.

It's going to be quite an undertaking so just wanted to see if you're interested in actually seeing a PR for this or if we should rather just fork this project since it seems to go directly against one of your "core features".

Sadly there isn't a lot of code we can borrow from Prisma 2 (since core logic is in Rust) but it can definitely drive a lot of the design to get us started and if we'd make an on-par MVP we'd be able to use:

  • domains, enums, types etc,
  • core column types
  • views
  • foreign keys
  • checks
  • defaults
  • ...

Thoughts?

Get query as a string

I'm building most of my queries with Mammoth.

There are some cases where I want to do this:

WITH inner AS (...)
SELECT count(*) FROM inner

Mammoth can't do "WITH ..." yet, so I'm writing it with raw SQL. However, the "..." is something I can do with Mammoth, and in fact, I'm already doing it somewhere else.

What would be nice: If that Mammoth object had a ".toSql()` function, I could use that to create the "..." instead of writing it in raw SQL, then plug it in to the bigger "WITH ..." expression.

What would .toSql() return?

It would have to return the query and the values in some format. Returning the query as a single string with $1/$2 placeholders would be problematic because they're referencing absolute positions. What might work: something similar to JavaScript's custom template literals, e.g. an Array<string> for the query and an Array<unknown> for the values.

How to accept a list of columns as a parameter?

I have a helper class that iterates over a table, returning a page of rows at a time. I'd like the class to accept a list of columns to project, e.g.

const lister = new UserLister(db);
const users = await lister.nextPageAsync([db.users.firstName, db.users.lastName]);

for (const user of users) {
    console.log(user.firstName);
}

Is there a way to write nextPageAsync? I've tried a couple things but eventually get stuck on trying to pass the list of columns to db.select(...). (I'd be fine using a limited amount of as any to get the body of nextPageAsync working.)

Possibly a tangent, but Mammoth's variable-length methods like .select(...) are implemented by having separate overrides for 1-75 arguments. Is it possible to write that with TS 4's variadic tuple types? Would that make these kinds of things easier?

Bug: leftJoin produces the wrong SQL

Here's the source code:

mammoth/src/select.ts

Lines 230 to 237 in 144b464

leftJoin<JoinTable extends Table<any, any>>(
table: JoinTable,
): LeftJoin<SelectQuery<Columns, IncludesStar>, JoinTable> {
return this.newSelectQuery(
[...this.tokens, new StringToken(`INNER JOIN`), this.getTableStringToken(table)],
table,
) as any;
}

Looks like leftJoin produces an INNER JOIN. The correct thing should be a LEFT JOIN, right?

Where can be called on select directly

The following code will compile without errors.

db.select(db.users.id).where(db.users.id.eq(1)).from(db.users)

but generate the invalid SQL.

SELECT users.id WHERE users.id = $1 FROM users

Leaving the user with.

error: syntax error at or near "FROM"

SelectQuery isn't compatible with PromiseLike?

declare function acceptPromiseLike<T>(promise: PromiseLike<T>): void;
...
const query = db.select(db.foo.id).from(db.foo);
await acceptPromiseLike(query); // error

Full error message:

Argument of type 'SelectQuery<{ id: Column<"id", "foo", string, false, false, undefined>; }, false>' is not assignable to parameter of type 'PromiseLike<{ id: string | undefined; }[]>'.
  Types of property 'then' are incompatible.
    Types of parameters 'onRejected' and 'onrejected' are incompatible.
      Type 'TResult2 | PromiseLike<TResult2>' is not assignable to type 'void | PromiseLike<void>'.
        Type 'TResult2' is not assignable to type 'void | PromiseLike<void>'.
          Type 'TResult2' is not assignable to type 'PromiseLike<void>'.

     await acceptPromiseLike(query); // error
                             ~~~~~

Is this mismatch accidental?

`sum()` doesn't work for non-`number` columns

In my Mammoth schema definition, I'm using big.js for Postgres numeric and JS BigInt for Postgres int8.

The Mammoth sum function doesn't work on those types:

sum: (expression: Expression<number, boolean, any>) => Expression<number, false, "sum">;

The quick fix for me is to define my own sum function with a different type. That's what I'm going to do for now.

But I wonder if there's a more general solution. One option would be to add a new "IsSummable" type parameter to the Expression type. But another type parameter ends up adding a bunch of noise to the IDE auto-complete and error messages.

I think Scala "type members" (called "associated types" in Rust and Swift) might provide a cleaner solution, but I'm not sure if that's possible in TypeScript. Here's one attempt to simulate them in TypeScript: StackOverflow link.

int8/bigint/bigserial: number vs BigInt

Thanks for adding int8/bigint/bigserial support in efdb893.

One problem: JS number can't losslessly represent all the values in an 64-bit integer.

  • A JS number can represent integers up to about 53 bits. That might be enough for some use cases, but it seems dangerous for Mammoth to default to something that's lossy. Might catch people by surprise.
  • Node 10.4+ supports BigInt, which can represent all int8 values.

Suggestion: expose two int8 types, one for number and one for BigInt.

  • Option 1: Call them int8AsJsBigInt and int8AsJsNumber so the user knows what they're getting in to.
  • Option 2: Call them int8 and int8AsJsNumber. Node 10 is the oldest actively-supported version of Node, so defaulting to BigInt might be reasonable. And the user can opt-in to number if they want.

Support for ENUM?

Is there a way to use Postgres ENUM fields with Mammoth v1.x?

The v0.10.x docs have something called EnumColumn, but I can't see anything like that in the v1.x codebase.

Concise type for a field?

[Another question about using Mammoth types in function signatures! Please let me know if these should not be put under issues.]

I have a helper function that gets a single record from a table, e.g.:

function fetch(db: Db, id: string): Promise<Array<{name: string, email: string}>> {
    db.select(
        db.user.name,
        db.user.email,
    ).from(db.user).where(db.user.id.eq(id));
}

What if I want to allow the caller to specify additional fields for inclusion in the SELECT, e.g.:

function fetch(db: Db, id: string, ...fields: ???): Promise<Array<???>> {
    db.select(
        db.user.name,
        db.user.email,
        ...fields,
    ).from(db.user).where(db.user.id.eq(id));
}

Is there a way to do this?

ts-typed-sql

Hi there!
I am the author of ts-typed-sql.
Maybe we can join efforts? Currently I have some bugs to fix in my library, but due to my studies, I am missing time to fix them.

Optional columns when inserting

I noticed nullable columns are still required to specify when inserting. For example:

export class Test {
  id = new UuidColumn().primaryKey().notNull();
  name = new TextColumn();
  createdAt = new TimestampWithTimeZoneColumn().notNull().default(new Now());
}
// ...
db.insertInto(db.test).values({
  name: null,
  createdAt: null,
})

here name: null is required even though it's nullable and createdAt: null is required even though it has a default.

Is this a limitation that you ran into? Or just haven't gotten around to making them optional in JS-land too? I can write a PR if it's possible.

Multicolumn indexes?

Have you considered how to do indexes for multiple columns? I don't need them now but might in the future.

Maybe handle empty list passed to `.in(...)` expression?

await db.select(db.foo.id)
    .from(db.foo)
    .where(db.foo.id.in([])); 

I get a syntax error from Postgres.

QUERY: "SELECT foo.id FROM foo WHERE foo.id IN"
PARAMS: []
error: syntax error at end of input
    ...
  length: 91,
  severity: 'ERROR',
  code: '42601',
  ...

This is a common problem with query building in general.

I think it's bad for Mammoth to generate invalid SQL. The quickest "fix" for that would be to have Mammoth throw its own exception if the .in(...) argument is empty.

Another option would be for Mammoth to replace the whole expression with false. I think that would actually be quite helpful in many cases. However, in some of those cases the resulting query might actually be useless and the user might want to know (with an exception) rather than just silently wasting resources.

[Bug] can not drop multi column in one next

export const foo = m.defineTable({
  id: m.uuid().primaryKey().default(`gen_random_uuid()`),
  name: m.text().notNull(),
  value: m.integer(),
});

then

mammoth next --dbFile=src/db.js

got

CREATE TABLE foo (
  id uuid PRIMARY KEY DEFAULT gen_random_uuid(),
  name text NOT NULL,
  value pg_catalog.int4
);

then change ts

export const foo = m.defineTable({
  id: m.uuid().primaryKey().default(`gen_random_uuid()`),
});

then next again, then got

ALTER TABLE foo DROP COLUMN name;

besides, another bug, mammoth-cli default dbFile should be db.js, otherwise SyntaxError: Cannot use import statement outside a module

Non-brittle way of determining pkey constraint violation?

I sometimes attempt an insert, knowing that there's a chance it might fail, e.g.

try {
    await db.insertInto(db.users).values({
        handle: ...,
        ...
    });
} catch (err) {
    if (err && err.severity === 'ERROR' && err.code === '23505' && err.constraint === 'users_pkey') {
        return {success: false, reason: 'user handle is already taken'};
    }
}

The problem is that specifying the table name as a string ('users_pkey') is brittle. We had a case where we updated the table name (via an IDE refactoring) but forgot to update the constraint string.

(A "user handle is already taken" error should obviously have a test. But we also check for more rare pkey clashes and probably won't have test coverage for all of them.)

Can Mammoth help with this problem? For example, one minimalist solution would be something like:

const pkeyConstraintName = `${mammoth.tableName(db.users)}_pkey`;

A less minimalist option:

const pkeyConstraintName = mammoth.pkeyName(db.users);
// `pkeyName` can be a bit smarter and adapt to Postgres' 63-char limit on identifier names.

`undefined` vs `null` is inconsistent for nullable columns

There seems to be a bug in mammoth where Mammoth typing is telling you that the type of a field in a row result is undefined, when in fact the value is null.

Here's my minimal reproduction, let me know if I'm misunderstanding something or made a mistake!

So let's say I have a table defined:

const person = defineTable({
    name: text().notNull(),
    age: integer(),
});

And now I want to select some rows from this table:

const results = await db.select(db.person.name, db.person.age).from(db.person);

My IDE is telling me the inferred type of results is {name: string, age: number | undefined}[]. However, when I console.log(results), I notice that I actually get null (rather than undefined) values for age.

Restrict functions which aren't allowed e.g. WHERE after LIMIT

Whatever you type in Mammoth gets translated to SQL 1 to 1 (more or less). The point is at least that this makes the order of functions important. For example, currently you may write select(foo.id).from(foo).limit(2).where(foo.value.eq(123)) which would output a query with a limit first and where later which is obviously incorrect.

Because we want to stick to SQL as close as possible, we don't want to sort the output tree to fix the query, instead, we want to give type hints to indicate something is not possible.

To achieve this, we can probably omit certain functions from the query if they can be considered invalid e.g. we cannot do a .where() anymore after a .limit() as that would be invalid SQL. This makes Mammoth a little bit more type safe and the autocomplete really nice :).

Table names which are keywords cause invalid sql statement errors

Hi,

first of all, thx a lot for this great library!

I'm running into issues with table names which are keywords (I assume that column names could cause trouble too). A table named user for example is currently not working. It could be solved by quoting table names, but it seems that's not yet supported. It would be awesome if there is a solution for this.

Idea: Allow Mammoth to serialize/deserialize values, for more precise customization

I problem I have:

  • I want different int8 columns to deserialize differently: number vs BigInt vs Buffer.
  • I have schema-aware JSON deserializers. I'd like to use those to deserialize different jsonb columns differently.

With "pg", you can only have a single serializer/deserializer for a type.

Mammoth has richer type information, e.g. jsonb<Article>. If Mammoth allowed transforming values, I could get exactly the serialization/deserialization I want.

One downside is performance. But maybe there won't be a performance hit if Mammoth hooks in to "pg" at a lower level, parsing the text/binary protocol directly. (Hopefully "pg-protocol" can do most of the work.)

Can Mammoth be used with PostGis for spatial queries and how?

Is Mammoth suitable for PostGis spatial queries?

Apologies, I'm neither a TypeScript nor PostGres expert but, I'd like to use both in a cloud project I'm starting.

I come from Java and a little C#, so I'd be thankful for strongly-typed queries.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.