Point out HN: Postgres.js – Fastest Elephantine-Featured PostgreSQL Client for Node and Deno

🚀 Fastest full-featured node & deno client 🏷 ES6 Tagged Template Strings at the core 🏄‍♀️ Simple surface API 🖊️ Dynamic query support 💬 Chat and help on Gitter Getting started Installation Usage Create your sql database instance // db.js import postgres from ‘postgres’ const sql = postgres({ /* options */ }) // will use…

49
Point out HN: Postgres.js – Fastest Elephantine-Featured PostgreSQL Client for Node and Deno

Fastest full PostgreSQL nodejs client

Getting began

Good UX with Postgres.js

Installation

Utilization

Plan your sql database instance

// db.js
import postgres from 'postgres'

const sql = postgres({ /alternatives */ }) // will use psql environment variables

export default sql

Merely import to be used in assorted locations

// customers.js
import sql from './db.js'

async feature getUsersOver(age) {
  const customers = await sql`
    capture
      name,
      age
    from customers
    the place age> ${ age }
  `
  // customers=Consequence [{ name: "Walter", age: 80 }, { name: 'Murray', age: 68 }, ...]
  return customers
}


async feature insertUser({ name, age }) {
  const customers = sql`
    insert into customers
      (name, age)
    values
      (${ name }, ${ age })
    returning name, age
  `
  // customers=Consequence [{ name: "Murray", age: 68 }]
  return customers
}

Table of Contents

Connection

postgres([url], [options])

It’s likely you’ll presumably use either a postgres:// url connection string or the alternatives to clarify your database connection properties. Options within the item will override any most modern within the url. Options will fall help to the equivalent environment variables as psql.

const sql = postgres('postgres://username:password@host:port/database', {
  host                 : '',            // Postgres ip take care of[s] or domain name[s]
  port                 : 5432,          // Postgres server port[s]
  database             : '',            // Name of database to join to
  username             : '',            // Username of database individual
  password             : '',            // Password of database individual
  ...and extra
})

More alternatives would maybe even even be indicate within the Connection tiny print half.

Queries

await sql`...` -> Consequence[]

Postgres.js utilizes Tagged template functions to course of count on parameters ahead of interpolation. Utilizing tagged template literals advantages builders by:

  1. Enforcing agreeable count on generation
  2. Giving the sql`` feature worthy utility and count on building parts.

Any generic price will be serialized per an inferred form, and replaced by a PostgreSQL protocol placeholder $1, $2, .... The parameters are then sent one after the other to the database which handles escaping & casting.

All queries will return a Consequence array, with objects mapping column names to every row.

const xs = await sql`
  insert into customers (
    name, age
  ) values (
    'Murray', 68
  )

  returning *
`

// xs=[{ user_id: 1, name: 'Murray', age: 68 }]

Please tell their own praises that queries are first completed when awaited – or manually by the use of .attain().

Inquire parameters

Parameters are mechanically extracted and handled by the database so that SQL injection is no longer conceivable. No particular handling is extreme, simply use tagged template literals as frequent. Dynamic queries and count on building would maybe even even be viewed within the subsequent half. // todo

const name = 'Mur'
    , age = 60

const customers = await sql`
  capture
    name,
    age
  from customers
  the place
    name love ${ name + '%' }
    and age> ${ age }
`
// customers=[{ name: 'Murray', age: 68 }]

Watch out with citation marks here. Because Postgres infers column kinds, you fabricate no longer deserve to wrap your interpolated parameters in quotes love '${name}'. This would possibly occasionally even just cause an error for the reason that tagged template replaces ${name} with $1 within the count on string, leaving Postgres to manufacture the interpolation. Whenever you happen to wrap that in a string, Postgres will scrutinize '$1' and clarify it as a string as in opposition to a parameter.

Dynamic column preference

const columns = ['name', 'age']

sql`
  capture
    ${ sql(columns) }
  from customers
`

// Which results in: 
capture "name", "age" from customers

Dynamic inserts

const individual = {
  name: 'Murray',
  age: 68
}

sql`
  insert into customers ${
    sql(individual, 'name', 'age')
  }
`

// Which results in: 
insert into customers ("name", "age") values ($1, $2)

It’s likely you’ll presumably pass over column names and simply attain sql(individual) to salvage your whole fields from the item as columns. Watch out no longer to allow customers to provide columns that you fabricate no longer are making an try to be inserted.

A couple of inserts in a single count on

Whenever you happen to must insert extra than one rows on the equivalent time or no longer it is additionally vital sooner to manufacture it with a single insert. Merely pass an array of objects to sql().

const customers = [{
  name: 'Murray',
  age: 68,
  garbage: 'ignore'
},
{
  name: 'Walter',
  age: 80
}]

sql`insert into customers ${ sql(customers, 'name', 'age') }`

// Is translated to: 
insert into customers ("name", "age") values ($1, $2), ($3, $4)

// Right here it is likely you'll presumably also additionally pass over column names which is ready to make use of object keys as columns
sql`insert into customers ${ sql(customers) }`

// Which results in: 
insert into customers ("name", "age") values ($1, $2), ($3, $4)

Dynamic columns in updates

That is additionally well-known for update queries

const individual = {
  id: 1,
  name: 'Murray',
  age: 68
}

sql`
  update customers region ${
    sql(individual, 'name', 'age')
  }
  the place user_id=${ individual.id }
`

// Which results in: 
update customers region "name" = $1, "age" = $2 the place user_id = $3

Dynamic values and the place in

Impress lists can additionally be created dynamically, making the place in queries easy too.

const customers = await sql`
  capture
    *
  from customers
  the place age in ${ sql([68, 75, 23]) }
`

or

const [{ a, b, c }] => await sql`
  capture
    *
  from (values ${ sql(['a', 'b', 'c']) }) as x(a, b, c)
`

Constructing queries

Postgres.js parts a easy dynamic count on builder by conditionally appending/omitting count on fragments.
It essentially works by nesting sql`` fragments internal assorted sql`` calls or fragments. This lets in you to form dynamic queries safely with out risking sql injections by frequent string concatenation.

Partial queries

const olderThan = x => sql`and age> ${ x }`

const filterAge = just appropriate

sql`
  capture
   *
  from customers
  the place name is no longer null ${
    filterAge
      ? olderThan(50)
      : sql``
  }
`
// Which results in: 
capture * from customers the place name is no longer null
// Or
capture * from customers the place name is no longer null and age > 50

Dynamic filters

sql`
  capture
    *
  from customers ${
    id
      ? sql`the place user_id=${ id }`
      : sql``
  }
`

// Which results in: 
capture * from customers
// Or
capture * from customers the place user_id = $1

SQL functions

Utilizing key phrases or calling functions dynamically is additionally conceivable by the use of sql`` fragments.

const date = null

sql`
  update customers region updated_at=${ date || sql`now()` }
`

// Which results in: 
update customers region updated_at = now()

Table names

Dynamic identifiers love desk names and column names is additionally supported love so:

const desk = 'customers'
    , column = 'id'

sql`
  capture ${ sql(column) } from ${ sql(desk) }
`

// Which results in: 
capture "id" from "customers"

Developed count on systems

.cursor()

await sql``.cursor([rows=1], [fn])

Exercise cursors when you would possibly maybe throttle the volume of rows being returned from a count on. It’s likely you’ll presumably use a cursor either as an async iterable or with a callback feature. For a callback feature novel outcomes are no longer requested till the promise / async callback feature has resolved.

callback feature

await sql`
  capture
    *
  from generate_series(1,4) as x
`.cursor(async([row]) => {
  // row={ x: 1 }
  await http.keep aside a query to('https://example.com/wat', { row })
}
for await…of

// for await...of
const cursor = sql`capture from generate_series(1,4) as x`.cursor()

for await (const [row] of cursor) {
  // row={ x: 1 }
  await http.keep aside a query to('https://example.com/wat', { row })
}

A single row will be returned by default, but it is likely you’ll presumably also additionally keep aside a query to batches by surroundings the number of rows desired in every batch as the main argument to .cursor:

await sql`
  capture
    *
  from generate_series(1,1000) as x
`.cursor(10, async rows => {
  // rows=[{ x: 1 }, { x: 2 }, ... ]
  await Promise.all(rows.scheme(row =>
    http.keep aside a query to('https://example.com/wat', { row })
  ))
}

If an error is thrown internal the callback feature no extra rows will be requested and the outer promise will reject with the thrown error.

It’s likely you’ll presumably terminate the cursor early either by calling damage within the for await...of loop, or by returning the token sql.CLOSE from the callback feature.

await sql`
  capture from generate_series(1,1000) as x
`.cursor(row => {
  return Math.random() > 0.9 && sql.CLOSE // or sql.END
})

.forEach()

await sql``.forEach(fn)

Whenever you happen to are making an try to take care of rows returned by a count on one after the other, it is likely you’ll presumably also use .forEach which returns a promise that resolves once there are no longer any extra rows.

await sql`
  capture created_at, name from events
`.forEach(row => {
  // row={ created_at: '2019-11-22T14: 22: 00Z', name: 'related' }
})

// No extra rows

picture

await sql``.picture([rows=1], fn) -> Consequence[]

Pretty than executing a given count on, .picture will return information utilized within the count heading within the true direction of. This information can embody the count on identifier, column kinds, and heaps others.

That is efficacious for debugging and analyzing your Postgres queries. Furthermore, .picture will provide you with salvage entry to to the final generated count on string that would maybe presumably be completed.

Uncooked

sql``.uncooked()

Utilizing .uncooked() will return rows as an array with Buffer values for every column, as hostile to objects.

This would possibly occasionally presumably be well-known to rating identically named columns, or for particular efficiency/transformation causes. The column definitions are aloof incorporated on the result array, plus salvage entry to to parsers for every column.

File

await sql.file(direction, [args], [options]) -> Consequence[]

Utilizing a .sql file for a count on is additionally supported with optionally accessible parameters to make use of if the file includes $1, $2, and heaps others

const result = await sql.file('count on.sql', ['Murray', 68])

Canceling Queries in Development

Postgres.js supports, canceling queries in growth. It essentially works by opening a brand novel reference to a protocol stage startup message to execute essentially the most modern count on working on a particular connection. That plot there’s no longer a guarantee that the count on will be canceled, and attributable to the conceivable speed prerequisites it would maybe even even result in canceling one other count on. That is k for long working queries, but within the case of high load and mercurial queries it would maybe be better to simply ignore ends as hostile to canceling.

const count on = sql`capture pg_sleep 100`.attain()
setTimeout(() => count on.execute(), 100)
const result = await count on

Unsafe uncooked string queries

Developed unsafe use cases

await sql.unsafe(count on, [args], [options]) -> Consequence[]

Whenever you happen to know what you are doing, it is likely you’ll presumably also use unsafe to pass any string you’d capture to postgres. Please tell their own praises that this would possibly maybe presumably result in SQL injection when you are no longer cautious.

sql.unsafe('capture ' + hazard + ' from customers the place id=' + dragons)

Transactions

BEGIN / COMMIT await sql.originate up([options=''], fn) -> fn()

Exercise sql.originate up to originate up a brand novel transaction. Postgres.js will reserve a connection for the transaction and supply a scoped sql instance for all transaction uses within the callback feature. sql.originate up will resolve with the returned price from the callback feature.

BEGIN is mechanically sent with the optionally accessible alternatives, and if anything fails ROLLBACK will be known as so the connection would maybe even even be launched and execution can continue.

const [user, account] = await sql.originate up(async sql => {
  const [user] = await sql`
    insert into customers (
      name
    ) values (
      'Murray'
    )
  `

  const [account] = await sql`
    insert into accounts (
      user_id
    ) values (
      ${ individual.user_id }
    )
  `

  return [user, account]
})

It’s additionally conceivable to pipeline the requests in a transaction if wanted by returning an array with queries from the callback feature love this:

const result = await sql.originate up(sql => [
  sql`update ...`,
  sql`update ...`,
  sql`insert ...`
])

SAVEPOINT await sql.savepoint([name], fn) -> fn()

sql.originate up('be taught write', async sql => {
  const [user] = await sql`
    insert into customers (
      name
    ) values (
      'Murray'
    )
  `

  const [account] = (await sql.savepoint(sql =>
    sql`
      insert into accounts (
        user_id
      ) values (
        ${ individual.user_id }
      )
    `
  ).employ(err => {
    // Story would maybe presumably no longer be created. ROLLBACK SAVEPOINT is named resulting from we caught the rejection.
  })) || []

  return [user, account]
})
.then(([user, account]) => {
  // broad success - COMMIT succeeded
})
.employ(() => {
  // no longer so real - ROLLBACK was once known as
})

Attain tell their own praises that it is likely you’ll presumably also in total develop the equivalent result the use of WITH queries (Overall Table Expressions) as hostile to the use of transactions.

Hear & sing

Whenever you happen to name .hear, a dedicated connection will be created to make particular you rating notifications in accurate-time. This connection will be used for any extra calls to .hear.

.hear returns a promise which resolves once the LISTEN count on to Postgres completes, or if there would possibly be already a listener lively.

await sql.hear('information', payload => {
  const json = JSON.parse(payload)
  console.log(json.this) // logs 'is'
})

Squawk would maybe even even be completed as frequent in SQL, or by the use of the sql.sing formula.

sql.sing('information', JSON.stringify({ no: 'this', is: 'information' }))

Realtime subscribe

Postgres.js implements the logical replication protocol of PostgreSQL to give a capture to subscription to accurate-time updates of insert, update and delete operations.

NOTE To make this work you would possibly maybe salvage the exquisite publications in your database, enable logical replication by surroundings wal_level=logical in postgresql.conf and join the use of either a replication or superuser.

Rapid originate up

Plan a publication (eg. in migration)

CREATE PUBLICATION alltables FOR ALL TABLES

Subscribe to updates

const sql = postgres({ publications: 'alltables' })

const { unsubscribe } = await sql.subscribe('insert:events', (row, { elaborate, relation, key, worn }) =>
  // expose about novel occasion row over eg. websockets or fabricate one thing else
)

Subscribe sample

It’s likely you’ll presumably subscribe to particular operations, tables, and even rows with predominant keys.

operation : schema . desk = primary_key

operation is one amongst | insert | update | delete and defaults to *

schema defaults to public

desk is a particular desk name and defaults to *

primary_key would maybe even even be used to easiest subscribe to particular rows

Examples

sql.subscribe('*',                () => /everything */ )
sql.subscribe('insert',           () => /all inserts */ )
sql.subscribe('*:customers',          () => /all operations on the general public.customers desk */ )
sql.subscribe('delete:customers',     () => /all deletes on the general public.customers desk */ )
sql.subscribe('update:customers=1',   () => /all updates on the customers row with a predominant key=1 */ )

Numbers, bigint, numeric

Number in javascript is easiest in a location to remark 253-1 safely that plot that kinds in PostgreSQLs love bigint and numeric will no longer match into Number.

Since Node.js v10.4 we are succesful of use BigInt to compare the PostgreSQL form bigint which is returned for eg. count[{ name: "Walter", age: 80 }, { name: 'Murray', age: 68 }, ...]. Sadly, it would no longer work with JSON.stringify out of the box, so Postgres.js will return it as a string.

Whenever you happen to are making an try to make use of BigInt it is likely you’ll presumably also add this reveal form:

const sql = postgres({
  kinds: {
    bigint: postgres.BigInt
  }
})

There is currently no guaranteed plot to take care of numeric / decimal kinds in native Javascript. These [and similar] kinds will be returned as a string. The best plot in this case is to make use of custom-made kinds.

Connection tiny print

All Postgres alternatives

const sql = postgres('postgres://username:password@host:port/database', {
  host                 : '',            // Postgres ip take care of[es] or domain name[s]
  port                 : 5432,          // Postgres server port[s]
  direction                 : '',            // unix socket direction (in total '/tmp')
  database             : '',            // Name of database to join to
  username             : '',            // Username of database individual
  password             : '',            // Password of database individual
  ssl                  : fallacious,         // just appropriate, capture, require, tls.join alternatives
  max                  : 10,            // Max number of connections
  max_lifetime         : null,          // Max lifetime in seconds (extra information under)
  idle_timeout         : 0,             // Lazy connection timeout in seconds
  connect_timeout      : 30,            // Connect timeout in seconds
  no_prepare           : fallacious,         // No computerized introduction of prepared statements
  kinds                : [],            // Array of custom-made kinds, scrutinize extra under
  onnotice             : fn,            // Defaults to console.log
  onparameter          : fn,            // (key, price) when server param commerce
  debug                : fn,            // Is is named with (connection, count on, params)
  remodel            : {
    column             : fn,            // Transforms incoming column names
    price              : fn,            // Transforms incoming row values
    row                : fn             // Transforms whole rows
  },
  connection           : {
    application_name   : 'postgres.js', // Default application_name
    ...                                 // Other connection parameters
  },
  target_session_attrs : null,          // Exercise 'be taught-write' with extra than one hosts to
                                        // guarantee easiest connecting to predominant
  fetch_types          : just appropriate,          // Automatically fetches kinds on join
                                        // on preliminary connection.
})

Point to that max_lifetime=60 (30 + Math.random() 30) by default. This resolves to an interval between 45 and 90 minutes to optimize for the advantages of prepared statements and working well with Linux’s OOM killer.

SSL

Though inclined to MITM attacks, a total configuration for the ssl option for some cloud suppliers is to region rejectUnauthorized to fallacious (if NODE_ENV is production):

const sql =
  course of.env.NODE_ENV === 'production'
    ? // "Except you're the use of a Personal or Protect Heroku Postgres database, Heroku Postgres would no longer currently give a capture to verifiable certificates"
      // https://again.heroku.com/3DELT3RK/why-can-t-my-third-celebration-utility-join-to-heroku-postgres-with-ssl
      postgres({ ssl: { rejectUnauthorized: fallacious } })
    : postgres()

For extra information concerning ssl with postgres, check out the Node.js documentation for tls.

Multi-host connections – Excessive Availability (HA)

A couple of connection strings would maybe even even be passed to postgres() within the originate of postgres('postgres://localhost: 5432,localhost: 5433', ...). This works the equivalent as native the psql elaborate. Read extra at extra than one host URIs.

Connections will be attempted in expose of the specified hosts/ports. On a a hit connection, all retries will be reset. This ensures that hosts can attain up and down seamlessly.

Whenever you happen to specify target_session_attrs: 'predominant' or PGTARGETSESSIONATTRS=predominant Postgres.js will easiest join to the main host, permitting for zero downtime failovers.

The Connection Pool

Connections are created lazily once a count on is created. This plot that simply doing const sql=postgres(...) will no longer enjoy any develop assorted than instantiating a brand novel sql instance.

No connection will be made till a count on is made.

This plot that we salvage a vital less complicated fable for error handling and reconnections. Queries will be sent over the wire straight on the next readily accessible connection within the pool. Connections are mechanically taken out of the pool when you originate up a transaction the use of sql.originate up(), and mechanically returned to the pool once your transaction is completed.

Any count on which was once already sent over the wire will be rejected if the connection is lost. It’ll mechanically defer to the error handling it is likely you’ll presumably also enjoy for that count on, and since connections are lazy it would maybe presumably mechanically strive to reconnect the next time a count on is made. The coolest thing about this is no bizarre generic “onerror” handler that tries to salvage issues help to peculiar, and additionally less complicated utility code resulting from you wouldn’t enjoy to take care of errors out of context.

There are no longer any guarantees about queries executing in expose unless the use of a transaction with sql.originate up() or surroundings max: 1. Pointless to deliver doing a sequence of queries, one expecting the loads of will work as anticipated, but that is exquisite attributable to the nature of js async/promise handling, so or no longer it is miles never distinguished for this library to be desirous about ordering.

Since this library mechanically creates prepared statements, it additionally has a default max lifetime for connections to prevent memory bloat on the database itself. That is a random interval for every connection between 45 and 90 minutes. This lets in extra than one connections to realize help up and down seamlessly with out individual interference.

Connection timeout

By default, connections will no longer terminate till .end() is named. Nonetheless, it will be well-known to enjoy them terminate mechanically when:

  • re-instantiating extra than one sql`` cases
  • the use of Postgres.js in a Serverless environment (Lambda, and heaps others.)
  • the use of Postgres.js with a database provider that mechanically closes connections after a while (scrutinize ECONNRESET danger)

This would possibly occasionally presumably be completed the use of the idle_timeout or max_lifetime alternatives. These configuration alternatives specify the number of seconds to abet ahead of mechanically closing an lazy connection and essentially the most time a connection can exist, respectively.

As an instance, to terminate a connection that has either been lazy for 20 seconds or existed for bigger than 30 minutes:

const sql = postgres({
  idle_timeout: 20,
  max_lifetime: 60 * 30
})

Auto fetching of array kinds

Postgres.js will mechanically earn desk/array-form information when it first connects to a database.

Whenever you happen to would maybe even enjoy revoked salvage entry to to pg_catalog this option will now no longer work and must aloof deserve to be disabled.

It’s likely you’ll presumably disable this option by surroundings fetch_types to fallacious.

Environmental variables

It’s additionally conceivable to join to the database with out a connection string or any alternatives. Postgres.js will fall help to the total environment variables utilized by psql as within the desk under:

LikelihoodAmbiance Variables
hostPGHOST
portPGPORT
databasePGDATABASE
usernamePGUSERNAME or PGUSER
passwordPGPASSWORD
idle_timeoutPGIDLE_TIMEOUT
connect_timeoutPGCONNECT_TIMEOUT

Ready statements

Ready statements will mechanically be created for any queries the place it would maybe even even be inferred that the count on is static. This would possibly occasionally presumably be disabled by the use of the no_prepare option. As an illustration — this is efficacious when the use of PGBouncer in transaction mode.

Custom Kinds

It’s likely you’ll presumably add ergonomic give a capture to for custom-made kinds, or just use sql.typed(price, form) inline, the place form is the PostgreSQL oid for the type and the precisely serialized string. (oid values for kinds would maybe even even be indicate within the pg_catalog.pg_types desk.)

Including Inquire helpers is the cleanest come which is ready to be

Read More

Charlie Layers
WRITTEN BY

Charlie Layers

Fill your life with experiences so you always have a great story to tellBio: About: