Present HN: Quickgres.js, 430 LoC pipelined PostgreSQL client library

9
[favorite_button]
Present HN: Quickgres.js, 430 LoC pipelined PostgreSQL client library
Advertisements

Quickgres is a native-JS PostgreSQL client library.

Or no longer it is around 400 strains of code, with out a external dependencies.

Aspects

  • Queries with parameters (along with entertaining statements and portals).
  • Every parameterized quiz creates a cached entertaining assertion and row parser.
  • COPY protocol for fleet desk dumps and inserts.
  • Evenly examined SSL connection toughen.
  • Plaintext & MD5 password authentication.
  • Partial quiz readback.
  • You wants to be able to total 2GB size queries (Even as you accept as true with to store movies in TOAST columns? (Maybe spend big objects as a substitute.)) I accept as true with never tried it though.
  • Canceling prolonged running queries.
  • Binary params, binary quiz outcomes.
  • Like a flash uncooked protocol cross-by design of to output stream
  • Client-facet library for parsing PostgreSQL quiz leads to the browser

Lacking

Advertisements
  • Beefy test suite
  • SASL authentication
  • Streaming replication (To your JavaScript DB synced by strategy of WAL shipping?)
  • No form parsing (That is more esteem a characteristic.)
  • Easy queries are deprecated in decide of parameterized queries.

What’s it actual for?

  • Or no longer it is moderately slight so that that it is seemingly you’ll perchance read it.
  • It would now not accept as true with deps, so that you compose no longer accept as true with to dread about npm dephell.
  • Performance-wise it be okay. Allege 100,000 DB-hitting HTTP/2 requests per 2d on a 16-core server.

Utilization

1) {
stay Up for client.sync();
spoil;
}
}

// Replica details
const copyResult=stay Up for client.quiz(‘COPY users TO STDOUT (FORMAT binary)’);
console.log(copyResult.rows[0]);

const copyIn=stay Up for client.quiz(‘COPY users_copy FROM STDIN (FORMAT binary)’);
console.log(copyIn.columnFormats);
copyResult.rows.forEach(row=> client.copyData(row));
stay Up for client.copyDone();

Advertisements

stay Up for client.discontinue(); // End the connection socket.
}

walk();”>

const { Client } = require('quickgres'); 

async characteristic walk() {
    const client = original Client({ particular person: 'myuser', database: 'mydb', password: 'mypass' });
    stay Up for client.join('/tmp/.s.PGSQL.5432'); // Join to a UNIX socket.
    // stay Up for client.join(5432, 'localhost'); // Join to a TCP socket.
    // stay Up for client.join(5432, 'localhost', {}); // Join to a TCP socket with SSL config (seek tls.join).
    console.error(client.serverParameters);

    // Acquire admission to row fields as object properties.
    let { rows, rowCount } = stay Up for client.quiz(
        'SELECT name, electronic mail FROM users WHERE id=$1', ['adb42e46-d1bc-4b64-88f4-3e754ab52e81']);
    console.log(rows[0].name, rows[0].electronic mail, rowCount);
    console.log(rows[0][0], rows[0][1], rowCount);

    // You possibly could convert the row into an object or an array.
    shriek(rows[0].toObject().name === rows[0].toArray()[0]);

    // Stream uncooked quiz outcomes protocol to stdout (why crash cycles on parsing details...)
    stay Up for client.quiz(
        'SELECT name, electronic mail FROM users WHERE id=$1', 
        ['adb42e46-d1bc-4b64-88f4-3e754ab52e81'], 
        Client.STRING, // Or Client.BINARY. Controls the structure of details that PostgreSQL sends you.
        ethical, // Cache the parsed quiz (default is ethical. Even as you utilize the quiz textual command material most fine as soon as, device this to flawed.)
        course of.stdout // The tip result stream. Client calls stream.write(buffer) on this. Look RowReader for critical points.
    );

    // Binary details
    const buf = Buffer.from([0,1,2,3,4,5,255,254,253,252,251,0]);
    const result = stay Up for client.quiz('SELECT $1::bytea', [buf], Client.BINARY, flawed);
    shriek(buf.toString('hex') === result.rows[0][0].toString('hex'), "bytea roundtrip failed");

    // Query execution occurs in a pipelined vogue, so ought to you crash a million 
    // random SELECTs, they win written to the server straight away, and the server
    // replies are streamed assist to you.
    const promises = [];
    for (let i = 0; i  1000000; i++) {
        const id = Math.floor(Math.random()*1000000).toString();
        promises.push(client.query('SELECT FROM users WHERE id=$1', [id]));
    }
    const results = await Promise.all(promises);

    // Partial query results
    client.startQuery('SELECT FROM users', []);
    while (client.inQuery) {
        const resultChunk = await client.getResults(100);
        // To stop receiving chunks, send a sync.
        if (resultChunk.rows.length > 1) {
            stay Up for client.sync();
            spoil;
        }
    }

    // Replica details
    const copyResult = stay Up for client.quiz('COPY users TO STDOUT (FORMAT binary)');
    console.log(copyResult.rows[0]);

    const copyIn = stay Up for client.quiz('COPY users_copy FROM STDIN (FORMAT binary)');
    console.log(copyIn.columnFormats);
    copyResult.rows.forEach(row => client.copyData(row));
    stay Up for client.copyDone();

    stay Up for client.discontinue(); // End the connection socket.
}

walk();

Changelog

  • 0.4.0: Optimized allocations, ambiance worthy uncooked protocol cross-by design of, client-facet protocol parsing lib.

  • 0.3.1: Treating undefined as null in quiz parameters. DB error messages open with ‘PostgreSQL Error:’.

    Advertisements
  • 0.3.0-rc1: Removed CopyReader and rolled .reproduction() into .quiz(). Commented supply code.

  • 0.3.0-rc0: Binary quiz params, binary quiz outcomes, indolent result parsing, rolled ObjectReader, ArrayReader and RawReader into RowReader, questionable lifestyles choices, bloat Up to 401 strains

  • 0.2.2-rc1: Quiz canceling with execute, made assertion caching non-compulsory, tests for bytea roundtrips and big objects, improve from connection-time EAGAIN, squeeze to 349 strains.

  • 0.2.1: Allocate actual size message write buffers (yay), eliminated portray programs, more tests, inlined row parsers, added RawReader, minor optimizations, chop strains to 365 from 441.

    Advertisements
  • 0.2.0: Deprecated simpleQuery, merged copyTo and copyFrom to reproduction, optimized need of socket writes on hot course (this improved SSL perf a dinky bit), added more tests to tests.js, changed sync() and copyDone() to async guidelines on how to simplify API.

Test output

On a 13″ Macbook Legitimate 2018 (2.3 GHz Intel Core i5), PostgreSQL 11.3.

$ node test/test.js testdb
46656.29860031104 'single-row-hitting queries per 2d'
268059 268059 1
268059 268059 1

README tests carried out

obtained 1000016 rows
573403.6697247706 'partial quiz (100 rows per crash) rows per 2d'
obtained 10000 rows
454545.45454545453 'partial quiz (early exit) rows per 2d'
warming Up 30000 / 30000     
38510.91142490372 'random queries per 2d'
670241.2868632708 '100-row quiz rows per 2d'
925069.3802035153 'streamed 100-row quiz rows per 2d'
3.0024 'stream writes per quiz'
1170973.0679156908 'binary quiz rows per 2d piped to test.dat'
916600.3666361136 'string quiz rows per 2d piped to test_str.dat'
595247.619047619 'quiz rows per 2d'
359717.9856115108 'quiz rows as arrays per 2d' 10000160
346505.8905058905 'quiz rows as objects per 2d' 1000016
808420.3718674212 'binary quiz rows per 2d'
558980.4359977641 'binary quiz rows as arrays per 2d' 10000160
426264.27962489345 'binary quiz rows as objects per 2d' 1000016
Rupture test: PostgreSQL Error: 83 ERROR VERROR C57014 Mcanceling assertion attributable to particular person query Fpostgres.c L3070 RProcessInterrupts  
Elapsed: 18 ms
Deleted 1000016 rows from users_copy
47021.94357366771 'binary inserts per 2d'
530794.0552016986 'textual command material copyTo rows per 2d'
461474.8500230734 'csv copyTo rows per 2d'
693974.3233865371 'binary copyTo rows per 2d'
Deleted 30000 rows from users_copy
328089.56692913384 'binary copyFrom rows per 2d'

carried out

Sorting out SSL connection
30959.752321981425 'single-row-hitting queries per 2d'
268059 268059 1
268059 268059 1

README tests carried out

obtained 1000016 rows
454346.2062698773 'partial quiz (100 rows per crash) rows per 2d'
obtained 10000 rows
454545.45454545453 'partial quiz (early exit) rows per 2d'
warming Up 30000 / 30000     
23094.688221709006 'random queries per 2d'
577034.0450086555 '100-row quiz rows per 2d'
745156.4828614009 'streamed 100-row quiz rows per 2d'
3 'stream writes per quiz'
1019379.2048929663 'binary quiz rows per 2d piped to test.dat'
605333.5351089588 'string quiz rows per 2d piped to test_str.dat'
508655.13733468973 'quiz rows per 2d'
277243.13834211254 'quiz rows as arrays per 2d' 10000160
252848.54614412136 'quiz rows as objects per 2d' 1000016
722033.21299639 'binary quiz rows per 2d'
432907.3593073593 'binary quiz rows as arrays per 2d' 10000160
393242.62681871804 'binary quiz rows as objects per 2d' 1000016
Rupture test: PostgreSQL Error: 83 ERROR VERROR C57014 Mcanceling assertion attributable to particular person query Fpostgres.c L3070 RProcessInterrupts  
Elapsed: 41 ms
Deleted 1000016 rows from users_copy
33407.57238307349 'binary inserts per 2d'
528829.1909042834 'textual command material copyTo rows per 2d'
501010.0200400802 'csv copyTo rows per 2d'
801295.6730769231 'binary copyTo rows per 2d'
Deleted 30000 rows from users_copy
222176.62741612975 'binary copyFrom rows per 2d'

carried out

Simulating web session workload: Quiz is available in with a session id, spend it to get particular person id and particular person details string. Replace particular person with a modified version of the details string.

Advertisements

The max-r one is appropriate fetching a plump a session row in line with session id, so it be a pure read workload.

$ node test/test-max-rw.js testdb
    32574 session RWs per 2d              
carried out

$ node test/test-max-r.js testdb
    130484 session Rs per 2d              
carried out

Yes, the laptop hits Planetary-1: one query per day per particular person in the enviornment. On the RW-facet, it could most likely perchance lend a hand 2.8 billion requests per day. Present that the test DB suits in RAM, so whenever you with out a doubt wanted to store 1k of details per particular person, you’d need 10 TB of RAM to hit this efficiency with 10 billion people.

On a 16-core server, 2xE5-2650v2, 64 GB ECC DDR3 and Optane. (NB the numCPUs and connections

Read More

Advertisements
Charlie
WRITEN BY

Charlie

Fill your life with experiences so you always have a great story to tell
Get Connected!
One of the Biggest Social Platform for Entrepreneurs, College Students and all. Come and join our community. Expand your network and get to know new people!

Discussion(s)

No comments yet
Knowasiak We would like to show you notifications so you don't miss chats & status updates.
Dismiss
Allow Notifications