Show HN: wrk-utils: run wrk in clusters and have stats

71
Show HN: wrk-utils: run wrk in clusters and have stats

wrk-utils is a build of instructions and wrk lua scripts to bustle wrk in clusters and indulge in stats.

This accomplishing consisted of two predominant parts;

Commands, that you might per chance per chance allow and procure the again message by sourcing advised script.

stats.lua, that collects stats. You would exhaust stats.lua as the key script or a library.

Aspects

  • Running wrk in clusters the exhaust of SSH
  • Dwell stats in JSON format, separated by convey codes
  • Possibility to make exhaust of stats.lua as a library to indulge in stats in personalized scripts
  • Dwell 5xx convey anecdote
  • Slack integration
  • JSON formatted logs for both requests and responses in recordsdata
  • Commands to Initialize servers and replace recordsdata
  • Commands to help out and forestall wrk
  • Create personalized instructions on servers
  • Read IPs from file or stdin (pipe)
  • Debug mode

Last stats sample recordsdata:

{
    "tournament": "executed",
    "start_time": 1639990232,
    "node": "10.10.1.1",
    "target_url": "https://instance.com/checklist?limit=100",
    "total_completed_requests": 778511,
    "total_sent_requests": 778707,
    "total_timeouts": 2809,
    "connect_error": 0,
    "socket_status_error": 341,
    "read_error": 0,
    "write_error": 0,
    "interval": "484.686049",
    "rps": 1606.22,
    "recv_bytes": "690.31mb",
    "convey": {
        "502": 315,
        "200": 778170,
        "504": 26
    }
}

Dwell stats sample recordsdata:

{"tournament": "response", "node": "10.0.1.2", "thread_id": 1001, "count": 231, "convey": {"200": 231}}
{"tournament": "response", "node": "10.0.1.2", "thread_id": 1000, "count": 234, "convey": {"200": 234}}
{"tournament": "response", "node": "10.0.1.2", "thread_id": 1004, "count": 234, "convey": {"200": 234}}
{"tournament": "response", "node": "10.0.1.3", "thread_id": 1004, "count": 353, "convey": {"200": 353}}
{"tournament": "response", "node": "10.0.1.3", "thread_id": 1000, "count": 354, "convey": {"200": 354}}
{"tournament": "response", "node": "10.0.1.2", "thread_id": 1003, "count": 360, "convey": {"200": 360}}
{"tournament": "response", "node": "10.0.1.2", "thread_id": 1002, "count": 355, "convey": {"200": 355}}
{"tournament": "response", "node": "10.0.1.3", "thread_id": 1001, "count": 351, "convey": {"200": 351, "502": 3}}
{"tournament": "response", "node": "10.0.1.3", "thread_id": 1003, "count": 360, "convey": {"200": 370}}
{"tournament": "response", "node": "10.0.1.3", "thread_id": 1002, "count": 355, "convey": {"200": 373}}

Set up & bustle

  1. wrk-utils requires sshpass. Be obvious or now now not it is installed.
  2. Prepare a build of servers with ssh keys OR similar ssh username/passwords and present checklist of IPs in servers.txt or pipe addresses to instructions.
  3. Add SSH_USR="", SSH_PWD="" and SLACK_WEBHOOK="" to config.env. Peep at config.env.sample.
  4. Bustle . advised.sh in bash or . advised.fish in fish.
  5. Now you peruse can again message and exhaust instructions.
  6. Bustle init-servers order to setup wrk on equipped servers by copying wrk binary and lua scripts there. (There is a compiled version of wrk in this repository. You would replace it with but another one.)
  7. You would bustle sync.file wordlist.txt *.csv to reproduction any somewhat a couple of file or now now not it is vital to servers.
  8. Bustle readily within the market-node-count to be obvious each of your servers are racy.
  9. Bustle exec-wrk 0 -t10 -c300 -d600s -s stats.lua 'https://instance.com/'
  10. You would abolish wrk cases the exhaust of abolish-all order.
# 10 is the extend to help out next wrk cases.
# All somewhat a couple of params after 10 will plod to wrk on servers.
exec-wrk 10 -t10 -c300 -d600s -s stats.lua 'https://instance.com/direction/?identification=1'

# working all wrk cases straight away by surroundings extend to zero
exec-wrk 0 -t10 -c300 -d600s -s personalized.lua 'https://instance.com/'

Custom script pattern with stats

You would exhaust stats.lua as a library to permit stats for your personalized scripts is this come:

require('stats') -- load stats.lua into your personalized script

feature demand()

    -- you might per chance per chance add request_logger to the demand feature
    request_logger(fraudulent)

    -- ...

    return wrk.format(nil, '/')

live

feature response(convey, headers, physique)

    -- or now now not it is vital to add response_logger to the response feature
    response_logger(convey, headers)
    --  ...
live

Checkout examples

Commands

ssh-all: executes a order on all servers

ssh-all 'ps aux | grep something'

ssh-one: executes a order on a random server

ssh-one 'ps aux | grep something'

ssh-all-sudo: executes a order on all servers as sudo

ssh-one-sudo: executes a order on a server as sudo

abolish-all: kills all wrk cases on all servers (pleasant)

abolish-all-drive: kills all wrk cases on all servers (drive, will lose logs)

readily within the market-node-count: prints possibility of readily within the market servers

vigorous-node-count: prints possibility of vigorous wrk cases in a loop

are residing-stats: are residing stats for all servers (per thread)

init-servers: creates wrk directory on servers copies wrk and lua scripts into that

sync-file: copies equipped recordsdata to wrk directory on all servers

sync-file wordlist.txt *.jpg

exec-wrk: executes wrk small by small or straight away (first argument is the extend to help out next instance)

exec-wrk 10 -t10 -c300 -d600s -s stats.lua 'https://instance.com/direction/?identification=1'
exec-wrk 0 -t10 -c300 -d600s -s personalized.lua 'https://instance.com/'

Identified disorders

  • Will indulge in to you take care of to indulge in to plod a URL with multiple parameters to wrk-exec, or now now not it is vital to cite that URL twice. e.g. exec-wrk 0 -t1 -c3 -d6s -s stats.lua "'https://instance.com/?a=1&b=2'"
  • You need to rerun are residing-stats when a new node comes up.
  • stats.lua focal level on requests with HTTP pipelines as one demand. You need to multiply possibility of requests with possibility of requests in each pipeline.

TODO

  • A dashboard to procure logs
  • Some documentations
  • More examples
  • Pork up are residing stats

Join the pack! Join 8000+ others registered customers, and procure chat, make teams, put up updates and make company across the arena!
www.knowasiak.com/register

Ava Chan
WRITTEN BY

Ava Chan

I'm a researcher at Utokyo :) and a big fan of Ava Max