Show HN: I produce a instrument to assist your machine discovering out models in no time

Simple, but Powerful. English Doc | 中文文档 Help wanted. Translation, rap lyrics, all wanted. Feel free to create an issue. Pinferencia tries to be the simplest AI model inference server ever! Serving a model with REST API has never been so easy. If you want to find a simple but robust way to serve your…

74
Show HN: I produce a instrument to assist your machine discovering out models in no time

These constituents are rather dazzling.

Straightforward, nevertheless Powerful.


Language grade: Python








PyPI

PyPI - Python Version

Pinferencia



English Doc
|

中文文档

Aid wished. Translation, rap lyrics, all wished. No doubt feel free to invent a converse.


Pinferencia tries to be the sterling AI model inference server ever!

Serving a model with REST API has by no formula been so straightforward.

In say so that you just can

  • get a straightforward nevertheless sturdy formula to assist your model
  • write minimal codes while defend controls over you provider
  • defend faraway from any heavy-weight alternatives
  • with out problems to integrate alongside with your CICD
  • salvage your model and provider portable and runnable all over machines

You’re on the very finest space.

Factors

Pinferencia aspects consist of:

  • Fleet to code, mercurial to cross alive. Minimal codes wished, minimal transformation wished. Authorized in defending with what you bear.
  • 100% Test Coverage: Both observation and branch coverages, no kidding.
  • Easy to utter, straightforward to love.
  • Computerized API documentation online page. All API explained in info with online are trying-out characteristic.
  • Aid any model, even a single feature is also served.

Install

pip install "pinferencia[uvicorn]"

Like a flash Originate

Aid Any Model

from pinferencia import Server


class MyModel:
    def predict(self, recordsdata):
        return sum(recordsdata)


model = MyModel()

provider = Server()
provider.register(
    model_name="mymodel",
    model=model,
    entrypoint="predict",
)

Authorized toddle:

uvicorn app:provider --reload

Hooray, your provider is alive. Wander to http://127.0.0.1: 8000/ and bear enjoyable.

Any Deep Finding out Objects? Authorized as straightforward. Straightforward utter or load your model, and register it with the provider. Wander alive without prolong.

Pytorch

import torch

from pinferencia import Server


# utter your models
model = "..."

# or load your models (1)
# from state_dict
model = TheModelClass(*args, kwargs)
model.load_state_dict(torch.load(PATH))

# complete model
model = torch.load(PATH)

# torchscript
model = torch.jit.load('model_scripted.pt')

model.eval()

provider = Server()
provider.register(
    model_name="mymodel",
    model=model,
)

Tensorflow

import tensorflow as tf

from pinferencia import Server


# utter your models
model = "..."

# or load your models (1)
# saved_model
model = tf.keras.models.load_model('saved_model/model')

# HDF5
model = tf.keras.models.load_model('model.h5')

# from weights
model = create_model()
model.load_weights('./checkpoints/my_checkpoint')
loss, acc = model.purchase in thoughts(test_images, test_labels, verbose=2)

provider = Server()
provider.register(
    model_name="mymodel",
    model=model,
    entrypoint="predict",
)

Any model of any framework will correct work the the same formula. Now toddle uvicorn app:provider --reload and trip!

Contributing

When that you just can cherish to make contributions, info are here

Read More
Fragment this on knowasiak.com to consult with of us on this topicJoin on Knowasiak.com now whenever you get yourself no longer registered but.

Charlie Layers
WRITTEN BY

Charlie Layers

Fill your life with experiences so you always have a great story to tellBio: About: