Skip to content

🚀 X-Model server for boosting ML inference without letting you do any heavy-lifting in the backend.

License

Notifications You must be signed in to change notification settings

biswaroop1547/microbatcher

Repository files navigation

MicroBatcher

X-Model server for boosting ML inference

What is this?

A server taking care of all your inference to deployment needs + boosting performance without letting you do any heavylifting in the backend.

Quickstart

  1. Clone this repo & install

    git clone https://github.com/biswaroop1547/microbatcher.git && cd microbatcher
    make install
  2. Define model path and start server

    echo

Philosophy

lorem ipsum

Why do you need this?

lorem ipsum

What will it enable?

lorem ipsum

Features

  • lorem ipsum
  • lorem ipsum

About

🚀 X-Model server for boosting ML inference without letting you do any heavy-lifting in the backend.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published