Get the latest tech news

OpenArc – Lightweight Inference Server for OpenVINO


Lightweight Inference server for OpenVINO. Contribute to SearchSavior/OpenArc development by creating an account on GitHub.

OpenArc is a lightweight inference API backend for Optimum-Intel from Transformers to leverage hardware acceleration on Intel CPUs, GPUs and NPUs through the OpenVINO runtime using OpenCL drivers. Exposing the conversation parameter from method grant's complete control over what get's passed in and out of the model without any intervention required at the template or application level. If you use the CLI tool and get an error about an unsupported architecture follow the link, open an issue with references to the model card and the maintainers will get back to you.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of openvino

openvino

Photo of openarc

openarc

Related news:

News photo

Intel's OpenVINO Now Available In openSUSE

News photo

Intel Xeon Max Sees Some Performance Gains For OpenVINO & ONNX With Linux 6.9