Show HN: JibarOS, a shared inference runtime for Android

JibarOS is an Android 16 fork I’ve been building to explore a shared runtime for on-device inference.

It adds: a system service in system_server a native daemon pluggable inference backends a Binder AIDL for capability-based calls across text, audio, and vision The goal is to centralize things that are otherwise handled independently by apps, like model residency, scheduling, fairness, and backend routing.

The current interface exposes 12 capabilities including completion, translation, rerank, embeddings, transcription, synthesis, VAD, OCR, detection, and description.

Repo: https://github.com/Jibar-OS/JibarOS

Interested in feedback from anyone who has worked on Android framework/services, ML runtimes, or device-level resource scheduling.


Comments URL: https://news.ycombinator.com/item?id=47873249

Points: 1

# Comments: 0