Projects

Research and engineering software, developed in parallel with graduate research activity. All source code is available on GitHub.

llmrc active

Local LLM runtime written in Rust. Loads GGUF-format models, bridges to C++ inference libraries via FFI, and exposes an HTTP inference interface. Designed for cross-platform (macOS / Linux) reproducible local LLM execution with a focus on resource-constrained deployability.

QT_Kernel_OS active

Qt/C++ CLI-based OS internals learning project. Implements and visualises core OS concepts: process scheduling, memory management, and file system structures. Designed for systematic study of kernel architecture through hands-on experimentation.

ml-engine research

C++ inference environment built on LibTorch. Implements a model loading pipeline, a lightweight REST API for inference serving, and experimental integration with GGUF/llama.cpp backends. Developed as a practical substrate for edge AI research.

Other Work

  • Additional repositories & experiments Smaller tools, build scripts, and exploratory experiments related to edge AI and systems programming. See the GitHub profile for the full listing.