nanograph 1.1.2
Multimodal is here. nanograph now handles images, audio, video, and PDFs as first-class graph citizens — external media URIs with vector embeddings you can search and traverse like any other node.
Under the hood, the runtime and query engine have been substantially reworked. Reads are lazy and metadata-first. Relational query tails run through DataFusion. Storage has been refactored with v4-ready seams, and derived Lance mirror tables now materialize committed graph history. Both SDKs ship with full media support.
This is the canonical stable release for the current Lance v3 line — it supersedes v1.1.0 and v1.1.1.
What shipped
Upgrading
brew upgrade nanograph or grab the binary from the release. Rust crates are published on crates.io for nanograph, nanograph-cli, nanograph-ffi, and nanograph-ts.
New databases default to Lance v3 / storage format 2.2. Existing 2.0 databases remain readable. See the migration guide if you want to upgrade storage format.
Skills updated
The nanograph-skills package has been updated for 1.1.2. Both skills now cover the new features:
New docs
- Blobs and Media Nodes — media storage,
@media_uri, JSONL formats, and media-root behavior - Embeddings — providers,
@embed, text and multimodal vectors, hybrid search - Skills — install agent skills so your AI tools know how to operate nanograph
Star the repo if you want to follow along.