Coggle

A local, completely offline CLI framework that lets any tool expose its capabilities through natural language. Adapter-based, GPU free, no cloud and no APIs.

Description

Coggle

The natural language layer for your CLI tools. Coggle works completely offline, uses an adapter driven code architecture and saves you some of them claude tokens ;)


Instead of learning tool specific flags or prompting a cloud LLM, you describe what you want in plain natural language. Coggle figures out the right tool, the right command and runs it locally.

$ coggle "move all files modified before 2022 to the archive"
$ coggle "trim yt_short_download.webm from 00:32 to 1:45 and save it as clip.mp4"
$ coggle "convert all pngs in this folder to webp"

The best part is that all processing happens locally and under sub-second times (you have no idea how powerful a lightweight text processing model and regex can be).

Why

Routine file tasks like format conversions, video trimming, bulk renames, PDF splitting or merging, etc typically mean you'll go on to uploading to a sketchy website, finding a one-off app or prompting a cloud LLM. All three are slow, limiting, or dependent on 3rd party availability.

The tools that actually do this work (ffmpeg, ImageMagick or pandoc ) are already on your machine. They're free, fast and quite capable, however the one issue they have is their CLI syntax which gets messy very quickly and something that most people don't wanna deal with when they have one-off tasks.


How it Works

Coggle runs a lightweight, staged NLP pipeline: Intent classification, Query Preprocessing, Span Splitting, Pre-Classification, Selector Classification and slot mapping all that use a hybrid combination of SpaCY, a fine tuned version of MiniLM. This is small enough to run on edge hardware and fast enough to feel almost instant. No Cuda. The heaviest dependencies being a 12MB spaCY model and MiniLM's sentence embedding model all-MiniLM-L6-v2 which is 80MB.

View Our Github Repository for more info on how this works under the hood.

Adapter System

Coggle's adapter system has to be one of its flagship features which allows it to become a super efficient CLI wrapper ecosystem. Any CLI tool can be integrated by defining an adapter.

Adapter: An adapter is a structured declaration of what the tool can do and what each operation needs. Basically enough description for the pipeline to match user intent to the right command. The pipeline does the rest.

This means that Coggle is kind of a platform in itself. 3rd Party CLI develoeprs can ship adapters for their own tools and the moment an adapter is installed (put the yaml file in the folder lol) that tool becomes accessible through plain language, automatically, with no changes to Coggle or its pipeline.


Epic in Plain Sight

  • For End Users: You don't need to know ffmpeg flags or ImageMagick syntax. You just need to describe the task. Coggle handles the rest locally, with no privacy tradeoffs and no dependency on internet connectivityor 3rd party APIs being up.

  • For Developers: If your tool has a CLI, it can have a Coggle adapter. In fact, you can use any LLM to generate the adapter once. You define the capability once; Coggle makes them discoverable and usable in plain language for anyone who installs your adapter.

  • For Agents: Coggle can be used as a subprocess tool by other agents that need fast, reliable, local file operations without a round trip to an API.


Current Status

Coggle is in active development. The Filesystem vertical is the current proof of concept. We are currently building the NLU pipeline: Span Pre classification stage and adapter design.