1 comments

  • alonsovm 2 hours ago

    Hello HN,

    I’m the author of Yori. It’s a command-line tool that treats natural language (English/Spanish) as source code and compiles it into standalone executables.

    The Problem: I was working on a hobby OS and got tired of the context-switching between "Architecting" (thinking of logic) and "Implementing" (fighting syntax/boilerplate). I wanted a tool where I could write the intent, and the machine would handle the implementation details.

    How it works: Yori isn't just a wrapper for an LLM. It acts as a build system that uses the C++ compiler (g++) as a "Truth Filter."

    Draft: It reads a .yori file (natural language instructions) and prompts a local model (via Ollama) or cloud model (Gemini) to generate C++ code.

    Verify: It attempts to compile the output with g++.

    Evolve: If compilation fails, it captures the stderr output, feeds it back into the LLM along with the broken code, and asks for a fix. It repeats this loop until a valid binary is produced.

    Features:

    100% Local: Defaults to using qwen2.5-coder via Ollama. No API keys or internet required.

    Incremental Builds: Can update existing C++ files without rewriting them from scratch.

    Unity Build System: Supports IMPORT: tags to merge multiple prompt files into a single compilation unit.

    It’s open source and written in C++. I know "AI coding" is a saturated market, but I wanted something that felt like a standard Unix tool—lightweight, local, and focused on producing binaries rather than just chat suggestions.

    Repo is here: https://github.com/alonsovm44/yori I’d love to hear your feedback on the self-correction loop logic!