Tagged: llm
Showing 11–15 of 36 articles
Data Pipelines in 50 Tokens ilo now handles the full fetch, parse, transform, aggregate, write cycle. The builtins that made it possible. Read article jCodeMunch: Use 80% Less Context jCodeMunch-MCP uses tree-sitter AST parsing to build a symbol index of your codebase, letting AI agents pull individual functions instead of whole files. The token savings are significant. Read article Getting the most out of Claude Code I ran a half-day Claude Code workshop for a team of technical and non-technical staff. Here's what I covered, what landed, and the habits that stuck. Read article Tokenizers Don't Care About Your Abbreviations A short-naming convention for ilo, tested against cl100k_base. The tokenizer already handles common English words in one token. Read article --explain: Reading a Language You Don't Know ilo's annotation tool shows the structural role of every statement. Built for agents, useful for humans too. Read article