Running an AI Agent on Your Own GPU: A Practical Walkthrough

There’s something satisfying about running a large language model on your own hardware. No API keys, no usage limits, no data leaving your machine. Just your GPU, some Docker containers, and a surprisingly capable AI sitting right there on localhost. Continue reading Running an AI Agent on Your Own GPU: A Practical Walkthrough

Running Kiro CLI in Docker with MicroK8s

For those using Agentic command line tools in development workflows, running them in containers on MicroK8s offers useful benefits: sandboxed execution, automated scheduling, and environment consistency. I walk through the approach in my latest post. Continue reading Running Kiro CLI in Docker with MicroK8s

Exploring Model Context Protocol (MCP)

I’m always exploring new protocols and tools to optimize performance, clarity, and maintainability of the systems we build. One intriguing development I’ve been examining recently is the Model Context Protocol. Today, I want to introduce you to this interesting concept, Continue reading Exploring Model Context Protocol (MCP)