Looking back, 2025 has been a good year. Actually, it has been a transformative one.
It was a year defined by taking control, whether that was taking control of my code by removing bloated frameworks, taking control of my rights against a billion-dollar bank, or simply having the time during the strike to rethink where I want to go.
Here is the breakdown of a very busy year.
The Legal Battle: David vs. BMO & KRMC
This was perhaps the most intense "side project" of the year. I represented myself in court against BMO and their legal team.
It wasn't just a dispute; it became a lesson in the legal system. I learned the law, practiced it, and stood in front of a judge against a billion-dollar institution. The result wasn't just a quiet settlement, it triggered a reported judgment by the BC Supreme Court and forced regulatory enforcement actions.
The Public Record:
- BC Supreme Court Judgment: 2025 BCSC 1371 (CanLII)
- Enforcement Action against KRMC (Law Firm): Case #32096 (Consumer Protection BC)
- Enforcement Action against BMO: Case #32104 (Consumer Protection BC)
My actions also triggered a Law Society investigation, and the FCAC has been informed. I will write a fully detailed blog post on this soon to warn others about how these institutions operate. But for now, I am proud that I didn't back down. I forced the issue, and now their undertakings are on the public record.
AI: From Work to the Metal
AI has been the theme of my professional life this year.
- At Work: I am working on AI initiatives at my company, which keeps me engaged and feeling good about my career trajectory.
- At Home (The Deep Dive): I didn't just want to use AI; I wanted to understand it.
I built a full implementation of GPT-2 in C, forward pass, backpropagation ( some numerical bugs are still there), everything from scratch with A.I's help. No PyTorch, no TensorFlow, just raw memory management and math. The inference engine achieved numerical parity with PyTorch (diff < 1e-5), which means my hand-written C produces the exact same outputs as the industry standard. That validation took 106 commits and countless debugging sessions, but it proved the approach works.
The Repository: github.com/antshiv/C-Transformer
What's Next: The C Kernel Engine
The GPT-2 project taught me what I needed to know, but it also taught me what not to do. The codebase grew to ~15,000 lines of monolithic code, functional but hard to extend. So I'm starting fresh with a kernel-based approach.
The C Kernel Engine treats each operation (matrix multiply, attention, normalization) as a modular kernel with forward and backward passes in one file. Think of it like building blocks: you can compose different architectures (GPT, LLaMA, Qwen) from the same validated kernels. The focus is CPU-first, x86, ARM, and eventually embedded systems, because I believe there's a real opportunity in efficient inference on commodity hardware.
Coming Soon: github.com/antshiv/c-kernel-engine
Linux Renaissance (AI-Enabled)
I used to work on Linux years ago at Ericsson, kernel modules, DPDK, the whole carrier-grade stack. But I drifted away from it. This year, AI brought me back.
Having Claude, chatGPT, Gemini as a pair programmer changes everything. Debugging obscure segfaults, navigating unfamiliar codebases, setting up build systems, tasks that used to take hours now take minutes. AI does write a lot of the code for me, it removes the friction that made deep systems work feel like a slog. But it also allows me to go deeper into learning and refreshing my calculus, computer architecture, Linux system programming and tuning and more. Also, Linux is fun again, and I'm spending more time in terminals than I have in years.
Flight Controller Progress
On the robotics side, I've been making real progress on my custom flight controller. The goal is a from-scratch autopilot that I fully understand, sensor fusion, state estimation, control loops, all of it. This connects back to the AI work: eventually, I want on-board inference running on CPUs I control, not black-box vendor stacks. The 3-4 month build cycles I outlined on my GitHub profile keep this moving forward alongside everything else.
The BCGEU Strike & The Antsand Rewrite
I was on strike with the BCGEU for two months. While strikes are stressful, this time became a gift. It gave me the mental space to finally tackle the technical debt that had been weighing on me.
I used those two months to modernize Antsand. In practical terms: I ripped out Vue 2 (a JavaScript framework that's now outdated and bloated) and replaced it with pure ES6 and PHP. Why this matters: the platform now loads faster, has fewer dependencies to break, and I understand every line of code that runs it. No more waiting for framework maintainers to fix bugs. No more security patches I don't control. It's mine.
I doubt I could have achieved this speed without AI assistance, but the strike gave me the hours I needed to execute it.
Habits & Physical Conditioning
Beyond the code and the courts, I carried forward a few critical habits from last year that have kept me grounded.
- The 3 AM Start: I’ve maintained the discipline of waking up early, usually around 3:00 or 4:00 AM. After a shower and the morning routine, that quiet window before the world wakes up is when I get my best deep work done.
- Cold Exposure: I still end every shower with cold water. It’s a small daily test of will that wakes up the system.
- Tacfit Commando: I restarted my Tacfit training and am now 6–7 months into continuous conditioning. I’ve realized that Tacfit is almost entirely about conditioning, building a body that works, not just one that looks good.
Goal for 2026: While the training is consistent, my diet needs work. Next year, I want to dial in my diet: more meal prep, more cooking, and exploring a more diverse, healthy menu to fuel all this work properly.
Family
On the home front, my daughter Lucy is growing up fast. She has learned so much this year. Watching her grow is the backdrop to all of this—it puts the legal fights and the coding marathons into perspective.
Looking Ahead to 2026
I am ending 2025 with a modernized web platform, a deeper understanding of the law, and a C kernel engine in the works.
Next year, my focus shifts to the fuel that powers these models. I plan to spend a lot more time on Training Data and refining my understanding of the core algorithms. The architecture is built; now it's time to teach it.
Happy New Year.