Loyal Wingman

The Loyal Wingman

All this machinery making modern music
Can still be open-hearted
Not so coldly charted

Rush, 'Spirit of Radio'

It all started, as these things usually go, with a failure -- but at the time it seemed like the definitive ending. Despite numerous books and printout guides (this was before Coursera), I just could not figure out programming. Something did not click for me. After several stabs at books with encouraging titles like 'Assembler for Beginners' and a flunked-out math major at Moscow State, I closed that chapter of my life for good. Or so I thought. I moved on to video game marketing, startups, and eventually, product management.

Fast forward to 2023. ChatGPT. The frenetic vibe in the air, at least if you were on Twitter. I'm flying somewhere with my daughter, not yet 2 at that point, and to alleviate the stress of travel I start making storybook pictures for her - little scenes to help her imagine the trip. Days later, I'm stumbling toward the store to pick up groceries, jet-lagged and groggy, when my brain slowly catches up: Wait a minute. This could be an app.

Days pass, then weeks. Months. I learn about Heroku, Vercel, Typescript, APIs, Python. "Learn" is probably not the right word -- this is more a process of frenetic discovery, fueled by endless questions and late-night cursing at LLMs. The first app is terrible and doesn't really work; neither does the second, nor the third. Vibe coding is suddenly a thing, and everyone is doing it. Bolt, Lovable, v0. I land a PR at my job a week after being hired -- ecstatic and terrified. Land another in the open source repo at the job that came after. Then came the slow, slow shift in thinking from seeing every engineering problem as incomprehensible and messy to breaking them down into individual things you can test, validate and improve on. "Wait a minute. This reminds me of something else," I say to myself as I open a bunch of old notebooks. Very old notebooks.

I've always wanted to be a writer. "Wanted" in the same sense as kids (present company included) want to be astronauts, or cops, or truck drivers: a passive desire with nothing attached to it except frustration when it doesn't happen. I had ideas, of course -- solid ones -- but nothing that would magically blossom overnight. I would write things down, sure -- only to throw them away in disgust the next morning. Characters would live in my head, but end up shambling zombies on paper. Dialogue lines would flow effortlessly -- until I tried nailing them down.

Armed with the "vibe coding" (pah!) insight, the obvious realization was that I needed to iterate to get better. But I couldn't iterate when every draft felt like slop. Discovery writing -- the kind you need when you have no idea what your story will look like -- requires output, but I couldn't stand my own output long enough to discover anything. What I needed was a sparring partner.

And where better to start than in a well-defined genre? I've always been a huge SF&F nerd, but dark fantasy had always held a special place in my heart. Planescape, The Witcher, Vampire: the Masquerade, Gaunt's Ghosts: morally ambiguous characters navigating an uncaring world always held sway with me. (Yes, this includes Taylor Swift's 'Anti-Hero.') And so Alisa Chernova -- Alice Black -- was born. 'Grimdark Buffy in Putin's Russia' lived in my head for many, many years, until AI showed me how to let her loose.

It didn't happen overnight. There were multiple attempts to get going, a failed AI-enabled-writer platform taking down a 3,000 word draft (for the best,) a year of spinning my wheels, and a scandal-inspired short story (still work-in-progress.) What moved things forward was deciding on the end goal. "I want to write a novel -- at least 80,000 words, genre-appropriate -- self-publish it at reasonable cost, build a modest community, and learn. No expectations of success."

So, what have I learned while I learned to think like an engineer? First: clean, single-purpose tools beat bloated "Swiss army knife platforms." A simple editor like Obsidian.md works for me (I'll never embrace vim, more's the pity). Similarly, Claude Code with its minimalistic CLI interface needs no fancy UX. It's extremely good at following instructions in plain English and seamlessly interfaces with Obsidian's markdown files. Everything is backed up to GitHub -- necessary self-preservation when LLMs can wreak havoc on your filesystem, and insurance against "YC-backed startup going down with all your data" scenarios. Finally, acknowledging that writing is modularized work: web research, world bible creation, outlining, drafting, copyediting, polishing, LLM yelling at the writer (yes, that happened, and no, that is a story for another post) -- all independent tasks requiring different, programmable approaches.

Which brings me to the big, ugly elephant in the room.

In September 2024, National Novel Writing Month (NaNoWriMo) endorsed AI use. The backlash was immediate and brutal - prominent authors Daniel José Older and Maureen Johnson publicly resigned from the board. It was described as "one of the biggest backlashes against the nonprofit yet." Over 70 authors signed an open letter demanding publishers "never release books that were created by machines." Many presses now auto-reject - and ban - writers who submit AI-assisted work.

Professional writers surveyed by Josh Bernoff don't hold back:

"There is no place for AI in the creative writing sphere, in my opinion."

"The rise of AI has shown me what people in general really think of writing — they hate doing it, and they don't see it as valuable work."

I could go on about steam engines, and buggy-and-horse-whip manufacturers, and creative destruction, but this is not the point. People are perfectly capable of forming their own opinions, and if they consider the spoor of the thinking machine anathema to creativity, who am I to challenge that? Nonetheless, I have two thoughts on this.

First, as creative replacement AI flat out sucks. It does not hold the writer's voice, it loses context, it overuses tropes, it just plainly messes things up. By their nature, large language models predict the highest-probability outcomes, which is, of course, antithetical to most (all?) great writing. Art thrives on the margins, and AI as creator produces lifeless, flat prose.

That said, AI excels in many other crucial roles. Research assistant: I can perfectly well do without manually doing 150 Google searches on the price of PM ammo in 2007 Moscow, thank you very much. Brainstorm partner: listen to my stupid ideas for 2 hours while I ride a train, then give me a decent summary of our rambling conversation. Process unblocker: nudge me towards doing 5 minutes of something useful instead of researching blog cover images. Ruthless editor: too ruthless, sometimes! (more on that later.) I think of AI as a loyal wingman -- a powerful tool, but a terrible master.

I'm not J.K. Rowling or Brandon Sanderson - writers with immense talent and drive that I deeply respect. But I don't need to be. Much like open source, if the work is good enough, if it's not slop, then maybe it's worth contributing. Maybe the world is a tiny bit more interesting with the story of a Moscow monster-hunting girl with a pair of oversized guns and an attitude. And if documenting how I got there helps even one other stuck writer start shipping their own thing — with or without AI — then this blog will have done its job. Avant!