“You’re still alive.”
“Barely.”
We sit there a moment, catching our breath.
Then she nudges my knee with hers. “My turn.”
I raise a brow. “Your turn at what, exactly?”
“Teaching you,” she says, already reaching for her datapad.
Oh.
I should’ve seen this coming.
She drags the pad open and flicks up a lattice of logic trees, audit loops, and adaptive counter-trace structures so dense it makes my head hurt just looking at them.
“This,” she says brightly, “is how the Coalition thinks.”
“That’s not thinking,” I mutter. “That’s a nervous breakdown in code form.”
She laughs. “Exactly. Now watch.”
She starts walking me through it—not lecturing, not simplifying, buttalking, fast and sharp and alive, hands moving as much as her mouth as she explains how audit prediction models can be turned backward, how behavioral drift flags create exploitable lag, how she can ghost an identity through three systems without tripping a single alarm if the timing’s right.
“And here,” she says, tapping a node, “is where they always assume the human element fails first.”
I lean closer. “They’re wrong.”
“They’re very wrong,” she agrees, eyes bright. “Because they forgot something.”
“What?”
She looks at me.
“Me.”
Stars.
Of course they did.
We work like that for hours.
Trading.
Testing.
Breaking and rebuilding each other’s instincts until strategy and logic start to bleed together into something new, something neither of us could’ve built alone.
And somewhere between rerouting antenna relays and disabling simulated tracker pings, I realize we’re moving without thinking.
Together.
When she reaches across me to adjust a parameter, her shoulder brushes my chest.
Just barely.
But it stops me cold.