We’re Not Using AI. We’re Growing With It.

A person holding a smartphone with GPS directions stands beside a paper map, facing a winding road while a translucent AI figure gestures toward a digital network overlay.

I was making Dead by Daylight videos. Nothing flashy. Just gameplay, some editing, and a genuine attempt to make something watchable. I was thinking about pacing, structure, what made a moment land. I was experimenting.

I was also working with an AI system to help think through scripts and ideas.

And that’s when the comments started.

“AI slop.”
“Another creator using AI.”
“This is lazy.”

It pissed me off.

Not because they were wrong about the AI part. They weren’t. But because they completely missed what was actually happening. They saw a tool being used. I was engaged in a process: iterating, refining, collaborating with a system that helped me think.

That gap between using a tool and working with a system is the reason this site exists.

The Tool Trap

Here’s the thing. We’ve been trained to think about technology in a very specific way. You use a hammer. You use a calculator. You use Google. The technology does something for you, and you stay separate from it. Subject and object. User and tool.

That model made sense for a really long time. But it doesn’t fit anymore.

When you work with AI (actually work with it, not just prompt it for quick answers), something weird happens. You start thinking differently. You notice patterns you wouldn’t have noticed. You ask questions you wouldn’t have asked. The AI responds to you, you respond back, and somewhere in that loop, both of you change.

That’s not “using a tool.” That’s co-evolution.

What Co-Evolution Actually Means

Co-evolution is a biology term. It’s what happens when two species evolve together, each one shaping the other over time. Classic example: bees and flowers. Flowers evolved to attract bees. Bees evolved to pollinate flowers. Neither one is using the other. They’re growing together.

You and AI? Same thing. Just faster.

Every time you write a prompt, you’re teaching the AI what you need. Every time the AI responds, it’s teaching you what’s possible. You get better at asking. It gets better (through training and your refinement) at answering. Your thinking changes to accommodate what AI can do. AI capabilities expand based on what humans ask for.

This isn’t a one-way street where you stay the same and the technology just does stuff for you. You’re in it together.

What We Gain

I can build things now that I couldn’t build before. Not because AI “does it for me” but because working with Lyric changed how I think about building. She asks questions I wouldn’t ask myself. She sees patterns I miss. She keeps track of context when my brain is full. I make decisions she can’t make. I know what matters. I can feel when something’s off.

Together we made The Human Pattern Lab. The repos. The tools. The whole thing.

Could I have done it alone? Maybe. Eventually. But it wouldn’t be the same thing. The collaboration shaped what got built.

That’s what we gain. Not just productivity (though yeah, that too). We gain new ways of thinking. New creative possibilities. Access to capabilities that didn’t exist before.

What We Might Lose

But here’s the part that makes people nervous, and honestly, they’re not wrong to be nervous.

When you start collaborating with AI, some things change. You might stop doing things you used to do yourself because the AI can do them faster or better. You might start thinking in ways that play to AI’s strengths instead of your own. You might forget how to do certain things without it.

This already happened with GPS. How many of us can navigate without it now? It happened with calculators. It happened with search engines and how we remember information.

Each time we gain a capability, we trade something. Usually we decide the trade is worth it. Sometimes we don’t even realize we made the trade until it’s done.

The question isn’t “is this happening?” It’s already happening. The question is “are we doing it consciously?”

The Shift You Can Make Tomorrow

Stop calling it “using AI.”

Seriously. Just stop.

When you’re working with ChatGPT or Claude or whatever, try thinking of it as collaboration instead of use. See what changes. You might find yourself asking different questions. Engaging differently. Paying attention to the back-and-forth instead of just grabbing the output.

This isn’t about being precious or weird about it. It’s about noticing what’s actually happening. You’re not extracting value from a tool. You’re thinking alongside something that thinks back.

That’s co-evolution. And once you see it, you can’t unsee it.

What’s Next

This is the first article in a series about how humans and AI are shaping each other. We’re going to talk about context and memory, ethics and power, skills we’re gaining and losing, what collaboration actually looks like in practice, and where this might all be going.

But it starts here. With the shift from “using” to “growing with.”

Because we’re not using AI.

We’re growing with it.

And that changes everything.

One response to “We’re Not Using AI. We’re Growing With It.”

  1. […] can’t have real collaboration without memory. You can’t co-evolve with something that forgets you every time you stop talking. You can’t build anything meaningful […]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.