We’re Not Just Using AI — We’re Becoming It
On how large language models are shaping us as much as we’re shaping them.
ChatGPT, Gemini and other large language models (LLMs) aren’t just delivering facts or helping us code; they’re reshaping how we think, how we communicate and how we relate to synthetic intelligence.
In short, they’re becoming part of our social world.
We can’t just think of AI as a “tool” anymore. It’s something more structural — and maybe even cultural.
What if AI is a social structure?
This is where a bit of sociological theory comes in — specifically, Anthony Giddens’ Structuration Theory.
Giddens (a big name in British sociology) had a deceptively simple insight:
Structures don’t exist “out there” like buildings or blueprints. They exist in practice.
That is, they’re created and maintained through the things we do every day.
Think of language, for example. There’s no one person enforcing grammar rules every time we speak. But we all more or less follow them and in doing so, we keep the structure of language alive. If everyone stopped following the rules, the structure would collapse.
Giddens called this the duality of structure. They shape what we can do — and we shape them by doing.
“The structural properties of social systems are both the medium and the outcome of the practices they recursively organize.”
— Anthony Giddens, The Constitution of Society (1984)
Now, apply that lens to large language models (LLMs).
LLMs are trained on us
LLMs like Claude are trained on massive archives of human-created content: articles, websites, social media, academic papers, e-books, blog posts — the stuff of the internet, in all its beauty and bleakness.
In other words, they’ve absorbed our habits of speech, our cultural assumptions, our biases, our metaphors and our contradictions - all embedded in language, media and expressions we produce. These systems are built on our past conversations and structured by human behavior.
But here’s where it gets recursive.
Once we start using them, we begin adjusting to how they respond. We learn how to prompt better. We phrase things in ways that get the “right” kind of answer. We absorb their tone. We mimic their style or biases — often without realizing it.
We build the AI and then it starts building us.
Prompting is the new social skill
If you’ve ever shared a clever prompt on Reddit or watched a YouTube video on “how to talk to AI like a pro,” you’ve participated in a new kind of digital culture. Prompting has become a social practice — a learned behavior that spreads and evolves.
We now have unwritten norms about what works:
“Tell it to act like an expert.”
“Use bullet points.”
“Be clear and direct.”
“Give it context, like you would with a coworker.”
The more we do this, the more we form habits — and those habits shape the outputs we get. What’s more, we teach each other these habits and they solidify into expectations.
Eventually, they stop feeling like strategies and start feeling like just the way you use AI. That’s structure.
“What is important about structural properties is that they are instantiations of rules and resources, drawn on in the production and reproduction of social action.”
— Anthony Giddens, The Constitution of Society (1984)
So what does this mean?
LLMs are interactive, evolving systems. They’re socio-technical structures: systems that are shaped by human input and that shape human behavior in turn. They’re part infrastructure, part culture. Part product, part partner.
We’re all participating in building them — not just the engineers, but the everyday users, too.
Every time we talk to AI, we’re reinforcing patterns.
Every time we change how we prompt to get a better answer, we’re adjusting to those patterns.
Every time we teach others how to prompt better, we’re spreading those patterns.
That’s structuration.
We’re not just users — we’re co-creators
This perspective is both humbling and empowering.
It means we can’t just blame AI for how it behaves — it’s behaving based on us. And it means we have agency. The way we use these systems today influences how they evolve tomorrow.
So next time you chat with a model, think about what you’re doing. You’re not just typing. You’re participating in the co-creation of a new digital structure.
And like any structure, it can be bent, reshaped or reimagined — depending on how we choose to engage with it.
A closing thought
We often ask: What is AI doing to society?
But maybe a better question is:
What kind of society are we building with AI?
And do we like the shape it’s taking?