My book was scraped to train AI. Yet I still use these tools every day.
The fact that big tech corporations have stolen the intellectual property of small creators makes many people – including me – uncomfortable.
One of my books was used to train large language models without my permission or compensation. I’m just one of tens of thousands of creators this happened to. I did not agree to it. I do not like it. And I don’t expect a cheque from Big Tech. That window has closed.
At the same time, I use these tools every day. They help me think, draft, organise, and test ideas faster than I could alone.
Is that contradictory? Possibly. But to me it feels more honest than pretending I can step outside technology that is already reshaping how we all work.
We still have to decide how we respond.
From where I sit, there are three options.
- We can refuse to engage with AI and keep our hands clean. That choice has a cost. Competitors move faster. Teams experiment. Markets shift. We stay “pure” and increasingly irrelevant.
- We can engage without thinking. We use the tools, ignore where they came from, and brush off the ethical questions as someone else’s problem. That choice has a cost too. It strips creators of agency and trust.
- Or we can engage deliberately. We use these tools because they are useful, while staying alert to the tensions they introduce and speaking honestly about them.
I choose that third path.
This is not a new position for me.
Since long before AI entered the mainstream, I’ve been talking about the value of sharing your knowledge freely.
I firmly believe that demonstrating your expertise publicly does not reduce your value:
A mechanic can post a video explaining how to change your oil. You do not suddenly become an expert mechanic by watching it. What you do learn is that they know their craft. You see the complexity involved. You can even decide to do the oil change yourself, using the video as a guide.
But quite often we’ll decide that rather than getting dirty, having to buy the right type of oil, raise the front of your car, open the sump nut, to dispose of 6 litres of used motor oil, or even potentially hurting yourself, it’s often easier to pay the $100 for a professional to handle it in 45 minutes while you have a coffee and read a great book.
Awareness and application have never been the same thing.
Knowledge creates understanding. Application requires judgement, context, and experience. That gap is where professional value lives.
AI has not erased it.
The AI debate brings an old tension into sharper focus.
My IP was taken without consent. That matters.
I benefit from tools partly built on that work. That also matters.
I am not convinced those facts can be fully reconciled.
And the complexity doesn’t end there.
When I use these tools, I’m not just navigating MY own stolen content.
I’m also benefiting from other experts’ work – some shared freely, some taken without their consent too.
Meanwhile, I’m actively encouraging AI to train on my public content, hoping it recommends me.
And that’s not even addressing the US and UK governments signalling support for training on copyrighted materials without fair compensation – a position welcomed by Big Tech and opposed by creators everywhere.
The layers keep unfolding.
What I am convinced of is this: pretending the question does not exist helps no one.
So I work with principles.
I use AI extensively as a collaborator. I use it for drafts, brainstorming, structure, refinement, even final copy in “low stakes” environments. Human judgement stays in the loop. Final sign-off is always mine.
Am I transparent about AI use? Yes, within reason. I don’t hide it. But I also don’t perform disclosure for its own sake: the work is mine because the responsibility is mine. If it’s wrong, I own that. The tool doesn’t.
This approach will not satisfy everyone. Some people want a boycott. Others want silence. I think both miss the reality on the ground.
AI processes information. Humans make decisions. The responsibility for those decisions has not disappeared. If anything, it has increased.
Your work product is your work product, regardless of the tools you used – you are professionally responsible for it.
This matters now because the norms are still being set.
How we talk about this tension shapes what comes next.
So that is where I land. Acknowledging the discomfort. Using the tools anyway. Staying honest about the trade-offs.
Where do you land on this? Do you draw a hard line, or are you trying to work inside the mess too?


