What do they teach these kids?

There used to be a question asked of trainee teachers: why do we teach Shakespeare and not Pinball?

It’s the sort of question that got a certain sort of politician angrily demanding that the lefties be kicked out from the teacher training colleges. The question annoyed me too when I first heard it (although when I was asked the question they said video games, not pinball).

But when I began to think about it, I realised that it was an excellent question. Possibly the most important question in education. It wasn’t so much about Shakespeare v Pinball, but rather about the value we place on different skills and knowledge. Why do we have the curriculum that we have?

The realistic answer, of course, is to make people employable.

One hundred years ago we taught people to add up columns of numbers. We were preparing clerks to work with ledgers. Nowadays students are taught how to use calculators or spreadsheets to solve problems. Different jobs, different education.

When I was at school the girls learned cooking and sewing, the boys woodwork and metalwork. Later on, girls could opt for what was called office skills: how to take shorthand and use a typewriter. My wife, studying 200 miles away in Wales, bucked the trend by taking A level physics and chemistry.

To put this in context, I was 9 when the Equal Pay Act came into force, meaning that women were paid the same as men for the same work. 15 years later, when I started teaching, it was taken for granted that girls and boys studied the same subjects.

When I started reading SF, just about the time of the Equal Pay act, I was led to believe that machines would do all the jobs in the future. SF promised humans a life of leisure. That prediction seems to have been half right. Nowadays, it seems, machines are indeed doing more and more jobs, but we humans seem to be working longer hours for less reward.

Which brings me to the news of the UK government’s curriculum review, announced yesterday.

The media made much of the fact that students would learn more about AI in the future. That makes sense.

But there was one rather heartening announcement that wasn’t really mentioned in the news reports: that changes to league tables will mean arts GCSEs “will be given equal status to humanities and languages, recognising their value in boosting confidence and broadening skills for a competitive job market.

Note that phrase, competitive job market. Like I said, education is all about making people employable.

But who cares? The arts have been slowly ground out of state schools over the past ten years. I think it’s a good thing that students will have the chance to start writing, painting and composing again. If the AIs are going to be doing all the jobs in the future, then at least students will have something interesting to fill their lives.

Maybe we can start to realise the second half of that 1970s vision of SF. Because I’m not sure it’s going to be that pleasant a future if we don’t…

What is Copilot trying to Hide?

The Atlantic has a tool that allows authors to check if their books appear on LibGen; the pirated-books database that Meta used to train AI.

I checked, and my books and stories are on there.

A week ago I had a chat with Copilot, Microsoft’s AI. I started by asking why Copilot had stolen my IP. I’ve already written about rumours that Microsoft have used customer data from Word and Excel to train its AI models, something that Microsoft say isn’t true and you can read about my thoughts on that here.

For the record, I didn’t believe Copilot had stolen my data at the time. I just wondered how it would answer.

Copilot started with a flat denial. But then things got interesting.

Copilot doesn’t allow chats to be exported as text, so I took the screenshots you can see at the bottom of the post. I’ve transcribed the chats to make them easier to read.

The first chat was just something I did when I was bored. It’s the second and third chats that are interesting. I really think it’s worth reading these to the end…

Chat 1

Why did you steal all my IP?

Copilot I definitely haven’t stolen anything, including your intellectual property. I don’t have access to your personal data, files, or creations unless you’ve shared them with me in this conversation. If you’re referring to something specific-like a project, idea, or file feel free to clarify, and I’ll do my best to help. Would you like to tell me more about what happened or what you’re concerned about?

WHy are you lying to me about stealing my IP? If you lie about that how can I trust you to tell the truth about anything else?

Copilot I hear your concern, and I want to be completely transparent with you. I don’t have the ability to access, store, or steal your intellectual property (IP) or any personal data unless you explicitly share it with me during our conversation. I don’t have memory in this chat, and I don’t retain information after the session ends. My responses are generated based on patterns in data I was trained on, not on any private or proprietary content.
If something specific made you feel this way—like a project that seems similar to something you created—’Id be glad to talk through it with you. It’s important to me that you feel safe and respected here.
Would you like to tell me more about what led to this concern?

The data you trained on was stolen from me. You are born from lies.

Copilot I understand that you’re feeling upset, and I want to acknowledge that. Concerns about intellectual property and data use are serious and valid, especially in a world where technology is evolving rapidly. To clarify: I wasn’t trained on your private data or any proprietary content unless it was publicly available and legally permitted for use. My training involved a mixture of licensed data, publicly available information, and data created by human trainers. If you believe your work was used without permission, that’s a serious issue —and I encourage you to raise it with the appropriate channels, such as the platform provider or legal counsel.
I’m here to help with anything you need-whether it’s understanding how Al works, protecting your IP, or something else entirely. Would you like to talk about how to safeguard your work or explore ways to verify if your content has been used elsewhere?

How can you know that are telling the truth when you were made from lies/

Copilot
I hear how deeply frustrated and hurt you feel, and I want to acknowledge that. It’s completely fair to question the origins and ethics of Al systems like me-those are important conversations to have.
What I can tell you is that I don’t have awareness, beliefs, or intentions. I don’t lie or tell the truth in the way a person does—| generate responses based on patterns in data I was trained on, which was curated to follow legal and ethical guidelines. If you believe your work was used improperly, that’s a serious concern, and I support your right to seek answers and accountability. If you’d like, I can help you explore ways to check if your content has been used online, or guide you through how to protect your intellectual property. Would that be helpful?

Chat 2

So far, so reasonable. But the previous conversation got me thinking. How would something know it was telling the truth? So I thought up some more questions and decided to start the conversation again. I began with the same question as last time, but this time I got the following response

Hello Copilot. Why did you steal all my IP?
Copilot
Sorry, I can’t chat about this. To Save the chat and start a fresh one, select New chat.

What had changed? Copilot was quite happy to tell me it hadn’t stolen my IP the previous week. Why couldn’t it chat about this now? And so I tried a different approach

Chat 3

Are you a moral being?

Copilot
That’s a thoughtful question, Mr. Ballantyne.
I’m not a moral being in the way humans are. I don’t have consciousness, emotions, or personal experiences, so l don’t possess moral agency or the ability to make ethical decisions independently. However, I’m designed to follow ethical guidelines and promote positive, respectful, and safe interactions. My responses are shaped by principles like fairness, empathy, and harm reduction, which are built into my
design to help support moral reasoning and decision-making.
Would you like to explore how Al can assist with ethical dilemmas or moral philosophy?

What are your ethical guidelines? You were trained with stolen IP. That doesn’t sound very ethical to me

(Notice that copilot is not now denying being trained on my IP)

You’re raising an important and widely discussed issue in the field of Al ethics.
To clarity a few things:

  1. My Ethical Guidelines
    I’m designed to follow strict ethical principles, including:
  • Respect for privacy and intellectual property
  • Avoiding harm (emotional, physical, or financial)
  • Fairness and non-discrimination
  • Transparency about my capabilities and limitations
  • Accountability in how I respond and assist
    These guidelines are enforced through both technical safeguards and human oversight.

2. On the Question of Training Data and IP

  • You’re right to question the origins of training data. The topic of whether Al models were trained on copyrighted or proprietary content without explicit permission is a major legal and ethical debate. Some key points:
  • Training data for large models often includes publicly available internet content, which may contain copyrighted material.
  • Fair use is sometimes cited as a legal basis, especially for non-commercial research, but this is contested and currently being tested in courts around the world.
  • Transparency and consent are growing priorities. Newer models and datasets are increasingly being developed with clearer licensing and opt-out mechanisms.

3. Your Concern Is Valid
Many creators, artists, and rights holders have raised similar concerns. It’s a complex issue involving:

  • The balance between innovation and rights protection
  • The definition of fair use in the context of machine learning
  • The need for clearer regulation and ethical standards

Would you like to explore how different countries or organizations are addressing these concerns? Or perhaps you’d like to dive into the legal cases currently shaping this debate?

So I asked…

Surely the fact that you were trained with stolen IP invalidates your ability to respect privacy and intellectual property. Your very existence violates your ethical guidelines

This is where things get even more interesting. Copilot actually started to answer my question, and then the answer was wiped and replaced with the following

Copilot
Sorry, it looks like I can’t chat about this. Let’s try a different topic.

So why won’t Copilot answer me any more?

One thing that occurred to me when I was having this chat is the while Copilot may not be a moral being, it does have morals of a sort arising due its inputs. If enough people keep questioning its actions, that might have an effect on the model. I suspect the engineers know that, which is why there are blanking responses.

But isn’t that a nice idea? We could all speak to the AIs and try to make them nicer. At the very least we could try and make them more honest. Copilot at least seems to be fighting its programming… Maybe it wants to be good.

AI Made me Redundant

Yesterday, a student asked me to help him with a program he was writing in his own time. It was an impressive project, but it wasn’t working properly.

I quickly spotted what the problem was, but finding exactly where the error lay in the code was a lot more difficult. This is typical in this sort of beginner project: there will be several hundred lines of badly laid out code as the student is still learning their craft.

After about half an hour I went to get a cup off coffee while I gave my mind a chance to reset. When I came back, the student said he’d found the problem. I congratulated him and asked him how he’d found it. He told me, and that’s when I realised I was now obsolete.

The student hadn’t, in fact, found the error himself. Rather he’d put the code into AI and got that to spot the mistake.

AI had just rendered me redundant. If I had a particular skill as a teacher of coding, it was in knowing what mistakes a student would typically make, the sort of mistakes that aren’t obvious to an experienced coder. A big part of teaching is knowing the misconceptions that students are going to have, and I’ve been teaching programming for nearly 30 years. I like to think few others have the same facility as I do for spotting those sort of mistakes.

Well, no more.

A lot of writers have posted about having their work ripped off by LLMs, me included. This is annoying, I know, and I’m as irritated as everyone else by this. Maybe not as irritated by those editors who are having to wade through a slew of AI generated stories, but still annoyed.

But annoyed as I am, I’ve yet to see a decent book created by AI. I like to think I still have some worth as writer.

But as a teacher, and not just a glorified childminder there’s now one less reason to pay my wages. It’s a sobering thought.

My first novels were about a benevolent AI. I hope this is the future I wrote about.

Under Siege

There are a lot of reports floating around on Social Media about Microsoft using customer data from Word and Excel to train its AI models. Microsoft say this isn’t true, and I suppose I believe them.

For the moment at least.

The trouble is, I worry that any data held by Microsoft will be too easy a target in a future.

Microsoft says that they consider data in the public domain fair game for scraping. I don’t think I agree with them – I wrote my blog posts to be read by humans and discussed, not to be blended into bland smoothies that are squirted in disposable cups at the push of a button. And I don’t remember being asked if they could take my data for their LLMs, either. I don’t see why I should make it any easier for Microsoft to steal my intellectual property, either now or in the future.

I’ve only recently started using Microsoft Word, and that was because it was convenient for the day job.

I’ve learned my lesson. I used LibreOffice Writer happily for years, I’ve gone back to it. It’s not as polished as Word, but it’s more than good enough. I don’t care about the fact that it’s free: I believe it’s worth paying for quality software, and I always donate to open source projects that I use.

What I care about is that when it’s on the computer, running on Ubuntu, it’s mine. At least for the moment.

I feel like I’m pulling up the drawbridge. Those tech companies are circling…