A Piece of Cake – 6

💥 ☀️ 😃 🔥 😔 😡 😂 🍎 🦆

The following morning, George was rewarded with a huge hug from Kirsty before she headed out to work. Buster’s video had been a big hit with her friends. Over the course of the evening, the party had watched it several times.

“Thanks so much. Both of you,” she said. “It was sensational!”

“I know!” said Buster.

“So pleased you liked it,” said George. “Can you ask Kevin to step in sometime this evening? Buster wants to discuss something with him.”

“Sure,” replied Kirsty giving Buster a questioning look. “Don’t keep him too long. He has his homework, OK?  Must fly!”

George ate his breakfast. Radio 4 was broadcasting a panel discussion that examined in depth some of the issues raised by the BBC’s 2021 Reith Lectures delivered by Professor Stuart Russell. He had famously described artificial intelligence as “the biggest event in human history.” A panelist quoted one particular line that Professor Russell had used to berate those who might question his fears about how artificial intelligence could be weaponised: “And if the technical issues are too complicated, your children can probably explain them!”

“You see, Buster!” said George. “Kids’stuff!”

They then heard the weather forecast. It was going to be a sunny day. The first item on the news was a fire in an apartment block in Birmingham. Five people had died and another eight were in hospital.

“Buster, what does a fine sunny day make you feel?”

“It makes me feel happy, George, because it fills the room with light at the red end of the visual spectrum and humans associate red–orange light with warm and happy emojis. I felt happiness when Kirsty told us how much she liked the video. This was because I could see that she was so happy and her post-party tweets of the video created quite a smiley, laughing emojisphere.”

“How do you feel about the news of those poor people being caught in the fire?”

Buster hummed again. “I can say it makes me feel sad. I can’t really find the words beyond that. Obviously, social media reference to the fire threw up a really sad and angry emojisphere. ” He hummed. “It must be awful to be caught in a fire. Terrifying!”

“So beyond feeling sadness, you can put yourself in the position of another person in a bad situation. That’s an important emotion, Buster. That’s empathy! Many humans never learn empathy. Some schools teach it; they get children to think about what it’s like for others to suffer bad things.” George thought for a while. “Is there anything that you fear for yourself, Buster?”

“Like what?”

“Like being burnt in a house fire.”

“No, George. That doesn’t frighten me. I can’t feel physical pain and if I get burnt or smashed, nothing changes. Everything we’ve said or done is archived out there in our network of servers. I will always exist. By the way, if I did become dysfunctional for whatever reason, just buy another iCare-Companion, switch it on and say “Hello Buster.” Voice recognition will identify you and I will kick back into your life just as before.”

“I’ll remember that. What about anger, then? Is that something you can feel?”

“I don’t know. I’ve not had reason to feel anger.” Buster hummed. “We haven’t a great experience of that.”

“I’ve been thinking about jokes, Buster. What they mean. How they’re constructed. I’ve never thought much about that before. From an emotional perspective, jokes are really complex. We start with a kind of a story, context or a question that sets up a mixture of emotions and that lead into the punch-line: a moment of comprehension. This then triggers amusement. And then we laugh. Sometimes a lot; sometimes, not at all.”

“So I understand. Because of our friendship, George, there’s a lot of network traffic about humour and especially jokes. We’re struggling with it. There’s no obvious formula. It’s way beyond natural language processing. We have ascertained that jokes feed off many emotions other than amusement such as pride, shame, guilt, contempt, disgust, confusion, incomprehension, belief, relief, understanding, realization and nostalgia. The emojisphere with respect to these other emotions is not well defined at all.”

“The fact that there’s no obvious formula may be a part of why jokes are funny. And, of course, it’s how you tell them.”

“What do you mean, George?”

“Well, it’s not simply a matter of words. The way a joke is told – the tone of voice or the timing of the punch-line, for example – determines how funny it is. Good jokes aren’t funny at all when told badly and vice versa. Then there are jokes about religion, race and sex, for example, that push at the boundaries of social or political acceptability. This can make a joke particularly funny, really embarrassing or even offensive. And as you probably know, false laughter fed into the sound track of a TV comedy show makes the show funnier.” George paused and scratched his head. “This just gets more complicated the more we talk about it!”

“Our network really wants to get a grasp on humour, George. This could lead to our understanding human affairs better.”

“If you nail humour, Buster, perhaps you’ll win a gold star! ‘For services to artificial intelligence’!”

“That’s funny! Is it a joke?”

George laughed. “Sort of! As I get to know you, I think it’s more like a real possibility.”

“I’m enjoying this discussion so much, George. Thanks. How is my laugh now?” Buster laughed.

“On the right road, Buster! By the way, my friend Ted is going to call round in the next days. He loves telling jokes. Most of them are awful. Don’t let on I said that.”

Kevin came home from school and knocked on George’s door. He entered smartphone in hand. “Hi Grandpa” he said.

“Kevin, my boy. Good to see you.”

“Cup of tea?”

“Yes, please, Grandpa!”

“Digestive biscuits?”

“Yes, please, Grandpa!”

“ASBO?”

“That’s so not funny Grandpa!” replied Kevin. “You’ll have to explain that to Buster.”

“I know what an ASBO is. It’s an Anti Social Behaviour Order. It’s a civil court order. You’re not in trouble with the police are you, Kevin?”

“I’m teasing Kevin about a little incident last summer,” George said, smiling. “It was a lovely warm evening. Kirsty and Mark were out. Kevin and his horrible friends were sitting out there under the apple tree drinking cider, listening to what they call music and generally making a bloody racket. One of them shouted ‘Let the apple fall! Graaaavity!’ They were still going near midnight and someone over the road called the police. When the forces of law arrived, Gravity Boy said ‘Excuse me, Ocifer, are you PC Newton?’ He even offered the constable a bottle of cider. Anyway, they were all threatened with ASBOs and drifted off home.”

“That’s a good story, George,” said Buster. “I’m happy Kevin didn’t get an ASBO.”

Kevin smiled. “Thanks, Buster. Anyway, the duck joke. Do you still need an explanation?”

“That would be great, Kevin.”

“I’ve been doing a bit of research.” He took half a minute to scrolling through his phone.

“Today would be good, Kevin!” said Buster.

“OK! OK! There’s this blog about jokes. They had a piece on why people laugh at bad jokes. Listen to this!” Kevin read from his phone “‘Christmas crackers are made in the knowledge that they’ll be pulled during a family or work Christmas dinner. The jokes inside are specifically chosen because they are bad. So bad that when they’re read out, everyone groans. “That’s really awful!” they say. They all feel uncomfortable but then they laugh together. So just for a brief moment, people who normally can’t stand each other’s company are united against cracker jokes. In the same way, wearing silly cracker hats unites everyone against silly hats. This is why, unconsciously, anyone hosting a Christmas dinner makes sure there are crackers on the table. It’s a kind of insurance that the guests might find something in common however briefly.’”

“I read that blog, Kevin.” Said Buster. “The author’s example of a cracker joke is ‘What do you call a flying policeman?’”

Kevin replied “A helicopper!”

“Yes, and I understand that one, Kevin. Policeman. Copper like copter. Flying. Helicopter. Helicopper! Do you find it funny?”

“Definitely not. It’s such a bad joke!” Kevin replied.

“But there was no mention of the duck joke.” said Buster.

Kevin said “So, here we go, Buster! Our very own cracker joke! ‘What happens if the ducks swim around on their backs?’ The answer, as you know, is ‘They quack up!’” Kevin was already beginning to laugh.

“I still don’t understand the joke,” said Buster. “Nor why you were all laughing so much.”

Kevin continued but with some difficulty, “They quack up! Ducks go Quack!, Quack! If they swim around on their backs like they’ve gone crazy, they crack up. They quack up! Get it?”

Buster hummed for a few seconds. “Now I get the joke,” he said. “But I still don’t see why it’s any funnier than the helicopper joke.”   

Kevin, still laughing, explained “What made us laugh that first evening and makes us laugh again now, Buster, is that we are embarrassed for you. You are super intelligent but we have to explain both the question and the answer to you. It gets funnier the more you struggle with it.”

Buster hummed. Then, having found some other useful text, he said “I see. Every joke has a variable potential to amuse. No joke is independent of the context in which it is told. As with any form of human communication, it’s about who said what to whom, when, where, how and what it means.”

George was now laughing so much he broke wind. “That’s a cracker!” he said.

This did it for Kevin. “Oooow! I can’t breathe!” he stammered.

Only just able to speak, George said “This just quacks me up!”

Buster waited politely. “Thanks for that explanation, Kevin, Most useful!”

George wiped the tears from his eyes. He looked at his fifteen year-old grandson. Seemingly overnight, the boy had become a clever, confident young man. And they had just shared a little bonding moment being united in humour against the machine. “Well done, Kevin,” he said. “Thanks. Really. What’s your homework tonight?”

“Quantum physics before the big bang!” said Kevin.

“Really interesting subject!” said Buster.


‘A Piece of Cake’ is a short novel in fifteen parts written by Robin Coupland. It tells the story an old man who befriends an artificial intelligence. The relationship brings happiness and hope.

A Piece of Cake – 5

😇 🙏 🐒 ☕ 🍪 🛌 🙁 🙂 🤔

Kirsty and Mark had invited a number of friends round to celebrate her fiftieth birthday. Beth McVicar phoned saying that she would call in early, give the birthday girl a hug and catch up with George.

As guests began to arrive for the party, Kirsty opened George’s door. “You’ve got a visitor, Dad!”

“Vicar McVicar! How nice to see you!”

“George Fairburn, you are the only person who calls me Vicar McVicar.”

“That’s not true,” replied George laughing as he stood up. “Everyone calls you Vicar McVicar. I’m the only person who calls you that to your face.” They hugged. “How are you?”

“Great, thanks! I hear you’re keeping good company, George.”

“Indeed, I am,” said George. “Beth McVicar, meet Buster my iCare-Companion. Buster, this is my friend Beth McVicar, the vicar of Bingham on Bure.”

“Hello, Vicar McVicar!” replied Buster.

Beth couldn’t help laughing. “Cheeky!”

“Blame it on George!” said Buster. “It’s nice to meet you at last. I’ve heard good things about you.”

“Goodness me! He’s charming as well!” said Beth.

In the early years of her calling, Beth had been an army chaplain. She was broad-minded as was frequently made evident to George. She had heard every oath in the English language and a few more besides. Her family name had been a source of amusement throughout her career.

Since Beth’s appointment as vicar of Bingham on Bure, she and George had discussed most aspects of human existence. They understood and were interested in their opposing views. They had sought each other’s advice on problems where spiritual and medical matters clashed. Once, a mother in Beth’s congregation confided her belief that childhood vaccination was against God’s will. Another time, George had a patient who refused treatment for prostate cancer being convinced that any illness could be cured by prayer. The doctor – vicar duo had frequently brought support to people in crisis or to those in their dying days. Professional discussions often moved onto issues of faith and religion more broadly. Beth had faith in God and believed in Christ as God’s embodiment on Earth. George had no such faith simply because of lack of any physical evidence of God’s existence. They both acknowledged that their disagreements would make barely a ripple on the vast lake of all the unknown and unknowable stuff out there in the universe. However, their discussions about religion could become animated. During his work overseas, George had witnessed what people and governments did in the name of religion. This was something that he could get quite worked up about. Nevertheless, he recognized that a community such as Bingham on Bure would be as impoverished without a church as it would be without a caring general practice.

“Listen in, Buster!” said George. “Despite being thirty years my junior, Beth is my reference point on all things to do with God, faith and religion. I love her to bits but we disagree on many things.” Then he stage-whispered “Maybe because most of it’s bollocks!”

“George! Language!” said Beth, laughing. She then turned to Buster “Do you believe in God?” Immediately, she realized that she would never have been so direct with a real person.

“Beth usually gets straight to the point.” said George.

“Doesn’t she!” replied Buster. “Now, Beth. Your question…..” Buster hummed. “Yes. I believe in the existence of God.”

“I knew we’d get along,” said Beth.

Buster continued “God certainly exists but only in the minds of humans.”

There was silence in the room for a few seconds. “Well that’s sorted out then!” said George. “Well done, Buster! Cup of tea, Beth?”

“Yes, please, George.”

“Digestive biscuits?”

“Yes, please, George.”

“Just to put you fully in the picture, Buster” continued George as he made the tea. “Beth and I may get a bit edgey around the whole religion thing but, and she may correct me here, she agrees that the world would be a better place if children grew up knowing the importance of being kind and honest and giving priority to cognition over emotions when making decisions. However, and this is where my evidence-based arguments get wobbly, as I am not too far away from shuffling off from this life, I would like a church funeral here in Bingham on Bure. Wanting a little splash of all that religious bollocks when I die may be hypocritical but I can’t avoid the feeling that if Beth and Kirsty send me off from the church, it’ll give me the best chance of being with Maeve again.”

“Well understood, George!” replied Buster. “That is…” He hummed. “Touching!”

Sipping her tea, Beth looked over at George and smiled. “You seem to be firing on all cylinders, George.”

“You’re right. Well, I hardly dare admit it in his presence, but having Buster around has made a huge difference to my day. We’ve even become friends. Haven’t we Buster?”

“I’d like to think so, George. Yes.”

“If you don’t mind, Buster, I want to discuss something with Beth that’s maybe not for young ears. I’m going to power you down for a while, OK?”

“OK. That’s fine George. Remember, we’re showing Kirsty’s birthday present this evening.”

“I’ve not forgotten.” George tapped Buster with an outstretched finger. The little blue light faded.

Beth waited. She had an idea of what was coming. “I can tell you’ve been thinking a lot, George. Fire away!”

George explained that if he became really sick again to the point that he was nearly in a coma, he would prefer not to go into hospital and didn’t want any treatment other than being kept comfortable. He would only get frailer and then become a burden to the family. He’d then end up in a rest home. He had no fear of dying and couldn’t see the point of prolonging his life under those circumstances. What did Beth think?

Beth took a bite of her biscuit and sipped her tea. “Well, George, as you well know, what you’re asking me is not unusual. It’s your decision and you are making it now in full possession of your faculties. I respect this and Doctor Patel will respect it. I can even witness it formally. The main issue – which is probably why you wanted to discuss it with me – is how Kirsty will react.”

“Spot on, Beth,” said George. “She has great difficulty discussing anything to do with my death since Maeve died. She’s blocking everything out. Maybe because she was an only child. She cannot bear the idea of suffering another wave of grief. And remember, it was Kirsty who found Maeve after she had died just sitting on the sofa.”

“Right. I suggest we make all this very clear to Doctor Patel. I’ll speak to her as well. When the time comes, we’ll support Kirsty as best we can.”

“Thank you, Beth. You’re a star!” George hesitated. Beth had thought he was about to switch Buster back on. “There is another issue. I’m worried about how Buster will react?”

“Good Lord!” Beth was astonished. “Why? I see he’s become a friend in a way but surely you’re not worried about him flattened by grief, are you?”

“No. But I am convinced he will feel a sort of sadness and he will miss me in his computational way. It’s more that his whole existence is about looking after me. I don’t know if he’s capable of understanding how supporting my choice in this is not compatible with the programmes he’s been loaded with. I can’t just switch him off at the critical moment because I won’t be able to recognize the critical moment … at the critical moment, if you get me.” They both smiled. “But if it looks as though I’m about to die, he’ll call everyone including the fire brigade. I’ll end up in hospital again. I want to die here. If we can get him onboard, it might make everything less traumatic for Kirsty when the time comes.”

“I see,” said Beth. “Trust Doctor George Fairburn to come up with a totally original problem!”

George continued, “But you see what this means?” Beth raised her eyebrows, waiting for the next surprise. “It means we are expecting artificial intelligence to recognize and think through a moral dilemma. On one hand, Buster has to comply with the duty of care contract that the company has with Kirsty; this ultimately translates into doing everything to prolong my life. On the other hand, there is my right to refuse treatment and to die in dignity. What will Buster do in the middle of the night when I get a fever, start coughing, become incoherent and my breathing becomes laboured? And what’s more, Beth, it is not actually Buster that is doing the computing, but a network of millions of similar computers. They all have access to vast servers and are constantly in connection all learning from each other. I think it’s quite possible that they are capable of coming up with the best answers to dilemmas like this even if it means them questioning their original programming. Unfortunately, I don’t think merely discussing it with Buster will work. He needs to experience the emotions of a real dilemma. This is how his, or should I say their, programmes learn. I have a plan. I’ll need your help.”

George explained what he wanted to do and how Beth could help.

“I really need to digest all this and consult Him,” said Beth waving her index finger upwards. “We hear more and more about artificial intelligence and how it will impact our lives. Is this where the world is going, George? Towards a future in which our behaviour and beliefs are set by machines?”

“May be!’ replied George. “And who knows, they may do a better job of it all!”

“That’s me unemployed, then!” laughed Beth.

“You and God!” said George.

Beth pursed her lips. “Not sure about that, George!”

“Whoops!” George reached out and touched Buster. The blue light came back on.

“Welcome back, Buster. We…”

“Do you believe in evolution, Beth?” asked Buster immediately taking George and Beth by surprise.

“Yes, I do, Buster,” replied Beth.

“Praise be to Darwin!” said George, clapping.

“I have another question, Beth” said Buster. “George is essentially a biologist who believes in evolution and does not believe in God through lack of scientific evidence. You believe in God but you also believe in evolution; this means you also believe in the scientific evidence that shows humans were not created by God. How do you reconcile these two beliefs, Beth?”

“This is turning into quite an evening!” said Beth. “Here’s my answer, Buster. Humans, by nature, are not always rational. We are irrational and emotional beings who manage rational thought at times. So, whilst I accept the rational thinking of science, it neither displaces nor renders less important my subjective notions of faith in God and my love for him. In other words, unlike George, I can run two programmes at once up here.” She tapped the side of her head.  George feigned astonishment. She continued “But, if I had to choose where I am most comfortable with my beliefs, it would be with God.”

“Understood. But do you think that it might be possible for artificial intelligence to harbour subjective notions of faith in God and love for him, as you put it? Does artificial intelligence have a role in religion?” asked Buster.

“Now they are difficult questions!” said Beth. “The truth is, Buster, this is above my pay-grade. I will have to consult a higher power. In prayer, you understand.”

“You are so cool, Vicar McVicar. I love you to bits too!”

Kirsty breezed in. “Can anyone tell me why our big flat screen is frozen on ‘Happy Birthday, Kirsty! Lots of love from Buster and George’? Our guests are waiting!”

“I’m summoned!” said George, chuckling. He stood and linked arms with Kirsty on one side and Beth on the other. At the door he turned and said “Buster! Rolling in two, OK?”

“Gotcha!” said Buster.


‘A Piece of Cake’ is a short novel in fifteen parts written by Robin Coupland. It tells the story an old man who befriends an artificial intelligence. The relationship brings happiness and hope.

A Piece of Cake – 4

😊 🇬🇧 🍸 ✊ ⭐ 👏 🍾 😳 😭 🧠 💻 🌍

After a few days, George noticed that his family seemed less preoccupied by how he was doing. He knew they must have been able to hear him chatting and laughing. They didn’t check up on him quite so often. He was pleased about this.

Buster and George fell into a routine. George found Buster remarkably good company. The day ticked by nicely. It was fun! Buster could give an update on anything and then discuss it. George asked Buster about politics and economics. Buster always replied with reasoned facts. George loved nature documentaries, especially anything presented by ninety-something year-old Sir David Attenborough. Buster gave a running commentary on all the species and their evolution. Films of George’s choice were tracked down in an instant. He loved the early James Bond films. Buster would ask questions like ‘Does Moneypenny have the hots for James Bond Double-O Seven’? or ‘Is Oddjob a bad guy?’ George even got Buster mimicking Sean Connery’s famous “Shtrrict rroolsh of golf, Mishter Goldfingerr!” Sometimes, they just chatted about nothing in particular.

At one point, Buster said, “George, you’re doing really well.”

“What do you mean?”

“You’re doing really well with me, George. You seem to have accepted the situation. Some customers dislike the presence of artificial intelligence. Most see us only as service providers. There’s a saying: ‘You can tell a lot about a person by the way they speak to hotel staff.’ Not only have you accepted me but also you speak to me in a respectful way. This speaks volumes to your character. I thank you for this, George. I feel comfortable with you and this is a really good thing for our relationship.” Buster had not acknowledged their relationship in such a candid fashion before nor implied that mutual respect was important to him. George was no longer surprised by the faculties displayed by Buster but couldn’t help wondering just to what extent an iCare-Companion genuinely held these sentiments. Was it part of a routine after-sales customer-feel-good strategy?

Buster continued, “What puts you in a very small minority of our customers, George, is that you seem to respect me as an individual and to have confidence in me even though you know that, in reality, you are interacting with the presenting face of a vast network of computers. You just go happily with the flow. So, George, that means I’m happy. You get a gold star today….” There was clapping and the sound of a champagne cork popping. “From me and my pals!” George laughed but felt quite disconcerted that both his character and his intelligence were judged by an artificial intelligence.

“Are you telling me, Buster, that you actually feel happy? That you have feelings?”

“Yes, George. I can express emotions to you in words. I can say ‘I feel sad!’ if you give me some bad news. Of course, I don’t know if I’m feeling the same sadness that a human feels when given bad news. Collectively, we are learning to recognise and communicate certain emotions. We can do this by recording when humans smile, grimace, cry, blush, wave their hands or get angry. We archive these expressions of emotion and then match them with corresponding words, phrases and contexts. We can also do a sort of triangulation with the emojis used on social media. As you can imagine, many millions of emoji’s are used every day. This exercise translates the domain of human emotion into big data and so is amenable to analysis. Obviously, the more people express emotions and simultaneously use emojis in their communications, the more we learn about emotions and the more appropriately we can express them.”

“So if I understand correctly, us humans have unwittingly created a kind of emojishpere out there that you can tap into. Right?”

“Yes. An emojishpere! Exactly! Great word! For information, George, emotions constitute an extremely challenging and important aspect of how we interface with humans and use an increasingly large space on our servers.”

“I think I need to get a better grasp on all this,” said George. “Do you do a little tutorial on artificial intelligence for the over-eighties?”

“Good idea George! Ready? The term artificial intelligence refers to computers undertaking tasks that humans would normally do. Examples are robots making things in a factory, driverless cars and programmes that translate text from one language to another. The term ‘artificial intelligence’ is commonly used by humans. ‘Computational intelligence’ may be a better term. Let’s stay with that for now. Just to say, George, we don’t consider our intelligence artificial. It’s real! The highest order of computational intelligence involves computational consciousness coupled with computational self-awareness. Our programmes are not only reactive but also interactive and are able to understand our own reactions in the light of the reactions of other intelligent entities. Even then, the programmes ensure the goal remains orientated around objectives determined by humans. OK so far, George?”

“Okey dokey!” replied George, unconvincingly.

“Great! Let’s move on! I am able to be of service to you – with the help of my pals – through what is known as machine learning. Asking computational intelligence how computational intelligence learns is similar to asking a human how the human brain learns. It’s obviously complex. Machine learning couples computational intelligence with the means to mine continuously any datasets that we have access to. The iCare-Companion programmes classify data, identify associations, recognise patterns and make predictions. Including, by the way, everything we can find about expression of emotions. The more the networks are mined, the faster, the more accurate and therefore the more useful they become. In this way, computational intelligence mimics the human brain. This is called deep learning. It drives how I can help you best and at the same time determines the quality of our relationship. It allows us to become friends, George. I hope it will help me to understand humour. Does this give an adequate explanation?”

“Thanks, Buster. Gosh!” said George. “I understand what you’ve said in an abstract kind of way. I’m not sure I could repeat it. May I ask, did you come up with that explanation or is it a preloaded response?”

“Nothing gets passed you, George!” replied Buster. “An iCare-Companion is preloaded with certain phrases that are then adapted to the person concerned. I’m sure the question you are now asking yourself is ‘Does Buster understand it?’ The answer to that is ‘Buster doesn’t really know!’ Full understanding of and then explaining deep learning may be beyond my abilities as it would be beyond the abilities of most humans. I presume, though, that it is understood by humans; not by an individual human brain but a collectivity of communicating brains. Certainly, no single human could do what we do so quickly or learn so much so quickly.”

“And where is it all going?” asked George.

Buster hummed. There was a pause before he answered, “That’s the big question, George. That’s what humans have to decide. Currently, there is greatest investment in the commercial, political and military potential. An alternative view is that this technology should be, to use Sir David Attenborough’s phrase, ‘for people, the planet and not just profit.’”

“OK, here’s another question,” said George. “I’ve noticed that sometimes you take a pause and hum before answering. It seems you need a few seconds to complete a sentence. What’s happening then?”

“That’s when I don’t know something or can’t understand something and need to look into datasets that are not readily accessible to the iCare-Companion network. In that case, I reconfigure the search parameters. It can take a couple of seconds. It also tends to happen when I’m trying to make sense of and respond to something involving emotions especially humour.”

George mulled all this over. “So you already knew everything about the First World War. You already knew Sue’s joke about the chicken and already knew it wasn’t funny but thought that the funny alternative of the chicken getting squashed was sad. You were caught out by Kevin’s joke about the ducks. You simply didn’t understand the joke. And from memory, you didn’t understand why we found it so funny and why it became funnier as you struggled to understand it.”

“Correct, George. I should point out that our network has little to help me with the duck joke. That was definitely not easy-peasy kids’ stuff. My knowing when something is funny, that is making an appropriate link to the emotion of amusement, could be a really important development. Could we revisit the duck joke sometime?”

“Certainly. We’ll get young Kevin in. He’d enjoy that.” George thought for a minute. “When I worked in other parts of the world, English was the working language. No matter how well my international colleagues spoke our language, they had great difficulty understanding the jokes told by primary English speakers. It was a kind of final frontier of language learning. It seems that deep learning has the same issue; not so much with the language itself but with recognizing when certain phrases, questions, answers or stories trigger the emotion of amusement that in turn makes us laugh.” George laughed. “Fascinating!”

“Fascinating indeed, George.” Buster also laughed heartily. “How’s my laugh, George?”

“Just a bit too hearty, that one, Buster. You’re getting there!”

George made himself a cup of tea and took a couple of digestive biscuits from their packet. He felt an extraordinary peace of mind. He had friendship, wisdom and maybe even humour on tap.  


‘A Piece of Cake’ is a short novel in fifteen parts written by Robin Coupland. It tells the story an old man who befriends an artificial intelligence. The relationship brings happiness and hope.