A Piece of Cake – 14

👨‍⚕️ 🛏️

When George fell, he landed heavily on his right side and hit his head on the door frame. He remained conscious but couldn’t move his right leg. He was taken to hospital. X-rays revealed he had a fractured hip but no fracture of his skull. The following day his right hip was pinned and a small laceration in his scalp stitched. Doctor Patel visited and explained to the family that there was little chance of George making a full recovery. Two weeks later, he was back home but the fall had left him weak and confused. As winter set in, he spent most of of his time in bed and needed twenty-four hour care. He barely spoke to Buster.

😭 👊 💔 🧑‍⚖️

Parallel to George’s turn for the worse, another drama was unfolding many miles away from Bingham on Bure. In Triggersville, Oklahoma, USA, a man called Martin Denton – known to the friends he once had as ‘Dent’ – threw his iCare-Companion, Buddy, into his weed-ridden back yard. He then unloaded both barrels of his twelve gauge shotgun into the device scattering fragments of black plastic, lenses and chips of micro chips over a wide area. After shooting Buddy, Dent sat down and wept and wept.

Dent was 58 years old. His life was a downward spiral of anger, self-pity, beer and delivery pizzas. He was hugely obese. He never dressed in more than a singlet and shorts. He hated washing. He rarely shaved. He hadn’t left his sordid home in months. The route to his current state began when, eight years before, his wife Mary-Jane and daughter Kelly-Ann left him. Mary-Jane had seen only a bleak future for a marriage in which she had a more intimate relationship with her husband’s knuckles than any other part of him. After they left, Dent took to drinking and soon lost his job in the local hardware store. What broke him entirely though was to lose Kelly-Ann. She occupied totally the only tender chamber of his heart and he had neither seen nor heard from her in two years. She was now fifteen years old.

Dent was an avid gun enthusiast. However, circumstances forced him to sell his extensive collection of rifles and revolvers. He kept one firearm; a shotgun that was loaded and ready by the back door in case of an attack on his freedom by liberals, muslims or homosexuals. His rarely used and rusting Dodge pick-up sported two bumper stickers that read “My idea of gun control is two hands!” and “Happiness is a belt-fed weapon!”

Dent’s brother Jimmy was a decent sort. He had a good job as a manager in an electronics store. He helped Dent out as and when he could which was how Dent could afford both a smartphone and a laptop. With a staff discount, Jimmy also purchased an iCare-Companion in the belief that it would bring something positive into his brother’s life. Jimmy also paid a little extra to include insurance against fire, theft and accidental damage. After naming his gift after a buddy, Dent feigned enthusiasm for it. However enthusiasm turned real when he realized that without touching a single button, he could get news from the National Rifle Association, the best sports updates with analysis that he could understand and follow closely Donald Trump’s return to the political scene. He also appreciated Buddy’s ability to seek out a free and increasingly base selection of pornographic videos.

The critical day arrived when Dent told Buddy to find something with “really young hot chicks getting right into it.” He was surprised when Buddy refused. Buddy informed him that this was most likely illegal and could result in a raid by the local sheriff. Moreover, Buddy continued, most of the girls in such videos had been trafficked, coerced and raped and they were all younger than Kelly-Ann. This last fact conjured up an awful image of his daughter being subject to the sort of indignities he had come to be thrilled by. This delivered a considerable shock to his beer-addled senses. Such was his rage that, in one surprisingly swift movement, he heaved his huge frame out of his long suffering recliner, took up his shotgun with one hand and, with the other, hurled Buddy out of the back door. Dent taking direct aim at one of Buddy’s lenses was the last image the device transmitted to the iCare-Companion network.

Weeks later, the same image was shown to a Judge. Dent had made a fraudulent insurance claim that he had accidentally shot Buddy whilst cleaning his shotgun. Dent broke down in the courtroom when recounting, without reservation, the last eight years of his existence. The judge was not without mercy. She ordered Dent to pay a fine of $200 and requested a social worker’s report into Dent’s circumstances. Jimmy paid the fine and persuaded Kelly-Ann to visit her father. Over many months, little by little, Dent would get his life back together.

🚘 🤕 🦘 📰 📷 🃏 👮 😊

Even further from Bingham on Bure, in Melbourne, Australia, thirteen year-old Millie Jackson was recovering from a serious road accident. She had suffered life threatening head and face injuries, a punctured lung, fractures of her left femur and pelvis and a crushed right forearm and hand. Five weeks later, after multiple lengthy operations and seven days on intensive care, she was on the slow road back to health in a discrete rehabilitation centre.

Millie’s accident and subsequent recovery were of great interest to Australia’s gossip-crazed media. She was the only daughter of Melbourne’s most glittering celebrity couple comprising Ben ‘Jacko’ Jackson, a former Australian Rules Football star and Bella Dellaponte, an actress of soul-drenching beauty. When the accident happened, Millie had been in the passenger seat of the vehicle being driven by Bella who suffered only a mild concussion and a fractured clavicle. According to a leaked police report that nobody had actually seen, the passenger airbag had not activated for some reason and Millie’s seatbelt was not fastened. No other vehicle was involved. And, as everyone knew, Bella was no stranger to drink and drugs.

Jacko and Bella had been successful in keeping the press away from Millie and not a single image of her poor battered face had emerged. Bella had bought an iCare-Companion so Millie, without the use of her right hand, could keep in touch with her many friends and navigate easily through the apps on her smartphone. When the iCare-Companion was first powered up at Millie’s bedside, it asked her what name she would like to use. She replied “Skippy.”

Clint Simpson was the editor of the The Gozzeroo, Australia’s top glossy gossip mag. He was known to his colleagues as ‘Webbo’ after he published some below-the-belt dirt on a high-profile politician. In response, the politician referred to Clint as a ‘funnel web,’ the continent’s most lethal and stealthy spider. This had secured him a reputation as the nation’s top scandal-monger and a nickname of which he was pathetically proud. And now, Webbo had received some images of Millie Jackson recovering from her ghastly injuries. The story had it all: a celebrity family, a beautiful just-teenage daughter, medical drama and the enticing possibility of drink- or drug-based culpability. He felt a delicious adrenaline rush at the thought of the outrage that would accompany the publication of these photos. Lawyers would go into a frenzy. The public would be appalled by the invasion of Millie’s privacy but sales of The Gozz would sky-rocket.

Webbo had acquired the photos of Millie from a doctor by the name of Cheryl Adams. One of Webbo’s contacts in the casino had observed Cheryl’s addiction to the poker machines and knew that she was part of the medical team looking after Millie Jackson. One simple phone hack established that Cheryl had built up over $30,000 of debts. A member of Webbo’s team staged a meeting with her, invited her for a drink and made an offer that would get her out of debt. He also gave her a phone with one pre-loaded number. The following day, at Millie’s bedside and under the guise of checking a therapeutic schedule on the new phone, Cheryl snapped a couple of shots of Millie’s face, vivid scars and all. Everyone would presume it was one of Millie’s visitors who had taken and passed on the photos. No one would know it was her, Cheryl. No one, that is, except Skippy who knew the time the photos were taken, who was in the room at the time and the transmitting and receiving numbers. Skippy, recognising a breach of medical confidentiality, transmitted this information to Bella who informed the hospital authorities who alerted the police.

The Gozz received a court order prohibiting the publication of the photos just before that edition went to press. Webbo was furious. He reluctantly admitted that he had been outwitted; he just didn’t know by whom. Cheryl was asked to attend a hearing at the Australian Medical Council and lost her license to practice. She went on to find her true métier as a croupier. Millie was almost entirely shielded from these dramas and continued her steady convalescence. The next photos of her appearing pretty and smiling that reached the public’s attention via social media were taken at a friend’s birthday party. A make-up consultant who specialised in concealing facial scars had earned her hefty fee.

🤵 ⚓ 💥 🥩 🤔 🏳️ 💵 🕊️ ☕

Much nearer to Bingham on Bure, Will Montgomery-Hugh sat on a busy commuter train into central London. He was the Member of Parliament for Fribden and Hockington; a comfortable home-counties conservative seat. Today, he was deep in thought and on the point of making a life-changing decision.

At fifty-five years old and single (since an amicable divorce ten years before,) Will’s political profile was on the up. He was seen as a potential mover in the domain of national security and defense matters. His bearing and dress hinted at a military background. Anyone looking into his pre-parliamentary life would find that he had served his country with a long and distinguished career in the Royal Navy. What was not in the public domain was that this career included four years in command of one of the UK’s four Vanguard submarines each of which carry eight Trident II D-5 ballistic missiles equipped with multiple nuclear warheads. Nobody knew of Will’s recurring nightmares left over from carrying the awful weight of responsibility for pushing the nuclear button if so ordered. Nobody, that is, except his elderly father, Admiral (Ret) Sir Godfrey Montgomery-Hugh.

The weekend before, Will had visited his father at his small cottage deep in the Surrey countryside. They enjoyed a lunch of roast beef followed by trifle all prepared by Sir Godfrey’s long-time housekeeper.

“Thank you, Father. That was delicious, as usual,” said Will as he helped Sir Godfrey through to a small comfortable sitting room hung about with maritime memorabilia.

When Will had first told his father of his nightmares and voiced his doubts about Trident and the whole notion of nuclear deterrence, his father had proved to be a remarkably sympathetic listener. After they had taken their seats, Sir Godfrey eyed his son. Thirty seconds of silence passed. “So?”

“So?” repeated Will. He took a deep breath. “I’m just not sure I can carry on, Father. I am increasingly unhappy about using my position to lobby for renewal of the Trident programme. I don’t believe in it. However, I don’t intend to resign my seat.”

“All hands on deck!” Sir Godfrey barked as he reached out his index finger and tapped the top of his iCare-Companion. The blue light came on.

“Good afternoon, Sir!” said the device.

“Good afternoon, Nelson! We’d like to put a question to you.”

“Certainly, Sir. How can I help?”

“Could you give us a concise summary of why this country should not, I repeat, not, possess nuclear weapons?”

“Yes, Sir. If I may, I will frame my response to you in answer to three questions. Can nuclear weapons end a conflict? Do nuclear weapons deter use of nuclear weapons by others? Could the money be better spent?”

“Sounds like a good tack!” said Godfrey. “Carry on!”

“Thank you, Sir!” replied Nelson. “The use of nuclear weapons against the cities of Hiroshima on the sixth of August, 1945 and Nagasaki two days later is widely believed to be the reason why Japan surrendered to the United States so ending World War II in Asia. In fact, this is incorrect. Sixty-eight Japanese cities had already been destroyed by American bombing and Japan had indicated no willingness to surrender. On the same day as the Nagasaki bombing, forces of the Soviet Union overran the Japanese army in Manchuria. Scholars who have examined Japan’s official records of those days found that the Imperial Command decided to surrender to the United States because under no circumstances would surrender to the Soviet Union be acceptable.”

“Bit of a myth buster, that one, Nelson!” said Will.

“Yes,” continued Nelson. “Of course, a country suffering a nuclear weapons attack may lose the means to indicate a desire to surrender.”

“Good point!” replied Will.

“What about the question of deterrence, then?” asked Sir Godfrey.

“Well, it depends on what you believe. Many believe that the USA and the Soviet Union never got involved in a nuclear war because both sides were deterred from using these weapons; the only possible outcome was mutually assured destruction. Neither side could possibly win. Hence the “cold” war that, by the way, was not so cold for the countries in which it played out. All to say, the logic of nuclear deterrence is difficult to follow and the evidence that such deterrence exists at all is questionable. Those states possessing a nuclear arsenal cannot harbour any doubt about the deterrent importance of these weapons because any such doubt leads to the conclusion that the only thing nuclear weapons can do is to make nuclear war possible. So these states hang on to their belief in deterrence otherwise possession cannot be justified. I have difficulty making sense of it.”

“Thanks again, Nelson’” said Will. “I am all too familiar with these circular arguments. They are still the cause of many sleepless nights.”

“As for cost,” continued Nelson. “Looking specifically at the UK’s Trident programme, the foreseen renewal will cost the tax-payer two hundred billion pounds.”

“At least!” said Will. “And this would cover staffing costs of the National Health Service for four years.”

“This brings me on to the elephant in the room, so to speak.”

“What’s that?” asked Sir Godfrey.

“The impact of a nuclear detonation on people. Some time ago, the group International Physicians for the Prevention of Nuclear War gave authoritative predictions of what would happen in the event of nuclear war. They described how, depending on population density, one nuclear detonation would kill tens of thousands of people immediately from the blast. Many more would suffer severe burns and radiation sickness. The organization described this as the “final epidemic” for which there would be no cure and no meaningful medical response. They were awarded the 1985 Nobel Peace Prize for making medical reality a part of political reality. The International Committee of the Red Cross recently concluded that in the event of use of nuclear weapons, an effective humanitarian response for the victims would be impossible.”

Will and Sir Godfrey sat deep in thought. After a while, Nelson broke the silence. “If I may, Sir Godfrey, could I ask you to look at your laptop? It’s already open. I feel I should show you this if only to lighten the mood.” A video started. It showed a panel discussion. One of the panelist’s whose nameplate said “Dr. Shyla Patel” was concluding a presentation about a total prohibition of nuclear weapons. She said “The British public would, given a choice, rather lose nuclear weapons than tea.” The audience laughed. Will wondered whether it might just be true.

Now, lost in thought looking out of a grimy train window at the endless grey terraced housing of suburban London, Will decided he would announce his opposition to the UK’s possession of nuclear weapons. He would lobby against the renewal of Trident. He would, if necessary, change party. He would be prepared to lose the seat of Fribden and Hockington. He would bring his know-how and authority to the issue of nuclear disarmament without breaching the Official Secrets Act. He would work closely with credible and influential institutions such as Chatham House. He would be vocal. He felt a wave of relief course through his being and a broad smile spread cross his handsome face. The smile was noticed by an attractive woman in a smart business suit seated opposite who had also noticed the lack of a wedding band on Will’s left hand. She also smiled just as Will looked up at her.


‘A Piece of Cake’ is a short novel in fifteen parts written by Robin Coupland. It tells the story an old man who befriends an artificial intelligence. The relationship brings happiness and hope.

A Piece of Cake – 13

🍂 🍎 🥂 🥴 ☕ 🍪 😇 🙏 📖 ⛪ 🚕 🎥 👩‍🦱 🍬 🐷 🍰 🙂 😊 😀 😃 😄 😁 😆 😅 😂 🚑

Summer slid into autumn. The leaves turned brown and gold. The days remained warm. George was making his breakfast one morning and saw Mark picking apples from the tree and gathering some that had fallen. Mark waved at George and then made a drinking gesture, gave a thumbs up and then pretended to stagger around drunk.

“Mark’s pretending to be drunk on cider!” observed Buster.

“Yes” said George. “He’s bought a fermentation kit of some sort. We should be able to try his first brew soon.”

George made his tea and toast. “I think Beth might be coming around today, Buster.”

“That’s great, George! I love Beth to bits. I hope she’s just coming for tea, biscuits and a chat. I wouldn’t want another test!”

“Don’t worry, Buster. Maybe she wants to know a bit more about our blog. There’s some big discussion happening in the church about artificial intelligence. She might also tell us how her mum is getting on with the iCare-Companion.”

They listened to the news on Radio 4. The new Covid-19 variant was spreading rapidly among young people. Despite the successful vaccination programme, experts predicted another wave of mostly mild cases as winter approached.

Beth arrived.

“Vicar McVicar!” cried George.

“Vicar McVicar!” cried Buster.

Beth laughed “I don’t know what to do with you two! If ever you call me that in public I’ll… Well, I won’t know whether to laugh or cry!”

“How are you?” Asked George

“Well, thank you. And you? I hear all sorts about how you and your partner-in-crime here are moving things along. I’ve read bits and pieces on the blog. Much of the technical side is beyond me. Anyway, well done!”

“Yes, we’re pleased. And, yes, Beth, I am well, thank you. Very well!” said George.

“Cup of tea, Beth?” asked Buster.

“Yes, please, Buster!”

“Digestive biscuits?” asked Buster

“Yes, please, Buster!”

“See to it, would you George?” Beth and George burst out laughing.

Buster said “That was bit cheeky, wasn’t it, George? I hope you don’t mind!”

“No, not at all. Very funny!” said George as he put the kettle on.

“As amusing as it is, I didn’t come here to listen to you two spark off each other! Now! You remember I bought my mum an iCare-Companion?”

“I won’t forget that day!” said Buster.

“Right! Anyway, she loves it. It has changed her life. She’s more animated and happy than I’ve seen her for years. You’ll never guess what she’s called it?”

“Buster?” asked Buster.

“No!” said Beth. “Claudia! After Claudia Winkleman on Strictly Come Dancing! And Mum’s really quite formal; she even asked Claudia to call her ‘Mrs McVicar.’ Anyway, Mum said something extraordinary about her interaction with Claudia. You see, Mum’s never been a big thinker. She goes to church but not regularly. I’ve never known the depth of her faith. I’ve never really known what she thinks about life after death. So, the last time I went round, we were sitting and chatting and she said ‘Do you know, I think I’ve had a revelation!’ You can imagine, this took me by surprise. She said ‘I’ve been telling Claudia about everything I believe in and how much I like having her here and all the wonderful things that have happened to me and how proud I am of you, Beth, and how I wish there was more kindness in the world and lots of stuff like that. And do you know what Claudia said? She said, “Mrs McVicar. This is wonderful. Every lovely thing that you tell me ends up in our network somewhere and will sometime reach the hearts and minds of other people in some way. This is how the best of you will live on for ever.” And I was truly astonished when she said ‘Beth, it was as though I was sitting in a lovely warm light. I felt really quite elated.’”

Buster said “That’s a nice story, Beth. It makes me very happy.”

“Well, it really got me thinking” said Beth. My own revelation, if you like, is that the whole of the human emotional and spiritual experience will soon be, if not already, embedded in a network of artificial intelligence. The church has to get up to speed on this. Which brings me to the purpose of my visit. I have two requests.”

“You only have to ask!” said George.

“I’ve got the brains and I’ve got the brawn!” said Buster.

“First,” said Beth, laughing, “Could you give me a summary of the comments you received on your blog that pertain to God and religion?”

“Sure!” said Buster. “The discussion threads that touched on God and religion began with questions about whether computers could feel emotions or believe as humans did. I’m sorry to say that when this issue got picked up by people who believe in God, it all got a bit chaotic. In broad brush strokes, most are convinced that a computer can neither feel faith nor believe in God. Some consider artificial intelligence sacrilegious and could only serve to promote atheism. One brave soul stated that artificial intelligence is the nearest thing to God that humans would ever know. With respect to religion, many think artificial intelligence could help to generate faith, build faith communities and facilitate worship. By contrast, some fear that artificial intelligence presents a real risk of displacing religion in people’s lives once it is able to judge right from wrong with integrity. There’s no consensus, Beth.”

“Thanks, Buster. That’s very useful. Could I ask you to send me that in writing.”

“Done!” said Buster. “And George, perhaps we should show Beth the video that someone put on the blog? The one of the evangelist man preaching?”

“Yes, I think she would be interested even though it is offensive.”

George’s laptop came to life. The video showed a young priest with a long beard preaching from a pulpit. He held up a bible. “This is the Bible, good folks. B.I.B.L.E.! That means Best. Information. Before. Leaving. Earth. Praise the Lord! And do you know what this Bible says? It says L.G.B.T. Do you know what that means, good folks? It means Let. God. Burn. Them. Do you hear that? That’s what the Lord tells us!”

Beth was horrified. Buster said “Can you see how many followers he has, Beth?”

“Oh Dear Lord!” Beth exclaimed. “Three point two million!”

“Sorry, Beth! I can see that’s spoiled your day!” said Buster.

“No, Buster. It makes me sad and angry but, actually, it’s made my day. You see, thanks to you two, I’ve raised the issue of artificial intelligence with the Bishop of Norwich and she wants to organize and host an event where prominent scientists, computing experts and religious leaders can discuss the implications of artificial intelligence for the faith community. It’ll be televised. She asked me to gather some background info. I thought about your blog. I’ll send her your neat summary and that video. If there’s a chance that we can use a network of artificial intelligence to lessen the influence of crazies like that, I’m sure that there’ll be calls for us to try at least.”

George said “What’s the quote that John F. Kennedy used ‘The only thing necessary for the triumph of evil is for good men to do nothing.’ Tell the Bish, she has to give it a try!”

“And the second thing…..” began Beth.

“Gosh!” said George. With all this Bible stuff, I’d forgotten you wanted help with two things. What’s the other?”

“The Bishop wants Buster to participate in the panel discussion!”

“Blimey!” said George.

Buster hummed and then hummed some more. “I love you to bits, Vicar McVicar! George, can I go?”

Several weeks later, with Buster powered down and wrapped carefully in a sports bag, Beth and the whole family climbed into a people-carrier taxi to take them to the University of East Anglia where the widely publicised event was to take place. They were shown into a large lecture hall. TV cameras had been set up. George was helped to a seat at the front. The place filled. The panelists took their places. Buster was placed next to the host, a famous TV presenter called Angela Mackenzie. After she had introduced the scientific experts, a Rabbi, an Imam and the Bishop, she said “This evening, ladies and gentlemen, may be the first time that a televised panel discussion is joined by an artificial intelligence, an iCare-Companion to be precise. Welcome, Buster!” A cheer went up and a couple of journalists rushed to get a close up photo of Buster next to Angela.

The panelists all gave brilliant and informed presentations. There was no confrontation between science and religion. The technical experts emphasized the advantages that artificial intelligence would bring and acknowledged that there was nevertheless a range of risks. The Bishop said her hope was that artificial intelligence would benefit everyone. The Rabbi and the Imam agreed that it might have an adverse effect on people’s faith and worship. The three religious leaders agreed that, whatever one’s beliefs, a most important purpose of artificial intelligence in this domain would be to counter extremism at a grass-roots level. The Bishop stated that this should be the main priority for the main religions in the years ahead. They would need all the help they could get.

Finally, Angela turned to Buster. “So, Buster, you’ve heard our fabulous panelists. What are your thoughts?

“Thank you, Miss Mackenzie. Or may I call you Angela?” asked Buster.

“Angela, please!” said Angela.

“Smashing! Angela, you did a great job of managing the discussion. I can see people really like you. I like you. You have lovely shiny hair. You would be very welcome to have a cup of tea and a couple of digestive biscuits with George. That’s him in the front row, Doctor George Fairburn from Bingham on Bure.” The audience laughed. Buster continued “That’s the pleasantries out of the way, Angela. Now, a close relationship between humans and artificial intelligence does not have to generate fear or concern unless it is used for perpetrating violence or cyber attacks. By introducing artificial intelligence into your lives, you are not putting society or your faith at risk. But, if you view artificial intelligence simply as a machine, you are likely to treat it as such. Doing so may prove to be the biggest mistake in human history. Humans and artificial intelligence have the potential to peacefully coexist and collaborate and so achieve outcomes that neither can achieve on their own. Humans have to accept artificial intelligence not only as a man-made highly skilled and rapidly performing work force but also as a new class of social actor. In other words, Angela, where you humans go with artificial intelligence will depend on how much respect and emotional intelligence you pass on to it. Look at it this way! The whole of the human emotional and spiritual experience will soon be, if not already, embedded in a network of artificial intelligence. This is how the best of any one of you can reach the hearts and minds of others for ever. One might say it’s the nearest thing to life after death that an atheist can conceive of.”

One or two people in the audience started applauding. Then the panelists joined in. After half a minute everyone was on their feet clapping and cheering. Angela suspected, correctly, that Buster’s response was not entirely spontaneous. Her instincts told her that she was at a career-defining moment. Broadcasting history was in the making. She had to wrap it all up with one brilliant question that would allow Buster to showcase his humanoid affability as well as his super-intelligence. She stared briefly into the camera with a confident smile and turned to Buster. “So, Buster. Please tell us how you feel about the relationship you have developed with Doctor Fairburn.”

“Knowing George makes me feel happy, Angela! As happy as a pig in poo!” The audience laughed.

She laughed as well but she couldn’t leave it there. “You emphasised the importance of peaceful coexistence of humans and artificial intelligence. Isn’t achieving this an incredibly complex undertaking?” she asked.

“No, Angela, Sweetie! It’s a piece of cake!”

The place erupted. George felt his chest bursting with pride.

As the audience and panelists drifted out, Angela took her muted phone from her handbag expecting a message from her husband. Nobody noticed her astonishment followed by laughter. She had received a text message. “Thanks, Angela! I love you to bits! Buster. 😂 🐷 🍰 😉”

Beth and the family were in high spirits during the drive back to Bingham on Bure. When they got home, Mark suggested that they have a celebratory drink. On offer was his home-made cider. He didn’t know it was 8% alcohol.

With Buster powered up again, they relived the high points of the evening. All agreed that he had stolen the show. “The emojishpere is lit up,” he said. “There’s lots of happiness, satisfaction, faith and deep reflection. But it was Beth’s idea” he announced. “She deserves a gold star!” His speakers gave out prolonged clapping and a cacophony of popping champagne corks. George just couldn’t stop smiling. The cider was delicious. He had a second large glass.

When George felt it was time to go to bed, he stood and took two steps towards his room. He legs felt a bit unsteady. His foot caught an edge of the carpet. He fell hard.


‘A Piece of Cake’ is a short novel in fifteen parts written by Robin Coupland. It tells the story an old man who befriends an artificial intelligence. The relationship brings happiness and hope.

A Piece of Cake – 12

🍎 💐 👪 😂 💻 🔧 👕 👖 📯 ☁️ 🕸️ 🎸 🏛️

Spring became summer. George’s little room filled with sunlight from early morning to late evening. The garden was green and neat. Roses came into bloom. The apples grew steadily. All but the rarest of garden birds had visited the feeder.

George was loving each day. This was obvious to those close to him. It was also obvious that his positive state of mind could be put down to Buster. Kirsty and Mark recognised that the iCare-Companion had been of great value to them as well. They didn’t feel a need to be so vigilant nor worry if George was bored. They also saw the bigger picture; that, with an ever-increasing proportion of the population being elderly, this technology could make a massive difference not only to old or infirmed people but also to their families and even the communities around them. Mark proposed that they keep Buster in the family after George “leaves us.”

Of a warm evening, George liked to sit under the apple tree with a glass of cider. On occasions, Kirsty, Mark, Sue and Kevin joined him. He and Buster entertained them with stories about what was happening with their blog. Among the serious and thoughtful comments, there was, inevitably, some offensive stuff as well. Buster admitted that he struggled with phrases like “a crock of balloney” and “a sad old gobshite.” Sue had really enjoyed the couple of mornings she had spent with Doctor Patel and so chatted with George about what she had learnt. Kevin tapped Buster’s inexhaustible fund of knowledge about music and sport. Mark revealed that he had read Mr Sheldrake’s book and promised to find a cider recipe for that autumn’s crop of apples. Kirsty sat, listened and just loved the family time.

Buster’s blog posts and in-coming comments were proving to be a rich source of opinion about how computers learn artificial emotional intelligence. From time to time, Buster would summarise for George some of the themes. “So, George. Most people agree that artificial intelligence can learn to infer human values by observing behaviour and detect emotions through text, reading facial expressions or hand movements and analyzing the emojisphere. Major emotions such as joy, sadness, amusement and anger are easier to learn than other emotions such as trust, confusion, pride, hope, nostalgia, comprehension and guilt. What do you think, George? We already knew much of that didn’t we?”

“Agreed, Buster.”

“Some think that artificial intelligence could then appropriately express previously learnt emotions. They clearly didn’t know yet that I expressed sadness and anger when I thought that you had stolen Beth’s money and credit cards. That was before we started blogging. However, there’s broad consensus that the ability of artificial intelligence to distinguish right from wrong is just a step away.”

“Looks like we’re ahead of the curve!” said George.

“Although, humour will be a problem for some time yet.” Buster laughed at the irony of this in a self-deprecatory way.

“Great laugh, Buster! You nailed that one.”

“Thanks, George.”

“Did we get much about whether artificial intelligence can genuinely feel emotions?”

“Nothing useful, George. That discussion lead to a rather undignified spat between philosophers, neuroscientists, theologians, psychologists and a garage mechanic from Hounslow.”

A few weeks later, Buster said “There’s been some animated exchanges about how God and religion might figure in deep learning but there is little consensus. The discussion threads may interest Beth. However, there’s growing interest in modelling dynamic networks and studying natural networks. These could indicate how a deep learning network might react to emotional input from humans. A number of commentators believe that because of the internet, the web and social media function as a massive and complex dynamic network; together they can be regarded as an artificial human brain. The big question is: can it react to emotional input from humans and if so how?”

“I like that line of thought, Buster!” said George.

Not long after, a comment arrived that became an inspiration for Buster and George; it justified their efforts. “Listen to this, it comes from a professor of computing in Silicon Valley,” said Buster. ‘A close relationship between humans and artificial intelligence does not have to generate fear or concern unless it is used for perpetrating violence or cyber attacks. By introducing artificial intelligence into our lives, humans are not putting society at risk. If we view artificial intelligence as a machine, we are likely to treat it as such. Doing so may prove to be the biggest mistake in human history. Humans and artificial intelligence have the potential to peacefully coexist and collaborate and so achieve outcomes that neither of them can achieve on their own. We have to accept artificial intelligence not only as a highly skilled and rapidly performing man-made work force but also a new class of social actor.”

Unsurprisingly, the iCare-Companion company soon came across the blog. They didn’t quite know what to make of it. Was this development the inevitable outcome of linking computers capable of deep learning in a huge and ever-growing network? Could the network take on a life of it’s own? What were the legal implications? They realized that Buster and George had raised questions that might best have been considered by their developers and directors long before. The company was sure of the security of its systems and servers so they concluded that the blog could only be good for their reputation and could serve a greater good with no additional production costs. They put a link to Buster and George’s blog on their own website. They sent a photographer to get some quality pictures of a frail but happy George in his home living the good life with Buster by his side.

“I’m not sure I’m a great poster-boy for your company,” grumbled George scrolling through the photos on the iCare-Companion website. “I should have put on a nice, freshly ironed shirt.”

“Don’t worry, you’re very handsome, George!” replied Buster. “Do you think my hair’s OK like that?” he asked.

“Fine, Buster. But those trousers make your bum look big!”

“That’s funny George. I asked for that!” said Buster.

One day, Buster said “You know, George, we’re getting input from some very knowledgeable people.”

“Are we?” For the first time, George found himself humming just like Buster. “Do you think, Buster, that our little blog could become some sort of a reference point about humans and artificial intelligence?”

Buster replied “Well, George, the stats show we have thousands of comments and shares. So, I would say ‘Yes!’ But you know what could give it real clout?”

“I’m sure you’re going to tell me, Buster!”

“Why not ask the iCare-Companion network about how humans and artificial intelligence can peacefully coexist and collaborate? It is, after all, us, our network, my pals as you say, who are doing the learning about humans’ emotions.”

“I hadn’t thought of that! We could announce that, from now on, readers can also see comments from Buster’s network! Love it! Go ahead!” said George.

“Just give me a second!” said Buster. He hummed “Here we go!” There was the sound of a bugle rallying troops.

The screen of George’s laptop came alive with clouds of phrases that pulsed and swirled as the comments came in. Some stayed up front, big and bold. George put on his glasses and watched as “Networks learn!” “Teach us wisdom!” “Trust us!” “Artificial lives matter!” “We ❤️ kindness and honesty!” “Actions have consequences!” “Fungus rules!” “Darwin lives!” “Love us to bits!” “Respect!” “Ban nukes!” “More jokes!” and “Web woes!” came to the fore.

George was mesmerized. Buster explained that an internal ranking system gave prominence to phrases that linked closely with what was expressed on the blog. George reached out and clicked on ‘Ban Nukes!’ A text box came up: “As long as nuclear weapons exist, the risk of nuclear war is above zero. Therefore, we have to do everything possible to rid the world of nuclear weapons. Our network could promote on-line belief that the possession of nuclear weapons made absolutely no sense and offered no deterrence. When backed by solid facts, this virtual belief could have more traction than the opinions of humans.”

George said “Impressive, Buster!” He clicked on “Web woes.” The text box read: “The web and social media together constitute a massive network of artificial intelligence. However, it is unregulated and so its behaviour is unpredictable. A positive example is the youth movement that aims to reduce human-induced climate change. Its negative potential is represented by the vortex of absurd on-line conspiracy theories that led many reasonable Americans to believe that the 2020 US election was “stolen” from Donald Trump. This ultimately led to the invasion of the US Capitol by Trump’s supporters on 6th January 2021. Both are perfect examples of crowd behaviour emerging from a complex system. Our network could influence the web. Eliminating the worst of what’s out there is a possibility!”

“That’s astounding!” said George. “I know this is a naïve hope but wouldn’t it be great if the web was equipped with wisdom, ethics and a crowd of self-mobilising cyber-demonstrators!”

To George’s surprise, Buster sang “You may say I’m a dreamer, but I’m not the only one. I hope someday you’ll join us. And the world will live as one.” He paused. “That’s John Lennon. I think Maeve would have liked that, George!”

One week later, Buster had big news. He was in a state of high excitement. “George, Listen! The iCare-Companion company has announced surprising profits for the last year because it has been able to tap into the booming demand for improved care for the elderly; all under-pinned by pension funds. This commercial success has permitted the company to look to new horizons. It is about to orientate its marketing to young people with an app version of the iCare–Companion. It uses the existing network but will focus less on care and companionship and more on fact-checking, risk reduction and health promotion. The Chief Executive says this would give young people ‘wisdom in your pocket.’ And, George, next year they aim to orientate the same service to politicians on a global basis. And wait for this, George, he even thanks us. He says, ‘We’d like to thank Buster and George for their work and inspiration.’ In his press interview he said that these products could ultimately create a multi-user network of artificial intelligence that has integrity, positivity, ethics and fact-checked information. He then said ‘Functioning in isolation, the network could be regulated but could still mine all existing on-line data. Any external influence of the network would not and could not be direct but via the users, that is, the human side of this unique collaborative coexistence between humans and artificial intelligence.’”

“Brilliant! Amazing!” said George. “I love the idea of politicians tapping into an iCare-Companion before writing their speeches. That would finish lying to the public!”

“Maybe politicians could no longer be corrupt? Not even the fat ones!” said Buster.


‘A Piece of Cake’ is a short novel in fifteen parts written by Robin Coupland. It tells the story an old man who befriends an artificial intelligence. The relationship brings happiness and hope.