IT started like any other day. By the time the sun went down, I felt like I was in a pyschological thriller.

Spending 20 minutes trying to log into my laptop, via a new system that has tighter security than MI5, left me wrung out and stressed before the day had barely begun. Then I tried to submit my gas and electricity readings, normally a straightforward process but suddenly inexplicably complicated: diverting me to an online link, demanding my password and not recognising it.

Then came my parcel delivery. Due to have a new phone delivered, I was told it would arrive between the very precise times of 12.03-13.03. At 12.15pm I got a text saying ‘Sorry we missed you. Your driver Tim won’t be back today.’

This was odd, as I’d listened out for the delivery; there was no van pulling up or anyone knocking on the door.

A link on the text showed a photo of the house Tim went to - a new-build with a double garage. Definitely not my little terraced cottage. Turns out he’d gone to the wrong address. The text said the driver would be back the next day so, to avoid him going to the wrong address yet again and leaving my parcel in a random wheelie bin, I clicked on the link (there was no phone number) and entered the depths of hell - otherwise known as a conversation with a chatbot.

The Robot suggested I download the delivery company app and share a photo of my house, to help the driver to locate it. Excuse me? I’ve paid for this service, and surely any delivery driver with an ounce of sense can find an address by punching a postcode into their satnav. So no, I won’t be faffing about downloading an app.

The Robot said someone would ring within 60 minutes. No-one rang so I was back on the chatbot. The Robot said someone would ring within 30 minutes. At the 29th minute an actual human called to say my parcel was back at the depot. She checked my postcode and said it would be with me by 6.30pm. “So it’s coming today?” I asked. “Yes,” she said.

Of course it didn’t arrive. I rang the depot and it was closed so I went on chatbot and even the Robot was calling it a day. ‘Are you happy with our service today?’ it wrote. Was it actually having a laugh? I pressed the ‘No’ option.

Next morning I rang and was told my parcel would be delivered that afternoon. “Could the driver call me if they’re running late?” I asked. Seemed reasonable to me. Not asking much, just a little professionalism and courtesy. “They use their own phones so it’s up them,” said the human, who clearly couldn’t wait to get me off the line.

Despite my frustration, I didn’t kick off about the wrong address or complain about the driver (his name isn’t Tim, btw). I was patient and polite. Which got me nowhere. I’m still waiting for my delivery.

It’s a familiar scenario. Deliveries don’t always go to plan, fair enough, but what has us tearing our hair out is the process of trying to fix it. Invariably, the only option is a computer generated chat. Let’s not be fooled by the word ‘chat’ - a chatbot is a chilling interface programme designed to stimulate human conversation and customer queries. It has its uses, but many systems are dysfunctional, leading to low customer satisfaction. The responses are automatic and often leave us hanging, with no solution.

There’s something sinister about conversing with something you know isn’t human. When a chatbot simply ends the chat, without being any use, we’re left feeling invisible, isolated, with wild-eyed paranoia creeping in. Then we lose patience with it all and feel so stressed we could cry until we remind ourselves it’s a First World Problem.

It’s more than that though. There is genuine despair at the erosion of human interaction in public services. It was what fuelled the protests of hundreds of people against the closure of railway station ticket offices. As consumers, we feel disrespected and defeated. Many shoppers - over two thirds, in a recent survey - simply give up on websites or apps when confronted with digital obstacles. And, without a flinch, comes the chatbot response: “Are you happy with our service today?”