By now, most people have heard of phishing: the classic email-based scam. But fewer are familiar with vishing, or voice phishing — an increasingly common and highly sophisticated attack method targeting UK businesses in 2026.
In the past week alone, we’ve seen a new type of cyber attack emerge on our service desk — one that doesn’t rely on links, malware, or even immediate access to systems.
It starts with a simple phone call.
What’s changing, however, is how these attacks are being carried out.
Attackers are now using AI-driven techniques to capture and replicate employee voices — a tactic often referred to as AI voice scraping.
Traditional phishing attacks target your inbox. Vishing targets the person.
Attackers use phone calls to:
Historically, these attacks relied solely on social engineering.
Now, Artificial Intelligence (AI) is changing the game.
Cybercriminals are increasingly using AI tools to:
This means a single phone call today could enable a far more convincing attack tomorrow.
In the past week alone, our service desk has reported multiple suspicious calls that appear to follow a similar pattern.
In each case, the caller:
Individually, these requests seemed harmless.
But taken together, they strongly suggest an attempt to capture voice samples for AI training purposes.
This is what we now refer to as AI voice scraping — the process by which attackers gather small snippets of speech that can later be used to clone a person’s voice.
Now, let's hear from one of our First Line Engineers as they reveal how they were targeted in a recent encounter:
“They were very insistent about me repeating things back – the company name, the email subject, even my own name. I even considered giving them a fake name because of how pushy they were, but my honesty got the better of me.”
“It was frustrating. They weren’t a customer, but if it was legitimate, it could have been a potential client, and first impressions are critical. I didn’t want to be rude or risk losing business.”It was only later, when similar calls started targeting colleagues, that the pattern became clear.
“I didn’t think it was anything more than a hard‑sell call at first. It wasn’t until my colleagues mentioned the same kind of pushy behaviour that we realised this could be voice scraping. Then the internal warning went out, and I had to put my hand up and say, ‘I’ve already had that call.’”Once the engineer recognised what had happened, they did exactly the right thing: they flagged it to their manager and shared details so the wider team could be warned.
“After the call, I messaged my manager because I suspected I’d be name-dropped in whatever email they sent. When we realised they’d tried my colleagues too, it opened up a wider discussion – and I was honest that I’d already spoken to them and fallen for it.”The emotional impact is an important part of this story, too.
“Afterwards, when they started targeting my colleagues, I felt shaken and honestly quite awful – like I’d opened our company and our clients up to attacks. That’s exactly what these scammers exploit: people trying to be helpful.”From the experience, they’ve taken away some clear lessons:
“Unfortunately, being too nice can mean being taken advantage of – that’s what the scammers rely on: the kindness and ignorance of others. Going forward, I’m going to be much less patient with rudeness or hostility on the phone. If something feels off, I’m going to challenge it or just hang up.”This incident underlines a crucial point: good customer service should never come at the expense of security. Your team need to know that it’s OK to say no, to slow things down, and to end a call if it doesn’t feel right.
Unlike traditional vishing attacks, AI-driven scams don’t always aim for immediate results.
Instead, they often happen in two stages:
1. Voice data collection: Attackers gather recordings of employees speaking naturally.
2. Voice cloning & impersonation: Those recordings are then used to:
Because the voice sounds real, these attacks are significantly harder to detect — especially for external contacts or customers.
During a vishing attack, you may be asked to:
While these requests may seem low-risk, they can contribute to a much larger attack.
As vishing attacks become more sophisticated, businesses need to adapt their security approach.
1. Train Your People
2. Strengthen Your Processes
3. Use the Right Technology
Vishing is no longer just a social engineering tactic — it’s becoming an AI-powered threat that's finding new, cleverer, and cruder ways to infiltrate personal information and business-critical data.
As a society, we have to do more to combat the seemingly endless wave of threats that are emerging year on year. The rise of AI voice scraping means that even seemingly harmless phone interactions can have serious consequences down the line.
Businesses that recognise and respond to this shift early will be far better positioned to protect their people, their customers, and their data. If you'd like to take the time to discuss vishing, AI voice scraping, or any other cybersecurity threat that's playing on your mind, please do give us a call.