Except now they record your voice and use it to train voice ai and scam you harder. My coworker’s ex-husband got a call from their “daughter” distressed “kidnapped” needing money for ransom. Sent it and called the ex-wife. Daughter was sleeping at home.
I wonder if they do. That seems like a lot of effort to go to for the average person for a scammer.
It seems easier to have a generic voice, rely on the fact that phone audio quality isn’t great to bridge the gap, and use a shotgun approach.
Some places do, since there were a few high profile attacks, but they were nearly all targeting organisations by pretending to be the CEO or something.
Except now they record your voice and use it to train voice ai and scam you harder. My coworker’s ex-husband got a call from their “daughter” distressed “kidnapped” needing money for ransom. Sent it and called the ex-wife. Daughter was sleeping at home.
I wonder if they do. That seems like a lot of effort to go to for the average person for a scammer.
It seems easier to have a generic voice, rely on the fact that phone audio quality isn’t great to bridge the gap, and use a shotgun approach.
Some places do, since there were a few high profile attacks, but they were nearly all targeting organisations by pretending to be the CEO or something.
Once it’s automated it’s the same either way. Probably something even vibe code could pull off.
that’s why they only get one word from me. and it’s said like a jolly game show host.
I’ve heard of this scenario as an example of why not to put your face on the internet. Now with AI it’s actually happening.