Artificial Intelligence (AI) has become an integral part of our daily lives, assisting with everything from office tasks to household activities.
However, over time, AI has been misused by some individuals to commit fraud.
AI-related scams, particularly those using deepfake technology, are increasingly taking advantage of unsuspecting victims. Deepfake technology creates fake videos or audio that are nearly indistinguishable from reality, making it a powerful tool for deception.
Deepfake technology utilizes AI to generate fake audio or video content that appears authentic. It can manipulate images or videos to make it seem like someone is doing or saying something they never did.
Here’s an explanation of AI-based scams, examples of how they work, and tips to protect yourself.
AI Voice Scams
The Indonesian Financial Services Authority (OJK) has urged the public to be more aware stay of new types of fraud involving deepfake technology, particularly through phone calls.
The scam phone call will feature a familiar voice, sounding like a family member, friend, or someone you know. AI technology can impersonate a person’s voice to closely mimic that of someone you're familiar with.
This often leads victims to follow the caller's instructions without question.
AI Video Scams
Recently, a viral social media case involved scammers using video calls featuring deepfake-generated visuals of celebrities.
In this scam, victims received video calls from unknown numbers, showing what appeared to be a celebrity’s face and voice, offering fake prizes or rewards.
It is a sophisticated impersonation created by AI that mimics a person’s voice or appearance. The process involves analyzing numerous videos and online content of a person, after which the AI algorithm creates a highly accurate imitation.
Scammers typically use deepfakes to impersonate individuals that potential victims know or trust, aiming to extort money or collect personal data.
Because the voice and face are familiar, the unsuspecting victim is likely to comply with the request without hesitation.
Tips to Stay Safe from AI Scams
Here are some practical steps to avoid becoming a victim of AI fraud:
1. Avoid Sharing Personal Information on Social Media
The first and most important tip is to avoid carelessly sharing your personal information on social media. Even if the information you post seems harmless, cybercriminals can use it to create convincing deepfakes and scams.
2. Be Cautious with Unknown Numbers or Accounts
While not all unknown numbers or accounts are scams, it’s best to avoid responding if you’re not expecting a call or message.
If someone claiming to be a family member, friend, or official contacts you from an unfamiliar number, verify their identity through a trusted source, such as a family WhatsApp group, before proceeding.
3. Ask Specific Questions
If you’ve already answered the phone and the voice or visuals seem familiar, reconfirm their identity by asking specific questions. Ask the caller or video caller to mention things that only the two of you would know. If their answers seem suspicious, hang up the phone or end the video call immediately.
4. Do Not Comply with Suspicious Requests
By using AI to imitate the voice or face of someone familiar to the victim, the perpetrator usually calls or makes a video call, offering lures such as valuable goods from immigration at a discounted price, or houses/vehicles at auction with tempting prices.
They may also provide urgent information, like your father/mother needing help, your car breaking down, or the police notifying you that you’re involved in online gambling.
If they then ask for money or personal data, stay alert and don’t hesitate to refuse. Even if the person claims to be a family member, close friend, or a known official/celebrity.
Especially if the caller rarely contacts you and has never made such requests before, the safest course of action is to firmly refuse and hang up the phone or video call.
5. Listen for Signs of AI Voice Fraud
While a person’s voice can be imitated, there are usually unusual pauses or distortions in the voice and video that could indicate a deepfake.
So, pay close attention to the caller's voice, especially the rhythm and pace of their speech, which are often harder to replicate. If you're unsure whether it sounds like someone you know, disconnect the call immediately.
6. Watch for Signs of Visual AI Fraud
Deepfake videos often have imperfections. Look for:
- Mismatched facial movements and voice synchronization
- Unnatural eye movements or gazes
- Blurry edges or distorted backgrounds
- Inconsistent video quality in certain areas
- Overly perfect or exaggerated visuals
Here are examples and explanations of deepfake scams using AI voice and visual technology, along with safety tips.
Stay Informed to Stay Safe
To help protect yourself, here’s a list of official BCA contact numbers and accounts:
- Halo BCA Phone Number: 1500888 (no prefix)
- WhatsApp Bank BCA: 08111500998 (look for the verified blue checkmark)
- Official BCA Social Media Accounts: Check at bca.co.id/socialmedia
- HaloBCA App: Only download from the official Play Store or App Store.
If you suspect fraud or have fallen victim to a scam, contact BCA immediately. For the latest updates on fraud schemes, visit www.bca.co.id/awasmodus.