After being scammed into thinking her daughter was kidnapped, an Arizona woman testified in the US Senate about the dangers side of artificial intelligence technology when in the hands of criminals.

  • animist@lemmy.one
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    This is gonna become waaaaay more common. I am already working on code words with my family members just in case.

  • DarkThoughts@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Is no one questioning how the alleged kidnappers managed to create a voice profile from a random 15 year old girl to create such a convincing AI voice? The only source that claims that this was potentially an AI scam, was in fact just another parent:

    But another parent with her informed her police were aware of AI scams like these.

    Isn’t it more likely that dad & daughter did this and it backfired?

    • davidhun@lemmy.sdf.org
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Given the prevalence of social media platforms where you post videos of yourself, it seems pretty easy to get enough voice sampling to generate a convincing clone. Depending on how much personal info she and her family members put out on social media, it’s trivial to connect all the dots to concoct a plausible scenario to scam someone.

      Now whether or not it was “just a prank, bro” from family or whomever, I don’t know.

  • Gray@lemmy.ca
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    My grandma fell for a scammer that was pretending to be one of her grandchildren stuck in a jail in Mexico over a mixup. No AI voice or anything, just an actor and a vulnerable 90+ year old woman. She sent the scammer $10,000. I cannot fucking begin to imagine what AI is going to do to the scamming industry.

  • carnha@lemmy.one
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I’ve only been thinking about the implications of faking a celebrity’s voice - personalizing it like this makes me sick to my stomach. Had no idea it’s already that easy. I don’t think the voice would even have to be that realistic - if they’re faking a life threatening situation, my first thought isn’t going to be “Hey, their voice sounded a little off”. Absolutely horrifying.

  • WorseDoughnut@kbin.social
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    When DeStefano tried to file a police report after the ordeal, she was dismissed and told this was a “prank call”.

    Why am I not surprised.

  • Plume (She/Her)@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I hate everything about AI and this is not helping. It feels like we opened a door wide open that we should’ve never have touched.

  • TruthButtCharioteer@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Soooo… holdup.

    1. Take out kidnapping insurance
    2. “Go to mexico”
    3. Get “kidnapped”
    4. Run the scam with your fancy schmancy ai
    5. Get pay out and get “rescued”