• OpenPassageways@lemmy.zip
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I’ve seen this advertised as a fraud detection and prevention service, even before ChatGPT. I’m assuming there’s a standard disclosure that the call may be recorded for training purposes, it’s only recently that “training” has included “training AI”.

    • Pandemanium@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      It doesn’t prevent any fraud when anyone on the Internet can now easily recreate anyone’s voice using AI. Banks should know better.

    • lolola@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Training then: Making all the new hires sit down and listen to a recording of you get increasingly frustrated with their dumbass coworker.

      Training now: Burning through a neighborhood’s worth of power for data processing to ensure that the phone tree understands you with absolute certainty when you say “speak to a representative”, yet continues to ignore you anyway.

    • Fredselfish@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Yeah but still should be illegal. I mean this AI is listening and gathering information about the customer while discussing private banking matters.

      • Jakeroxs@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        It really depends on how it’s being stored and used, like the other commenter mentioned, it’s standard practice in banking/Brokerage industry to record all calls for training/litigation/coaching.

      • Flocklesscrow@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        100%

        And how long (as if it isn’t already) before the same systems transition to healthcare? AI wants to know all the salacious details, baby.