投稿時間:2024-01-27 18:03:50 RSSフィード2024-01-27 18:00分まとめ(4件)
カテゴリー | サイト名 | 記事タイトル | リンクURL | 頻出ワード・要約等 | 登録日 |
---|---|---|---|---|---|
IT | 気になる、記になる… | Fossil、スマートウォッチ事業から撤退へ | https://taisy0.com/2024/01/27/194468.html | fossil,google,theverge | 2024-01-27 08:41:12 |
IT | 気になる、記になる… | Appleの整備済み商品情報 2024/1/27 − M2搭載MacBook Air多数追加 | https://taisy0.com/2024/01/27/194466.html | apple,macbookair,初期不良 | 2024-01-27 08:05:28 |
Program | AWSタグが付けられた新着投稿 - Qiita | 心からのお礼。(AWS Cloud Quest) | https://qiita.com/nikorasu277/items/8ba948f0228937537b81 | awscloudquest | 2024-01-27 17:09:39 |
海外TECH | Engadget | ElevenLabs reportedly banned the account that deepfaked Biden's voice with its AI tools | https://www.engadget.com/elevenlabs-reportedly-banned-the-account-that-deepfaked-bidens-voice-with-its-ai-tools-083355975.html?src=rss | ElevenLabs an AI startup that offers voice cloning services with its tools has banned the user that created an audio deepfake of Joe Biden used in an attempt to disrupt the elections according to Bloomberg The audio impersonating the president was used in a robocall that went out to some voters in New Hampshire last week telling them not to vote in their state s primary It initially wasn t clear what technology was used to copy Biden s voice but a thorough analysis by security company Pindrop showed that the perpetrators used ElevanLabs tools nbsp The security firm removed the background noise and cleaned the robocall s audio before comparing it to samples from more than voice synthesis technologies used to generate deepfakes Pindrop CEO Vijay Balasubramaniyan told Wired that it came back well north of percent that it was ElevenLabs Bloomberg says the company was notified of Pindrop s findings and is still investigating but it has already identified and suspended the account that made the fake audio ElevenLabs told the news organization that it can t comment on the issue itself but that it s dedicated to preventing the misuse of audio AI tools and that it takes any incidents of misuse extremely seriously The deepfaked Biden robocall shows how technologies that can mimic somebody else s likeness and voice could be used to manipulate votes this upcoming presidential election in the US This is kind of just the tip of the iceberg in what could be done with respect to voter suppression or attacks on election workers Kathleen Carley a professor at Carnegie Mellon University told The Hill It was almost a harbinger of what all kinds of things we should be expecting over the next few months It only took the internet a few days after ElevenLabs launched the beta version of its platform to start using it to create audio clips that sound like celebrities reading or saying something questionable The startup allows customers to use its technology to clone voices for artistic and political speech contributing to public debates Its safety page does warn users that they cannot clone a voice for abusive purposes such as fraud discrimination hate speech or for any form of online abuse without infringing the law But clearly it needs to put more safeguards in place to prevent bad actors from using its tools to influence voters and manipulate elections around the world nbsp This article originally appeared on Engadget at | 2024-01-27 08:33:55 |
コメント
コメントを投稿