Artwork

コンテンツは Audioboom and Information Security Forum Podcast によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Audioboom and Information Security Forum Podcast またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作物をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal
Player FM -ポッドキャストアプリ
Player FMアプリでオフラインにしPlayer FMう!

S30 Ep1: Dr. Andrew Newell - Deep Fakes: An attack on human identity

23:35
 
シェア
 

Manage episode 444124594 series 1318624
コンテンツは Audioboom and Information Security Forum Podcast によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Audioboom and Information Security Forum Podcast またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作物をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal
Today, Steve sits down with Dr. Andrew Newell, Chief Scientific Officer at the British biometrics firm iProov, for a conversation about deep fakes. As technology improves, it’s becoming ever more difficult to determine what’s real and what’s fake. Steve and Andrew discuss what this will mean going forward for security, social media platforms, and everyday technology users.
Key Takeaways:
1. Technology is the key to mitigating the threat of deep fakes, which are synthetic images or videos created to deceive.
2. Deep fakes are becoming increasingly sophisticated, making them hard to spot.
3. Newell breaks down the problem into two parts: secure identity verification and detecting synthetic images.
4. Incentives for verifying imagery will radically shift as deep fakes become more prevalent.

Tune in to hear more about:
1. Deep fake technology and its potential impact on identity verification processes (5:57)
2. Preventing deep fake images and videos using technology and algorithmic systems (9:57)
3. Deep fakes and their potential uses, including filmmaking and education (13:11)
4. Deep fakes and their impact on society, with a focus on technology’s role in verifying authenticity (18:43)

Standout Quotes:
1. “I think the urgency here — and this is the absolutely key part — is that we need to get the technology in place to make sure that the processes that rely on the genuineness of the person in imagery, that we can have something in place that we know works, that we know that we can trust, and is something that is very easy to use.” - Andrew Newell
2. “I think on the protection of identity proofing systems against the threat from deep fakes, we have a technology solution now. And the urgency is to make sure that this technology is used wherever that we need to actually guard against that threat.” - Andrew Newell
3. “And one of the most important things, if not the most important thing, is: when we think about a way to mitigate these threats, it has to be something that works for everybody. We cannot end up with a system that only works for certain groups in a society.” - Andrew Newell
Mentioned in this episode:
Read the transcript of this episode
Subscribe to the ISF Podcast wherever you listen to podcasts
Connect with us on LinkedIn and Twitter
From the Information Security Forum, the leading authority on cyber, information security, and risk management.

  continue reading

280 つのエピソード

Artwork
iconシェア
 
Manage episode 444124594 series 1318624
コンテンツは Audioboom and Information Security Forum Podcast によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Audioboom and Information Security Forum Podcast またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作物をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal
Today, Steve sits down with Dr. Andrew Newell, Chief Scientific Officer at the British biometrics firm iProov, for a conversation about deep fakes. As technology improves, it’s becoming ever more difficult to determine what’s real and what’s fake. Steve and Andrew discuss what this will mean going forward for security, social media platforms, and everyday technology users.
Key Takeaways:
1. Technology is the key to mitigating the threat of deep fakes, which are synthetic images or videos created to deceive.
2. Deep fakes are becoming increasingly sophisticated, making them hard to spot.
3. Newell breaks down the problem into two parts: secure identity verification and detecting synthetic images.
4. Incentives for verifying imagery will radically shift as deep fakes become more prevalent.

Tune in to hear more about:
1. Deep fake technology and its potential impact on identity verification processes (5:57)
2. Preventing deep fake images and videos using technology and algorithmic systems (9:57)
3. Deep fakes and their potential uses, including filmmaking and education (13:11)
4. Deep fakes and their impact on society, with a focus on technology’s role in verifying authenticity (18:43)

Standout Quotes:
1. “I think the urgency here — and this is the absolutely key part — is that we need to get the technology in place to make sure that the processes that rely on the genuineness of the person in imagery, that we can have something in place that we know works, that we know that we can trust, and is something that is very easy to use.” - Andrew Newell
2. “I think on the protection of identity proofing systems against the threat from deep fakes, we have a technology solution now. And the urgency is to make sure that this technology is used wherever that we need to actually guard against that threat.” - Andrew Newell
3. “And one of the most important things, if not the most important thing, is: when we think about a way to mitigate these threats, it has to be something that works for everybody. We cannot end up with a system that only works for certain groups in a society.” - Andrew Newell
Mentioned in this episode:
Read the transcript of this episode
Subscribe to the ISF Podcast wherever you listen to podcasts
Connect with us on LinkedIn and Twitter
From the Information Security Forum, the leading authority on cyber, information security, and risk management.

  continue reading

280 つのエピソード

すべてのエピソード

×
 
Loading …

プレーヤーFMへようこそ!

Player FMは今からすぐに楽しめるために高品質のポッドキャストをウェブでスキャンしています。 これは最高のポッドキャストアプリで、Android、iPhone、そしてWebで動作します。 全ての端末で購読を同期するためにサインアップしてください。

 

クイックリファレンスガイド