Artwork

コンテンツは Paul White-Jennings によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Paul White-Jennings またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作権で保護された作品をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal
Player FM -ポッドキャストアプリ
Player FMアプリでオフラインにしPlayer FMう!

Steering the future of IT ft. Elizabeth M. Adams and Joe Weinman

26:01
 
シェア
 

アーカイブされたシリーズ ("無効なフィード" status)

When? This feed was archived on April 07, 2023 15:16 (1y ago). Last successful fetch was on October 25, 2022 09:51 (1+ y ago)

Why? 無効なフィード status. サーバーは持続期間に有効なポッドキャストのフィードを取得することができませんでした。

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 317845737 series 2874135
コンテンツは Paul White-Jennings によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Paul White-Jennings またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作権で保護された作品をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal

Data is being called the new oil as companies race to get as much of it as much as possible. But without the right measures to keep certain AI-centric biases in check or forethought to incorporate ethical measures early, the real power of data can become too slick to handle. Listen as CEO Elizabeth M. Adams and author Joe Weinman talk about what the future of IT can (and should) look like. From AI bias to unreliable algorithms to the working harmony between humans and machines, see what we should be accounting for as our IT-enabled future fast approaches.

Key Takeaways:

[2:43] Why does bias happen in IT systems? When there isn’t enough diversity in the data sets that the model is being trained on, it takes on this bias in the life of the algorithm. Elizabeth shares an example in facial recognition where the data ends up being sold on the market, a customer uses that data, and makes a decision about whether or not someone is deemed a threat based on biased information. Then, if law enforcement agencies decide to use that data, they can over-police in already over policed communities and cause a systemic problem, all because of that data.

[5:35] All areas of our life are impacted by algorithms, from traffic patterns to predictions about who should get a loan, their interest rate, health insurance, and what type of health coverage someone is granted.

[6:13] Joe shares two scenarios about how humans will interact with machines over the next coming decades. First, humans are replaced by machines. Second, and the most likely scenario, humans will collaborate with machines to create a better solution and higher productivity.

[8:00] Human supervision can be extremely relevant in using information technology and AI. Joe shares some examples from MIT’s Kevin Slavin such as flash crashes, caused by program trading.

[10:54] Responsibility in AI is a shared responsibility between both the technical and non technical teams. Building ethical technology doesn’t eliminate the possibility of unethical results, and we need more resources dedicated to areas like AI Ethics and governance within our companies, especially large ones acting as nation states.

[16:27] Elizabeth discusses some best practices that will add ethics into more computer science courses and students get a critical perspective early on.

[18:09] Companies that don’t consider themselves to be in the tech business will need to play catch up fast and take on that responsibility themselves before the government has to step in. Hopefully, more companies will begin to take a more serious look at the ethical components of the tech they rely on. Elizabeth discusses the long-wave theory, which talks about how long it takes for all of the different revolutions.

[23:27] Will we be in a Terminator SkyNet scenario? Quite possibly, says Joe, but we have to figure out where humans are going to be in the loop and understand what our algorithms are doing and how they're training other algorithms.

Quotes:

  • [2:25] “Data is what they’re calling the new oil, and there’s a race to how much data a company can consume.” - Elizabeth
  • [5:39] “All the technologies that make sense of more data in less time and more intricate ways are fueling some of the most exciting and polarizing advancements.” - Jo
  • [7:57] “The best performance sometimes is through a joint human and machine.” - Joe
  • [14:10] “If you look at human behavior, you have a wide spectrum of possibilities, ranging from Mother Teresa to say a dictator that kills millions of people. The way the technology gets employed, and that is not the technology's fault.” - Joe
  • [18:48] “For those companies who are not able to quickly adapt to this digital moment that we are having, I don’t think they will be around for long. That’s where we are, where we are going to stay, and where the jobs are going to be.” - Elizabeth
  • [25:18] “We have to put ethics at the forefront of all of our business. Whether you think you work in tech or not.” - Jo

Continue on your journey:

pega.com/podcast

Mentioned:

  continue reading

45 つのエピソード

Artwork
iconシェア
 

アーカイブされたシリーズ ("無効なフィード" status)

When? This feed was archived on April 07, 2023 15:16 (1y ago). Last successful fetch was on October 25, 2022 09:51 (1+ y ago)

Why? 無効なフィード status. サーバーは持続期間に有効なポッドキャストのフィードを取得することができませんでした。

What now? You might be able to find a more up-to-date version using the search function. This series will no longer be checked for updates. If you believe this to be in error, please check if the publisher's feed link below is valid and contact support to request the feed be restored or if you have any other concerns about this.

Manage episode 317845737 series 2874135
コンテンツは Paul White-Jennings によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Paul White-Jennings またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作権で保護された作品をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal

Data is being called the new oil as companies race to get as much of it as much as possible. But without the right measures to keep certain AI-centric biases in check or forethought to incorporate ethical measures early, the real power of data can become too slick to handle. Listen as CEO Elizabeth M. Adams and author Joe Weinman talk about what the future of IT can (and should) look like. From AI bias to unreliable algorithms to the working harmony between humans and machines, see what we should be accounting for as our IT-enabled future fast approaches.

Key Takeaways:

[2:43] Why does bias happen in IT systems? When there isn’t enough diversity in the data sets that the model is being trained on, it takes on this bias in the life of the algorithm. Elizabeth shares an example in facial recognition where the data ends up being sold on the market, a customer uses that data, and makes a decision about whether or not someone is deemed a threat based on biased information. Then, if law enforcement agencies decide to use that data, they can over-police in already over policed communities and cause a systemic problem, all because of that data.

[5:35] All areas of our life are impacted by algorithms, from traffic patterns to predictions about who should get a loan, their interest rate, health insurance, and what type of health coverage someone is granted.

[6:13] Joe shares two scenarios about how humans will interact with machines over the next coming decades. First, humans are replaced by machines. Second, and the most likely scenario, humans will collaborate with machines to create a better solution and higher productivity.

[8:00] Human supervision can be extremely relevant in using information technology and AI. Joe shares some examples from MIT’s Kevin Slavin such as flash crashes, caused by program trading.

[10:54] Responsibility in AI is a shared responsibility between both the technical and non technical teams. Building ethical technology doesn’t eliminate the possibility of unethical results, and we need more resources dedicated to areas like AI Ethics and governance within our companies, especially large ones acting as nation states.

[16:27] Elizabeth discusses some best practices that will add ethics into more computer science courses and students get a critical perspective early on.

[18:09] Companies that don’t consider themselves to be in the tech business will need to play catch up fast and take on that responsibility themselves before the government has to step in. Hopefully, more companies will begin to take a more serious look at the ethical components of the tech they rely on. Elizabeth discusses the long-wave theory, which talks about how long it takes for all of the different revolutions.

[23:27] Will we be in a Terminator SkyNet scenario? Quite possibly, says Joe, but we have to figure out where humans are going to be in the loop and understand what our algorithms are doing and how they're training other algorithms.

Quotes:

  • [2:25] “Data is what they’re calling the new oil, and there’s a race to how much data a company can consume.” - Elizabeth
  • [5:39] “All the technologies that make sense of more data in less time and more intricate ways are fueling some of the most exciting and polarizing advancements.” - Jo
  • [7:57] “The best performance sometimes is through a joint human and machine.” - Joe
  • [14:10] “If you look at human behavior, you have a wide spectrum of possibilities, ranging from Mother Teresa to say a dictator that kills millions of people. The way the technology gets employed, and that is not the technology's fault.” - Joe
  • [18:48] “For those companies who are not able to quickly adapt to this digital moment that we are having, I don’t think they will be around for long. That’s where we are, where we are going to stay, and where the jobs are going to be.” - Elizabeth
  • [25:18] “We have to put ethics at the forefront of all of our business. Whether you think you work in tech or not.” - Jo

Continue on your journey:

pega.com/podcast

Mentioned:

  continue reading

45 つのエピソード

すべてのエピソード

×
 
Loading …

プレーヤーFMへようこそ!

Player FMは今からすぐに楽しめるために高品質のポッドキャストをウェブでスキャンしています。 これは最高のポッドキャストアプリで、Android、iPhone、そしてWebで動作します。 全ての端末で購読を同期するためにサインアップしてください。

 

クイックリファレンスガイド