Artwork

コンテンツは Jayesh Jagasia によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Jayesh Jagasia またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作権で保護された作品をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal
Player FM -ポッドキャストアプリ
Player FMアプリでオフラインにしPlayer FMう!

#111 - Leslie Nooteboom - Co-founder & Chief Product Officer, Humanising Autonomy

43:37
 
シェア
 

Manage episode 289364281 series 2793161
コンテンツは Jayesh Jagasia によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Jayesh Jagasia またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作権で保護された作品をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal

Data is the lifeblood of Artificial Intelligence. Quite simply, the better and richer the quality of data, the more capable the algorithm. Now this applies to both, the training data available to train the algorithm, but importantly, also the input data that is available for the algorithm to do its job.

Take the case of autonomous vehicles or advanced driver assistance systems. These systems rely on the eyes - cameras, LIDARs and RADARs - to see the environment around the vehicle. The input from these eyes is then passed on to the brain - the algorithm - which makes sense of what the eyes see.

Most state of the art ADAS and AV algorithms today are designed to perceive what these sensors see by drawing bounding boxes around road users. That’s how they perceive pedestrians, other road users, vehicles and obstacles.

But human behaviour rarely fits in a box. And human behaviour has a huge impact on how good or not an AV algorithm is. A bounding box alone is not sufficient to really perceive pedestrian behaviour, for instance. Is that pedestrian about to cross the road? How much risk does this road user pose? Is that a vulnerable road user?

Enter Humanising Autonomy. A company on a mission to create a global standard for human interaction with automated systems. This is an incredibly interesting company, and I was delighted to have the opportunity to speak to their Co-founder and Chief Product Officer, Leslie Nooteboom.

Think of Humanising Autonomy as a module you could add to the AV brain, that then makes the brain capable of perceiving - and predicting - human behaviour on roads. I would imagine a solution like this could improve road safety by orders of magnitude.

These guys are up to some really fascinating stuff that sits at the intersection of behavioural psychology, vision perception and artificial intelligence. How does that impact the world of autonomous driving? Find out in my very interesting chat with Leslie.

http://ai-in-automotive.com/aiia/111/leslienooteboom

AI in Automotive Podcast

  continue reading

40 つのエピソード

Artwork
iconシェア
 
Manage episode 289364281 series 2793161
コンテンツは Jayesh Jagasia によって提供されます。エピソード、グラフィック、ポッドキャストの説明を含むすべてのポッドキャスト コンテンツは、Jayesh Jagasia またはそのポッドキャスト プラットフォーム パートナーによって直接アップロードされ、提供されます。誰かがあなたの著作権で保護された作品をあなたの許可なく使用していると思われる場合は、ここで概説されているプロセスに従うことができますhttps://ja.player.fm/legal

Data is the lifeblood of Artificial Intelligence. Quite simply, the better and richer the quality of data, the more capable the algorithm. Now this applies to both, the training data available to train the algorithm, but importantly, also the input data that is available for the algorithm to do its job.

Take the case of autonomous vehicles or advanced driver assistance systems. These systems rely on the eyes - cameras, LIDARs and RADARs - to see the environment around the vehicle. The input from these eyes is then passed on to the brain - the algorithm - which makes sense of what the eyes see.

Most state of the art ADAS and AV algorithms today are designed to perceive what these sensors see by drawing bounding boxes around road users. That’s how they perceive pedestrians, other road users, vehicles and obstacles.

But human behaviour rarely fits in a box. And human behaviour has a huge impact on how good or not an AV algorithm is. A bounding box alone is not sufficient to really perceive pedestrian behaviour, for instance. Is that pedestrian about to cross the road? How much risk does this road user pose? Is that a vulnerable road user?

Enter Humanising Autonomy. A company on a mission to create a global standard for human interaction with automated systems. This is an incredibly interesting company, and I was delighted to have the opportunity to speak to their Co-founder and Chief Product Officer, Leslie Nooteboom.

Think of Humanising Autonomy as a module you could add to the AV brain, that then makes the brain capable of perceiving - and predicting - human behaviour on roads. I would imagine a solution like this could improve road safety by orders of magnitude.

These guys are up to some really fascinating stuff that sits at the intersection of behavioural psychology, vision perception and artificial intelligence. How does that impact the world of autonomous driving? Find out in my very interesting chat with Leslie.

http://ai-in-automotive.com/aiia/111/leslienooteboom

AI in Automotive Podcast

  continue reading

40 つのエピソード

すべてのエピソード

×
 
Loading …

プレーヤーFMへようこそ!

Player FMは今からすぐに楽しめるために高品質のポッドキャストをウェブでスキャンしています。 これは最高のポッドキャストアプリで、Android、iPhone、そしてWebで動作します。 全ての端末で購読を同期するためにサインアップしてください。

 

クイックリファレンスガイド