Artificial intelligence is evolving at an unprecedented pace—what does that mean for the future of technology, venture capital, business, and even our understanding of ourselves? Award-winning journalist and writer Anil Ananthaswamy joins us for our latest episode to discuss his latest book Why Machines Learn: The Elegant Math Behind Modern AI . Anil helps us explore the journey and many breakthroughs that have propelled machine learning from simple perceptrons to the sophisticated algorithms shaping today’s AI revolution, powering GPT and other models. The discussion aims to demystify some of the underlying mathematical concepts that power modern machine learning, to help everyone grasp this technology impacting our lives–even if your last math class was in high school. Anil walks us through the power of scaling laws, the shift from training to inference optimization, and the debate among AI’s pioneers about the road to AGI—should we be concerned, or are we still missing key pieces of the puzzle? The conversation also delves into AI’s philosophical implications—could understanding how machines learn help us better understand ourselves? And what challenges remain before AI systems can truly operate with agency? If you enjoy this episode, please subscribe and leave us a review on your favorite podcast platform. Sign up for our newsletter at techsurgepodcast.com for exclusive insights and updates on upcoming TechSurge Live Summits. Links: Read Why Machines Learn, Anil’s latest book on the math behind AI https://www.amazon.com/Why-Machines-Learn-Elegant-Behind/dp/0593185749 Learn more about Anil Ananthaswamy’s work and writing https://anilananthaswamy.com/ Watch Anil Ananthaswamy’s TED Talk on AI and intelligence https://www.ted.com/speakers/anil_ananthaswamy Discover the MIT Knight Science Journalism Fellowship that shaped Anil’s AI research https://ksj.mit.edu/ Understand the Perceptron, the foundation of neural networks https://en.wikipedia.org/wiki/Perceptron Read about the Perceptron Convergence Theorem and its significance https://www.nature.com/articles/323533a0…
Bismillah... Sebuah penelitian terbaru dari California Institute of Technology (Caltech) mengungkap fakta mengejutkan tentang kecepatan berpikir manusia. Tim peneliti yang dipimpin oleh Prof. Markus Meister dan Jieyu Zheng menemukan bahwa otak kita hanya memproses informasi dengan kecepatan 10 bit per detik. Apa artinya ini? Bayangkan indera kita mampu mengumpulkan data hingga 1 miliar bit per detik. Namun, dari "banjir data" tersebut, hanya 10 bit yang benar-benar diproses oleh otak untuk memahami dunia dan membuat keputusan. Apakah betul, bahwa ternyata otak manusia bekerja sangat lambat? Seperti apa penjelasannya dengarkan selengkapnya melalui episode ini.
Bismillah... Sebuah penelitian terbaru dari California Institute of Technology (Caltech) mengungkap fakta mengejutkan tentang kecepatan berpikir manusia. Tim peneliti yang dipimpin oleh Prof. Markus Meister dan Jieyu Zheng menemukan bahwa otak kita hanya memproses informasi dengan kecepatan 10 bit per detik. Apa artinya ini? Bayangkan indera kita mampu mengumpulkan data hingga 1 miliar bit per detik. Namun, dari "banjir data" tersebut, hanya 10 bit yang benar-benar diproses oleh otak untuk memahami dunia dan membuat keputusan. Apakah betul, bahwa ternyata otak manusia bekerja sangat lambat? Seperti apa penjelasannya dengarkan selengkapnya melalui episode ini.