This week in AI & Machine Learning: Ego4D teaches AI to perceive the world, detecting colon cancer, robots used in nursing homes, Dermatologist uses for TensorFlow.js, responsible AI and more!

Authors note:

I have two upcoming workshops this month! You can always see future events at

My Top AI Highlight:

Facebook AI Announces the Ego4D dataset for Teaching AI to Perceive the World from a First-person Perspective

Creating computer vision applications to understand, navigate and interact with the world from a first-person perspective has been a difficult task, but one that’s becoming increasingly important for new applications of augmented reality, virtual reality, and robotics.

Facebook AI is hoping to make research into these tasks easier with Ego4D, the world’s largest first-person dataset. It contains over 2200 hours of unscripted activities captured by 700 people in 9 different countries.

The dataset hopes to set new benchmarks for five complex interaction tasks: Episodic memories, object manipulation, audio visual conversation transcription, and social interaction. Learn more about how Ego4D can help accomplish this in the video below or the official Facebook AI blog post.

🤖 Artificial Intelligence News:

🛠️ Developer Tools & Education:

📅 Upcoming Online AI & Data Science Events:

🎤 Interesting Podcasts & Interviews:

📄 Notable Research Papers:

🐘 About the Author & Plainsight:

  • Sage Elliott is a Developer Evangelist at Plainsight & passionate about making AI more approachable. Connect with Sage on Twitter or LinkedIn.
  • Plainsight’s vision AI platform streamlines the end-to-end machine learning process. From data annotation through deployment, customers quickly create and successfully operationalize their own vision AI applications to solve highly diverse business challenges. Join the conversation on Slack.
 View All Blogs