『Special Episode: European Robotics Forum (ERF) 2026』のカバーアート

Special Episode: European Robotics Forum (ERF) 2026

Special Episode: European Robotics Forum (ERF) 2026

無料で聴く

ポッドキャストの詳細を見る

今ならプレミアムプランが3カ月 月額99円

2026年5月12日まで。4か月目以降は月額1,500円で自動更新します。

概要

Recorded live at the European Robotics Forum 2026 in Stavanger, this special episode of Among Us: Human–Robot Teaming takes you inside one of Europe’s most important robotics gatherings—through conversations, demos, and cutting-edge research.

This is not theory. This is what robotics actually looks like today.

🎤 Who You’ll Hear in This Episode🔹 Opening Perspective: The Vision Behind ERF

We start with Nabil Belbachir, organizer of ERF, discussing:

  1. The strategic direction of robotics in Europe
  2. Where industry and research are actually converging
  3. Why events like ERF matter beyond networking

🔹 Bio-Inspired Flight & Flapping Drones

Next, a fascinating discussion with Matěj Karásek from Flapper Drones:

  1. How flapping-wing drones differ from traditional UAVs
  2. The role of bio-inspired design in robotics
  3. Real-world applications where conventional drones fail

🔹 Precision Haptics & Human Interaction

We then speak with Marco Aggravi, Project Manager and R&D Engineer at Haption:

  1. High-fidelity force feedback systems
  2. The role of haptics in teleoperation and training
  3. Why touch is still the missing piece in many robotic systems

🔹 Live Demo: Immersive Interaction with SenseGlove

A hands-on demo with SenseGlove showcases:

  1. Real-time haptic feedback in virtual and robotic environments
  2. Applications in training, simulation, and remote manipulation
  3. The transition from demo tech → deployable systems

🔹 Research Spotlight: Human Trajectory Prediction

The episode concludes with a presentation of the paper:

“Transformer-Based Human Trajectory Prediction in Manufacturing Settings”

Authored by:

  1. Even Langås
  2. Atle Aalerud
  3. Daniel Hagen
  4. Filippo Sanfilippo

Key ideas explored:

  1. Predicting human motion in industrial environments
  2. Using transformer architectures for temporal modeling
  3. Improving safety and fluency in human–robot collaboration

🧠 What Connects All of This?

At first glance, these look like separate topics:

  1. Event organization
  2. Bio-inspired drones
  3. Haptics
  4. Wearable interfaces
  5. AI-based trajectory prediction

They’re not.

They all point toward one thing:

👉 Robots that understand, adapt, and physically interact with humans in real environments

⚙️ The Real Insight (No Sugarcoating)

Most robotics systems today:

  1. Perceive poorly
  2. Predict weakly
  3. Interact unnaturally

This episode shows what it takes to fix that:

  1. Better sensing (event cameras, multimodal systems)
  2. Better prediction (transformers, temporal models)
  3. Better interaction (haptics, wearable feedback)

🎯 Who This Episode Is For
  1. Robotics researchers and PhD students
  2. Engineers working on HRI/HRC systems
  3. Anyone serious about moving from demos → deployable systems

🚀 Final Takeaway

If you want real human–robot teaming, you need to combine:

  1. Perception + Prediction + Physical Interaction

Miss one—and your system breaks in the real world.

adbl_web_anon_alc_button_suppression_c
まだレビューはありません