exploring a multi sensor picking process in the future
play

Exploring a Multi-Sensor Picking Process in the Future Warehouse - PowerPoint PPT Presentation

Exploring a Multi-Sensor Picking Process in the Future Warehouse Alexander Diete September 9, 2016 University of Mannheim About the project Problem Figure 1: Picking process in warehouses 1 Idea Use sensors and video data to enhance the


  1. Exploring a Multi-Sensor Picking Process in the Future Warehouse Alexander Diete September 9, 2016 University of Mannheim

  2. About the project

  3. Problem Figure 1: Picking process in warehouses 1

  4. Idea Use sensors and video data to enhance the process 2

  5. Hardware • Data glass (Vuzix M100) • Wristband (Custom 3D Print) • Depth Sensor (Project Tango Tablet) 3

  6. Data gathering

  7. Data collected • Data glass • IMU data • Video stream • Wristband • IMU data • RFID read • Tango • Point cloud data 4

  8. Recording Session Figure 2: Different parts being recorded 5

  9. Point cloud Figure 3: 3rd person depth view 6

  10. Recording Application Figure 4: Sensor Data Collector App 7

  11. Activities to be recognized • Navigation (walking to shelf) • Locating shelf • Grabbing into shelf 8

  12. Problems • Time synchronization • Consistent recording rate for the sensors • Start and endpoint of labels 9

  13. Solutions • Zero lining for time synchronization • Align datasets in post-processing • Manual sensor rate adjustment for glasses • Use observation video to pinpoint start and end of activities 10

  14. Solutions - Alignment tool 11

  15. Solutions - Labeling tool 12

  16. Dataset

  17. Description • First recording session resulted in 2.7 GB • Different processes recorded • Picking from one shelf • Picking from multiple shelves • Picking with different hands 13

  18. Example Figure 5: Accelerometer data from wristband 14

  19. Future Work

  20. Recording optimization • Switch to full client server architecture • Synchronized start of all devices recording • Health status of sensors • Reduce the overall setup time • Better live preview of data • Video stream and plot of data • Includes health status of sensors 15

  21. Machine Learning • Video stream • Object recognition (boxes, shelves) • Motion detection • Sensor data • Activity recognition (walking, standing, arm movement) • Combination of both data streams 16

  22. Depth information • 3rd person perspective vs. 1st person perspective • 3rd person perspective feasible for recognition but hard to deploy. • 1st person perspective: minimum distance of depth sensor is 30cm • Means that detection of objects is not feasible • But: Can recognize if background is blocked by some object • Thus grabbing detection should be possible 17

  23. Conclussion

  24. Summary • Created a framework for collecting multiple data sources • Built tools to align and label data • Proposed multiple approaches for activity recognition 18

  25. Open Questions • Is the selection of sensors sufficient for task? • Can machine learning be applied to the combination of data? • Semi supervised learning applicable for different warehouse locations? 19

  26. Thank you for your attention

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend