tvm tvm f for ed or edge c e com omputin ting p g pla
play

TVM TVM f for ed or edge c e com omputin ting p g pla latf - PowerPoint PPT Presentation

TVM TVM f for ed or edge c e com omputin ting p g pla latf tform orm NTT Software Inno nnovation n Ce Center Ka Kazutaka Mo Morita In Inference in 5G era Edge Devices Offload MEC (Mobile edge computing) server Offload


  1. TVM TVM f for ed or edge c e com omputin ting p g pla latf tform orm NTT Software Inno nnovation n Ce Center Ka Kazutaka Mo Morita

  2. In Inference in 5G era Edge Devices Offload MEC (Mobile edge computing) server Offload Internet Cloud Base station ~10 10 ms ms la latency �

  3. Benefits of offloading inference Be Computing resource Inference with data Edge Cloud High-end server- spec accelerators GPU Big are available data Edge is one of the targets of AI 5G Edge Interaction accelerators Real-time with other inference with big data devices data Device 5G CPU AI chip Device Device AI chip is unavailable for low-end devices �

  4. Ex Exam ample le – Aug Augmented R d Reality Occlusion Point cloud Object segmentation Point inference cloud Cloud data HYPER-REALITY: https://vimeo.com/166807261 Plane detection object will not collide Captured images Object detection inference Inferenc nce with h bi big da data in n the he cloud ud can also provide collider from moving real world objects bouncing object Ma Many Inferenc nce tasks �

  5. Edge computing platform with TVM VM Developer Developing framework for edge computing Distribute runtimes to device, edge, and cloud Device Offload inference if TVM VM necessary, based on SDK device and communication status Edge offload offload data Device Cloud Internet offload data �

  6. What are required for TVM VM? He Heter erogen eneo eous ru runti time with th Dy Dyna namic r run untime Smart NI Sma NIC support offl of floa oading g suppor ort No overhead of PCIe On device communication or host On device memory access On edge Execute on edge via RPC On edge Edge Edge CPU CPU Switch based on device Scheduler and communication status Smart NIC GPU FPGA NIC Auto tuning support would be also nice Device Device �

  7. Th Thank You! Email: kazutaka.morita.fp@hco.ntt.co.jp �

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend