BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//https://techplay.jp//JP
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALDESC:[1st AIP Open Seminar] Talks by Tensor Learning Team 
X-WR-CALNAME:[1st AIP Open Seminar] Talks by Tensor Learning Team 
X-WR-TIMEZONE:Asia/Tokyo
BEGIN:VTIMEZONE
TZID:Asia/Tokyo
BEGIN:STANDARD
DTSTART:19700101T000000
TZOFFSETFROM:+0900
TZOFFSETTO:+0900
TZNAME:JST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
UID:797205@techplay.jp
SUMMARY:[1st AIP Open Seminar] Talks by Tensor Learning Team 
DTSTART;TZID=Asia/Tokyo:20201111T150000
DTEND;TZID=Asia/Tokyo:20201111T170000
DTSTAMP:20260502T085945Z
CREATED:20201022T140016Z
DESCRIPTION:イベント詳細はこちら\nhttps://techplay.jp/event/79720
 5?utm_medium=referral&utm_source=ics&utm_campaign=ics\n\nTensor Learning 
 Team (https://aip.riken.jp/labs/generic_tech/tensor_learn/?lang=en) at RI
 KEN AIP\n\nSpeaker 1  (15:00 - 15:45): Qibin Zhao \nTitle: Overview of Te
 nsor Networks in Machine Learning\nAbstract:  In recent years\, tensor ne
 tworks (TNs) have been increasingly applied to machine learning and deep 
 neural networks (DNN). This talk will present an overview of recent progr
 ess of TNs technology in machine learning from several aspects including 
 TN for data decomposition\, model parameter representation and function r
 epresentation. Our team conducts research on this topic towards one quest
 ion whether TNs are possibly developed to be a powerful ML model with new
  perspectives.\n\nSpeaker 2 (15:45 - 16:10): Cesar Caiafa\nTitle: Sparse 
 tensor representations and applications\nAbstract: It has been demonstrat
 ed that sparse coding of natural data allows for capturing information ef
 ficiently (compression)\, providing a powerful linear model for signal pr
 ocessing and machine learning tasks. In this talk\, I will present a gene
 ralization of sparse representations for multidimensional data (tensors).
  The Sparse Tucker (ST) model provides a computationally efficient tool f
 or classical signal processing as well as to model diffusion-weighted Mag
 netic Resonance Images (dMRI) providing a valuable new tool for the const
 ruction and validation of macroscopic brain connectomes in neuroscience s
 tudies. These results were presented in NIPS 2017 and NeurIPS 2019.\n\nSp
 eaker 3 (16:10 - 16:35): Chao Li \nTitle: Evolutionary Topology Search fo
 r Tensor Network Decomposition\nAbstract: Tensor diagram notation is a si
 mple yet powerful framework to rigorously formulate tensor network (TN) d
 ecomposition using graph in a topological space. In the talk\, we introdu
 ce a genetic algorithm (GA) to search the (near-)optimal graphical struct
 ure of TN decomposition for a given tensor\, and use empirical results to
  demonstrate GA can  tackle the issue in an affordable manner. This work 
 has been presented at ICML 2020.\n\nSpeaker 4 (16:35 - 17:00): Tatsuya Yo
 kota\nTitle: Nonnegative Matrix Factorization in Application to Dynamic P
 ET image reconstruncion\nAbstract: In this seminar\, we will introduce re
 search on applying non-negative matrix factorization technology to the pr
 oblem of dynamic PET image reconstruction. A noise-robust reconstruction 
 was achieved by a method that successfully combines non-negative constrai
 nts\, low-rank constraints\, image prior of spatial basis\, and smoothnes
  prior of temporal basis. This result was presented at the ICCV 2019.
LOCATION:オンライン
URL:https://techplay.jp/event/797205?utm_medium=referral&utm_source=ics&utm
 _campaign=ics
END:VEVENT
END:VCALENDAR
