Publications and Preprints 🔗

  1. Teshima, T., Tojo, K., Ikeda, M., Ishikawa, I., and Oono, K.,
    Universal approximation property of neural ordinary differential equations. NeurIPS2020 Workshop: Differential Geometry meets Deep Learning, 2020 (DiffGeo4DL).
    preprint (arXiv)
  2. Teshima, T.*, Ishikawa, I.*, Tojo, K., Oono, K., Ikeda, M., and Sugiyama, M.,
    Coupling-based invertible neural networks are universal diffeomorphism approximators. Thirty-fourth Conference on Neural Information Processing Systems, 2020 (NeurIPS 2020).
    * Equal contribution. Oral presentation (one of the 105 orals among the 1900 accepted papers; paper acceptance rate 20.1%, oral acceptance rate 1.1%).

    Proceedings code (figure) slides Conference session
  3. Fujisawa, M., Teshima, T., and Sato, I.,
    γ-ABC: Outlier-robust approximate bayesian computation based on robust divergence estimator. The 24th International Conference on Artificial Intelligence and Statistics, accepted (AISTATS 2021).
    preprint (arXiv) conference
  4. Kato, M. and Teshima, T.,
    Non-negative Bregman divergence minimization for deep direct density ratio estimation. arXiv:2006.06979 [cs.LG].
    preprint (arXiv)
  5. Kato, M., Teshima, T., and Honda, J.,
    Learning from positive and unlabeled data with a selection bias. Seventh International Conference on Learning Representations, 2019 (ICLR 2019).
    paper
  6. Teshima, T., Xu, M., Sato, I., and Sugiyama, M.,
    Clipped matrix completion: a remedy for ceiling effects. Thirty-Third AAAI Conference on Artificial Intelligence, 2019 (AAAI-19).
    paper supplementary code slides