• Skip to primary navigation
  • Skip to content
  • Skip to footer
Applied Mathematics Postdoc @ VTech
  • Research
  • Publications
  • Presentations
  • Awards
  • Professional Activity
  • CV
    Pedro Juan Soto

    Pedro Juan Soto

    Researcher in in Coding Theory, Distributed Computing, Algebraic Computing, Machine Learning, and Theory of Computation.

    • Google Scholar
    • Orcid
    • GitHub
    • Twitter
    • Instagram

    Publications

    • Gretchen Matthews, Pedro Soto. Algebraic Geometric Rook Codes for Coded Distributed Computing. To Appear in IEEE Information Theory Workshop (ITW) 2024,
    • Pedro Soto. Random Alloy Codes and the Fundamental Limits of Coded Distributed Tensors.) To Appear in IEEE Information Theory Workshop (ITW) 2024,
    • Mariya Bessonov, Ilia Ilmer, Tatiana Konstantinova, Alexey Ovchinnikov, Gleb Pogudin, Pedro Soto. Faster Groebner bases for Lie derivatives of ODE systems via monomial orderings. In International Symposium on Symbolic and Algebraic Computation (ISSAC), Feb 2024
    • Keren Censor-Hillel, Yuka Machino, Pedro Soto, Near-Optimal Fault Tolerance for Efficient Batch Matrix Multiplication via an Additive Combinatorics Lens. In International Colloquium on Structural Information and Communication Complexity (SIROCCO), May 2024
    • Soo Go, Victor Pan, Pedro Soto Pedro Soto, Root-Squaring for Root-Finding. In Computer Algebra in Scientific Computing (CASC), Aug 2023
    • Ilia Ilmer, Alexey Ovchinnikov, Gleb Pogudin, Pedro Soto. More Efficient Identifiability Verification in ODE Models by Reducing Non-Identifiability. In Arxiv, Apr 2022
    • Xiaodi Fan, Pedro Soto, Yuchun Zou, Xian Su, Jun Li, Sequence-Aware Coding for Leveraging Stragglers in Coded Matrix Multiplication. In IEEE International Conference on Communications (ICC), 2023, Accepted
    • Felisa Vazquez-Abad, Oliver Shetler, and Pedro Soto, Quantile Formulation for Optimization under a Qualitative Risk Constraint. In IEEE Conference on Decision and Control, 2022
    • Pedro Soto, Ilia Ilmer, Haibin Guan, Jun Li. Lightweight Projective Derivative Codes for Compressed Asynchronous Gradient Descent. In Proceedings of the 39th International Conference on Machine Learning, volume 162 of Proceedings of Machine Learning Research, pages 20444–20458. PMLR, 17–23 Jul 2022
    • Pedro Soto, Xiaodi Fan, Angel Saldivia, and Jun Li. Rook Coding for Batch Matrix Multiplication. In IEEE Transactions on Communications (TCOM), 2022
    • Xiaodi Fan, Angel Saldivia, Pedro Soto, and Jun Li. Coded Matrix Chain Multiplication. In 2021 IEEE/ACM29th International Symposium on Quality of Service (IWQOS), pages 1–6, 2021
    • Xiaodi Fan, Pedro Soto, and Jun Li. Leveraging Stragglers in Coded Computing with Heterogeneous Servers. In International Symposium on Quality of Service (IWQoS), Los Angeles, California, 2020
    • Pedro Soto and Jun Li. Straggler-free coding for concurrent matrix multiplications. In 2020 IEEE International Symposium on Information Theory (ISIT), pages 233–238, 2020
    • Pedro Soto, Jun Li, and Xiaodi Fan. Dual Entangled Polynomial Code: Three-Dimensional Coding for Distributed Matrix Multiplication. In Proceedings of the 36th International Conference on Machine Learning (IMCL), volume 97 of Proceedings of Machine Learning Research, pages 5937–5945, Long Beach, California, Jun 2019
    • Google Scholar
    • Orcid
    • GitHub
    • Twitter
    • Instagram
    • Feed
    © 2024 Applied Mathematics Postdoc @ VTech. Powered by Jekyll & Minimal Mistakes.