Hi, I’m Kuba Perlin.
I’m currently working as a Science Research Engineer at DeepMind.
Background
I studied Computer Science at the University of Cambridge (BA, Triple First) and University of Oxford (MSc, Distinction).
My studies focussed on machine learning and theoretical computer science, including probabilistic algorithms, complexity theory, game theory, and computational learning theory.
I wrote my MSc thesis at the Oxford Robotics Institute, on 3D rotation invariance in neural networks for point cloud processing.
Work Experience
Before joining DeepMind, I worked full time at Cohere, a start-up training and serving Large Language Models. Prior to that, I have interned at EPFL, Google, and NASA, among other places.
At NASA, I worked on applications of deep learning to Computational Fluid Dynamics.
I have also worked as a Teaching Assistant (a.k.a. supervisor) at the University of Cambridge, teaching Probability and Complexity Theory courses in 2020, and Discrete Mathematics in 2021.
Last but not least, for 2 years, I was an after-school-class maths teacher preparing a cohort of gifted Polish students for the national Junior Math Olympiad. In my second year of teaching, my school performed, arguably, best in the country.
Publications
- Interlocking Backpropagation: Improving depthwise model-parallelism (2021, JMLR)
Aidan N. Gomez*, Oscar Key*, Kuba Perlin, Stephen Gou, Nick Frosst, Jeff Dean, Yarin Gal
- Scalable Training of Language Models using JAX pjit and TPUv4 (2022)
Joanna Yoo*, Kuba Perlin*, Siddhartha Rao Kamalakara, João G.M. Araújo