Avatar

Alejandro Carderera

Quantitative Researcher

Quantfury

Biography

I am a Quantitative Researcher at Quantfury, working in Machine Learning techniques for algorithmic trading. I obtained my Ph.D. in Machine Learning at the Georgia Institute of Technology, working with Prof. Sebastian Pokutta, on designing novel convex optimization algorithms with solid theoretical convergence guarantees and good numerical performance.

Prior to joining the Ph.D. program I worked at HP as an R&D Systems Engineer for two years. I obtained a bachelor of science in Industrial Engineering from the Universidad Politécnica de Madrid and a Master of Science in Applied Physics from Cornell University.

Interests

  • Convex Optimization
  • Theoretical Machine Learning
  • Causal Inference
  • Statistical Estimation

Education

  • Ph.D. in Machine Learning, 2021

    Georgia Institute of Technology

  • MS in Applied Physics, 2016

    Cornell University

  • BSc in Industrial Engineering, 2014

    Universidad Politécnica de Madrid

Experience

 
 
 
 
 

Quantitative Researcher - ML for algorithmic trading

Quantfury

Jan 2022 – Present Atlanta, USA.
 
 
 
 
 

Quantitative Researcher - Summer Associate

J.P. Morgan

Jun 2021 – Aug 2021 New York, USA.
 
 
 
 
 

Quantitative Researcher - Summer Associate

J.P. Morgan

Jul 2020 – Aug 2020 New York, USA.
 
 
 
 
 

Graduate Research Assistant

Georgia Institute of Technology

Aug 2018 – Dec 2021 Atlanta, USA.
 
 
 
 
 

R&D Systems Engineer

HP

Aug 2016 – Jul 2018 Barcelona, Spain
 
 
 
 
 

Graduate Research Assistant

Cornell University

Jul 2015 – Jul 2016 Ithaca, USA.

Publications

Simple steps are all you need: Frank-Wolfe and generalized self-concordant functions - NeurIPS 2021

Generalized self-concordance is a key property present in the objective function of many important learning problems. We establish the …

FrankWolfe.jl - INFORMS Journal of Computing

We present FrankWolfe.jl, an open-source implementation of several popular Frank-Wolfe and Conditional Gradients variants for …

Parameter-Free Locally Accelerated Conditional Gradients - ICML 2021

Projection-free conditional gradient (CG) methods are the algorithms of choice for constrained optimization setups in which projections …

Second-order Conditional Gradient Sliding - Fields Institute Communication Series on Data Science and Optimization

Constrained second-order convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, …

Locally Accelerated Conditional Gradients - AISTATS 2020

Conditional gradients constitute a class of projection-free first-order algorithms for smooth convex optimization. As such, they are …

Breaking the Curse of Dimensionality (Locally) to Accelerate Conditional Gradients

Conditional gradient methods constitute a class of first order algorithms for solving smooth convex optimization problems that are …