Advanced Optimization Algorithms in the Era of Machine Learning

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Mathematics and Computer Science".

Deadline for manuscript submissions: 31 December 2024 | Viewed by 171

Special Issue Editor


E-Mail Website
Guest Editor
Machine Learning for Computational Science Group, NEC Laboratories Europe, 69115 Heidelberg, Germany
Interests: machine learning; information theory; optimization; control

Special Issue Information

Dear Colleagues, 

We are witnessing the rapid advance of Machine Learning and its application in various domains. We would like to provide the opportunity to explore related areas that focus on the development of alternative optimization algorithms that provide a different perspective on machine learning.

We would like to explore the extensions of optimization algorithms that improve the efficiency of the learning, the modelling of Bayesian systems, that reveal the connection with computational physics, that allow to integrate discrete variables, that allow to integrate the solution of a bilevel or multilevel optimization problems. In addition, we would like to consider discrete and continuous integration as, for example, where cost function and constrains of optimization problems are estimated on data either in two-step or end-to-end training. We would like to give the opportunity to explore the use of Hessian-free optimization method for bilevel problems and the exploration of efficient optimal transport, variational optimization, equilibrium problems, normalizing flows, and generative diffusion systems.

In the scope of the Special Issue are methods for lower bound approximations and stochastic relaxations of discrete problems; adjoint methods of optimization problems; solver free learning of optimization problems; optimization free gradient estimation methods; continuous relaxations of discrete operations and algorithms (e.g., sorting, ranking, argmax, shortest-path, if-else constructs, loops, top-k, logic operators, indexing, etc.); Smoothing or variational for the optimization of discrete structures (e.g., graphs, tree, sequences); Semi-definitive optimization relaxation of discrete or logical optimization (e.g., SAT solver, MaxSAT), Optimization and integration of numerical simulators (e.g., field equations, fluid dynamics, differentiable molecular dynamics, differentiable particle simulators,  differentiable protein binding, differentiable protein-folding).

Dr. Francesco Alesiani
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • optimization for machine learning
  • gradient estimation for continuous–discrete programming
  • efficient algorithms for optimal transport optimization, diffusion models and normalizing flows
  • optimization algorithms for discrete and combinatorial optimization
  • optimization algorithms for bilevel optimization
  • optimization algorithms of Bayesian learning
  • physic inspired optimization and learning methods

Published Papers

This special issue is now open for submission.
Back to TopTop