Symmetry with Optimization in Real-World Applications

A special issue of Symmetry (ISSN 2073-8994). This special issue belongs to the section "Mathematics".

Deadline for manuscript submissions: 31 May 2024 | Viewed by 2794

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computing & Informatics, Bournemouth University, Poole BH12 5BB, UK
Interests: artificial intelligence algorithms; ad-hoc networks; aeronautical communications; wireless communications
Special Issues, Collections and Topics in MDPI journals
School of Computer Science, Northeast Electric Power University, Jilin 132012, China
Interests: traffic classification; support vector machine; feature selection; parameters optimization

Special Issue Information

Dear Colleagues,

Optimization methods are widely used to solve engineering problems. For example, engineering problems can be converted into multi-objective optimization problems, computer models can be solved or identified using optimization methods, and optimization methods can be used to construct deep neural networks. Practical engineering problems are often based on nonlinear data or models. Nonlinear models usually exhibit symmetry, non-convexity, and multiple equivalent solutions. The optimization problem is the rational collocation and organic combination of mathematical knowledge, information, and thinking methods. Generally, simple methods (such as gradient descent) perform very well in practice. Therefore, mining the symmetry relationship and structure in the nonlinear model can help to propose simple and effective optimization methods, and can help to select appropriate optimization methods for specific engineering problems.

Dr. Jiankang Zhang
Dr. Bin Li
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Symmetry is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • optimization methods
  • symmetry in engineering problems
  • nonlinear models
  • optimization of deep neural network
  • optimization methods in engineering
  • dynamic programming problem
  • global optimization algorithm
  • constrained/unconstrained programming methods

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

12 pages, 1913 KiB  
Article
Adaptive Multi-Channel Deep Graph Neural Networks
by Renbiao Wang, Fengtai Li, Shuwei Liu, Weihao Li, Shizhan Chen, Bin Feng and Di Jin
Symmetry 2024, 16(4), 406; https://doi.org/10.3390/sym16040406 - 01 Apr 2024
Viewed by 758
Abstract
Graph neural networks (GNNs) have shown significant success in graph representation learning. However, the performance of existing GNNs degrades seriously when their layers deepen due to the over-smoothing issue. The node embedding incline converges to a certain value when GNNs repeat, aggregating the [...] Read more.
Graph neural networks (GNNs) have shown significant success in graph representation learning. However, the performance of existing GNNs degrades seriously when their layers deepen due to the over-smoothing issue. The node embedding incline converges to a certain value when GNNs repeat, aggregating the representations of the receptive field. The main reason for over-smoothing is that the receptive field of each node tends to be similar as the layers increase, which leads to different nodes aggregating similar information. To solve this problem, we propose an adaptive multi-channel deep graph neural network (AMD-GNN) to adaptively and symmetrically aggregate information from the deep receptive field. The proposed model ensures that the receptive field of each node in the deep layer is different so that the node representations are distinguishable. The experimental results demonstrate that AMD-GNN achieves state-of-the-art performance on node classification tasks with deep models. Full article
(This article belongs to the Special Issue Symmetry with Optimization in Real-World Applications)
Show Figures

Figure 1

15 pages, 1537 KiB  
Article
A Dynamic Fusion of Local and Non-Local Features-Based Feedback Network on Super-Resolution
by Yuhao Liu and Zhenzhong Chu
Symmetry 2023, 15(4), 885; https://doi.org/10.3390/sym15040885 - 09 Apr 2023
Cited by 1 | Viewed by 1090
Abstract
Many Symmetry blocks were proposed in the Single Image Super-Resolution (SISR) task. The Attention-based block is powerful but costly on non-local features, while the Convolutional-based block is good at efficiently handling the local features. However, assembling two different Symmetry blocks will generate an [...] Read more.
Many Symmetry blocks were proposed in the Single Image Super-Resolution (SISR) task. The Attention-based block is powerful but costly on non-local features, while the Convolutional-based block is good at efficiently handling the local features. However, assembling two different Symmetry blocks will generate an Asymmetry block, making the classic Symmetry-block-based Super-Resolution (SR) architecture fail to deal with these Asymmetry blocks. In this paper, we proposed a new Dynamic fusion of Local and Non-local features-based Feedback Network (DLNFN) for SR, which focus on optimizing the traditional Symmetry-block-based SR architecture to hold two Symmetry blocks in parallel, making two Symmetry-blocks working on what they do best. (1) We introduce the Convolutional-based block for the local features and Attention-based network block for non-local features and propose the Delivery–Adjust–Fusion framework to hold these blocks. (2) we propose a Dynamic Weight block (DW block) which can generate different weight values to fuse the outputs on different feedback iterations. (3) We introduce the MAConv layer to optimize the In block, which is critical for our two blocks-based feedback algorithm. Experiments show our proposed DLNFN can take full advantage of two different blocks and outperform other state-of-the-art algorithms. Full article
(This article belongs to the Special Issue Symmetry with Optimization in Real-World Applications)
Show Figures

Figure 1

Back to TopTop