# A Hybrid Forward–Backward Algorithm and Its Optimization Application

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

**Remark**

**1.**

## 2. Preliminaries

**Lemma**

**1**

**.**Suppose that $W:H\to \mathbb{R}$ is Fr$\stackrel{\xb4}{e}$chet differentiable with $\nabla W:H\to H$ being κ-strongly monotone and ι-Lipschitz continuous. Define $S:=Id-\chi \alpha \nabla W$, where $\chi \in [0,1]$ and $\alpha \in (0,2\kappa /{\iota}^{2})$. Then $\parallel S\left(x\right)-S\left(y\right)\parallel \le (1-\chi \vartheta )\parallel x-y\parallel (x,y\in H)$, where $\vartheta :=1-\sqrt{1-\alpha (2\kappa -\alpha {\iota}^{2})}\in (0,1]$.

**Lemma**

**2**

- (i)
- B is a $\frac{1}{\rho}$-Lipschitz continuous and monotone operator;
- (ii)
- if ν is any constant in $(0,2\rho ]$, then $Id-\nu B$ is nonexpansive, where $Id$ stands for the identity operator on H.

**Lemma**

**3.**

- (i)
- $${\parallel (Id+sA)}^{-1}{(Id-sB)x-x\parallel \le 2\parallel (Id+tA)}^{-1}(Id-tB)x-x\parallel ,\phantom{\rule{1.em}{0ex}}\forall x\in H,$$
- (ii)
- ${(A+B)}^{-1}\left(0\right)=Fix\left({(Id+\lambda A)}^{-1}(Id-\lambda B)\right)$, $\forall \lambda >0$.

**Proof.**

**Lemma**

**4**

**.**Let S be a nonexpansive mapping defined on a closed convex subset C of a Hilbert space H. Then $Id-S$ is demi-closed, i.e., whenever ${\left({x}_{n}\right)}_{n\ge 0}$ is a sequence in C weakly converging to some $x\in C$ and the sequence ${\left((Id-S){x}_{n}\right)}_{n\ge 0}$ strongly converges to some $y\in H$, it follows that $y=(Id-S)x$.

**Lemma**

**5**

**.**Let ${\left({\mu}_{n}\right)}_{n\ge 0}$ be a sequence of nonnegative real numbers such that

**Lemma**

**6**

**.**Let ${\left({\zeta}_{n}\right)}_{n\ge 0}$ be a sequence of nonnegative real numbers such that there exists a subsequence $\left({\zeta}_{{n}_{i}}\right)$ of ${\left({\zeta}_{n}\right)}_{n\ge 0}$ such that ${\zeta}_{{n}_{i}+1}>{\zeta}_{{n}_{i}}$ for all $i\in \mathbb{N}$. Then there exists a nondecreasing sequence $\left({m}_{j}\right)$ of $\mathbb{N}$ such that ${lim}_{j\to \infty}{m}_{j}=\infty $ and the following properties are satisfied by all (sufficiently large) number of $j\in \mathbb{N}$, ${\zeta}_{{m}_{j}+1}\ge {\zeta}_{{m}_{j}}$, ${\zeta}_{{m}_{j}+1}\ge {\zeta}_{k}.$ In fact, ${m}_{j}$ is the largest number n in the set $\{1,2,\cdots ,j\}$ such that ${\zeta}_{n+1}\ge {\zeta}_{n}$.

## 3. Main Results

**Assumption**

**1.**

Algorithm 1: The hybrid forward–backward algorithm |

**Assumption**

**2.**

**Remark**

**2.**

**Theorem**

**1.**

**Proof.**

**Case 1.**There exists $N\in \mathbb{N}$ such that $\parallel {p}_{n+1}{-z\parallel}^{2}\le {\parallel {p}_{n}-z\parallel}^{2},\phantom{\rule{0.166667em}{0ex}}\forall n\ge N$. Recalling that $\parallel {p}_{n}{-z\parallel}^{2}$ is lower bounded, one deduces that ${lim}_{n\to \infty}{\parallel {p}_{n}-z\parallel}^{2}$ exists. Using Assumption 2 (C${}_{2}$), and letting n tend to infinity in (22), one finds that

**Case 2.**There exists a subsequence $(\parallel {p}_{{n}_{j}}-z{{\parallel}^{2})}_{j\ge 0}$ of $(\parallel {p}_{n}-z{{\parallel}^{2})}_{n\ge 0}$ such that

**Case 1.**guarantees that

Algorithm 2: The hybrid forward–backward algorithm without the inertial term |

## 4. Numerical Experiment

## 5. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Moudafi, A. Split monotone variational inclusions. J. Optim. Theory Appl.
**2011**, 150, 275–283. [Google Scholar] [CrossRef] - Iiduka, H. Two stochastic optimization algorithms for convex optimization with fixed point constraints. Optim. Methods Softw.
**2019**, 34, 731–757. [Google Scholar] [CrossRef] [Green Version] - Ansari, Q.H.; Bao, T.Q. A limiting subdifferential version of Ekeland’s variational principle in set optimization. Optim. Lett.
**2019**, 1–5. [Google Scholar] [CrossRef] - Al-Homidan, S.; Ansari, Q.H.; Kassay, G. Vectorial form of Ekeland variational principle with applications to vector equilibrium problems. Optimization
**2020**, 69, 415–436. [Google Scholar] [CrossRef] - Li, X.; Huang, N.; Ansari, Q.H.; Yao, J.C. Convergence rate of descent method with new inexact line-search on Riemannian manifolds. J. Optim. Theory Appl.
**2019**, 180, 830–854. [Google Scholar] [CrossRef] - Polyak, B.T. Some methods of speeding up the convergence of iteration methods. USSR Comput. Math. Math. Phys.
**1964**, 4, 1–17. [Google Scholar] [CrossRef] - Rockafellar, R.T. Monotone operators and the proximal point algorithm. SIAM J. Control Optim.
**1976**, 14, 877–898. [Google Scholar] [CrossRef] [Green Version] - Moreau, J.J. Proximité et dualité dans un espace hilbertien. Bull. De La Société Mathématique De Fr.
**1965**, 93, 273–299. [Google Scholar] [CrossRef] - Bruck, R.E.; Reich, S. Nonexpansive projections and resolvents of accretive operators in Banach spaces. Houst. J. Math.
**1977**, 3, 459–470. [Google Scholar] - Lions, P.L.; Mercier, B. Splitting algorithms for the sum of two nonlinear operators. SIAM J. Numer. Anal.
**1979**, 16, 964–979. [Google Scholar] [CrossRef] - Cho, S.Y.; Kang, S.M. Approximation of fixed points of pseudocontraction semigroups based on a viscosity iterative process. Appl. Math. Lett.
**2011**, 24, 224–228. [Google Scholar] [CrossRef] - Bot, R.I.; Csetnek, E.R. Second order forward-backward dynamical systems for monotone inclusion problems. SIAM J. Control Optim.
**2016**, 54, 1423–1443. [Google Scholar] [CrossRef] - Cho, S.Y.; Kang, S.M. Approximation of common solutions of variational inequalities via strict pseudocontractions. Acta Math. Sci.
**2012**, 32, 1607–1618. [Google Scholar] [CrossRef] - Cho, S.Y.; Li, W.; Kang, S.M. Convergence analysis of an iterative algorithm for monotone operators. J. Inequal. Appl.
**2013**, 2013, 199. [Google Scholar] [CrossRef] [Green Version] - Luo, Y.; Shang, M.; Tan, B. A general inertial viscosity type method for nonexpansive mappings and its applications in signal processing. Mathematics
**2020**, 8, 288. [Google Scholar] [CrossRef] [Green Version] - Alvarez, F.; Attouch, H. An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping. Set-Valued Anal.
**2001**, 9, 3–11. [Google Scholar] [CrossRef] - Lorenz, D.A.; Pock, T. An inertial forward-backward algorithm for monotone inclusions. J. Math. Imaging Vis.
**2015**, 51, 311–325. [Google Scholar] [CrossRef] [Green Version] - He, S.; Liu, L.; Xu, H.K. Optimal selections of stepsizes and blocks for the block-iterative ART. Appl. Anal.
**2019**, 1–14. [Google Scholar] [CrossRef] - Cho, S.Y. Generalized mixed equilibrium and fixed point problems in a Banach space. J. Nonlinear Sci. Appl.
**2016**, 9, 1083–1092. [Google Scholar] [CrossRef] [Green Version] - Liu, L. A hybrid steepest descent method for solving split feasibility problems involving nonexpansive mappings. J. Nonlinear Convex Anal.
**2019**, 20, 471–488. [Google Scholar] - Kassay, G.; Reich, S.; Sabach, S. Iterative methods for solving systems of variational inequalities in reflexive Banach spaces. SIAM J. Optim.
**2011**, 21, 1319–1344. [Google Scholar] [CrossRef] - Censor, Y.; Gibali, A.; Reich, S.; Sabach, S. Common solutions to variational inequalities. Set-Valued Var. Anal.
**2012**, 20, 229–247. [Google Scholar] [CrossRef] - Gibali, A.; Liu, L.W.; Tang, Y.C. Note on the modified relaxation CQ algorithm for the split feasibility problem. Optim. Lett.
**2018**, 12, 817–830. [Google Scholar] [CrossRef] - Cho, S.Y. Strong convergence analysis of a hybrid algorithm for nonlinear operators in a Banach space. J. Appl. Anal. Comput.
**2018**, 8, 19–31. [Google Scholar] - Yamada, I. The hybrid steepest descent method for the variational inequality problem over the intersection of fixed point sets of nonexpansive mappings. Inherently Parallel Algorithms Feasibility Optim. Their Appl.
**2001**, 8, 473–504. [Google Scholar] - Attouch, H.; Maingé, P.E. Asymptotic behavior of second-order dissipative evolution equations combining potential with non-potential effects. ESAIM Control Optim. Calc. Var.
**2011**, 17, 836–857. [Google Scholar] [CrossRef] [Green Version] - Iiduka, H.; Takahashi, W. Strong convergence theorems for nonexpansive mappings and inverse-strongly monotone mappings. Nonlinear Anal.
**2005**, 61, 341–350. [Google Scholar] [CrossRef] - Opial, Z. Weak convergence of the sequence of successive approximations for nonexpansive mappings. Bull. Am. Math. Soc.
**1967**, 73, 591–597. [Google Scholar] [CrossRef] [Green Version] - Maingé, P.E. Inertial iterative process for fixed points of certain quasi-nonexpansive mappings. Set-Valued Anal.
**2007**, 15, 67–79. [Google Scholar] [CrossRef] - Maingé, P.E. The viscosity approximation process for quasi-nonexpansive mappings in Hilbert spaces. Comput. Math. Appl.
**2010**, 59, 74–79. [Google Scholar] [CrossRef] [Green Version] - Peypouquet, J.; Sorin, S. Evolution equations for maximal monotone operators: Asymptotic analysis in continuous and discrete time. arXiv
**2009**, arXiv:0905.1270. [Google Scholar] - He, X.; Huang, T.; Yu, J.; Li, C.; Li, C. An inertial projection neural network for solving variational inequalities. IEEE Trans. Cybern.
**2016**, 47, 809–814. [Google Scholar] [CrossRef] [PubMed]

**Figure 1.**Circuit architecture diagram of model (42), where $e=\dot{p}$, $f=\ddot{p}$, $w=p+\alpha \dot{p}$, $q=\dot{p}+\ddot{p}$, $g=q/k$.

**Figure 2.**From top to bottom: the original signal, the noisy measurement, the minimum norm solution, the reconstruction signals respectively by Algorithm 1 (${\mu}_{n}=0.5$) and Algorithm 2 .

**Figure 3.**The behavior of $\parallel {p}_{n}\parallel $ with the number of iterations (

**left**), the behavior of $\parallel {D}_{n}\parallel $ with the running time (

**right**). The number of iterations is 2000. The first CPU time to compute $\parallel {p}_{2000}\parallel $ is about $123.41$ s; the second CPU time to compute $\parallel {p}_{2000}\parallel $ is about $116.64$ s; the third CPU time to compute $\parallel {p}_{2000}\parallel $ is about $116.59$ s; the fourth CPU time to compute $\parallel {p}_{2000}\parallel $ is about $138.18$ s (from top to bottom in legend).

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Liu, L.; Qin, X.; Yao, J.-C.
A Hybrid Forward–Backward Algorithm and Its Optimization Application. *Mathematics* **2020**, *8*, 447.
https://doi.org/10.3390/math8030447

**AMA Style**

Liu L, Qin X, Yao J-C.
A Hybrid Forward–Backward Algorithm and Its Optimization Application. *Mathematics*. 2020; 8(3):447.
https://doi.org/10.3390/math8030447

**Chicago/Turabian Style**

Liu, Liya, Xiaolong Qin, and Jen-Chih Yao.
2020. "A Hybrid Forward–Backward Algorithm and Its Optimization Application" *Mathematics* 8, no. 3: 447.
https://doi.org/10.3390/math8030447