Image Processing Reference
In-Depth Information
where the third term is for low-rank regularization, and the fourth term is for total
variation regularization. and are the respective tuning parameters. These
two additional terms are introduced below.
Low-Rank Regularization. Low-rank is an assumption often used in matrix comple-
tion tasks, where the matrix is incomplete and the goal is to estimate missing values
from a small number of entries. Here we use low-rank as a regularization term to help
retrieve useful information from remote regions. The rank of 3D image is defined as
[11]:
, where the rank is computed as the combination of
trace norms of all matrices unfolded along each dimension. are parameters satisfy-
ing 0 and
1 . is the unfolded along the i -th dimension.
Total-Variation Regularization. Total-variation is defined as the integral of the
absolute gradients of the image [12]: || . It is proven useful in
image denoising and super-resolution [12]. One of main advantages of total variation
is its ability to effectively preserve edges in the image.
2.4
Optimization
We employ the alternating direction method of multipliers (ADMM) algorithm to
optimize the cost function in Eq. (6). Following [13], we introduce variables
and equality constraints , and thus the Lagrangian cost function is:
min , , ,
(7)
| |
where
are Lagrangian dual variables to integrate the equality constraints
into the cost function. Then, we break the cost function into three subproblems and
iteratively update them. Note that we propose to keep , fixed to achieve a
convex solution when optimizing the three subproblems. The entire super-resolution
process is summarized as Algorithm 1.
Subproblem 1: Update by minimizing:
argmin ,
2
(8)
||
Subproblem 2: Update
by minimizing:
min
(9)
2
Subproblem 3: Update
by:
(10)
Search WWH ::




Custom Search