Friday, February 10, 2023

A Classification of Screening-like Methods (2023)

screening-methods-p2

A Classification of Screening-like Methods

Author: Tran Thu Le
Date: 10/02/2023

The map o screening methods

Screening methods prove to be a powerful tool for reducing the dimension of the optimization problems. We may classify the method into the following categories:

Safe Screening rules with safe regions:

  1. Static Ball [8, 2012]
  2. Static Ellipsoid [9, 2012]
  3. Static Dome [10, 2012]
  4. Static balls (ST1, ST2 and ST3 ball) [11, 2012] (here ST1 = ball in [8])
  5. Sequential Balls (DPP, EDPP ball) [12, 2013]
  6. Sequential Dome (SASVI dome) [13, 2014]
  7. Sequential Dome-like [14, 2014]
  8. Dynamic Ball [15, 2015]
  9. Dynamic Ball and Dome (GAP ball and GAP dome - best regions) [16, 2015]
  10. Sequential and Dynamic Balls (FNE ball) [17, 2016] (here Sequential FNE = EDPP [12])
  11. Sequential Ball, Dome and double-cuts Dome [18, 2016]
  12. Dynamic SASVI Dome and Dynamic EDPP Ball [22, 2021]
  13. Dynamic Holder Dome [23, 2022] (here Holder dome = Dynamic SASVI dome [22])
  14. Static double-cuts Dome [26, 2022] (quantile regression problem)
  15. Sequential Dome [29, 2021] (logistic problem)

GAP safe ball: applications, generalizations, variants

  1. [16, 2015] for Lasso problem [original definition]
  2. [19, 2017] for norm regularization problem (logistic loss, smooth SVM) [generalization]
  3. [20, 2021] for Kullback-Leibler divergence problem [generalizations generalized-GAP and refined-GAP]
  4. [21, 2021] for atomic-norm problem [variant]
  5. [24, 2022] for elastic net support vector machine problem [applications, keeping method]
  6. [25, 2021] for SLOPE problem [application]
  7. [27, 2020] for Huber regression problem
  8. [28, 2022] for Kullback-Leibler divergence problem [a sequential version]
  9. [1, 2020] for antisparse code problem [application, squeezing method]
  10. [2, 2022] for elastic-net problem [application, relaxing method]
  11. [5, 2022] for support vector machine problem [relation]

Some other screening-like methods:

  1. Screening-like methods: Squeezing [1, 2020], Relaxing [2, 2022], Subspace screening [f, 2023]
  2. Non-safe Screening methods: Strong screening [3, 2012] and [4, 2020]
  3. Safe Screening methods without using safe region: Region-free [5, 2022], Hessian [6, 2021] and [7, 2020]
  4. Screening methods for non-convex problems: 0\ell_0 [d, 2021], 0\ell_0-2\ell_2 [e, 2022], …

Other recent papers about screening:

  • Safe screening for generalized Lasso [b, 2018]
  • Safe screening for rank Lasso [c, 2022]

References

[1]: Elvira, Clément, and Cédric Herzet. “Safe squeezing for antisparse coding.” IEEE Transactions on Signal Processing 68 (2020): 3252-3265.
[2]: Guyard, Théo, Cédric Herzet, and Clément Elvira. “Screen & relax: accelerating the resolution of Elastic-Net by safe identification of the solution support.” ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2022.
[3]: Tibshirani, Robert, et al. “Strong rules for discarding predictors in lasso‐type problems.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 74.2 (2012): 245-266.
[4]: Larsson, Johan, Malgorzata Bogdan, and Jonas Wallin. “The strong screening rule for SLOPE.” Advances in neural information processing systems 33 (2020): 14592-14603.
[5]: Herzet, Cédric, Clément Elvira, and Hong-Phuong Dang. “Region-free Safe Screening Tests for 1\ell_ {1}-penalized Convex Problems.” 2022 30th European Signal Processing Conference (EUSIPCO). IEEE, 2022.
[6]: Larsson, Johan, and Jonas Wallin. “The Hessian Screening Rule.” arXiv preprint arXiv:2104.13026 (2021).
[7]: Sun, Yifan, and Francis Bach. “Safe screening for the generalized conditional gradient method.” arXiv preprint arXiv:2002.09718 (2020).
[8]: El Ghaoui, L., V. Viallon, and T. Rabbani. “Safe feature elimination in sparse supervised learning technical report no.” Technical report, UCB/EECS-2010–126, EECS Department, University of California, Berkeley (2010).
[9]: Dai, Liang, and Kristiaan Pelckmans. “An ellipsoid based, two-stage screening test for BPDN.” 2012 Proceedings of the 20th European Signal Processing Conference (EUSIPCO). IEEE, 2012.
[10]: Xiang, Zhen James, and Peter J. Ramadge. “Fast lasso screening tests based on correlations.” 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2012.
[11]: Wang, Jie, et al. “Lasso screening rules via dual polytope projection.” Advances in neural information processing systems 26 (2013).
[12]: Xiang, Zhen, Hao Xu, and Peter J. Ramadge. “Learning sparse representations of high dimensional data on large scale dictionaries.” Advances in neural information processing systems 24 (2011).
[13]: Liu, Jun, et al. “Safe screening with variational inequalities and its application to lasso.” International Conference on Machine Learning. PMLR, 2014.
[14]: Wang, Jie, et al. “A safe screening rule for sparse logistic regression.” Advances in neural information processing systems 27 (2014).
[15]: Bonnefoy, Antoine, et al. “Dynamic screening: Accelerating first-order algorithms for the lasso and group-lasso.” IEEE Transactions on Signal Processing 63.19 (2015): 5121-5132.
[16]: Fercoq, Olivier, Alexandre Gramfort, and Joseph Salmon. “Mind the duality gap: safer rules for the lasso.” International Conference on Machine Learning. PMLR, 2015.
[17]: Malti, Abed, and Cédric Herzet. “Safe screening tests for LASSO based on firmly non-expansiveness.” 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2016.
[18]: Xiang, Zhen James, Yun Wang, and Peter J. Ramadge. “Screening tests for lasso problems.” IEEE transactions on pattern analysis and machine intelligence 39.5 (2016): 1008-1027.
[19]: Ndiaye, Eugene, et al. “Gap safe screening rules for sparsity enforcing penalties.” The Journal of Machine Learning Research 18.1 (2017): 4671-4703.
[20]: Dantas, Cassio F., Emmanuel Soubies, and Cédric Févotte. “Expanding boundaries of Gap Safe screening.” The Journal of Machine Learning Research 22.1 (2021): 10665-10721.
[21] Fan, Zhenan, Huang Fang, and Michael P. Friedlander. “Safe-screening rules for atomic-norm regularization.” arXiv preprint arXiv:2107.11373 (2021).
[22]: Yamada, Hiroaki, and Makoto Yamada. “Dynamic sasvi: Strong safe screening for norm-regularized least squares.” Advances in Neural Information Processing Systems 34 (2021): 14645-14655.
[23]: Tran, Thu-Le, et al. “Beyond GAP screening for Lasso by exploiting new dual cutting half-spaces with supplementary material.” arXiv preprint arXiv:2203.00987 (2022).
[24]: Wang, Hongmei, and Yitian Xu. “A safe double screening strategy for elastic net support vector machine.” Information Sciences 582 (2022): 382-397.
[25]: Elvira, Clément, and Cédric Herzet. “Safe rules for the identification of zeros in the solutions of the SLOPE problem.” arXiv preprint arXiv:2110.11784 (2021).
[26]: Shang, Pan, and Lingchen Kong. “1\ell _ {1}-Norm Quantile Regression Screening Rule via the Dual Circumscribed Sphere.” IEEE Transactions on Pattern Analysis and Machine Intelligence 44.10 (2021): 6254-6263.
[27]: Chen, Huangyue, et al. “Safe feature screening rules for the regularized Huber regression.” Applied Mathematics and Computation 386 (2020): 125500.
[28]: Wang, Hongmei, Kun Jiang, and Yitian Xu. “Sequential safe feature elimination rule for L1-regularized regression with Kullback–Leibler divergence.” Neural Networks 155 (2022): 523-535.
[29: Pan, Xianli, and Yitian Xu. “A Safe Feature Elimination Rule for $ L_ {1} $ L 1-Regularized Logistic Regression.” IEEE Transactions on Pattern Analysis and Machine Intelligence 44.9 (2021): 4544-4554.

[b]: Ren, Shaogang, et al. “Safe feature screening for generalized LASSO.” IEEE transactions on pattern analysis and machine intelligence 40.12 (2017): 2992-3006.
[c]: Shang, Pan, Lingchen Kong, and Dashuai Liu. “A Safe Feature Screening Rule for Rank Lasso.” IEEE Signal Processing Letters 29 (2022): 1062-1066.
[d]: Atamturk, Alper, and Andrés Gómez. “Safe screening rules for 0\ell_0-regression from perspective relaxations.” International conference on machine learning. PMLR, 2020.
[e]: Deza, Anna, and Alper Atamturk. “Safe Screening for Logistic Regression with 0\ell_0-2\ell_2 Regularization.” arXiv preprint arXiv:2202.00467 (2022).
[f]: Zhong, Peiwei, and Yitian Xu. “Subspace screening rule for multi-label estimator with sparsity-inducing regularization.” Neurocomputing (2023).

Popular Posts