Loading…

SUPPORT UNION RECOVERY IN HIGH-DIMENSIONAL MULTIVARIATE REGRESSION

In multivariate regression, a K -dimensional response vector is regressed upon a common set of p covariates, with a matrix B* ∈ R p×K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the l1/l2 norm is used for support union rec...

Full description

Saved in:
Bibliographic Details
Published in:The Annals of statistics 2011-02, Vol.39 (1), p.1-47
Main Authors: Obozinski, Guillaume, Wainwright, Martin J., Jordan, Michael I.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c506t-ca1264b502cd6ccee80dcdbaa2dbc49f304d1741ce53037de3119f23bf0916b43
cites cdi_FETCH-LOGICAL-c506t-ca1264b502cd6ccee80dcdbaa2dbc49f304d1741ce53037de3119f23bf0916b43
container_end_page 47
container_issue 1
container_start_page 1
container_title The Annals of statistics
container_volume 39
creator Obozinski, Guillaume
Wainwright, Martin J.
Jordan, Michael I.
description In multivariate regression, a K -dimensional response vector is regressed upon a common set of p covariates, with a matrix B* ∈ R p×K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the l1/l2 norm is used for support union recovery, or recovery of the set of s rows for which B* is nonzero. Under high-dimensional scaling, we show that the multivariate group Lasso exhibits a threshold for the recovery of the exact row pattern with high probability over the random design and noise that is specified by the sample complexity parameter θ(n, p, s):=n/[2ψ(B*) log(p − s)]. Here n is the sample size, and ψ(B*) is a sparsity-overlap function measuring a combination of the sparsities and overlaps of the K -regression coefficient vectors that constitute the model. We prove that the multivariate group Lasso succeeds for problem sequences (n, p, s) such that θ(n, p, s) exceeds a critical level θ u , and fails for sequences such that θ(n, p, s) lies below a critical level θ l . For the special case of the standard Gaussian ensemble, we show that θ l = θ u so that the characterization is sharp. The sparsity-overlap function ψ(B*) reveals that, if the design is uncorrelated on the active rows, l1/l2 regularization for multivariate regression never harms performance relative to an ordinary Lasso approach and can yield substantial improvements in sample complexity (up to a factor of K) when the coefficient vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the multivariate group Lasso. We complement our analysis with simulations that demonstrate the sharpness of our theoretical results, even for relatively small problems.
doi_str_mv 10.1214/09-aos776
format article
fullrecord <record><control><sourceid>jstor_proje</sourceid><recordid>TN_cdi_projecteuclid_primary_oai_CULeuclid_euclid_aos_1291388368</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><jstor_id>29783630</jstor_id><sourcerecordid>29783630</sourcerecordid><originalsourceid>FETCH-LOGICAL-c506t-ca1264b502cd6ccee80dcdbaa2dbc49f304d1741ce53037de3119f23bf0916b43</originalsourceid><addsrcrecordid>eNpFkE1Lw0AQhhdRsFYP_gAhCB48RPcju8nejDW2gbQp-Sh4WjabDTTUpmbbg__eLQ31NDDvM-_MvADcI_iCMPJeIXdlZ3yfXYARRixwA87YJRhByKFLCfOuwY0xLYSQco-MwHteLpdpVjjlIk4XThZN0lWUfTnxwpnF05n7Ec-jRW6lMHHmZVLEqzCLwyKy5DSL8qNyC64auTH6bqhjUH5GxWTmJuk0noSJqyhke1dJhJlXUYhVzZTSOoC1qispcV0pjzcEejXyPaQ0JZD4tSYI8QaTqoEcscojY_B28t31XavVXh_UZl2LXb_-lv2v6ORaTMpk6A7FZiEQ5ogEAWGBtXg8W_wctNmLtjv0W3u1CKhPMaEEW-j5BKm-M6bXzXkFguIYsoBchGluQ7bs02AojZKbppdbtTbnAUxsyCSglns4ca3Zd_2_zn17l333DxhGgA0</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>857523532</pqid></control><display><type>article</type><title>SUPPORT UNION RECOVERY IN HIGH-DIMENSIONAL MULTIVARIATE REGRESSION</title><source>JSTOR Archival Journals and Primary Sources Collection【Remote access available】</source><source>Project Euclid Journals</source><creator>Obozinski, Guillaume ; Wainwright, Martin J. ; Jordan, Michael I.</creator><creatorcontrib>Obozinski, Guillaume ; Wainwright, Martin J. ; Jordan, Michael I.</creatorcontrib><description>In multivariate regression, a K -dimensional response vector is regressed upon a common set of p covariates, with a matrix B* ∈ R p×K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the l1/l2 norm is used for support union recovery, or recovery of the set of s rows for which B* is nonzero. Under high-dimensional scaling, we show that the multivariate group Lasso exhibits a threshold for the recovery of the exact row pattern with high probability over the random design and noise that is specified by the sample complexity parameter θ(n, p, s):=n/[2ψ(B*) log(p − s)]. Here n is the sample size, and ψ(B*) is a sparsity-overlap function measuring a combination of the sparsities and overlaps of the K -regression coefficient vectors that constitute the model. We prove that the multivariate group Lasso succeeds for problem sequences (n, p, s) such that θ(n, p, s) exceeds a critical level θ u , and fails for sequences such that θ(n, p, s) lies below a critical level θ l . For the special case of the standard Gaussian ensemble, we show that θ l = θ u so that the characterization is sharp. The sparsity-overlap function ψ(B*) reveals that, if the design is uncorrelated on the active rows, l1/l2 regularization for multivariate regression never harms performance relative to an ordinary Lasso approach and can yield substantial improvements in sample complexity (up to a factor of K) when the coefficient vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the multivariate group Lasso. We complement our analysis with simulations that demonstrate the sharpness of our theoretical results, even for relatively small problems.</description><identifier>ISSN: 0090-5364</identifier><identifier>EISSN: 2168-8966</identifier><identifier>DOI: 10.1214/09-aos776</identifier><identifier>CODEN: ASTSC7</identifier><language>eng</language><publisher>Cleveland, OH: Institute of Mathematical Statistics</publisher><subject>62F07 ; 62J07 ; block-norm ; Covariance matrices ; Eigenvalues ; Exact sciences and technology ; General topics ; group Lasso ; high-dimensional scaling ; Infinity ; Jordan matrices ; LASSO ; Linear regression ; Mathematics ; Matrices ; Modeling ; Multivariate analysis ; multivariate regression ; Normal distribution ; Numerical analysis ; Numerical analysis in abstract spaces ; Numerical analysis. Scientific computation ; Numerical linear algebra ; Probability and statistics ; Regression analysis ; Regression coefficients ; Sample size ; Sciences and techniques of general use ; second-order cone program ; Simulation ; simultaneous Lasso ; sparsity ; Statistics ; Studies ; variable selection</subject><ispartof>The Annals of statistics, 2011-02, Vol.39 (1), p.1-47</ispartof><rights>Copyright © 2011 The Institute of Mathematical Statistics</rights><rights>2015 INIST-CNRS</rights><rights>Copyright Institute of Mathematical Statistics Feb 2011</rights><rights>Copyright 2011 Institute of Mathematical Statistics</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c506t-ca1264b502cd6ccee80dcdbaa2dbc49f304d1741ce53037de3119f23bf0916b43</citedby><cites>FETCH-LOGICAL-c506t-ca1264b502cd6ccee80dcdbaa2dbc49f304d1741ce53037de3119f23bf0916b43</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.jstor.org/stable/pdf/29783630$$EPDF$$P50$$Gjstor$$H</linktopdf><linktohtml>$$Uhttps://www.jstor.org/stable/29783630$$EHTML$$P50$$Gjstor$$H</linktohtml><link.rule.ids>230,315,786,790,891,932,27957,27958,58593,58826</link.rule.ids><backlink>$$Uhttp://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&amp;idt=23943385$$DView record in Pascal Francis$$Hfree_for_read</backlink></links><search><creatorcontrib>Obozinski, Guillaume</creatorcontrib><creatorcontrib>Wainwright, Martin J.</creatorcontrib><creatorcontrib>Jordan, Michael I.</creatorcontrib><title>SUPPORT UNION RECOVERY IN HIGH-DIMENSIONAL MULTIVARIATE REGRESSION</title><title>The Annals of statistics</title><description>In multivariate regression, a K -dimensional response vector is regressed upon a common set of p covariates, with a matrix B* ∈ R p×K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the l1/l2 norm is used for support union recovery, or recovery of the set of s rows for which B* is nonzero. Under high-dimensional scaling, we show that the multivariate group Lasso exhibits a threshold for the recovery of the exact row pattern with high probability over the random design and noise that is specified by the sample complexity parameter θ(n, p, s):=n/[2ψ(B*) log(p − s)]. Here n is the sample size, and ψ(B*) is a sparsity-overlap function measuring a combination of the sparsities and overlaps of the K -regression coefficient vectors that constitute the model. We prove that the multivariate group Lasso succeeds for problem sequences (n, p, s) such that θ(n, p, s) exceeds a critical level θ u , and fails for sequences such that θ(n, p, s) lies below a critical level θ l . For the special case of the standard Gaussian ensemble, we show that θ l = θ u so that the characterization is sharp. The sparsity-overlap function ψ(B*) reveals that, if the design is uncorrelated on the active rows, l1/l2 regularization for multivariate regression never harms performance relative to an ordinary Lasso approach and can yield substantial improvements in sample complexity (up to a factor of K) when the coefficient vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the multivariate group Lasso. We complement our analysis with simulations that demonstrate the sharpness of our theoretical results, even for relatively small problems.</description><subject>62F07</subject><subject>62J07</subject><subject>block-norm</subject><subject>Covariance matrices</subject><subject>Eigenvalues</subject><subject>Exact sciences and technology</subject><subject>General topics</subject><subject>group Lasso</subject><subject>high-dimensional scaling</subject><subject>Infinity</subject><subject>Jordan matrices</subject><subject>LASSO</subject><subject>Linear regression</subject><subject>Mathematics</subject><subject>Matrices</subject><subject>Modeling</subject><subject>Multivariate analysis</subject><subject>multivariate regression</subject><subject>Normal distribution</subject><subject>Numerical analysis</subject><subject>Numerical analysis in abstract spaces</subject><subject>Numerical analysis. Scientific computation</subject><subject>Numerical linear algebra</subject><subject>Probability and statistics</subject><subject>Regression analysis</subject><subject>Regression coefficients</subject><subject>Sample size</subject><subject>Sciences and techniques of general use</subject><subject>second-order cone program</subject><subject>Simulation</subject><subject>simultaneous Lasso</subject><subject>sparsity</subject><subject>Statistics</subject><subject>Studies</subject><subject>variable selection</subject><issn>0090-5364</issn><issn>2168-8966</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2011</creationdate><recordtype>article</recordtype><recordid>eNpFkE1Lw0AQhhdRsFYP_gAhCB48RPcju8nejDW2gbQp-Sh4WjabDTTUpmbbg__eLQ31NDDvM-_MvADcI_iCMPJeIXdlZ3yfXYARRixwA87YJRhByKFLCfOuwY0xLYSQco-MwHteLpdpVjjlIk4XThZN0lWUfTnxwpnF05n7Ec-jRW6lMHHmZVLEqzCLwyKy5DSL8qNyC64auTH6bqhjUH5GxWTmJuk0noSJqyhke1dJhJlXUYhVzZTSOoC1qispcV0pjzcEejXyPaQ0JZD4tSYI8QaTqoEcscojY_B28t31XavVXh_UZl2LXb_-lv2v6ORaTMpk6A7FZiEQ5ogEAWGBtXg8W_wctNmLtjv0W3u1CKhPMaEEW-j5BKm-M6bXzXkFguIYsoBchGluQ7bs02AojZKbppdbtTbnAUxsyCSglns4ca3Zd_2_zn17l333DxhGgA0</recordid><startdate>20110201</startdate><enddate>20110201</enddate><creator>Obozinski, Guillaume</creator><creator>Wainwright, Martin J.</creator><creator>Jordan, Michael I.</creator><general>Institute of Mathematical Statistics</general><general>The Institute of Mathematical Statistics</general><scope>IQODW</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope></search><sort><creationdate>20110201</creationdate><title>SUPPORT UNION RECOVERY IN HIGH-DIMENSIONAL MULTIVARIATE REGRESSION</title><author>Obozinski, Guillaume ; Wainwright, Martin J. ; Jordan, Michael I.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c506t-ca1264b502cd6ccee80dcdbaa2dbc49f304d1741ce53037de3119f23bf0916b43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2011</creationdate><topic>62F07</topic><topic>62J07</topic><topic>block-norm</topic><topic>Covariance matrices</topic><topic>Eigenvalues</topic><topic>Exact sciences and technology</topic><topic>General topics</topic><topic>group Lasso</topic><topic>high-dimensional scaling</topic><topic>Infinity</topic><topic>Jordan matrices</topic><topic>LASSO</topic><topic>Linear regression</topic><topic>Mathematics</topic><topic>Matrices</topic><topic>Modeling</topic><topic>Multivariate analysis</topic><topic>multivariate regression</topic><topic>Normal distribution</topic><topic>Numerical analysis</topic><topic>Numerical analysis in abstract spaces</topic><topic>Numerical analysis. Scientific computation</topic><topic>Numerical linear algebra</topic><topic>Probability and statistics</topic><topic>Regression analysis</topic><topic>Regression coefficients</topic><topic>Sample size</topic><topic>Sciences and techniques of general use</topic><topic>second-order cone program</topic><topic>Simulation</topic><topic>simultaneous Lasso</topic><topic>sparsity</topic><topic>Statistics</topic><topic>Studies</topic><topic>variable selection</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Obozinski, Guillaume</creatorcontrib><creatorcontrib>Wainwright, Martin J.</creatorcontrib><creatorcontrib>Jordan, Michael I.</creatorcontrib><collection>Pascal-Francis</collection><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>The Annals of statistics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Obozinski, Guillaume</au><au>Wainwright, Martin J.</au><au>Jordan, Michael I.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>SUPPORT UNION RECOVERY IN HIGH-DIMENSIONAL MULTIVARIATE REGRESSION</atitle><jtitle>The Annals of statistics</jtitle><date>2011-02-01</date><risdate>2011</risdate><volume>39</volume><issue>1</issue><spage>1</spage><epage>47</epage><pages>1-47</pages><issn>0090-5364</issn><eissn>2168-8966</eissn><coden>ASTSC7</coden><abstract>In multivariate regression, a K -dimensional response vector is regressed upon a common set of p covariates, with a matrix B* ∈ R p×K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the l1/l2 norm is used for support union recovery, or recovery of the set of s rows for which B* is nonzero. Under high-dimensional scaling, we show that the multivariate group Lasso exhibits a threshold for the recovery of the exact row pattern with high probability over the random design and noise that is specified by the sample complexity parameter θ(n, p, s):=n/[2ψ(B*) log(p − s)]. Here n is the sample size, and ψ(B*) is a sparsity-overlap function measuring a combination of the sparsities and overlaps of the K -regression coefficient vectors that constitute the model. We prove that the multivariate group Lasso succeeds for problem sequences (n, p, s) such that θ(n, p, s) exceeds a critical level θ u , and fails for sequences such that θ(n, p, s) lies below a critical level θ l . For the special case of the standard Gaussian ensemble, we show that θ l = θ u so that the characterization is sharp. The sparsity-overlap function ψ(B*) reveals that, if the design is uncorrelated on the active rows, l1/l2 regularization for multivariate regression never harms performance relative to an ordinary Lasso approach and can yield substantial improvements in sample complexity (up to a factor of K) when the coefficient vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the multivariate group Lasso. We complement our analysis with simulations that demonstrate the sharpness of our theoretical results, even for relatively small problems.</abstract><cop>Cleveland, OH</cop><pub>Institute of Mathematical Statistics</pub><doi>10.1214/09-aos776</doi><tpages>47</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0090-5364
ispartof The Annals of statistics, 2011-02, Vol.39 (1), p.1-47
issn 0090-5364
2168-8966
language eng
recordid cdi_projecteuclid_primary_oai_CULeuclid_euclid_aos_1291388368
source JSTOR Archival Journals and Primary Sources Collection【Remote access available】; Project Euclid Journals
subjects 62F07
62J07
block-norm
Covariance matrices
Eigenvalues
Exact sciences and technology
General topics
group Lasso
high-dimensional scaling
Infinity
Jordan matrices
LASSO
Linear regression
Mathematics
Matrices
Modeling
Multivariate analysis
multivariate regression
Normal distribution
Numerical analysis
Numerical analysis in abstract spaces
Numerical analysis. Scientific computation
Numerical linear algebra
Probability and statistics
Regression analysis
Regression coefficients
Sample size
Sciences and techniques of general use
second-order cone program
Simulation
simultaneous Lasso
sparsity
Statistics
Studies
variable selection
title SUPPORT UNION RECOVERY IN HIGH-DIMENSIONAL MULTIVARIATE REGRESSION
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-09-23T07%3A33%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-jstor_proje&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=SUPPORT%20UNION%20RECOVERY%20IN%20HIGH-DIMENSIONAL%20MULTIVARIATE%20REGRESSION&rft.jtitle=The%20Annals%20of%20statistics&rft.au=Obozinski,%20Guillaume&rft.date=2011-02-01&rft.volume=39&rft.issue=1&rft.spage=1&rft.epage=47&rft.pages=1-47&rft.issn=0090-5364&rft.eissn=2168-8966&rft.coden=ASTSC7&rft_id=info:doi/10.1214/09-aos776&rft_dat=%3Cjstor_proje%3E29783630%3C/jstor_proje%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c506t-ca1264b502cd6ccee80dcdbaa2dbc49f304d1741ce53037de3119f23bf0916b43%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=857523532&rft_id=info:pmid/&rft_jstor_id=29783630&rfr_iscdi=true