Logout succeed
Logout succeed. See you again!

Algorithms for Sparsity-Constrained Optimization PDF
Preview Algorithms for Sparsity-Constrained Optimization
Springer Theses Recognizing Outstanding Ph.D. Research Sohail Bahmani Algorithms for Sparsity- Constrained Optimization Springer Theses Recognizing Outstanding Ph.D. Research Forfurthervolumes: http://www.springer.com/series/8790 Aimsand Scope The series “Springer Theses” brings together a selection of the very best Ph.D. theses from around the world and across the physical sciences. Nominated and endorsed by two recognized specialists, each published volume has been selected for its scientific excellence and the high impact of its contents for the pertinent fieldofresearch.Forgreateraccessibilitytonon-specialists,thepublishedversions includeanextendedintroduction,aswellasaforewordbythestudent’ssupervisor explainingthespecialrelevanceoftheworkforthefield.Asawhole,theserieswill provideavaluableresourcebothfornewcomerstotheresearchfieldsdescribed,and for other scientists seeking detailed backgroundinformation on special questions. Finally,itprovidesanaccrediteddocumentationofthevaluablecontributionsmade bytoday’syoungergenerationofscientists. Thesesareacceptedintotheseriesbyinvitednominationonlyand mustfulfill allofthe followingcriteria (cid:129) TheymustbewritteningoodEnglish. (cid:129) ThetopicshouldfallwithintheconfinesofChemistry,Physics,EarthSciences and related interdisciplinary fields such as Materials, Nanoscience, Chemical Engineering,ComplexSystemsandBiophysics. (cid:129) Theworkreportedinthethesismustre presentasignificantscientificadvance. (cid:129) Ifthethesisincludespreviouslypublishedmaterial,permissiontoreproducethis mustbegainedfromtherespectivecopyrightholder. (cid:129) They must have been examined and passed during the 12 months prior to nomination. (cid:129) Eachthesisshouldincludeaforewordbyt hesupervisoroutliningthesignificance ofitscontent. (cid:129) The theses should have a clearly defined structure including an introduction accessibletoscientistsnotexpertinthatparticularfield. Sohail Bahmani Algorithms for Sparsity-Constrained Optimization 123 SohailBahmani CarnegieMellonUniversity Pittsburgh,Pennsylvania,USA ISSN2190-5053 ISSN2190-5061(electronic) ISBN978-3-319-01880-5 ISBN978-3-319-01881-2(eBook) DOI10.1007/978-3-319-01881-2 SpringerChamHeidelbergNewYorkDordrechtLondon LibraryofCongressControlNumber:2013949675 ©SpringerInternationalPublishingSwitzerland2014 Thisworkissubjecttocopyright.AllrightsarereservedbythePublisher,whetherthewholeorpartof thematerialisconcerned,specificallytherightsoftranslation,reprinting,reuseofillustrations,recitation, broadcasting,reproductiononmicrofilmsorinanyotherphysicalway,andtransmissionorinformation storageandretrieval,electronicadaptation,computersoftware,orbysimilarordissimilarmethodology nowknownorhereafterdeveloped.Exemptedfromthislegalreservationarebriefexcerptsinconnection with reviews or scholarly analysis or material supplied specifically for the purpose of being entered and executed on a computer system, for exclusive use by the purchaser of the work. Duplication of this publication or parts thereof is permitted only under the provisions of the Copyright Law of the Publisher’slocation,initscurrentversion,andpermissionforusemustalwaysbeobtainedfromSpringer. PermissionsforusemaybeobtainedthroughRightsLinkattheCopyrightClearanceCenter.Violations areliabletoprosecutionundertherespectiveCopyrightLaw. Theuseofgeneraldescriptivenames,registerednames,trademarks,servicemarks,etc.inthispublication doesnotimply,evenintheabsenceofaspecificstatement,thatsuchnamesareexemptfromtherelevant protectivelawsandregulationsandthereforefreeforgeneraluse. While the advice and information in this book are believed to be true and accurate at the date of publication,neithertheauthorsnortheeditorsnorthepublishercanacceptanylegalresponsibilityfor anyerrorsoromissionsthatmaybemade.Thepublishermakesnowarranty,expressorimplied,with respecttothematerialcontainedherein. Printedonacid-freepaper SpringerispartofSpringerScience+BusinessMedia(www.springer.com) To myparents::: Supervisor’s Foreword Theproblemofsparseoptimizationhasgatheredalotofattentionlately.Thereason issimple:sparsityisafundamentalstructuralcharacteristicofmuchofthedatawe encounter.Indeed,onemayclaimthatthestructureinthesedataisanexpressionof sparsity.Thesparsitymaymanifestindifferentways.Oftenthedatathemselvesare sparse,in thatthe majorityoftheir componentsare zero-valued.Morecommonly, the data may simply be restricted in the values they can take or the patterns they may follow. Here, the structure in the data can often be characterized as sparsity in a transformed domain. For instance, the data may be restricted to only inhabit a restricted set of subspaces. In this case descriptions of the data in terms of their projectionson these subspaceswill be sparse. Thissparsity can be exploited for a variety of purposes, e.g., compressed sensing techniques exploit sparsity in signalstocharacterizethemusingfarfewermeasurementsthanwouldotherwisebe required,RADAR andSONAR applicationsexploitthe spatialsparsity of sources forbetterdetectionandlocalizationofsources,etc. At othertimes, sparsity maybe imputedto characterizationsof variousaspects of the data, in an attemptto bringoutthe structurein it. Thus, statistical analyses and various machine learning techniques often attempt to fit sparse models to data,enablebetterpredictions,identifyimportantvariables,etc.Atyetothertimes, sparsitymaybeenforcedsimplytocompensateforpaucityofdatatolearnricheror moredetailedmodels. Inallcases,oneendsuphavingtoestimatethesparsestsolutionthatminimizes a loss function of some kind, i.e., with an instance of the aforementionedsparse- optimization problem. The specifics vary chiefly in the loss function minimized. For instance,compressedsensing attemptsto minimizethe squarederrorbetween observationsof the data and observationsthat might be engenderedby the sparse solution,machinelearningtechniquesattempttominimizethenegativelogproba- bilityoftheobserveddata,aspredictedbythemodel,andsoon. Obtaining sparse solutions, however, is not trivial. Sparsity is defined through the` norm—thenumberofnonzerocomponents—ofthevariablebeingoptimized. 0 To obtain a sparse solution, this norm must hence be directly minimized or, alternately, imposed as a constraint on the optimization problem. Unfortunately, vii viii Supervisor’sForeword optimizationproblemsinvolvingthe` normrequiredeterminationofthe optimal 0 set of components to be assigned nonzero values and are hence combinatorial in nature and are generally computationally intractable. As a result, one must either employgreedyalgorithmstoobtainasolutionoremployproxiesthatarerelaxations ofthe` norm.Both ofthese approacheshaveyieldedhighlyeffectivealgorithms 0 foroptimization,whenthelossfunctionisquadraticor,moregenerally,convexin nature. Formoregenericclassesoflossfunctions,however,thesituationisnotsoclear. Proxiesto the` normwhichcanbeshownto resultin optimallysparse solutions 0 forquadraticorconvexlossfunctionsarenolongerguaranteedtoprovideoptimal solutionsforotherlossfunctions.Itissimilarlyunclearwhethergreedyalgorithms that are effective for well-behaved loss functions will be equally effective in the mostgeneralcase. ThisistheproblemspacethatSohailtacklesinthismonograph.Inanoutstanding series of results, he develops and analyzes a greedy framework for sparsity- constrained optimization of a wide class of loss functions, shows how it may be applied to various problems, and finally extends it to handle the case where the solutionsarenotmerelysparse,butrestrictedtolieinspecifiedsubspaces. GraSP is the proposed greedy framework for sparse optimization of loss func- tions. Through rigorous analysis, Sohail demonstrates that it imposes far fewer constraintsonthelossfunction,onlyrequiringittobeconvexonsparsesubspaces, and converges linearly to the optimal solution. As an illustrative application he applies GraSP to the problem of feature selection through sparse optimization of logistic functions, and demonstrates that it results in significantly better solutions thancurrentmethods.One-bitcompressivesensingistheproblemofreconstructing asignalfromaseriesofone-bitmeasurements,achallengingbutexcitingproblem. Sohail demonstrates that GraSP-based solutions can result in greatly improved signalrecoveryoverallothercurrentmethods. Subsequently,hedevelopsasolutiontodealwithmodel-basedsparsity:problems wherethesolutionsare notonlyrequiredto be sparse,butare furtherrestrictedto lie on onlyspecific subspaces.Such problemsfrequentlyarise, forinstance, when additionalinformationisavailableabouttheinterdependencebetweenthelocation ofnonzerovaluesintheestimatedvariables. Finally he reverses gear and addresses a more philosophical problem—that of identifying the best proxy for gradient-based algorithms for sparsity-constrained least-squares optimization—and arrives at the remarkable result that the optimal proxyisthe` normitself. 0 Together, the contributions of this monograph lay a solid foundation of tech- niques and results for any aspiring or established researcher wishing to work on theproblemofsparseoptimizationofdifficult-to-optimizelossfunctions.Assuch, I believe that this monograph is a mandatory inclusion in the library of anybody workingonthetopic. LanguageTechnologiesInstitute Prof.BhikshaRaj CarnegieMellonUniversity Pittsburgh,USA Acknowledgements IwouldliketothankProfessorBhikshaRaj,myadvisor,forhiscontinuoussupport and encouragement during my studies at Carnegie Mellon University. He made every effort to allow me to achieve my goals in research during the course of the Ph.D. studies. I would also like to thank Dr. Petros T. Boufounos for his insightful comments that helped me improve the quality of my work during our collaborationandforservinginmythesisdefensecommittee.Iwouldliketothank Professor José M. F. Moura and Professor Soummya Kar, who also served in the defensecommittee,fortheirenlighteningadviceandcommentsonmythesis.Above all, I would like to expressmy sincere gratitude to my parents who supportedme throughout my life in every aspect. I especially thank my mother for giving me motivationandhopethathelpedmeendureandovercomedifficulties. ix