Loading…

CFDP: Common Frequency Domain Pruning

As the saying goes, sometimes less is more - and when it comes to neural networks, that couldn't be more true. Enter pruning, the art of selectively trimming away unnecessary parts of a network to create a more streamlined, efficient architecture. In this paper, we introduce a novel end-to-end...

Full description

Saved in:
Bibliographic Details
Main Authors: Khaki, Samir, Luo, Weihan
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 4724
container_issue
container_start_page 4715
container_title
container_volume
creator Khaki, Samir
Luo, Weihan
description As the saying goes, sometimes less is more - and when it comes to neural networks, that couldn't be more true. Enter pruning, the art of selectively trimming away unnecessary parts of a network to create a more streamlined, efficient architecture. In this paper, we introduce a novel end-to-end pipeline for model pruning via the frequency domain. This work aims to shed light on the interoperability of intermediate model outputs and their significance beyond the spatial domain. Our method, dubbed Common Frequency Domain Pruning (CFDP) aims to extrapolate common frequency characteristics defined over the feature maps to rank the individual channels of a layer based on their level of importance in learning the representation. By harnessing the power of CFDP, we have achieved state-of-the-art results on CIFAR-10 with GoogLeNet reaching an accuracy of 95.25%, that is, +0.2% from the original model. We also outperform all benchmarks and match the original model's performance on ImageNet, using only 55% of the trainable parameters and 60% of the FLOPs. In addition to notable performances, models produced via CFDP exhibit robustness to a variety of configurations including pruning from untrained neural architectures, and resistance to adversarial attacks. The implementation code can be found at https://github.com/Skhaki18/CFDP.
doi_str_mv 10.1109/CVPRW59228.2023.00499
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10208515</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10208515</ieee_id><sourcerecordid>10208515</sourcerecordid><originalsourceid>FETCH-LOGICAL-i119t-80239b8558c170c97b3fffc8acb9ec187b58525318a1459b3d4fb58b9920e18d3</originalsourceid><addsrcrecordid>eNotjk1LxDAUAKMguKz9Bwq9eGx9L2k2ed4ka1VYsIgfxyXJJhKxqbbuYf-9BT0NzGEYxi4QakSgK_PaPb1J4lzXHLioARqiI1aQIi0kCOANiWO24LiCSklcnbJimj4AAEFLSWLBLk277q5LM_T9kMt2DN_7kP2hXA-9Tbnsxn1O-f2MnUT7OYXin0v20t4-m_tq83j3YG42VUKkn0rPE-TmsvaowJNyIsbotfWOgketnNSSS4HaYiPJiV0TZ-WIOATUO7Fk53_dFELYfo2pt-Nhi8DnXZTiF1oGQAo</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>CFDP: Common Frequency Domain Pruning</title><source>IEEE Xplore All Conference Series</source><creator>Khaki, Samir ; Luo, Weihan</creator><creatorcontrib>Khaki, Samir ; Luo, Weihan</creatorcontrib><description>As the saying goes, sometimes less is more - and when it comes to neural networks, that couldn't be more true. Enter pruning, the art of selectively trimming away unnecessary parts of a network to create a more streamlined, efficient architecture. In this paper, we introduce a novel end-to-end pipeline for model pruning via the frequency domain. This work aims to shed light on the interoperability of intermediate model outputs and their significance beyond the spatial domain. Our method, dubbed Common Frequency Domain Pruning (CFDP) aims to extrapolate common frequency characteristics defined over the feature maps to rank the individual channels of a layer based on their level of importance in learning the representation. By harnessing the power of CFDP, we have achieved state-of-the-art results on CIFAR-10 with GoogLeNet reaching an accuracy of 95.25%, that is, +0.2% from the original model. We also outperform all benchmarks and match the original model's performance on ImageNet, using only 55% of the trainable parameters and 60% of the FLOPs. In addition to notable performances, models produced via CFDP exhibit robustness to a variety of configurations including pruning from untrained neural architectures, and resistance to adversarial attacks. The implementation code can be found at https://github.com/Skhaki18/CFDP.</description><identifier>EISSN: 2160-7516</identifier><identifier>EISBN: 9798350302493</identifier><identifier>DOI: 10.1109/CVPRW59228.2023.00499</identifier><identifier>CODEN: IEEPAD</identifier><language>eng</language><publisher>IEEE</publisher><subject>Benchmark testing ; Computer architecture ; Frequency-domain analysis ; Measurement ; Neural networks ; Pipelines ; Resistance</subject><ispartof>2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2023, p.4715-4724</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10208515$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>310,311,786,790,795,796,27958,54906,55283</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10208515$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Khaki, Samir</creatorcontrib><creatorcontrib>Luo, Weihan</creatorcontrib><title>CFDP: Common Frequency Domain Pruning</title><title>2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)</title><addtitle>CVPRW</addtitle><description>As the saying goes, sometimes less is more - and when it comes to neural networks, that couldn't be more true. Enter pruning, the art of selectively trimming away unnecessary parts of a network to create a more streamlined, efficient architecture. In this paper, we introduce a novel end-to-end pipeline for model pruning via the frequency domain. This work aims to shed light on the interoperability of intermediate model outputs and their significance beyond the spatial domain. Our method, dubbed Common Frequency Domain Pruning (CFDP) aims to extrapolate common frequency characteristics defined over the feature maps to rank the individual channels of a layer based on their level of importance in learning the representation. By harnessing the power of CFDP, we have achieved state-of-the-art results on CIFAR-10 with GoogLeNet reaching an accuracy of 95.25%, that is, +0.2% from the original model. We also outperform all benchmarks and match the original model's performance on ImageNet, using only 55% of the trainable parameters and 60% of the FLOPs. In addition to notable performances, models produced via CFDP exhibit robustness to a variety of configurations including pruning from untrained neural architectures, and resistance to adversarial attacks. The implementation code can be found at https://github.com/Skhaki18/CFDP.</description><subject>Benchmark testing</subject><subject>Computer architecture</subject><subject>Frequency-domain analysis</subject><subject>Measurement</subject><subject>Neural networks</subject><subject>Pipelines</subject><subject>Resistance</subject><issn>2160-7516</issn><isbn>9798350302493</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2023</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotjk1LxDAUAKMguKz9Bwq9eGx9L2k2ed4ka1VYsIgfxyXJJhKxqbbuYf-9BT0NzGEYxi4QakSgK_PaPb1J4lzXHLioARqiI1aQIi0kCOANiWO24LiCSklcnbJimj4AAEFLSWLBLk277q5LM_T9kMt2DN_7kP2hXA-9Tbnsxn1O-f2MnUT7OYXin0v20t4-m_tq83j3YG42VUKkn0rPE-TmsvaowJNyIsbotfWOgketnNSSS4HaYiPJiV0TZ-WIOATUO7Fk53_dFELYfo2pt-Nhi8DnXZTiF1oGQAo</recordid><startdate>202306</startdate><enddate>202306</enddate><creator>Khaki, Samir</creator><creator>Luo, Weihan</creator><general>IEEE</general><scope>6IE</scope><scope>6IL</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIL</scope></search><sort><creationdate>202306</creationdate><title>CFDP: Common Frequency Domain Pruning</title><author>Khaki, Samir ; Luo, Weihan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i119t-80239b8558c170c97b3fffc8acb9ec187b58525318a1459b3d4fb58b9920e18d3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Benchmark testing</topic><topic>Computer architecture</topic><topic>Frequency-domain analysis</topic><topic>Measurement</topic><topic>Neural networks</topic><topic>Pipelines</topic><topic>Resistance</topic><toplevel>online_resources</toplevel><creatorcontrib>Khaki, Samir</creatorcontrib><creatorcontrib>Luo, Weihan</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan All Online (POP All Online) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Xplore</collection><collection>IEEE Proceedings Order Plans (POP All) 1998-Present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Khaki, Samir</au><au>Luo, Weihan</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>CFDP: Common Frequency Domain Pruning</atitle><btitle>2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)</btitle><stitle>CVPRW</stitle><date>2023-06</date><risdate>2023</risdate><spage>4715</spage><epage>4724</epage><pages>4715-4724</pages><eissn>2160-7516</eissn><eisbn>9798350302493</eisbn><coden>IEEPAD</coden><abstract>As the saying goes, sometimes less is more - and when it comes to neural networks, that couldn't be more true. Enter pruning, the art of selectively trimming away unnecessary parts of a network to create a more streamlined, efficient architecture. In this paper, we introduce a novel end-to-end pipeline for model pruning via the frequency domain. This work aims to shed light on the interoperability of intermediate model outputs and their significance beyond the spatial domain. Our method, dubbed Common Frequency Domain Pruning (CFDP) aims to extrapolate common frequency characteristics defined over the feature maps to rank the individual channels of a layer based on their level of importance in learning the representation. By harnessing the power of CFDP, we have achieved state-of-the-art results on CIFAR-10 with GoogLeNet reaching an accuracy of 95.25%, that is, +0.2% from the original model. We also outperform all benchmarks and match the original model's performance on ImageNet, using only 55% of the trainable parameters and 60% of the FLOPs. In addition to notable performances, models produced via CFDP exhibit robustness to a variety of configurations including pruning from untrained neural architectures, and resistance to adversarial attacks. The implementation code can be found at https://github.com/Skhaki18/CFDP.</abstract><pub>IEEE</pub><doi>10.1109/CVPRW59228.2023.00499</doi><tpages>10</tpages></addata></record>
fulltext fulltext_linktorsrc
identifier EISSN: 2160-7516
ispartof 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2023, p.4715-4724
issn 2160-7516
language eng
recordid cdi_ieee_primary_10208515
source IEEE Xplore All Conference Series
subjects Benchmark testing
Computer architecture
Frequency-domain analysis
Measurement
Neural networks
Pipelines
Resistance
title CFDP: Common Frequency Domain Pruning
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-09-22T16%3A22%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=CFDP:%20Common%20Frequency%20Domain%20Pruning&rft.btitle=2023%20IEEE/CVF%20Conference%20on%20Computer%20Vision%20and%20Pattern%20Recognition%20Workshops%20(CVPRW)&rft.au=Khaki,%20Samir&rft.date=2023-06&rft.spage=4715&rft.epage=4724&rft.pages=4715-4724&rft.eissn=2160-7516&rft.coden=IEEPAD&rft_id=info:doi/10.1109/CVPRW59228.2023.00499&rft.eisbn=9798350302493&rft_dat=%3Cieee_CHZPO%3E10208515%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i119t-80239b8558c170c97b3fffc8acb9ec187b58525318a1459b3d4fb58b9920e18d3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10208515&rfr_iscdi=true