{"id":1631,"date":"2026-03-30T00:24:36","date_gmt":"2026-03-29T16:24:36","guid":{"rendered":"https:\/\/cchsu.info\/wordpress\/zh\/research\/"},"modified":"2026-03-30T00:24:36","modified_gmt":"2026-03-29T16:24:36","slug":"research","status":"publish","type":"page","link":"https:\/\/cchsu.info\/wordpress\/zh\/research\/","title":{"rendered":"\u7814\u7a76"},"content":{"rendered":"<div id=\"pl-1631\"  class=\"panel-layout\" ><div id=\"pg-1631-0\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-0\" ><div id=\"pgc-1631-0-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-0-0-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"0\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<p class=\"page-language-switcher\"><strong>\u8a9e\u8a00\uff1a<\/strong><a href=\"https:\/\/cchsu.info\/wordpress\/research\/\">English<\/a> | \u7e41\u9ad4\u4e2d\u6587<\/p>\n<h1>\u7814\u7a76\u9858\u666f<\/h1>\n<p>Advanced Computer Vision Lab \u2014 <strong>A<\/strong>ssured <strong>C<\/strong>omputer <strong>V<\/strong>ision: <strong>L<\/strong>ean, <strong>A<\/strong>utonomous, <strong>B<\/strong>road-Spectrum<\/p>\n<p>\u7576 generative AI \u9010\u6b65\u6a21\u7cca\u771f\u5be6\u8207\u507d\u9020\u5a92\u9ad4\u7684\u908a\u754c\u3001\u81ea\u4e3b\u7cfb\u7d71\u5c0d vision \u7684\u53ef\u9760\u5ea6\u8981\u6c42\u6108\u4f86\u6108\u9ad8\uff0c\u800c Earth observation \u4e5f\u9032\u5165\u9ad8\u8cc7\u6599\u91cf\u7684\u65b0\u968e\u6bb5\u6642\uff0c\u771f\u6b63\u80fd\u90e8\u7f72\u5230\u73fe\u5834\u7684 visual intelligence \u9580\u6abb\u81ea\u7136\u8ddf\u8457\u63d0\u9ad8\u3002ACVLab \u7684\u7814\u7a76\u4e3b\u8ef8\u53ef\u6574\u7406\u6210\u56db\u500b\u5f7c\u6b64\u6263\u5408\u7684 pillars\u3002<\/p>\n<p><b>Assured Visual Intelligence<\/b> \u95dc\u5fc3\u7684\u662f\u6bcf\u4e00\u6b21 visual AI \u8f38\u51fa\u80fd\u4e0d\u80fd\u88ab\u4fe1\u4efb\uff0c\u7121\u8ad6\u662f\u9ad8\u58d3\u7e2e\u689d\u4ef6\u4e0b\u7684 DeepFake detection\u3001adversarial perturbation defense\uff0c\u6216\u900f\u904e proactive watermarking \u9032\u884c media authentication\uff0c\u6838\u5fc3\u90fd\u662f\u8b93 forensic\u3001medical \u8207 regulatory \u5834\u666f\u6709\u8db3\u5920\u7684 accountability\u3002<\/p>\n<p><b>Lean Visual Architectures<\/b> \u5247\u5f9e computation abstraction \u7684\u4e0d\u540c\u5c64\u6b21\u91cd\u65b0\u8a2d\u8a08\u7cfb\u7d71\uff1a\u5305\u542b exact attention \u7684 prefix-scan reformulation\uff08ELSA\uff09\u3001\u7565\u904e pixel decoding \u7684 bitstream-level forensics\u3001\u5728 ultra-low bit width \u4ecd\u76e1\u91cf\u5b88\u4f4f accuracy \u7684 adaptive quantization\uff08QuantTune\/FracQuant\uff09\uff0c\u4ee5\u53ca bandwidth-constrained satellites \u4e0a\u7684 joint transmission-restoration\uff0c\u76ee\u6a19\u662f\u628a latency\u3001memory \u8207 energy cost \u4e00\u8d77\u964d\u4e0b\u4f86\u3002<\/p>\n<p><b>Autonomous Visual Perception<\/b> \u628a vision \u5f9e 2D \u5f71\u50cf\u63a8\u9032\u5230 3D physical space\uff1amaterial-aware scene reconstruction with hyperspectral unmixing\u3001BEV adversarial defense for self-driving\uff08BFDM\uff09\u3001\u80fd\u70ba\u4e0b\u6e38 robotic pipelines \u63d0\u4f9b\u7a69\u5065\u7279\u5fb5\u7684 shadow \/ reflection removal\uff08PhaSR\u3001ReflexSplit\uff09\uff0c\u4ee5\u53ca uncertainty-aware 3D annotation for autonomous driving datasets\u3002<\/p>\n<p><b>Broad-Spectrum Scientific Sensing<\/b> \u5247\u628a\u611f\u77e5\u80fd\u529b\u63a8\u5230\u53ef\u898b\u5149\u4e4b\u5916\uff1avision-language prompts \u9a45\u52d5\u7684 universal hyperspectral restoration\uff08PromptHSI\uff09\u3001\u7372\u5f97\u672a\u4f86\u79d1\u6280\u734e\u80af\u5b9a\u7684 real-time CubeSat compressed sensing\u3001\u900f\u904e sparse spectral representations \u9032\u884c\u7684 hyperspectral pansharpening\uff08S<sup>3<\/sup>RNet\uff09\uff0c\u4ee5\u53ca\u80fd\u63ed\u9732 RGB \u770b\u4e0d\u5230\u64cd\u5f04\u75d5\u8de1\u7684 cross-spectral forgery detection\u3002<\/p>\n<p>\u9019\u4e9b pillars \u4e26\u4e0d\u662f\u5404\u81ea\u7368\u7acb\u3002Hyperspectral forensics \u628a trust \u548c spectral sensing \u63a5\u8d77\u4f86\uff0con-satellite real-time inference \u628a efficiency \u548c broad-spectrum data \u63a5\u8d77\u4f86\uff0cBEV adversarial defense \u5247\u628a trust \u548c embodied perception \u63a5\u8d77\u4f86\u3002\u5c0d ACVLab \u800c\u8a00\uff0c\u771f\u6b63\u80fd\u843d\u5730\u7684 visual intelligence\uff0c\u5fc5\u9808\u540c\u6642\u517c\u9867 trustworthy\u3001efficient\u3001embodied \u8207 perceptually complete\u3002<\/p>\n<h2>\u7814\u7a76\u652f\u67f1<\/h2>\n<ul>\n<li><strong>Autonomous Visual Perception<\/strong>: PhaSR\u3001ReflexSplit\u3001autonomous driving\u3001tracking\u3001embodied perception\u30013D reconstruction<\/li>\n<li><strong>Assured Visual Intelligence<\/strong>: GRACEv2\u3001UMCL\u3001DDD-Net\u3001DeepFake detection\u3001proactive authentication\u3001trustworthy media analysis<\/li>\n<li><strong>Broad-Spectrum Scientific Sensing<\/strong>: PromptHSI\u3001S<sup>3<\/sup>RNet\u3001CubeSat compressed sensing\u3001remote sensing\u3001satellite imaging<\/li>\n<li><strong>Lean Visual Architectures<\/strong>: ELSA\u3001QuantTune\u3001FracQuant\u3001bitstream-level inference\u3001CubeSat on-board processing\u3001edge deployment<\/li>\n<\/ul>\n<p>\u7814\u7a76\u7c21\u4ecb\uff1a [<a href=\"https:\/\/www.dropbox.com\/scl\/fi\/wjivz198w7soqxu6t0hlz\/Recent-Research_DFD2HSI_v2.pdf?rlkey=abaukudj7vnz338oayo0pv17j&amp;st=3lfjprys&amp;dl=0\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>]\uff08\u6700\u8fd1\u66f4\u65b0\uff1a2024 \u5e74 10 \u6708\uff09<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-1\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-1\" ><div id=\"pgc-1631-1-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-1-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"1\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/research_assets\/20260324\/PhaSR_teaser.png\" title=\"\u7814\u7a76\" alt=\"PhaSR: Generalized Image Shadow Removal with Physically Aligned Priors\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-1-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-1-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"2\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u9b6f\u68d2\u9670\u5f71\u79fb\u9664<\/h1>\n<p><strong>PhaSR: Generalized Image Shadow Removal with Physically Aligned Priors<\/strong><\/p>\n<p>\u5df2\u7372 <strong>IEEE\/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2026<\/strong> \u63a5\u6536\u3002<\/p>\n<p>\u5728\u8907\u96dc\u4e14\u591a\u5149\u6e90\u7684\u60c5\u5883\u4e0b\uff0c\u9670\u5f71\u79fb\u9664\u5bb9\u6613\u53d7\u5230\u7269\u7406\u7167\u660e\u5148\u9a57\u8207\u5b78\u7fd2\u7279\u5fb5\u4e0d\u4e00\u81f4\u7684\u5f71\u97ff\u3002PhaSR \u7d50\u5408 physically aligned normalization \u8207 geometry-semantic rectification\uff0c\u5728\u8d85\u8d8a\u55ae\u4e00\u5149\u6e90\u5047\u8a2d\u7684\u771f\u5be6\u5834\u666f\u4e2d\u4ecd\u80fd\u7dad\u6301\u7a69\u5065\u8868\u73fe\u3002<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u81ea\u4e3b\u8996\u89ba\u611f\u77e5 \/ \u9b6f\u68d2\u5834\u666f\u6062\u5fa9<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/abs\/2601.17470\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>] [<a href=\"https:\/\/github.com\/ming053l\/PhaSR\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-2\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-2\" ><div id=\"pgc-1631-2-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-2-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"3\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/research_assets\/20260324\/ReflexSplit_vis.png\" title=\"\u7814\u7a76\" alt=\"ReflexSplit: Single Image Reflection Separation via Layer Fusion-Separation\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-2-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-2-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"4\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u771f\u5be6\u4e16\u754c\u53cd\u5c04\u5206\u96e2<\/h1>\n<p><strong>ReflexSplit: Single Image Reflection Separation via Layer Fusion-Separation<\/strong><\/p>\n<p>\u5df2\u7372 <strong>IEEE\/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2026<\/strong> \u63a5\u6536\u3002<\/p>\n<p>\u73bb\u7483\u53cd\u5c04\u6703\u9020\u6210\u9ad8\u5ea6\u975e\u7dda\u6027\u7684\u5716\u5c64\u6df7\u5408\uff0c\u8b93\u65e2\u6709\u5206\u96e2\u6a21\u578b\u5728\u771f\u5be6\u4e16\u754c\u4e2d\u5bb9\u6613\u5931\u6548\u3002ReflexSplit \u900f\u904e dual-stream fusion-separation blocks \u8207 curriculum training\uff0c\u5728\u5408\u6210\u8207\u771f\u5be6\u8cc7\u6599\u4e0a\u90fd\u9054\u5230\u66f4\u7a69\u5065\u7684\u53cd\u5c04\u5206\u96e2\u80fd\u529b\u3002<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u81ea\u4e3b\u8996\u89ba\u611f\u77e5 \/ \u9b6f\u68d2\u5834\u666f\u6062\u5fa9<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/abs\/2601.17468\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>] [<a href=\"https:\/\/github.com\/wuw2135\/ReflexSplit\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-3\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-3\" ><div id=\"pgc-1631-3-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-3-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"5\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/research_assets\/20260324\/ELSA_teaser.png\" title=\"\u7814\u7a76\" alt=\"ELSA: Exact Linear-Scan Attention for Fast and Memory-Light Vision Transformers\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-3-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-3-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"6\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u9ad8\u6548\u7387 AI \u63a8\u8ad6<\/h1>\n<p><strong>ELSA: Exact Linear-Scan Attention for Fast and Memory-Light Vision Transformers<\/strong><\/p>\n<p>\u5df2\u7372 <strong>CVPR 2026 Findings Workshop<\/strong> \u63a5\u6536\u3002<\/p>\n<p>ELSA \u5c07 exact softmax attention \u91cd\u5beb\u70ba associative monoid \u4e0a\u7684 prefix scan\uff0c\u5728\u4e0d\u9700\u91cd\u65b0\u8a13\u7df4\u7684\u524d\u63d0\u4e0b\u5be6\u73fe\u66f4\u7701\u8a18\u61b6\u9ad4\u7684\u63a8\u8ad6\uff0c\u4e26\u5177\u5099\u53ef\u8b49\u660e\u7684 FP32 \u7a69\u5b9a\u6027\u3002\u900f\u904e Triton \u8207 CUDA C++ \u5be6\u4f5c\uff0c\u53ef\u540c\u6642\u63d0\u5347\u8cc7\u6599\u4e2d\u5fc3\u8207\u908a\u7de3\u786c\u9ad4\u4e0a\u7684\u90e8\u7f72\u6027\u3002<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u7cbe\u5be6\u8996\u89ba\u67b6\u69cb \/ \u786c\u9ad4\u7121\u95dc\u63a8\u8ad6<\/p>\n<p>arXiv \u9810\u5370\u672c\u5373\u5c07\u516c\u958b<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-4\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-4\" ><div id=\"pgc-1631-4-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-4-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"7\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/research_assets\/20260324\/QuantTune_method.png\" title=\"\u7814\u7a76\" alt=\"QuantTune: Optimizing Model Quantization with Adaptive Outlier-Driven Fine Tuning\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-4-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-4-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"8\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u91cf\u5316\u53cb\u5584\u90e8\u7f72<\/h1>\n<p><strong>QuantTune: Optimizing Model Quantization with Adaptive Outlier-Driven Fine Tuning<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>IEEE International Conference on Multimedia Information Processing and Retrieval (MIPR) 2025<\/strong>\u3002<\/p>\n<p>QuantTune \u805a\u7126\u65bc Transformer \u91cf\u5316\u904e\u7a0b\u4e2d\u7684 outlier-driven dynamic range amplification\uff0c\u986f\u8457\u964d\u4f4e\u4f4e\u4f4d\u5143\u8a2d\u5b9a\u4e0b\u7684\u7cbe\u5ea6\u640d\u5931\uff0c\u4e14\u4e0d\u9700\u589e\u52a0\u63a8\u8ad6\u7aef\u7684\u786c\u9ad4\u8907\u96dc\u5ea6\uff0c\u53ef\u8de8 ViT\u3001BERT \u8207 OPT \u6a21\u578b\u8f49\u79fb\u3002<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u7cbe\u5be6\u8996\u89ba\u67b6\u69cb \/ \u91cf\u5316\u611f\u77e5\u90e8\u7f72<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/abs\/2403.06497\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>] [<a href=\"https:\/\/ieeexplore.ieee.org\/document\/11225997\/\" target=\"_blank\" rel=\"noopener noreferrer\">IEEE Xplore<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-5\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-5\" ><div id=\"pgc-1631-5-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-5-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"9\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/research_assets\/20260324\/PromptHSI_teaser.png\" title=\"\u7814\u7a76\" alt=\"PromptHSI: Universal Hyperspectral Image Restoration with Vision-Language Modulated Frequency Adaptation\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-5-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-5-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"10\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u901a\u7528\u9ad8\u5149\u8b5c\u5fa9\u539f<\/h1>\n<p><strong>PromptHSI: Universal Hyperspectral Image Restoration with Vision-Language Modulated Frequency Adaptation<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>IEEE Transactions on Geoscience and Remote Sensing (TGRS), Early Access, Feb. 2026<\/strong>\u3002<\/p>\n<p>PromptHSI \u662f\u4e00\u500b all-in-one \u7684\u9ad8\u5149\u8b5c\u5f71\u50cf\u5fa9\u539f\u6846\u67b6\uff0c\u7d50\u5408 frequency-aware modulation \u8207 vision-language guided prompt learning\uff0c\u4f7f\u55ae\u4e00\u6a21\u578b\u5373\u53ef\u540c\u6642\u8655\u7406\u96f2\u906e\u853d\u3001\u6a21\u7cca\u3001\u96dc\u8a0a\u8207\u5149\u8b5c\u7f3a\u5931\u7b49\u591a\u7a2e\u9059\u6e2c\u9000\u5316\u3002<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u5168\u983b\u8b5c\u79d1\u5b78\u611f\u6e2c \/ \u9ad8\u5149\u8b5c\u5fa9\u539f<\/p>\n<p>[<a href=\"https:\/\/ieeexplore.ieee.org\/document\/11371358\" target=\"_blank\" rel=\"noopener noreferrer\">IEEE Xplore<\/a>] [<a href=\"https:\/\/arxiv.org\/abs\/2411.15922\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>] [<a href=\"https:\/\/github.com\/chingheng0808\/PromptHSI\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-6\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-6\" ><div id=\"pgc-1631-6-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-6-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"11\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/research_assets\/20260324\/TIFS_GRACEv2_overview.png\" title=\"\u7814\u7a76\" alt=\"Towards Robust DeepFake Detection under Unstable Face Sequences: Adaptive Sparse Graph Embedding with Order-Free Representation and Explicit Laplacian Spectral Prior\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-6-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-6-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"12\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u5a92\u9ad4\u5b89\u5168\u8207 DeepFake \u9b6f\u68d2\u6027<\/h1>\n<p><strong>Towards Robust DeepFake Detection under Unstable Face Sequences: Adaptive Sparse Graph Embedding with Order-Free Representation and Explicit Laplacian Spectral Prior<\/strong><\/p>\n<p>\u76ee\u524d\u6295\u7a3f\u81f3 <strong>IEEE Transactions on Information Forensics and Security (TIFS)<\/strong>\u3002<\/p>\n<p>GRACEv2 \u91dd\u5c0d\u58d3\u7e2e\u3001\u906e\u64cb\u3001\u5f71\u683c\u7f3a\u6f0f\u8207\u9806\u5e8f\u64fe\u52d5\u6240\u9020\u6210\u7684\u4e0d\u7a69\u5b9a\u4eba\u81c9\u5e8f\u5217\u8a2d\u8a08\uff0c\u900f\u904e order-free temporal graph embedding \u8207 explicit Laplacian spectral prior\uff0c\u5728\u56b4\u82db\u771f\u5be6\u689d\u4ef6\u4e0b\u63d0\u5347 DeepFake \u5075\u6e2c\u7684\u7a69\u5065\u5ea6\u3002<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u53ef\u4fe1\u8996\u89ba\u667a\u6167 \/ \u9b6f\u68d2 DeepFake \u5075\u6e2c<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/abs\/2512.07498\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-7\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-7\" ><div id=\"pgc-1631-7-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-7-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"13\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/research_assets\/20260324\/IJCV_UMCL_paradigm.jpg\" title=\"\u7814\u7a76\" alt=\"UMCL: Unimodal-Generated Multimodal Contrastive Learning for Cross-compression-rate Deepfake Detection\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-7-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-7-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"14\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u8de8\u58d3\u7e2e\u7387 DeepFake \u5075\u6e2c<\/h1>\n<p><strong>UMCL: Unimodal-Generated Multimodal Contrastive Learning for Cross-compression-rate Deepfake Detection<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>International Journal of Computer Vision (IJCV)<\/strong>, Jan. 2026.<\/p>\n<p>UMCL \u5f9e\u55ae\u4e00\u8996\u89ba\u8f38\u5165\u4e2d\u5408\u6210\u5c0d\u58d3\u7e2e\u66f4\u7a69\u5065\u7684\u591a\u6a21\u614b\u7dda\u7d22\uff0c\u5305\u62ec rPPG\u3001\u6642\u9593 landmark \u8207\u8a9e\u610f\u5d4c\u5165\uff0c\u85c9\u6b64\u63d0\u5347\u8de8\u58d3\u7e2e\u689d\u4ef6\u4e0b\u7684 DeepFake \u5075\u6e2c\u80fd\u529b\uff0c\u540c\u6642\u4fdd\u7559\u8f03\u5177\u53ef\u89e3\u91cb\u6027\u7684\u7279\u5fb5\u95dc\u4fc2\u3002<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u53ef\u4fe1\u8996\u89ba\u667a\u6167 \/ \u8de8\u58d3\u7e2e\u9451\u8b58<\/p>\n<p>[<a href=\"https:\/\/link.springer.com\/article\/10.1007\/s11263-025-02606-0\" target=\"_blank\" rel=\"noopener noreferrer\">Springer<\/a>] [<a href=\"https:\/\/doi.org\/10.1007\/s11263-025-02606-0\" target=\"_blank\" rel=\"noopener noreferrer\">DOI<\/a>] [<a href=\"https:\/\/arxiv.org\/abs\/2511.18983\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-8\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-8\" ><div id=\"pgc-1631-8-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-8-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"15\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"http:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/11\/drct_fix.gif\" title=\"\u7814\u7a76\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-8-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-8-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"16\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u65b0\u4e00\u4ee3\u8d85\u89e3\u6790\u5ea6\u6a21\u578b<\/h1>\n<p><strong>DRCT: Saving Image Super-Resolution away from Information Bottleneck<\/strong><\/p>\n<p>Presented at <strong>IEEE\/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2024, NTIRE Workshop<\/strong> <span style=\"color: red;\">[Oral]<\/span>.<\/p>\n<p><a href=\"https:\/\/cchsu.info\/\" target=\"_blank\" rel=\"noopener noreferrer\">Chih-Chung Hsu<\/a>, Chia-Ming Lee, Yi-Shiuan Chou<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u7cbe\u5be6\u8996\u89ba\u67b6\u69cb \/ \u9ad8\u6548\u7387\u8d85\u89e3\u6790\u5ea6<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/pdf\/2404.00722.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/arxiv.org\/abs\/2404.00722\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>] [<a href=\"https:\/\/github.com\/ming053l\/DRCT\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>] [<a href=\"https:\/\/allproj002.github.io\/drct.github.io\/\" target=\"_blank\" rel=\"noopener noreferrer\">Project Page<\/a>] [<a href=\"https:\/\/drive.google.com\/file\/d\/1zR9wSwqCryLeKVkJfTuoQILKiQdf_Vdz\/view?usp=sharing\" target=\"_blank\" rel=\"noopener noreferrer\">Poster<\/a>] [<a href=\"https:\/\/docs.google.com\/presentation\/d\/1MxPPtgQZ61GFSr3YfGOm9scm23bbbXRj\/edit?usp=sharing&amp;ouid=105932000013245886245&amp;rtpof=true&amp;sd=true\" target=\"_blank\" rel=\"noopener noreferrer\">Slides<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-9\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-9\" ><div id=\"pgc-1631-9-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-9-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"17\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"http:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/11\/4SFL.png\" title=\"\u7814\u7a76\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-9-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-9-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"18\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>CT \u5f71\u50cf\u534a\u76e3\u7763\u5075\u6e2c<\/h1>\n<p><strong>A Closer Look at Spatial-Slice Features for COVID-19 Detection<\/strong><\/p>\n<p>Presented at <strong>IEEE\/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2024, DEF-AI-MIA Workshop<\/strong>.<\/p>\n<p><a href=\"https:\/\/cchsu.info\/\" target=\"_blank\" rel=\"noopener noreferrer\">Chih-Chung Hsu<\/a>, Chia-Ming Lee, Yang Fan Chiang, Yi-Shiuan Chou, Chih-Yu Jiang, Shen-Chieh Tai, Chi-Han Tsai<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u53ef\u4fe1\u8996\u89ba\u667a\u6167 \/ \u91ab\u7642\u5f71\u50cf<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/abs\/2404.01643.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/arxiv.org\/abs\/2404.01643\" target=\"_blank\" rel=\"noopener noreferrer\">arXiv<\/a>] [<a href=\"https:\/\/github.com\/ming053l\/E2D\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>] [<a href=\"https:\/\/allproj001.github.io\/cov19d.github.io\/\" target=\"_blank\" rel=\"noopener noreferrer\">Project Page<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-10\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-10\" ><div id=\"pgc-1631-10-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-10-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"19\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"http:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/11\/RTCS.png\" title=\"\u7814\u7a76\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-10-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-10-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"20\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u9ad8\u901f\u9ad8\u5149\u8b5c\u58d3\u7e2e\u611f\u6e2c<\/h1>\n<p><strong>Real-Time Compressed Sensing for Joint Hyperspectral Image Transmission and Restoration for CubeSat<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>IEEE Transactions on Geoscience and Remote Sensing (TGRS)<\/strong>\u3002<\/p>\n<p><strong>Future Technology Award\uff08\u672a\u4f86\u79d1\u6280\u734e\uff09<\/strong><\/p>\n<p><a href=\"https:\/\/cchsu.info\/\" target=\"_blank\" rel=\"noopener noreferrer\">Chih-Chung Hsu<\/a>, Chih-Yu Jian, Eng-Shen Tu, Chia-Ming Lee, Guan-Lin Chen<\/p>\n<p><strong>\u7814\u7a76\u65b9\u5411\u3002<\/strong> \u5168\u983b\u8b5c\u79d1\u5b78\u611f\u6e2c \u00d7 \u7cbe\u5be6\u8996\u89ba\u67b6\u69cb<\/p>\n<p>[<a href=\"https:\/\/ieeexplore.ieee.org\/abstract\/document\/10474407\" target=\"_blank\" rel=\"noopener noreferrer\">IEEE Xplore<\/a>] [<a href=\"https:\/\/github.com\/ming053l\/RTCS\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-11\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-11\" ><div id=\"pgc-1631-11-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-11-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"21\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/pos2.png\" width=\"486\" height=\"477\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/pos2.png 486w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/pos2-300x294.png 300w\" sizes=\"(max-width: 486px) 100vw, 486px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-11-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-11-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"22\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>CT \u5f71\u50cf\u4e2d\u7684 COVID-19 \u75c7\u72c0\u5075\u6e2c<\/h1>\n<p><strong>\u7cbe\u9078\u7af6\u8cfd\u8ad6\u6587\u8207\u6210\u679c<\/strong><\/p>\n<p><strong>IEEE ECCV Workshop 2022<\/strong>\uff3bCOV19D challenge \u7b2c 1 \u540d\uff3d<\/p>\n<p><a href=\"https:\/\/openaccess.thecvf.com\/content\/ICCV2021W\/MIA-COV19D\/papers\/Kollias_MIA-COV19D_COVID-19_Detection_Through_3-D_Chest_CT_Image_Analysis_ICCVW_2021_paper.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Spatial-Slice Feature Learning using Visual Transformer and Essential Slices Selection Module for COVID-19 Detection of CT Scans in the Wild<\/a><\/p>\n<p><strong>IEEE ICCV Workshop 2021<\/strong>\uff3bCOV19D challenge \u7b2c 3 \u540d\uff3d<\/p>\n<p><a href=\"https:\/\/ieeexplore.ieee.org\/document\/9607525\" target=\"_blank\" rel=\"noopener noreferrer\">Adaptive Distribution Learning with Statistical Hypothesis Testing for COVID-19 CT Scan Classification<\/a><\/p>\n<p>\u9019\u7cfb\u5217\u6a21\u578b\u5c08\u70ba noisy\u3001in-the-wild \u7684 CT \u5f71\u50cf\u8a2d\u8a08\uff0c\u5728\u4e0d\u540c\u7a7a\u9593\u89e3\u6790\u5ea6\u8207 slice resolution \u4e0b\u4ecd\u80fd\u7dad\u6301\u7a69\u5065\u8868\u73fe\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-12\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-12\" ><div id=\"pgc-1631-12-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-12-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"23\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/IT-SMP_flowchart.png\" width=\"583\" height=\"443\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/IT-SMP_flowchart.png 583w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/IT-SMP_flowchart-300x228.png 300w\" sizes=\"(max-width: 583px) 100vw, 583px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-12-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-12-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"24\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>Social Media Prediction \u7684\u7e31\u5411\u5efa\u6a21<\/h1>\n<p><strong>A Comprehensive Study of Spatiotemporal Feature Learning for Social Media Popularity Prediction<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>ACM Multimedia 2022<\/strong>\u3002<\/p>\n<p><strong>C.C. Hsu<\/strong>, P.J. Tsai, T.C. Yeh, and X.U. Hou<\/p>\n<p>\u6211\u5011\u5c07 social media popularity prediction \u91cd\u65b0\u5b9a\u7fa9\u70ba identity-preserving \u7684\u7e31\u5411\u4efb\u52d9\uff0c\u4e26\u5206\u6790 multimodal temporal features \u5982\u4f55\u63d0\u5347\u9577\u6642\u9593\u5c3a\u5ea6\u4e0b\u7684\u9810\u6e2c\u7a69\u5b9a\u5ea6\u3002<\/p>\n<p>[<a href=\"https:\/\/dl.acm.org\/doi\/abs\/10.1145\/3503161.3551593\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>]<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-13\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-13\" ><div id=\"pgc-1631-13-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-13-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"25\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/result_ACD.png\" width=\"3189\" height=\"2013\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/result_ACD.png 3189w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/result_ACD-300x189.png 300w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/result_ACD-1024x646.png 1024w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/result_ACD-768x485.png 768w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/result_ACD-1536x970.png 1536w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2022\/12\/result_ACD-2048x1293.png 2048w\" sizes=\"(max-width: 3189px) 100vw, 3189px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-13-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-13-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"26\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u81ea\u99d5\u5834\u666f\u8a9e\u610f\u5206\u5272<\/h1>\n<p><strong>\u7cbe\u9078\u8ad6\u6587\uff1a\u805a\u7126\u7a69\u5065\u4e14\u9ad8\u6548\u7387\u7684\u5834\u666f\u7406\u89e3<\/strong><\/p>\n<p><strong>IEEE ICME Workshop 2022<\/strong><\/p>\n<p>Augmented-Training-Aware Bisenet for Real-Time Semantic Segmentation [<a href=\"https:\/\/ieeexplore.ieee.org\/document\/9859497\/\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>]<\/p>\n<p><strong>IEEE ICASSP 2022<\/strong><\/p>\n<p>DCSN: Deformable Convolutional Semantic Segmentation Neural Network for Non-Rigid Scenes [<a href=\"https:\/\/ieeexplore.ieee.org\/document\/9747586\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>]<\/p>\n<p>\u9019\u4e9b\u5de5\u4f5c\u805a\u7126\u65bc\u81ea\u99d5\u5834\u666f\u4e2d\u7684\u5373\u6642\u8a9e\u610f\u7406\u89e3\uff0c\u5728 robustness \u8207 low-compute deployment \u4e4b\u9593\u53d6\u5f97\u5e73\u8861\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-14\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-14\" ><div id=\"pgc-1631-14-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-14-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"27\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/fakevsreal-1.png\" width=\"200\" height=\"150\" sizes=\"(max-width: 200px) 100vw, 200px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-14-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-14-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"28\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u507d\u9020\u5f71\u50cf \/ \u5f71\u7247\uff08DeepFake\uff09\u5075\u6e2c<\/h1>\n<p><strong>\u7cbe\u9078\u8ad6\u6587\u8207\u5ef6\u4f38\u63a8\u5ee3<\/strong><\/p>\n<p><strong>IEEE ICIP 2019<\/strong> and <strong>Applied Sciences<\/strong><\/p>\n<p>Detecting Generated Image Based on Coupled Network with Two-Step Pairwise Learning<\/p>\n<p><strong>IEEE IS3C 2018<\/strong><\/p>\n<p>Learning to Detect Fake Face Images in the Wild<\/p>\n<p>[\u5a92\u9ad4\u5831\u5c0e] <a href=\"https:\/\/view.ctee.com.tw\/technology\/17461.html\" target=\"_blank\" rel=\"noopener noreferrer\">\u5de5\u5546\u6642\u5831<\/a> \/ <a href=\"https:\/\/smctw.tw\/3352\/\" target=\"_blank\" rel=\"noopener noreferrer\">\u53f0\u5927\u65b0\u8208\u5a92\u9ad4\u4e2d\u5fc3<\/a><\/p>\n<p>[<a href=\"https:\/\/cchsu.info\/?p=138\" target=\"_blank\" rel=\"noopener noreferrer\">Project<\/a>] [<a href=\"https:\/\/arxiv.org\/abs\/1809.08754\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/github.com\/jesse1029\/Fake-Face-Images-Detection-Tensorflow\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>] [<a href=\"http:\/\/divd.cchsu.info\" target=\"_blank\" rel=\"noopener noreferrer\">Online Demo<\/a>]<\/p>\n<p>\u507d\u9020 \/ \u9020\u5047\u7167\u7247\u5075\u6e2c\uff0c\u805a\u7126\u65bc\u53ef\u4fe1\u5a92\u9ad4\u5206\u6790\u8207\u6253\u64ca\u5047\u7167\u7247\u3001\u5047\u65b0\u805e\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-15\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-15\" ><div id=\"pgc-1631-15-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-15-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"29\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/flowchart_DCN.png\" width=\"477\" height=\"301\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/flowchart_DCN.png 477w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/flowchart_DCN-300x189.png 300w\" sizes=\"(max-width: 477px) 100vw, 477px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-15-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-15-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"30\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u9ad8\u5149\u8b5c\u5f71\u50cf\u7684\u6df1\u5ea6\u58d3\u7e2e\u611f\u6e2c<\/h1>\n<p><strong>\u7cbe\u9078\u8ad6\u6587\uff1a\u805a\u7126\u9ad8\u6548\u7387\u885b\u661f\u611f\u6e2c<\/strong><\/p>\n<p><strong>IEEE Transactions on Geoscience and Remote Sensing<\/strong><\/p>\n<p>DCSN: Deep Compressed Sensing Network for Efficient Hyperspectral Data Transmission of Miniaturized Satellite [<a href=\"https:\/\/ieeexplore.ieee.org\/document\/9257426\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>]<\/p>\n<p><strong>CVGIP 2020<\/strong><\/p>\n<p>Deep Joint Compression and Super-Resolution Low-Rank Network for Fast Hyperspectral Data Transmission<\/p>\n<p>[<a href=\"https:\/\/chihungkao.github.io\/DCSN\/DCSN\" target=\"_blank\" rel=\"noopener noreferrer\">Project<\/a>] [<a href=\"https:\/\/github.com\/jesse1029\/DCSN\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>]<\/p>\n<p>\u805a\u7126\u9ad8\u5149\u8b5c \/ \u591a\u5149\u8b5c\u5f71\u50cf\u7684\u8d85\u89e3\u6790\u5ea6\u8207\u58d3\u7e2e\u611f\u6e2c\uff0c\u652f\u63f4\u66f4\u9ad8\u6548\u7387\u7684\u885b\u661f\u8cc7\u6599\u50b3\u8f38\u8207\u5fa9\u539f\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-16\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-16\" ><div id=\"pgc-1631-16-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-16-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"31\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous.png\" width=\"2156\" height=\"1444\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous.png 2156w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-300x201.png 300w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-1024x686.png 1024w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-768x514.png 768w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-1536x1029.png 1536w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-2048x1372.png 2048w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-380x254.png 380w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-285x190.png 285w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/autonomous-272x182.png 272w\" sizes=\"(max-width: 2156px) 100vw, 2156px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-16-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-16-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"32\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u4ee5\u8996\u89ba\u8cc7\u8a0a\u9032\u884c\u81ea\u99d5\u8eca\u6c7a\u7b56<\/h1>\n<p><strong>\u7cbe\u9078\u7814\u7a76\uff1a\u805a\u7126\u7a69\u5065\u7684\u8996\u89ba\u6c7a\u7b56<\/strong><\/p>\n<p><strong>Multimedia Tools and Applications<\/strong><\/p>\n<p>Deep Learning-based Vehicle Trajectory Prediction based on Generative Adversarial Network for Autonomous Driving Applications<\/p>\n<p><strong>IEEE ICCE-TW 2020<\/strong><\/p>\n<p>Learning to Predict Risky Driving Behaviors for Autonomous Driving<\/p>\n<p>[Large-Scale Vehicle Collision Dataset @ TW] [<a href=\"https:\/\/sites.google.com\/view\/tvcd-tw\/\" target=\"_blank\" rel=\"noopener noreferrer\">Link<\/a>]<\/p>\n<p>\u805a\u7126\u81ea\u99d5\u8eca\u8996\u89ba\u7cfb\u7d71\u4e2d\u7684\u5371\u96aa\u99d5\u99db\u884c\u70ba\u9810\u6e2c\uff0c\u4ee5\u53ca\u53f0\u7063\u9053\u8def\u8cc7\u6599\u5eab\u7684\u5efa\u7f6e\u8207\u5206\u6790\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-17\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-17\" ><div id=\"pgc-1631-17-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-17-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"33\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a01.jpg\" width=\"371\" height=\"315\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a01.jpg 371w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a01-300x255.jpg 300w\" sizes=\"(max-width: 371px) 100vw, 371px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-17-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-17-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"34\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u793e\u7fa4\u5a92\u9ad4\u71b1\u5ea6\u9810\u6e2c<\/h1>\n<p><strong>\u7cbe\u9078\u6210\u679c\u8207\u734e\u9805<\/strong><\/p>\n<ul>\n<li><strong>ACM Multimedia 2017-2020<\/strong><\/li>\n<li><em>Social Media Prediction Based on Residual Learning and Random Forest<\/em>\uff082017\uff09\u3002\u8f03\u65b0\u7684\u7248\u672c\u53ef\u53c3\u8003 publications \u9801\u9762\u3002<\/li>\n<li><span style=\"color: #800000;\">2 Best-Performance Awards and 2 Top-Performance Awards<\/span><\/li>\n<li><span style=\"color: #800000;\">Best Grand Challenge Paper Award (2017)<\/span><\/li>\n<li>[<a href=\"https:\/\/github.com\/jesse1029\/SMHP2018\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>] [<a href=\"https:\/\/dl.acm.org\/citation.cfm?id=3127894\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>]<\/li>\n<\/ul>\n<p>\u805a\u7126\u793e\u7fa4\u8cbc\u6587\u9ede\u64ca\u7387\u8207\u9577\u671f\u6d41\u884c\u5ea6\u7684\u9810\u6e2c\u8207\u5206\u6790\u3002<\/p><\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-18\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-18\" ><div id=\"pgc-1631-18-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-18-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"35\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/SGAN.png\" width=\"1816\" height=\"825\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/SGAN.png 1816w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/SGAN-300x136.png 300w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/SGAN-1024x465.png 1024w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/SGAN-768x349.png 768w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/SGAN-1536x698.png 1536w\" sizes=\"(max-width: 1816px) 100vw, 1816px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-18-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-18-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"36\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u8eab\u5206\u4fdd\u6301\u7684\u4eba\u81c9\u8d85\u89e3\u6790\u5ea6<\/h1>\n<p><strong>SiGAN: Siamese Generative Adversarial Network for Identity-Preserving Face Hallucination<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>IEEE Transactions on Image Processing (TIP)<\/strong>, 2019.<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/abs\/1807.08370\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/github.com\/jesse1029\/SiGAN\" target=\"_blank\" rel=\"noopener noreferrer\">GitHub<\/a>]<\/p>\n<p>\u9084\u539f\u4e0d\u6e05\u695a\u3001\u6a21\u7cca\u7684\u4f4e\u89e3\u6790\u5ea6\u4eba\u81c9\u7167\u7247\uff0c\u540c\u6642\u4fdd\u7559\u539f\u59cb\u8eab\u5206\u8cc7\u8a0a\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-19\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-19\" ><div id=\"pgc-1631-19-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-19-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"37\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a02.png\" width=\"371\" height=\"312\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a02.png 371w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a02-300x252.png 300w\" sizes=\"(max-width: 371px) 100vw, 371px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-19-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-19-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"38\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u5de8\u91cf\u5f71\u50cf\u5206\u7fa4<\/h1>\n<p><strong>CNN-Based Joint Clustering and Representation Learning with Feature Drift Compensation for Large-Scale Image Data<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>TMM 2018<\/strong>\uff0c\u4e26\u65bc <strong>ICIP 2017<\/strong> \u767c\u8868\u3002<\/p>\n<p>[<a href=\"https:\/\/arxiv.org\/abs\/1705.07091\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/github.com\/jesse1029\/Large-scale-image-clustering-feature-drifting\" target=\"_blank\" rel=\"noopener noreferrer\">Code<\/a>]<\/p>\n<p>\u5de8\u91cf\u5f71\u50cf\u8cc7\u6599\u5206\u7fa4\u6f14\u7b97\u6cd5\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-20\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-20\" ><div id=\"pgc-1631-20-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-20-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"39\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a03.png\" width=\"371\" height=\"303\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a03.png 371w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a03-300x245.png 300w\" sizes=\"(max-width: 371px) 100vw, 371px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-20-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-20-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"40\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u53bb\u65b9\u584a\u6548\u61c9\u8207\u8d85\u89e3\u6790\u5ea6<\/h1>\n<p><strong>Learning-Based Joint Super-Resolution and Deblocking for a Highly Compressed Image<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>TMM 2015<\/strong>\uff0c\u4e26\u65bc <strong>MMSP 2013<\/strong> \u767c\u8868\u3002<\/p>\n<p><strong>MMSP 2013 Top 10% Paper Award<\/strong><\/p>\n<p>[<a href=\"https:\/\/cchsu.info\/Project\/LQSR\/\" target=\"_blank\" rel=\"noopener noreferrer\">Project Page<\/a>] [<a href=\"https:\/\/drive.google.com\/file\/d\/0B3-EGmMjT8dqM2JLM3BlUkZNX3c\/view?usp=sharing\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/cchsu.info\/Project\/LQSR\/LQSR_Code_20150319.zip\" target=\"_blank\" rel=\"noopener noreferrer\">Matlab Source Code<\/a> (32-bit only)]<\/p>\n<p>\u540c\u6642\u53bb\u9664\u5340\u584a\u6548\u61c9\u4e26\u63d0\u9ad8\u89e3\u6790\u5ea6\uff0c\u8b93\u653e\u5927\u5f8c\u7684\u5f71\u50cf\u7dad\u6301\u6e05\u6670\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-21\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-21\" ><div id=\"pgc-1631-21-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-21-0-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"41\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<p><iframe loading=\"lazy\" width=\"365\" height=\"352\" src=\"https:\/\/www.youtube.com\/embed\/5AdJU6VOBZM\" frameborder=\"0\" allow=\"autoplay; encrypted-media\" allowfullscreen><\/iframe><\/p>\n<\/div>\n<\/div><\/div><\/div><div id=\"pgc-1631-21-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-21-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"42\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u5177\u7d0b\u7406\u8996\u8a0a\u7684\u8d85\u89e3\u6790\u5ea6<\/h1>\n<p><strong>Temporally Coherent Super-Resolution of Textured Video via Dynamic Texture Synthesis<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>IEEE Transactions on Image Processing (TIP)<\/strong>\uff0c\u4e26\u65bc <strong>MMSP 2014<\/strong> \u767c\u8868\u3002<\/p>\n<p>[<a href=\"https:\/\/cchsu.info\/Project\/VideoSR\/\" target=\"_blank\" rel=\"noopener noreferrer\">Project Page<\/a>] [<a href=\"https:\/\/drive.google.com\/file\/d\/0B5bMFjPkQlkkbi03ZGk5a0hodlE\/view?usp=sharing\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"http:\/\/www.google.com\/url?q=http%3A%2F%2Fcchsu.info%2FProject%2FVideoSR%2FReleased_DTSSR.zip&amp;sa=D&amp;sntz=1&amp;usg=AFQjCNFH0HR8d9gWCJ__4nW_66lXfBuswA\" target=\"_blank\" rel=\"noopener noreferrer\">Matlab Code<\/a>]<\/p>\n<p>\u63d0\u4f9b\u52d5\u614b\u7d0b\u7406\u8996\u8a0a\u7684\u8d85\u89e3\u6790\u5ea6\u6280\u8853\uff0c\u6539\u5584\u653e\u5927\u5f8c\u7684\u7d30\u7bc0\u8207\u6642\u9593\u4e00\u81f4\u6027\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-22\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-22\" ><div id=\"pgc-1631-22-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-22-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"43\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a04.jpg\" width=\"150\" height=\"125\" sizes=\"(max-width: 150px) 100vw, 150px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-22-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-22-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"44\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u5f71\u50cf\u91cd\u5b9a\u5411\u54c1\u8cea\u8a55\u4f30<\/h1>\n<p><strong>Objective Quality Assessment for Image Retargeting Based on Perceptual Geometric Distortion and Information Loss<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>IEEE Journal of Selected Topics in Signal Processing<\/strong>\uff0c\u4e26\u65bc <strong>VCIP 2013<\/strong> \u767c\u8868\u3002<\/p>\n<p>[<a href=\"https:\/\/cchsu.info\/Project\/IQA\/\" target=\"_blank\" rel=\"noopener noreferrer\">Project Page<\/a>] [<a href=\"https:\/\/drive.google.com\/file\/d\/0B5bMFjPkQlkkbGk2NDJXcW1ITjQ\/edit?usp=sharing\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/cchsu.info\/Project\/IQA\/SFMetric_Released_20150327.rar\" target=\"_blank\" rel=\"noopener noreferrer\">Matlab Code<\/a>]<\/p>\n<p>\u8a55\u4f30\u5f71\u50cf\u6fc3\u7e2e\u6280\u8853\u7684\u54c1\u8cea\uff0c\u91cf\u5316\u5e7e\u4f55\u5931\u771f\u8207\u8cc7\u8a0a\u6d41\u5931\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-23\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-23\" ><div id=\"pgc-1631-23-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-23-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"45\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a05.png\" width=\"346\" height=\"240\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a05.png 346w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a05-300x208.png 300w\" sizes=\"(max-width: 346px) 100vw, 346px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-23-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-23-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"46\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u5f71\u50cf\u8d85\u89e3\u6790\u5ea6<\/h1>\n<p><strong>Image Super-Resolution via Feature-Based Affine Transform<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>MMSP 2011<\/strong>\u3002<\/p>\n<p>[<a href=\"https:\/\/cchsu.info\/Project\/ImageSR\/\" target=\"_blank\" rel=\"noopener noreferrer\">Project Page<\/a>] [<a href=\"https:\/\/docs.google.com\/viewer?a=v&amp;pid=sites&amp;srcid=ZGVmYXVsdGRvbWFpbnxudGh1amVzc2V8Z3g6MjhiMDg0MWU3MWUzNWM1Nw\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/cchsu.info\/Project\/ImageSR\/Released_Matlab20150318.rar\" target=\"_blank\" rel=\"noopener noreferrer\">Executable Code (Matlab)<\/a>]<\/p>\n<p><strong>\u8aaa\u660e\u3002<\/strong> \u672c\u9801\u63d0\u4f9b\u4ee5 NLM \u70ba\u4f8b\u7684\u5be6\u4f5c\u7248\u672c\uff0c\u793a\u7bc4\u6240\u63d0\u51fa\u65b9\u6cd5\u7684\u4f7f\u7528\u65b9\u5f0f\u3002<\/p>\n<p>\u9019\u9805\u5f71\u50cf\u8d85\u89e3\u6790\u5ea6\u65b9\u6cd5\u8457\u91cd\u65bc\u64f4\u5145\u8cc7\u6599\u5eab\u4e2d\u7684\u6a23\u614b\uff0c\u63d0\u5347\u653e\u5927\u5f8c\u7684\u91cd\u5efa\u6548\u679c\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-24\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-24\" ><div id=\"pgc-1631-24-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-24-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"47\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/header_fig.png\" width=\"296\" height=\"141\" sizes=\"(max-width: 296px) 100vw, 296px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-24-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-24-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"48\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u4eba\u81c9\u8d85\u89e3\u6790\u5ea6<\/h1>\n<p><strong>Face Hallucination Using Bayesian Global Estimation and Local Basis Selection<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>MMSP 2010<\/strong>\u3002<\/p>\n<p>[<a href=\"https:\/\/cchsu.info\/Project\/Hallucination\/\" target=\"_blank\" rel=\"noopener noreferrer\">Project Page<\/a>] [<a href=\"https:\/\/docs.google.com\/viewer?a=v&amp;pid=sites&amp;srcid=ZGVmYXVsdGRvbWFpbnxudGh1amVzc2V8Z3g6Nzk0ZTMwOWE3OTE2MTE3Zg\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/drive.google.com\/file\/d\/0B3-EGmMjT8dqQldQcTZHWTlXamc\/view?usp=sharing\" target=\"_blank\" rel=\"noopener noreferrer\">Matlab Code &amp; Database<\/a>]<\/p>\n<p>\u4eba\u81c9\u8d85\u89e3\u6790\u5ea6\u653e\u5927\uff0c\u5f9e\u6975\u4f4e\u89e3\u6790\u5ea6\u4eba\u81c9\u5f71\u50cf\u91cd\u5efa\u51fa\u8f03\u6e05\u6670\u7684\u4eba\u81c9\u7d50\u679c\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-25\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-25\" ><div id=\"pgc-1631-25-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-25-0-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"49\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<p><iframe loading=\"lazy\" width=\"365\" height=\"316\" src=\"https:\/\/www.youtube.com\/embed\/ZwPC-AGrWIw\" frameborder=\"0\" allow=\"autoplay; encrypted-media\" allowfullscreen><\/iframe><\/p>\n<\/div>\n<\/div><\/div><\/div><div id=\"pgc-1631-25-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-25-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"50\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u8996\u8a0a\u9451\u8b58<\/h1>\n<p><strong>Video Forgery Detection Using the Correlation of Noise Residue<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>MMSP 2008<\/strong>\u3002<\/p>\n<p><strong>\u5f15\u7528\u6b21\u6578 &gt; 100<\/strong><\/p>\n<p>[<a href=\"https:\/\/docs.google.com\/viewer?a=v&amp;pid=sites&amp;srcid=ZGVmYXVsdGRvbWFpbnxudGh1amVzc2V8Z3g6NTZjN2ZkMDZhNTM5ZDdhMA\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"https:\/\/cchsu.info\/Project\/vf_released.zip\" target=\"_blank\" rel=\"noopener noreferrer\">Matlab Code<\/a>] [<a href=\"https:\/\/drive.google.com\/file\/d\/0B3-EGmMjT8dqZzJMZ29zTHZNWjg\/view?usp=sharing\" target=\"_blank\" rel=\"noopener noreferrer\">Database<\/a>]<\/p>\n<p>\u8996\u8a0a\u9451\u8b58\u6280\u8853\uff0c\u805a\u7126\u65bc\u5f71\u7247\u507d\u9020\u5075\u6e2c\u8207\u53ef\u4fe1\u5a92\u9ad4\u5206\u6790\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><div id=\"pg-1631-26\"  class=\"panel-grid panel-has-style\" ><div class=\"panel-row-style panel-row-style-for-1631-26\" ><div id=\"pgc-1631-26-0\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-26-0-0\" class=\"so-panel widget widget_sow-image panel-first-child panel-last-child\" data-index=\"51\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-image so-widget-sow-image-default-dbf295114b96-1631\"\n\t\t\t\n\t\t>\n<div class=\"sow-image-container\">\n\t\t<img \n\tsrc=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a07.png\" width=\"352\" height=\"275\" srcset=\"https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a07.png 352w, https:\/\/cchsu.info\/wordpress\/wp-content\/uploads\/2020\/12\/a07-300x234.png 300w\" sizes=\"(max-width: 352px) 100vw, 352px\" alt=\"\" \t\tclass=\"so-widget-image\"\/>\n\t<\/div>\n\n<\/div><\/div><\/div><div id=\"pgc-1631-26-1\"  class=\"panel-grid-cell\" ><div id=\"panel-1631-26-1-0\" class=\"so-panel widget widget_sow-editor panel-first-child panel-last-child\" data-index=\"52\" ><div\n\t\t\t\n\t\t\tclass=\"so-widget-sow-editor so-widget-sow-editor-base\"\n\t\t\t\n\t\t>\n<div class=\"siteorigin-widget-tinymce textwidget\">\n\t<h1>\u5f71\u50cf\u8a8d\u8b49\u8207\u7ac4\u6539\u5b9a\u4f4d<\/h1>\n<p><strong>Image Authentication and Tampering Localization Based on Watermark Embedding in the Wavelet Domain<\/strong><\/p>\n<p>\u767c\u8868\u65bc <strong>Optical Engineering<\/strong>\u3002<\/p>\n<p>[<a href=\"http:\/\/ieeexplore.ieee.org\/document\/1442247\/\" target=\"_blank\" rel=\"noopener noreferrer\">PDF<\/a>] [<a href=\"http:\/\/cchsu.info\/Project\/WaveletWaterMarking_Released.rar\" target=\"_blank\" rel=\"noopener noreferrer\">Source Code<\/a>]<\/p>\n<p>\u5c07\u6d6e\u6c34\u5370\u85cf\u5165\u5f71\u50cf\u4e2d\uff0c\u4e26\u53ef\u8010\u53d7\u4e0d\u540c\u653b\u64ca\u4ee5\u9032\u884c\u5f71\u50cf\u8a8d\u8b49\u8207\u7ac4\u6539\u5b9a\u4f4d\u3002<\/p>\n<\/div>\n<\/div><\/div><\/div><\/div><\/div><\/div>","protected":false},"excerpt":{"rendered":"<p>\u8a9e\u8a00\uff1aEnglish | \u7e41\u9ad4\u4e2d\u6587 \u7814\u7a76\u9858\u666f Advanced Computer Vision Lab \u2014 A [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"parent":1629,"menu_order":2,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-1631","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/cchsu.info\/wordpress\/wp-json\/wp\/v2\/pages\/1631","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cchsu.info\/wordpress\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/cchsu.info\/wordpress\/wp-json\/wp\/v2\/types\/page"}],"replies":[{"embeddable":true,"href":"https:\/\/cchsu.info\/wordpress\/wp-json\/wp\/v2\/comments?post=1631"}],"version-history":[{"count":0,"href":"https:\/\/cchsu.info\/wordpress\/wp-json\/wp\/v2\/pages\/1631\/revisions"}],"up":[{"embeddable":true,"href":"https:\/\/cchsu.info\/wordpress\/wp-json\/wp\/v2\/pages\/1629"}],"wp:attachment":[{"href":"https:\/\/cchsu.info\/wordpress\/wp-json\/wp\/v2\/media?parent=1631"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}