Since the Lipschitz properties of CNN are widely considered to be related to
adversarial robustness, we theoretically characterize the $ell_1$ norm and
$ell_infty$ norm of 2D multi-channel convolutional layers and provide
efficient methods to compute the exact $ell_1$ norm and $ell_infty$ norm.
Based on our theorem, we propose a novel regularization method termed norm
decay, which can effectively reduce the norms of convolutional layers and
fully-connected layers. Experiments show that norm-regularization methods,
including norm decay, weight decay, and singular value clipping, can improve
generalization of CNNs. However, they can slightly hurt adversarial robustness.
Observing this unexpected phenomenon, we compute the norms of layers in the
CNNs trained with three different adversarial training frameworks and
surprisingly find that adversarially robust CNNs have comparable or even larger
layer norms than their non-adversarially robust counterparts. Furthermore, we
prove that under a mild assumption, adversarially robust classifiers can be
achieved, and can have an arbitrarily large Lipschitz constant. For this
reason, enforcing small norms on CNN layers may be neither necessary nor
effective in achieving adversarial robustness. The code is available at
https://github.com/youweiliang/norm_robustness.

360 Mobile Vision - 360mobilevision.com North & South Carolina Security products and Systems Installations for Commercial and Residential - $55 Hourly Rate. ACCESS CONTROL, INTRUSION ALARM, ACCESS CONTROLLED GATES, INTERCOMS AND CCTV INSTALL OR REPAIR 360 Mobile Vision - 360mobilevision.com is committed to excellence in every aspect of our business. We uphold a standard of integrity bound by fairness, honesty and personal responsibility. Our distinction is the quality of service we bring to our customers. Accurate knowledge of our trade combined with ability is what makes us true professionals. Above all, we are watchful of our customers interests, and make their concerns the basis of our business.

By admin