Outsourcing neural network inference tasks to an untrusted cloud raises data
privacy and integrity concerns. To address these challenges, several
privacy-preserving and verifiable inference techniques have been proposed based
on replacing the non-polynomial activation functions such as the rectified
linear unit (ReLU) function with polynomial activation functions. Such
techniques usually require polynomials with integer coefficients or polynomials
over finite fields. Motivated by such requirements, several works proposed
replacing the ReLU activation function with the square activation function. In
this work, we empirically show that the square function is not the best
degree-$2$ polynomial that can replace the ReLU function even when restricting
the polynomials to have integer coefficients. We instead propose a degree-$2$
polynomial activation function with a first order term and empirically show
that it can lead to much better models. Our experiments on the CIFAR-$10$ and
CIFAR-$100$ datasets on various architectures show that our proposed activation
function improves the test accuracy by up to $9.4%$ compared to the square

360 Mobile Vision - 360mobilevision.com North & South Carolina Security products and Systems Installations for Commercial and Residential - $55 Hourly Rate. ACCESS CONTROL, INTRUSION ALARM, ACCESS CONTROLLED GATES, INTERCOMS AND CCTV INSTALL OR REPAIR 360 Mobile Vision - 360mobilevision.com is committed to excellence in every aspect of our business. We uphold a standard of integrity bound by fairness, honesty and personal responsibility. Our distinction is the quality of service we bring to our customers. Accurate knowledge of our trade combined with ability is what makes us true professionals. Above all, we are watchful of our customers interests, and make their concerns the basis of our business.

By admin