Download PDFOpen PDF in browserScaling-up the Analysis of Neural Networks by Affine Forms: a Block-Wise Noising ApproachEasyChair Preprint 108679 pages•Date: September 8, 2023AbstractThe effectiveness of neural networks in handling visual perturbations are frequently assessed using abstract trans- forms, such as affine transformations. However, these trans- forms may forfeit precision and be computationally expensive (time and memory consuming). We suggest in this article a novel approach called block-wise noising to overcome these limitations. Block-wise noising simulates real-world situations in which par- ticular portions of an image are disrupted by inserting non-zero noise symbols only inside a given section of the image. Using this method, it is possible to assess neural networks resilience to these disturbances while preserving scalability and accuracy. The experimental results demonstrate that the present block-wise noising achieves a 50% speed improvement compared to the usual affine forms on specific trained neural networks. Additionally, it can be especially helpful for applications like computer vision, where real-world images may be susceptible to different forms of disturbance. Keyphrases: Artificial Intelligence, Interpretation abstract, Optimisation, Scalability
|