POPL'24 VerificationReLUConvex HullFormal Methods

WraLU: Fast and Precise ReLU Hull Approximation

A fast and precise approach to over-approximating the convex hull of the ReLU function, reporting 10x-10^6x runtime improvements and up to 50% fewer constraints in the evaluated benchmarks.

Overview

WraLU is a fast and precise approach to over-approximating the convex hull of the ReLU function (referred to as the ReLU hull), one of the most used activation functions. Published at POPL’24 (the 51st ACM SIGPLAN Symposium on Principles of Programming Languages).

Key Idea

Our key insight is to formulate a convex polytope that “wraps” the ReLU hull, by reusing the linear pieces of the ReLU function as the lower faces and constructing upper faces that are adjacent to the lower faces. The upper faces can be efficiently constructed based on the edges and vertices of the lower faces, given that an n-dimensional hyperplane can be determined by an (n−1)-dimensional hyperplane and a point outside of it.

Performance Highlights

  • 10x to 10^6x runtime improvement over the compared baselines in the reported evaluation
  • Up to 50% fewer constraints while maintaining or improving precision
  • Handles arbitrary input polytopes and higher-dimensional cases
  • Used in verification experiments on CNNs and ResNet-style models with tens of thousands of neurons
  • Designed for LP-based neural network verification pipelines that benefit from tighter ReLU relaxations

How It Works

  1. Lower Faces: Reuse the linear pieces of the ReLU function directly
  2. Upper Face Construction: Build upper faces adjacent to lower faces using edge and vertex information
  3. Wrapping Polytope: The resulting convex polytope tightly wraps the ReLU hull
  4. Integration: Feed the constraints to LP solvers for neural network verification
  • WraAct: Extension to general activation functions (Sigmoid, Tanh, MaxPool)
  • wraact (library): Unified Python library for hull approximation
  • propdag: Bound propagation framework for DAG-structured networks

Citation

@article{ma2024relu,
  title={ReLU Hull Approximation},
  author={Ma, Zhongkui and Li, Jiaying and Bai, Guangdong},
  journal={Proceedings of the ACM on Programming Languages},
  volume={8},
  number={POPL},
  pages={2260--2287},
  year={2024},
  publisher={ACM New York, NY, USA}
}