WraLU: Fast and Precise ReLU Hull Approximation
A fast and precise approach to over-approximating the convex hull of the ReLU function, achieving 10X-10⁶X speedup with 50% fewer constraints.
Overview
WraLU is a fast and precise approach to over-approximating the convex hull of the ReLU function (referred to as the ReLU hull), one of the most used activation functions. Published at POPL’24 (the 51st ACM SIGPLAN Symposium on Principles of Programming Languages).
Key Idea
Our key insight is to formulate a convex polytope that “wraps” the ReLU hull, by reusing the linear pieces of the ReLU function as the lower faces and constructing upper faces that are adjacent to the lower faces. The upper faces can be efficiently constructed based on the edges and vertices of the lower faces, given that an n-dimensional hyperplane can be determined by an (n−1)-dimensional hyperplane and a point outside of it.
Performance Highlights
- 10X to 10⁶X speedup compared to existing methods
- 50% fewer constraints while maintaining or improving precision
- Handles arbitrary input polytopes and higher-dimensional cases
- Verified networks with tens of thousands of neurons (modern ResNets and CNNs)
- Integrated into PRIMA, the state-of-the-art neural network verifier
How It Works
- Lower Faces: Reuse the linear pieces of the ReLU function directly
- Upper Face Construction: Build upper faces adjacent to lower faces using edge and vertex information
- Wrapping Polytope: The resulting convex polytope tightly wraps the ReLU hull
- Integration: Feed the constraints to LP solvers for neural network verification
Related Tools
- WraAct: Extension to general activation functions (Sigmoid, Tanh, MaxPool)
- wraact (library): Unified Python library for hull approximation
- propdag: Bound propagation framework for DAG-structured networks
Citation
@article{ma2024relu,
title={ReLU Hull Approximation},
author={Ma, Zhongkui and Li, Jiaying and Bai, Guangdong},
journal={Proceedings of the ACM on Programming Languages},
volume={8},
number={POPL},
pages={2260--2287},
year={2024},
publisher={ACM New York, NY, USA}
}